WO2023189125A1 - Camera device, image generation method, and system - Google Patents

Camera device, image generation method, and system Download PDF

Info

Publication number
WO2023189125A1
WO2023189125A1 PCT/JP2023/007444 JP2023007444W WO2023189125A1 WO 2023189125 A1 WO2023189125 A1 WO 2023189125A1 JP 2023007444 W JP2023007444 W JP 2023007444W WO 2023189125 A1 WO2023189125 A1 WO 2023189125A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
illumination
irradiation
camera device
captured
Prior art date
Application number
PCT/JP2023/007444
Other languages
French (fr)
Japanese (ja)
Inventor
利章 篠原
雄一 畑瀬
Original Assignee
i-PRO株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by i-PRO株式会社 filed Critical i-PRO株式会社
Publication of WO2023189125A1 publication Critical patent/WO2023189125A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present disclosure relates to a camera device, an image generation method, and a system.
  • Patent Document 1 discloses a first illumination device that emits first light including a first wavelength, and a second illumination device that emits second light that includes a second wavelength different from the first wavelength.
  • An image processing device includes: and an imaging device that captures an image of a target object. The imaging device images the object in a state in which the object is irradiated with first light from a first illumination device and second light is irradiated onto the object from a second illumination device.
  • the image processing device generates a gray-scale image consisting of pixels imaged through a first optical filter that transmits a first wavelength, and pixels that are imaged through a second optical filter that transmits a second wavelength. Generate a distance image containing pixels.
  • Patent Document 1 an environment in which a complex background (for example, a background in which not only the parts to be inspected but also various elements such as the pattern on the surface of the inspection jig, the structure of the machine, the tray, etc. are reflected) is reflected.
  • a complex background for example, a background in which not only the parts to be inspected but also various elements such as the pattern on the surface of the inspection jig, the structure of the machine, the tray, etc. are reflected
  • image inspections such as determining the quality of parts can be performed stably and quickly.
  • Patent Document 1 does not consider the case where the background is an area that has substantially no texture (texture such as a pattern).
  • the area where there is substantially no texture as used herein is not limited to an area where there is no texture at all, but also conceptually includes an area where the presence of texture is so inconspicuous within the angle of view that it is difficult to distinguish.
  • Patent Document 1 when an object (for example, an object to be picked up by an end effector such as a robot hand) is placed in a background that is an area with substantially no texture, the distance to the object is accurately recognized. This sometimes made it difficult to pick up the object.
  • an object for example, an object to be picked up by an end effector such as a robot hand
  • the present disclosure has been devised in view of the above-mentioned conventional circumstances, and aims to recognize the distance to an object with high precision.
  • the present disclosure includes: an illumination section that irradiates light to a target object using a first irradiation mode and a second irradiation mode that is different from the first irradiation mode; an illumination control section that controls the illumination section; an imaging unit that generates a first captured image of the object while being irradiated with light in an irradiation mode and a second captured image of the object while being irradiated with light in the second irradiation mode; , a generation unit that generates a distance image for specifying a distance between a reference point and the target object based on the first captured image and the second captured image.
  • the present disclosure also provides a step of controlling an illumination unit that irradiates light to a target object using a first irradiation mode and a second irradiation mode different from the first irradiation mode; generating a first captured image of the target object in a irradiated state and a second captured image of the target object in a state of irradiation with light in the second irradiation mode; and a step of generating the first captured image. and the step of generating a distance image for specifying the distance between the reference point and the target object based on the second captured image.
  • the present disclosure also provides an illumination unit that irradiates light to a target object using a first irradiation mode and a second irradiation mode that is different from the first irradiation mode, an illumination control unit that controls the illumination unit, and Imaging that generates a first captured image that captures the object while being irradiated with light in a first irradiation mode, and a second captured image that captures the object while being irradiated with light in the second irradiation mode. and a generation unit that generates a distance image for specifying the distance between the reference point and the target object based on the first captured image and the second captured image. do.
  • the distance to the target object can be recognized with high accuracy.
  • FIG. 1 is a diagram showing an example of the system configuration of the system.
  • FIG. 2A is a diagram showing an example of a time chart in which illumination of the illumination pattern P1 and illumination of the illumination pattern P2 are performed alternately.
  • FIG. 2B is a diagram showing another example of a time chart in which the illumination of the illumination pattern P1 and the illumination of the illumination pattern P2 are alternately performed.
  • FIG. 3 is a flowchart showing an example of the first operation procedure of the camera device.
  • FIG. 4 is a flowchart illustrating an example of the second operation procedure of the camera device.
  • FIG. 5 is a diagram showing an example of a UI setting screen when lighting is performed using a plurality of different lighting patterns.
  • FIG. 6 is a diagram showing an example of a UI setting screen when the same illumination pattern is irradiated with shifted positions.
  • FIG. 7 is a diagram showing an example of a UI setting screen when emitting light with the dot pitch reduced by a specified magnification.
  • FIG. 8 is a diagram showing an example of an operation outline and an example of a UI setting screen when controlling a lighting area using the processing results of the AI processing unit.
  • FIG. 9A is a flowchart illustrating an example of the third operation procedure of the camera device.
  • FIG. 9B is a flowchart showing another example of the third operation procedure of the camera device.
  • FIG. 10 is a diagram showing an example of an operation outline and an example of a UI setting screen when controlling illumination intensity using the processing result of the AI processing unit.
  • FIG. 11A is a flowchart illustrating an example of the fourth operation procedure of the camera device.
  • FIG. 11B is a flowchart showing another example of the fourth operation procedure of the camera device.
  • FIG. 12 is a flowchart illustrating an example of the fifth operation procedure of the camera device.
  • Embodiment 1 a use case in which an object is imaged by a camera device fixed to a robot hand (not shown) that is the tip of a robot arm (not shown) will be described.
  • the target object is, for example, a background object that is an area with substantially no texture (for example, a white table or floor surface without a pattern, a black table or floor surface without a pattern), or a background object placed on a table as an example of a background object. This corresponds to an object photographed by a camera device, such as a workpiece as an example of a placed object.
  • the objects are a white table with virtually no texture and a workpiece placed on the white table (for example, a white vase, a red apple, etc.).
  • a white table for example, a white vase, a red apple, etc.
  • the object is, for example, an industrial product or an industrial part, and may be picked up by a robot hand.
  • the target object does not have to be limited to industrial products or industrial parts.
  • FIG. 1 is a diagram showing an example of the system configuration of the system 100.
  • the system 100 includes at least a camera device 10 and a PC (Personal Computer) 50.
  • the camera device 10 and the PC 50 are connected so that data signals can be input and output to each other.
  • the camera device 10 may be connected to a robot controller (not shown) that can control the movement of a robot hand (not shown) to which the camera device 10 is fixed. .
  • the camera device 10 is, for example, fixed to a robot hand (not shown; the same applies hereinafter), which is the tip of a robot arm (not shown; the same applies hereinafter), and is movable integrally with the robot hand.
  • the camera device 10 is fixed so as not to interfere with the robot hand (that is, so that the robot hand (not shown) is not included within the field of view of the camera device 10).
  • the camera device 10 corresponds to the illumination mode by alternately using each of a plurality of types of illumination modes (in other words, structured illumination (see below)) in a time-sharing manner toward an object included within its angle of view.
  • the object is imaged with compound eyes while irradiating structured illumination.
  • the camera device 10 corresponds to a so-called two-lens type stereo camera that includes a plurality of imaging modules (for example, a first imaging module CAP1 and a second imaging module CAP2, see below).
  • the camera device 10 uses an image (for example, a distance image and/or parallax image). Further, the camera device 10 can specify the position information of the object based on the distance image and/or the parallax image.
  • the camera device 10 outputs output data including the generated distance image and target object position information to the robot controller.
  • the system 100 may include a plurality of camera devices 10 having the same configuration. In this case, the plurality of camera devices 10 and the PC 50 are connected to each other so that data signals can be input and output.
  • the robot controller recognizes the distance from the camera device 10 (in other words, the robot hand) to the target object based on the output data from the camera device 10. Subsequently, the robot controller controls the movement of the robot hand (for example, picking up the object) according to the position of the object. Note that the movements performed by the robot hand are adaptively determined according to the use case of the system 100, and are not limited to the above-mentioned picking up of objects.
  • the PC 50 has executable browser software (not shown) installed therein for setting various parameters used by the camera device 10.
  • the PC 50 starts up browser software in response to an operation by a user of the system 100 (not shown; the same applies hereinafter), and displays various UI (User Interface) setting screens for the camera device 10. Details of various parameters and the UI setting screen will be described later.
  • the camera device 10 includes a first imaging module CAP1, a second imaging module CAP2, an SoC (System on a Chip) 15, a lighting control section 16, and a lighting section 17.
  • the first imaging module CAP1 includes an L lens 11 and an L image sensor 13, and images the object from a first viewpoint.
  • the second imaging module CAP2 includes an R lens 12 and an R image sensor 14, and images the object from a second viewpoint different from the first viewpoint.
  • the first imaging module CAP1 and the second imaging module CAP2 are arranged with an interval (eg, module installation interval) between them.
  • the second viewpoint corresponds to a viewpoint for capturing an image of the object from a position separated by the module installation distance from the first viewpoint in a direction parallel to the installation surface of the camera device 10.
  • the module installation interval may be defined with respect to a direction other than the direction parallel to the installation surface of the camera device 10.
  • the first imaging module CAP1 and the second imaging module CAP2 correspond to the imaging section of this embodiment.
  • the L lens 11 includes, for example, a focus lens and a zoom lens.
  • the L lens 11 is illuminated with structured illumination (see below) that corresponds to the first illumination mode from the illumination unit 17 toward the object, and the L lens 11 is illuminated by the object (for example, the object) included within the angle of view.
  • Incident light which is reflected light, enters. If either a visible light cut filter or an IR cut filter is disposed between the L lens 11 and the L image sensor 13, the incident light that entered the L lens 11 passes through the filter to the L image sensor 13.
  • An optical image of the subject is formed on the light-receiving surface (imaging surface).
  • imaging surface imaging surface
  • the camera device 10 may include an L lens drive section (not shown; the same applies hereinafter) that controls the drive of the L lens 11.
  • the CPU (Central Processing Unit) 152 or the ISP (Image Signal Processor) 153 adjusts (changes) internal parameters related to driving the L lens 11 (for example, the position of the focus lens, the position of the zoom lens corresponding to the zoom magnification).
  • the L lens 11 may be driven via the L lens driving section. Further, the L lens 11 may be fixedly arranged.
  • the R lens 12 includes, for example, a focus lens and a zoom lens.
  • the R lens 12 is illuminated with structured illumination (see below) that corresponds to the second illumination mode from the illumination unit 17 toward the object, and when the object (for example, the object) included in the angle of view Incident light, which is reflected light, enters. If either a visible light cut filter or an IR cut filter is disposed between the R lens 12 and the R image sensor 14, the incident light that enters the R lens 12 passes through the filter to the R image sensor 14. An optical image of the subject is formed on the light-receiving surface (imaging surface).
  • imaging surface imaging surface
  • the camera device 10 may include an R lens drive section (not shown; the same applies hereinafter) that controls the drive of the R lens 12.
  • the CPU 152 or the ISP 153 adjusts (changes) internal parameters related to driving the R lens 12 (for example, the position of the focus lens, the position of the zoom lens corresponding to the zoom magnification), and drives the R lens 12 via the R lens driving section. It may be driven. Further, the R lens 12 may be fixedly arranged.
  • the visible light cut filter has a property of blocking visible light (for example, light having a wavelength of 400 to 760 [nm]) out of the incident light that has passed through the L lens 11 or the R lens 12 (that is, the light reflected by the subject). have The visible light cut filter blocks visible light among the incident light that has passed through the L lens 11 or the R lens 12.
  • the camera device 10 may include a filter drive unit (not shown; the same applies hereinafter) that controls the drive of the visible light cut filter.
  • the visible light cut filter operates between the L lens 11 and the L image sensor 13 and between the R lens 12 and the R image sensor during a predetermined period (for example, at night) based on a control signal from the CPU 152 or the ISP 153. 14 via the filter drive section.
  • the IR cut filter has the property of passing visible light (for example, light having a wavelength of 400 to 760 [nm]) and blocking near-infrared light (for example, light having a wavelength of 780 [nm] or more).
  • the IR cut filter blocks near-infrared light of the incident light that has passed through the L lens 11 or the R lens 12 and allows visible light to pass through.
  • the camera device 10 may include a filter drive unit that controls the drive of the IR cut filter. In the case where the filter drive section is provided, the IR cut filter operates between the L lens 11 and the L image sensor 13 and between the R lens 12 and the R image sensor 14 during a predetermined period (for example, during the day) based on a control signal from the CPU 152 or the ISP 153.
  • the filter drive unit is located between the filter drive unit and the filter drive unit.
  • the L image sensor 13 includes a CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor in which a plurality of pixels suitable for imaging visible light or near-infrared light are arranged, and an exposure control circuit (not shown). omitted) and a signal processing circuit (not shown).
  • the L image sensor 13 performs photoelectric conversion at predetermined intervals to convert light received by a light-receiving surface (imaging surface) made up of a plurality of pixels into an electrical signal.
  • the predetermined interval of photoelectric conversion is determined according to the so-called frame rate (fps: frame per second). For example, if the frame rate is 120 [fps], the predetermined interval is 1/120 [second].
  • the L image sensor 13 is configured to respond to light reflected by the object while the illumination unit 17 is emitting structured illumination (see below) corresponding to the first irradiation mode toward the object.
  • a red component signal (R signal), a green component signal (G signal), and a blue component signal (B signal) are acquired temporally continuously for each pixel as electrical signals.
  • the L image sensor 13 may acquire a monochrome electrical signal instead of an RGB electrical signal.
  • a signal processing circuit (not shown) of the L image sensor 13 converts an electrical signal (analog signal) into digital imaging data.
  • the L image sensor 13 transfers digital image data to the memory 151 at predetermined intervals depending on the frame rate.
  • the memory 151 stores digital image data received from the L image sensor 13.
  • the L image sensor 13 may send digital image data to the CPU 152 at predetermined intervals depending on the frame rate. Further, the L image sensor 13 adjusts (changes) internal parameters indicating exposure conditions (for example, exposure time, gain, frame rate) based on an exposure control signal from the CPU 152 or the ISP 153 using an exposure control circuit. Thereby, the camera device 10 can change the exposure conditions according to the surrounding environment, and can obtain imaging data with good image quality.
  • exposure conditions for example, exposure time, gain, frame rate
  • the R image sensor 14 includes a CCD sensor or CMOS sensor in which a plurality of pixels suitable for imaging visible light or near-infrared light are arranged, an exposure control circuit (not shown), and a signal processing circuit (not shown). include.
  • the R image sensor 14 performs photoelectric conversion at predetermined intervals to convert light received by a light receiving surface (imaging surface) made up of a plurality of pixels into an electrical signal.
  • the predetermined interval of photoelectric conversion is determined according to the so-called frame rate (fps). For example, if the frame rate is 120 [fps], the predetermined interval is 1/120 [second].
  • the R image sensor 14 is configured to respond to the light reflected by the object while the illumination unit 17 is emitting structured illumination (see below) corresponding to the second irradiation mode toward the object.
  • a red component signal (R signal), a green component signal (G signal), and a blue component signal (B signal) are acquired temporally continuously for each pixel as electrical signals.
  • the R image sensor 14 may acquire a monochrome electrical signal instead of an RGB electrical signal.
  • a signal processing circuit (not shown) of the R image sensor 14 converts an electrical signal (analog signal) into digital imaging data.
  • the R image sensor 14 transfers digital image data to the memory 151 at predetermined intervals depending on the frame rate.
  • the memory 151 stores digital format imaging data received from the R image sensor 14.
  • the R image sensor 14 may send digital image data to the CPU 152 at predetermined intervals depending on the frame rate. Further, the R image sensor 14 adjusts (changes) internal parameters related to exposure conditions (for example, exposure time, gain, frame rate) by an exposure control circuit based on an exposure control signal from the CPU 152 or the ISP 153. Thereby, the camera device 10 can change the exposure conditions according to the surrounding environment, and can obtain imaging data with good image quality.
  • exposure conditions for example, exposure time, gain, frame rate
  • the SoC 15 is an integrated circuit product in which various electronic components are integrated on a single chip made up of an integrated circuit.
  • the SoC 15 includes a memory 151, a CPU 152, an ISP 153, an AI processing section 154, and an external interface 155. In FIG. 1, the interface is illustrated as "I/F" for convenience.
  • the memory 151 includes at least a RAM (Random Access Memory) and a ROM (Read Only Memory).
  • the memory 151 temporarily stores programs and control data necessary for executing the operations of the camera device 10, as well as data or information generated during the operation of each part of the camera device 10.
  • the RAM is, for example, a work memory used when each part of the camera device 10 operates.
  • the ROM stores and retains programs and control data for controlling each part of the camera device 10 in advance, for example.
  • the memory 151 also stores a structured illumination pattern (not shown) set by the PC 50 or attribute information for specifying the pattern.
  • the memory 151 also stores information about the waiting time (minute time ⁇ Ta) from when the illumination unit 17 emits the first structured illumination of the illumination pattern P1 until it emits the second structured illumination of the illumination pattern P2, and the illumination unit 17 stores information on the waiting time (minute time ⁇ Tb) from when the second structured illumination of the illumination pattern P2 is irradiated to when the first structured illumination of the illumination pattern P1 is irradiated.
  • the CPU 152 is a processor that functions as a controller that controls the overall operation of the camera device 10.
  • the CPU 152 performs control processing for unifying the operations of each part of the camera device 10, data input/output processing with respect to each part of the camera device 10, data calculation processing, and data storage processing.
  • the CPU 152 operates according to programs and control data stored in the memory 151.
  • the CPU 152 uses the memory 151 during operation, and transfers data or information generated or acquired by the CPU 152 to the memory 151 for temporary storage. Further, the CPU 152 acquires image data from each of the L image sensor 13 and the R image sensor 14 from the memory 151, and calculates the parallax between the image data from the L image sensor 13 and the image data from the R image sensor 14. .
  • the CPU 152 generates a distance image of an object included in the target object within the viewing angle based on the parallax calculation result.
  • the CPU 152 has a timer (not shown) or an illuminance sensor (not shown), and generates a control signal for arranging either the visible light cut filter or the IR cut filter based on the output of the timer or the illuminance sensor. It may also be generated and sent to the filter driver. Details of various processes performed by the CPU 152 will be described later.
  • the ISP 153 is a processor that manages various image processes performed within the camera device 10.
  • the ISP 153 reads imaging data from the memory 151 and performs various image processing using the read imaging data.
  • the ISP 153 uses the memory 151 during operation, and transfers data or information generated or acquired by the ISP 153 to the memory 151 for temporary storage.
  • the ISP 153 performs resizing processing to convert the size of the imaging data read from the memory 151 into a size suitable for the area determination processing performed by the AI processing unit 154. Details of the area determination process will be described later.
  • the ISP 153 determines the irradiation area or irradiation intensity of the second structured illumination (see below) that corresponds to the second irradiation mode to be emitted by the illumination unit 17 based on the processing result of the AI processing unit 154, and controls the illumination control unit 16. send to The ISP 153 includes image data (first left image P1L) captured by the L image sensor 13 in a state where the first structured illumination corresponding to the first irradiation mode is irradiated, and image data captured by the R image sensor 14 in the same state. Based on the imaging data (first right image P1R), a distance image for specifying the distance from the reference point to the target object is generated.
  • the reference point corresponds to the imaging planes and focal points of the L image sensor 13 and the R image sensor 14, the lens interfaces of the L lens 11 and the R lens 12, and the like.
  • the reference point corresponds to a position definable by an optical element included in the camera device 10.
  • the definition of the reference point is not limited to the example described above.
  • the ISP 153 stores image data (first left image P1L) captured by the L image sensor 13 in a state where the first structured illumination corresponding to the first irradiation mode is irradiated, and image data captured by the R image sensor 14 in the same state.
  • an exposure control signal is generated for adjusting internal parameters that determine the exposure conditions for imaging (that is, exposure) by each of the L image sensor 13 and the R image sensor 14. and may be sent to each of the L image sensor 13 and the R image sensor 14. Details of various processes performed by the ISP 153 will be described later.
  • the AI processing unit 154 is configured using, for example, a GPU (Graphics Processing Unit) and memory. Note that a DSP (Digital Signal Processor) may be used instead of or together with the GPU.
  • the AI processing unit 154 uses AI (for example, a trained model for segmentation) to generate image data (the first Based on the first left image P1L) and the image data captured by the R image sensor 14 in the same state (the first right image P1R), area determination processing is performed for each image data (image).
  • the area determination process corresponds to, for example, a process of determining an image area corresponding to the table and an image area corresponding to the work in an image obtained by capturing a workpiece placed on a table as imaging data.
  • the AI processing unit 154 reads out the image data (for example, the first left image P1L, first right image P1R) that has been resized by the ISP 153 from the memory 151 and performs the above-described area determination process.
  • the learned model used by the AI processing unit 154 is generated in advance through machine learning, for example.
  • the learned model refers to parameters ( For example, it corresponds to a model equipped with various weighting coefficients, etc.).
  • the AI processing unit 154 stores the result of the area determination process in the memory 151.
  • the external interface 155 receives data requests from the PC 50 and transmits data generated or acquired by the CPU 152 or the ISP 153 to the PC 50 in accordance with the data requests. Although not shown in FIG. 1, when the robot controller and the camera device 10 are connected, the external interface 155 can transmit data generated or acquired by the CPU 152 or the ISP 153 to the robot controller. good.
  • the illumination control unit 16 is constituted by a control circuit for performing various controls on the first structured illumination and the second structured illumination from the illumination unit 17.
  • the lighting control unit 16 generates control signals for controlling each pattern, irradiation mode, and irradiation timing of the first structured illumination and the second structured illumination irradiated by the illumination unit 17 based on the output from the SoC 15. and sends it to the illumination section 17.
  • the irradiation mode is an operation mode that determines which light, the first structured illumination or the second structured illumination, is irradiated by the illumination unit 17.
  • the illumination mode is divided into a first illumination mode in which the illumination unit 17 emits the first structured illumination, and a second illumination mode in which the illumination unit 17 emits the second structured illumination. .
  • the illumination control unit 16 switches between the first irradiation mode and the second irradiation mode, for example, according to the time chart shown in FIG. 2A or the time chart shown in FIG. 2B.
  • the illumination unit 17 is configured by a projector including a light emitting element capable of illuminating each of the first structured illumination LG1 and the second structured illumination LG2.
  • the illumination unit 17 irradiates the object with each of the first structured illumination LG1 and the second structured illumination LG2 based on the control signal from the illumination control unit 16.
  • the illumination unit 17 also irradiates the entire object with the first structured illumination LG1, and further irradiates the entire object with the second structured illumination LG1 toward an illumination area (for example, an area corresponding to an arrangement object) controlled by the illumination control unit 16. irradiate with the chemical illumination LG2.
  • the illumination unit 17 irradiates the entire object with the first structured illumination LG1, and further irradiates the entire object with the second structured illumination LG2 having an illumination intensity (for example, irradiation brightness) controlled by the illumination control unit 16. Irradiates the entire object.
  • the first structured illumination LG1 is, for example, illumination having a texture in which a plurality of dots are arranged in a predetermined pattern.
  • the second structured illumination LG2 is, for example, illumination having a texture in which a plurality of dots are arranged in a predetermined pattern. The arrangement positions of the plurality of dots are different between the first structured illumination LG1 and the second structured illumination LG2.
  • FIG. 1 shows an example in which each of the lighting control section 16 and the lighting section 17 is included in the camera device 10, each of the lighting control section 16 and the lighting section 17 is included in the camera device 10. 10 may be provided separately. In this case, the lighting control section 16 and the external interface 155 of the SoC 15 are connected so that data signals can be input and output.
  • FIG. 2A is a diagram showing an example of a time chart in which illumination of the illumination pattern P1 and illumination of the illumination pattern P2 are alternately performed.
  • FIG. 2B is a diagram showing another example of a time chart in which the illumination of the illumination pattern P1 and the illumination of the illumination pattern P2 are alternately performed.
  • the illumination of the illumination pattern P1 corresponds to the first structured illumination LG1
  • the illumination of the illumination pattern P2 corresponds to the second structured illumination LG2.
  • the horizontal axis in FIG. 2 indicates time.
  • the illumination control unit 16 provides structured illumination in two irradiation modes (i.e., the first irradiation).
  • the illumination pattern P1 corresponding to the mode and the illumination pattern P2 corresponding to the second irradiation mode are alternately time-divisionally irradiated from the illumination unit 17.
  • the illumination pattern P1 is applied at time t1
  • the illumination pattern P2 is applied at time t2 after a minute time ⁇ ta has elapsed from time t1.
  • the illumination of the illumination pattern P2 is applied at time t2
  • the illumination of the illumination pattern P1 is applied at time t3 after a minute time ⁇ tb has elapsed from time t2.
  • the illumination of the illumination pattern P1 is applied at time t3
  • the illumination of the illumination pattern P2 is applied at time t4 after a minute time ⁇ ta has elapsed from time t3.
  • the illumination of the illumination pattern P2 is applied at time t4
  • the illumination of the illumination pattern P1 is applied at time t5 after a short time ⁇ tb has elapsed from time t4.
  • the illumination pattern P1 is irradiated at time t5
  • the illumination pattern P2 is irradiated at time t6 after a short time ⁇ ta has elapsed from time t5.
  • the illumination of the illumination pattern P2 is irradiated at time t6
  • the illumination of the illumination pattern P1 is irradiated at a time after a minute time ⁇ tb has elapsed from the time t6.
  • the minute time ⁇ ta and the minute time ⁇ tb may be the same or different, and their magnitude relationship is not particularly limited.
  • the first left image P1L is generated by the L image sensor 13 under illumination of the illumination pattern P1 (the first structured illumination LG1 corresponding to the first illumination mode), and the R image sensor 14, the first right image P1R is generated.
  • the first left image P1L and the first right image P1R have parallax with each other.
  • the first left image P1L is an example of a first captured image captured from the first viewpoint.
  • the first right image P1R is an example of a second captured image captured from the second viewpoint.
  • the second left image P2L is generated by the L image sensor 13 under illumination of the illumination pattern P2 (second structured illumination LG2 corresponding to the second illumination mode), and the R image sensor 14, the second right image P2R is generated.
  • the second left image P2L and the second right image P2R have parallax with each other.
  • the second left image P2L is an example of the first captured image captured from the first viewpoint.
  • the second right image P2R is an example of a second captured image captured from the second viewpoint.
  • the CPU 152 or the ISP 153 combines the first left image P1L captured during the irradiation with the first structured illumination LG1 and the second left image P2L captured during the irradiation with the second structured illumination LG2. Through this combining process, the CPU 152 or the ISP 153 generates a combined left image P3L that has more image features than the image features on the first left image P1L or the image features on the second left image P2L.
  • the composite left image P3L is an example of the first composite image.
  • the CPU 152 or the ISP 153 combines the first right image P1R captured during the irradiation with the first structured illumination LG1 and the second right image P2R captured during the irradiation with the second structured illumination LG2. Through this combining process, the CPU 152 or the ISP 153 generates a combined right image P3R that has more image features than the image features on the first right image P1R or the image features on the second right image P2R.
  • the composite right image P3R is an example of the second composite image. Therefore, the composite left image P3L corresponds to an image having parallax with respect to the composite right image P3R.
  • the CPU 152 or the ISP 153 calculates the parallax between the composite left image P3L and the composite right image P3R based on the composite left image P3L and the composite right image P3R. Further, the CPU 152 or the ISP 153 generates a distance image for specifying the distance from the camera device 10 to the target object.
  • the target object is a situation in which an object (for example, a white vase) is placed on a background object (for example, a white table) that is an area that has virtually no texture
  • a situation in which an object (for example, a white vase) is placed on a background object (for example, a white table) that is an area that has virtually no texture Assume that a situation in which an object (for example, a red apple) is placed on a (for example, a white table) is imaged using a two-lens camera such as a stereo camera. In this case, the captured image of the white vase and white table has fewer colors, that is, fewer image features (feature points), than the captured image of the red apple and white table.
  • the camera device 10 has a first structured illumination LG1 having an illumination pattern P1 provided with a plurality of dots (for example, see FIG. 5), and the illumination pattern P1 is different from the first structured illumination LG1.
  • the second structured illumination LG2 having the illumination pattern P2 is alternately irradiated.
  • the camera device 10 combines the left images (that is, the first left image P1L and the second left image P2L) into a composite left image P3L and the right images (that is, the first right image P1R and the second right image P2R).
  • the combined right image P3R and the combined right image P3R are respectively generated and the parallax is calculated.
  • the parallax is calculated in a state where each image feature on the composite left image P3L and the composite right image P3R is apparently increased by the composition process.
  • the camera device 10 can generate a distance image that allows highly accurate recognition of the distance to an object that is substantially free of texture.
  • the ratio between the minute time ⁇ ta and the minute time ⁇ tb is determined from the illumination pattern P1 to the illumination pattern P2, assuming the shortest imaging interval (for example, 1/480 [second] if the frame rate is 480 [fps]).
  • the interval may be shorter than in FIG. 2A, and the interval from illumination pattern P2 to illumination pattern P1 may be longer than in FIG. 2A.
  • the temporal blurring of the object for example, a moving object
  • the accuracy of the distance image generated based on image synthesis is improved. improves.
  • FIG. 3 is a flowchart illustrating an example of the first operation procedure of the camera device 10.
  • illumination of the illumination pattern P1 and imaging under the illumination, and illumination of the illumination pattern P2 and imaging under the illumination, are each performed asynchronously.
  • the illumination control unit 16 of the camera device 10 switches to the first illumination mode at time t1 (see FIG. 2), for example, and directs the illumination of the illumination pattern P1 (first structured illumination LG1) toward the target object. Irradiation is performed from the illumination unit 17 (step St1).
  • Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the first structured illumination LG1 from the illumination unit 17 in step St1 (step St2).
  • two pieces of imaging data for example, the first left image P1L and the first right image P1R having parallax with each other are obtained.
  • the first left image P1L is obtained from the first imaging module CAP1
  • the first right image P1R is obtained from the second imaging module CAP2.
  • the CPU 152 or the ISP 153 of the camera device 10 waits for the elapse of a minute time ⁇ ta (standby) from the execution timing of step St1 or the execution timing of step St2 (step St3).
  • the illumination control unit 16 of the camera device 10 switches to the second illumination mode and directs illumination of the illumination pattern P2 (second structured illumination LG2) from the illumination unit 17 toward the object. irradiate (step St4).
  • Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the second structured illumination LG2 from the illumination unit 17 in step St4 (step St5).
  • two pieces of imaging data eg, second left image P2L and second right image P2R
  • the second left image P2L is obtained from the first imaging module CAP1
  • the second right image P2R is obtained from the second imaging module CAP2.
  • the CPU 152 or the ISP 153 of the camera device 10 selects one of the two image data captured in step St2 that has a parallax with each other (for example, the first left image P1L) and one of the two image data that has a parallax that is captured in step St5. (for example, second left image P2L) (step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 selects the other of the two pieces of image data (for example, the first right image P1R) that have a parallax from each other that are imaged in step St2, and the two images that have a parallax from each other that are imaged in step St5.
  • the other data (for example, the second right image P2R) is synthesized (step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 calculates the parallax between the two composite images (for example, the composite left image P3L and the composite right image P3R) obtained by the composition.
  • the CPU 152 or ISP 153 of the camera device 10 uses the parallax calculation results to generate a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object, and outputs it to the PC 50. (Step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 waits for a minute time ⁇ tb to elapse (standby) from the execution timing of step St4, the execution timing of step St5, or the execution timing of step St6 (step St7). After step St7, the processing of the camera device 10 returns to step St1.
  • step St6 the CPU 152 or the ISP 153 of the camera device 10 displays the first left image P1L captured by the first imaging module CAP1 and the first right image captured by the second imaging module CAP2 during irradiation with the illumination pattern P1.
  • the P1R parallax may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 combines the parallax based on the first left image P1L and the first right image P1R with the parallax based on the second left image P2L and the second right image P2R, thereby determining the standard of the camera device 10.
  • a distance image for specifying the distance between a point (see above) and a target object may be generated and output to the PC 50.
  • step St6 the CPU 152 or the ISP 153 of the camera device 10 displays a first left image P1L captured by the first imaging module CAP1 and a first right image captured by the second imaging module CAP2 during the irradiation with the illumination pattern P1.
  • the P1R parallax may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated.
  • the CPU 152 or ISP 153 of the camera device 10 generates a distance image from the parallax based on the first left image P1L and the first right image P1R, and further generates a distance image from the parallax based on the second left image P2L and the second right image P2R. may be generated.
  • the CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object by combining these two distance images, and You can also output to
  • FIG. 4 is a flowchart illustrating an example of the second operation procedure of the camera device.
  • illumination of the illumination pattern P1 and imaging under the illumination and illumination of the illumination pattern P2 and imaging under the illumination are executed in synchronization.
  • the same step numbers are assigned to the same processes as those in FIG. 3 to simplify or omit the description, and different contents will be described.
  • the illumination control unit 16 of the camera device 10 switches to the first illumination mode at time t1 (see FIG. 2), for example, and directs the illumination of the illumination pattern P1 (first structured illumination LG1) toward the target object. Irradiation is performed from the illumination unit 17 (step St1A).
  • Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the first structured illumination LG1 from the illumination unit 17 in step St1A (step St1A).
  • step St1A two pieces of imaging data (for example, the first left image P1L and the first right image P1R) having parallax with each other are obtained in synchronization with the irradiation of the first structured illumination LG1 from the illumination unit 17.
  • the first left image P1L is obtained from the first imaging module CAP1
  • the first right image P1R is obtained from the second imaging module CAP2.
  • the CPU 152 or the ISP 153 of the camera device 10 waits for a minute time ⁇ ta to elapse (standby) from the execution timing of step St1A (step St3A).
  • the illumination control unit 16 of the camera device 10 switches to the second illumination mode and directs illumination of the illumination pattern P2 (second structured illumination LG2) from the illumination unit 17 toward the object. irradiate (step St4A).
  • Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the second structured illumination LG2 from the illumination unit 17 in step St4A (step St4A). That is, in step St4A, two pieces of imaging data (for example, the second left image P2L and the second right image P2R) having parallax with each other are obtained in synchronization with the irradiation of the second structured illumination LG2 from the illumination unit 17. For example, the second left image P2L is obtained from the first imaging module CAP1, and the second right image P2R is obtained from the second imaging module CAP2.
  • the CPU 152 or the ISP 153 of the camera device 10 selects one of the two image data captured in step St1A that has a parallax with each other (for example, the first left image P1L) and one of the two image data that has a parallax that is captured in step St4A. (for example, second left image P2L) (step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 selects the other of the two image data (for example, the first right image P1R) that has a parallax from each other that was imaged in step St1A, and the two images that have a parallax from each other that was imaged in step St4A.
  • the other data (for example, the second right image P2R) is synthesized (step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 calculates the parallax between the two composite images (for example, the composite left image P3L and the composite right image P3R) obtained by the composition.
  • the CPU 152 or ISP 153 of the camera device 10 uses the parallax calculation results to generate a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object, and outputs it to the PC 50. (Step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 waits for a minute time ⁇ tb to elapse (standby) from the execution timing of step St4A (step St7A). After step St7A, the processing of the camera device 10 returns to step St1A.
  • step St6 the CPU 152 or the ISP 153 of the camera device 10 displays the first left image P1L captured by the first imaging module CAP1 and the first right image captured by the second imaging module CAP2 during irradiation with the illumination pattern P1.
  • the P1R parallax may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 combines the parallax based on the first left image P1L and the first right image P1R with the parallax based on the second left image P2L and the second right image P2R, thereby determining the standard of the camera device 10.
  • a distance image for specifying the distance between a point (see above) and a target object may be generated and output to the PC 50.
  • step St6 the CPU 152 or the ISP 153 of the camera device 10 displays a first left image P1L captured by the first imaging module CAP1 and a first right image captured by the second imaging module CAP2 during the irradiation with the illumination pattern P1.
  • the P1R parallax may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 generates a distance image from the parallax based on the first left image P1L and the first right image P1R, and further generates a distance image from the parallax based on the second left image P2L and the second right image P2R. may be generated.
  • the CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object by combining these two distance images, and You can also output to
  • FIG. 5 is a diagram showing an example of a UI setting screen when lighting is performed using a plurality of different lighting patterns.
  • FIG. 6 is a diagram showing an example of a UI setting screen when the same illumination pattern is irradiated with shifted positions.
  • FIG. 7 is a diagram showing an example of a UI setting screen when emitting light with the dot pitch reduced by a specified magnification.
  • the UI setting screen WD1 shown in FIG. 5 is displayed on the display (not shown) of the PC 50 by the user's operation.
  • the PC 50 displays a pattern image EX1 indicating the illumination pattern P1 corresponding to the first structured illumination LG1 and a pattern image EX2 indicating the illumination pattern P2 corresponding to the second structured illumination LG2 on the UI setting screen WD1.
  • the PC 50 sets the pattern image EX1 as the illumination pattern P1 corresponding to the first structured illumination LG1, and sets the pattern image EX2 as the illumination pattern P1 corresponding to the first structured illumination LG1. is set as the illumination pattern P2 corresponding to the second structured illumination LG2.
  • the first irradiation mode is a mode in which a pattern (see pattern image EX1 in FIG. 5) in which a plurality of dots (an example of an irradiation figure) are arranged at a first interval is irradiated.
  • the second irradiation mode is a mode in which a pattern (see pattern image EX2 in FIG. 5) in which a plurality of dots (an example of an irradiation figure) are arranged at second intervals is irradiated.
  • Pattern image EX1 and pattern image EX2 both have a black background and a pattern in which a plurality of white dots are arranged irregularly. At least one is different.
  • Two different types of structured illumination having a black background and a plurality of white dots are irradiated onto the object.
  • the captured image taken during structured illumination will have a different shape compared to when it is not illuminated with structured illumination.
  • the structured illumination of the illumination patterns P1 and P2 set on the UI setting screen WD1 of 5 is reflected and the image features increase, so the accuracy of parallax calculation of the object (for example, a white table and a white vase) is improved, and the The accuracy of generating distance images of objects is also improved.
  • the pattern image EX1 of the illumination pattern P1 and the pattern image EX2 of the illumination pattern P2 may be provided in units of frames as shown in FIG. 5, or may be provided in units of lines constituting a frame. , or may be provided in units of blocks each consisting of a plurality of pixels.
  • the dots in the pattern image are just an example of the irradiation figure, and the irradiation figure is not limited to dots (dots or minute circles).
  • the irradiation figure may be, for example, not only a circle but also a polygon.
  • the set contents are input from the PC 50 to the CPU 152 or the ISP 153 via the external interface 155.
  • the CPU 152 or the ISP 153 sets the illumination pattern P1 corresponding to the first structured illumination LG1 and the illumination pattern P2 corresponding to the second structured illumination LG2 in the illumination control unit 16.
  • the illumination unit 17 can emit the first structured illumination LG1 having the illumination pattern P1 (see pattern image EX1), and the second structured illumination LG2 having the illumination pattern P2 (see pattern image EX2). It is possible to irradiate
  • the NG button BT2 is pressed by the user's operation.
  • the UI setting screen WD2 shown in FIG. 6 is displayed on the display (not shown) of the PC 50 by the user's operation.
  • the PC 50 includes a pattern image EX1 indicating an illumination pattern P1 corresponding to the first structured illumination LG1, and an illumination corresponding to the second structured illumination LG2 obtained by shifting the pattern image EX1 by ⁇ m on the two-dimensional coordinate axis (see FIG. 6).
  • a pattern image EX1A indicating pattern P2 is displayed on the UI setting screen WD1.
  • the designation of ⁇ m is confirmed, for example, after the pattern image EX1 is designated by the user's operation, the value (numeric value) of ⁇ m is input into the designation field FLD1 by the user's operation, and the OK button BT1 is pressed.
  • the PC 50 sets the pattern image EX1 as the illumination pattern P1 corresponding to the first structured illumination LG1, and sets the pattern image EX1A as the illumination pattern P2 corresponding to the second structured illumination LG2.
  • the first irradiation mode is a mode in which a pattern (see pattern image EX1 in FIG. 6) in which a plurality of dots (an example of an irradiation figure) are arranged at third intervals is irradiated.
  • the second irradiation mode is a mode in which a pattern (see pattern image EX1A in FIG. 6) in which a plurality of dots (an example of an irradiation figure) arranged at third intervals is shifted by a predetermined amount is irradiated.
  • pattern image EX1 and pattern image EX1A is similar to the relationship between pattern image EX1 and pattern image EX2, in that both have a black background and a pattern in which a plurality of white dots are irregularly arranged. However, the location, spacing, and number of dots are different. Therefore, in the setting example of FIG. 6 as well, the object is irradiated with two different types of structured illumination in which the background color is black and a plurality of white dots are arranged. As a result, even if the object has virtually no texture, such as a white vase placed on a white table, the captured image taken during structured illumination will have a different shape compared to when it is not illuminated with structured illumination.
  • the structured illumination of the illumination patterns P1 and P2 set on the UI setting screen WD2 of 6 is reflected and the image features increase, so the accuracy of parallax calculation of the object (for example, a white table and a white vase) is improved, and the The accuracy of generating distance images of objects is also improved.
  • the pattern image EX1 of the illumination pattern P1 and the pattern image EX1A of the illumination pattern P2 may be provided in units of frames as shown in FIG. 6, or may be provided in units of lines constituting a frame. , or may be provided in units of blocks each consisting of a plurality of pixels.
  • the set contents are input from the PC 50 to the CPU 152 or the ISP 153 via the external interface 155.
  • the CPU 152 or the ISP 153 sets the illumination pattern P1 corresponding to the first structured illumination LG1 and the illumination pattern P2 corresponding to the second structured illumination LG2 in the illumination control unit 16.
  • the illumination unit 17 can emit the first structured illumination LG1 having the illumination pattern P1 (see pattern image EX1), and the second structured illumination LG2 having the illumination pattern P2 (see pattern image EX1A). It is possible to irradiate
  • the NG button BT2 is pressed by the user's operation.
  • the UI setting screen WD3 shown in FIG. 7 is displayed on the display (not shown) of the PC 50 by the user's operation.
  • the PC 50 includes a pattern image EX1 showing an illumination pattern P1 corresponding to the first structured illumination LG1, and an illumination pattern P2 corresponding to the second structured illumination LG2, which is obtained by reducing the pitch of two arbitrary dots in the pattern image EX1 by ⁇ r times.
  • a pattern image EX1B showing . is displayed on the UI setting screen WD1.
  • the designation of ⁇ r is confirmed, for example, after the pattern image EX1 is designated by the user's operation, the value (numerical value) of ⁇ r is input into the designation field FLD2 by the user's operation, and the OK button BT1 is pressed.
  • the PC 50 sets the pattern image EX1 as the illumination pattern P1 corresponding to the first structured illumination LG1, and sets the pattern image EX1B as the illumination pattern P2 corresponding to the second structured illumination LG2.
  • the first irradiation mode is a mode in which a pattern (see pattern image EX1 in FIG. 7) in which a plurality of dots (an example of an irradiation figure) having a first size are arranged is irradiated.
  • the second irradiation mode is a mode in which a pattern (see pattern image EX1B in FIG. 7) in which a plurality of dots (an example of an irradiation figure) having a second size smaller than the first size is arranged is irradiated.
  • the relationship between pattern image EX1 and pattern image EX1B is similar to the relationship between pattern image EX1 and pattern image EX2 or pattern image EX1A, in which both have a black background and a pattern in which a plurality of white dots are irregularly arranged. Although they have one thing in common, the location and number of dots are different. Therefore, in the setting example of FIG.
  • the object is irradiated with two different types of structured illumination in which the background color is black and a plurality of white dots are arranged.
  • the captured image taken during structured illumination will have a different shape compared to when it is not illuminated with structured illumination.
  • the structured illumination of the illumination patterns P1 and P2 set on the UI setting screen WD3 of 7 is reflected and the image features increase, so the accuracy of parallax calculation of the object is improved, and the accuracy of generation of the distance image of the object is also improved. do.
  • the pattern image EX1 of the illumination pattern P1 and the pattern image EX1B of the illumination pattern P2 may be provided in units of frames as shown in FIG. 5, or may be provided in units of lines constituting a frame. , or may be provided in units of blocks each consisting of a plurality of pixels.
  • the set contents are input from the PC 50 to the CPU 152 or the ISP 153 via the external interface 155.
  • the CPU 152 or the ISP 153 sets the illumination pattern P1 corresponding to the first structured illumination LG1 and the illumination pattern P2 corresponding to the second structured illumination LG2 in the illumination control unit 16.
  • the illumination unit 17 can emit the first structured illumination LG1 having the illumination pattern P1 (see pattern image EX1), and the second structured illumination LG2 having the illumination pattern P2 (see pattern image EX1B). It is possible to irradiate
  • the NG button BT2 is pressed by the user's operation.
  • the camera device 10 generates the illumination pattern P2 corresponding to the second structured illumination LG2 based on the processing result of the AI processing unit 154 on the captured image taken during the irradiation of the object with the first structured illumination LG1. It may also be controlled to change dynamically. This control example will be explained with reference to the set of FIGS. 8 and 9A/9B, and the set of FIGS. 10 and 11, respectively.
  • FIG. 8 is a diagram showing an example of an operation outline and an example of a UI setting screen when controlling a lighting area using the processing results of the AI processing unit 154.
  • FIG. 9A is a flowchart illustrating an example of the third operation procedure of the camera device 10.
  • FIG. 9B is a flowchart showing another example of the third operation procedure of the camera device 10.
  • illumination of the first structured illumination LG1 and imaging under the first structured illumination LG1, illumination of the second structured illumination LG2 and imaging under the second structured illumination LG2 are described. , are executed synchronously.
  • FIG. 9B the same step numbers are assigned to processes that overlap with the explanation of FIG. 9A to simplify or omit the explanation, and different contents will be explained.
  • the captured image IMG0 captured by either the first imaging module CAP1 or the second imaging module CAP2 includes: A background object BKG1 with substantially no texture and a workpiece OB1 are shown. Since there are few image features in the captured image IMG0 from the first imaging module CAP1 and the second imaging module CAP2, it is difficult to accurately grasp the distance from the camera device 10 to the workpiece OB1.
  • the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 are irradiated with the first structured illumination LG1 (for example, the illumination pattern P1 corresponding to the pattern image EX1) from the illumination unit 17.
  • the workpiece OB1 and the background object are imaged.
  • a captured image IMG1 is obtained.
  • the AI processing unit 154 of the camera device 10 uses AI (for example, a trained model for segmentation) to perform region determination processing for determining the region of the workpiece OB1 in the captured image IMG1.
  • the CPU 152 or the ISP 153 of the camera device 10 sets the third irradiation mode to the illumination control unit 16.
  • the CPU 152 or the ISP 153 of the camera device 10 determines the irradiation area for irradiating the second structured illumination LG2 (an example of the first light) based on the result of the area determination process by the AI processing unit 154, and The illumination control unit 16 is controlled to irradiate the determined irradiation area with the second structured illumination LG2. Further, the CPU 152 or the ISP 153 of the camera device 10 applies a second light (a light different from the first light) to an area that does not correspond to the area of the workpiece OB1 in the captured image IMG1 based on the result of the area determination process by the AI processing unit 154. The illumination control unit 16 is controlled to irradiate the light (an example).
  • the illumination control unit 16 causes the illumination unit 17 to irradiate the second structured illumination LG2 so as to narrow down (restrict) the irradiation range to the determined irradiation area, and to apply the structured illumination to areas other than the irradiation area.
  • the illumination section 17 irradiates the second light (for example, white light).
  • the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 are provided with a second structured illumination LG2 (for example, a part of the illumination pattern P1 corresponding to the pattern image EX1 corresponding to the area of the irradiation area) from the illumination unit 17.
  • the workpiece OB1 and the background object are imaged while being irradiated.
  • a captured image IMG2 is obtained.
  • the captured image IMG2 includes an image region CT1 of a portion illuminated with the second structured illumination LG2 and an image region CT0 of a portion not illuminated with the second structured illumination LG2 within the entire object within the viewing angle. exist.
  • the CPU 152 or the ISP 153 of the camera device 10 synthesizes one captured image IMG1 from the first imaging module CAP1 and one captured image IMG2 from the second imaging module CAP2 so that the workpiece OB1 overlaps, and creates one composite image. Then, the other captured image IMG1 from the first imaging module CAP1 and the other captured image IMG2 from the second imaging module CAP2 are combined so that the workpiece OB1 overlaps to generate the other composite image.
  • the CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance from the camera device 10 to the workpiece OB1 based on the two composite images.
  • the CPU 152 or the ISP 153 of the camera device 10 recognizes the workpiece OB1 (for example, a screw) from the two composite images, and generates a recognition result screen WD4 that specifies the size and distance of the workpiece OB1 (for example, a screw) from the recognition result. , may be sent to the PC50.
  • the PC 50 displays the recognition result screen WD4 sent from the camera device 10.
  • the recognition result screen WD4 includes at least an image of the image area CT1 of the portion irradiated with the second structured illumination LG2, for example, and a display area DTL1 of detailed information on the workpiece OB1 (for example, a screw). Furthermore, a frame WK1 indicating the shape of the workpiece OB1 (for example, a screw) is shown in the image of the image area CT1. This frame WK1 is added by the CPU 152 or the ISP 153 of the camera device 10.
  • the display area DTL1 shows the size (for example, 100 mm x 50 mm) and distance (30 cm) of the recognized workpiece OB1 (for example, a screw) and the distance from the camera device 10 as detailed information about the workpiece OB1 (for example, a screw).
  • the illumination control unit 16 of the camera device 10 switches to the third illumination mode at time t1 (see FIG. 2), for example, and directs the illumination of the illumination pattern P1 (first structured illumination LG1) toward the target object. Irradiation is performed from the illumination unit 17 (step St1A).
  • Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the first structured illumination LG1 from the illumination unit 17 in step St1A (step St1A).
  • step St1A two pieces of imaging data (for example, the first left image P1L and the first right image P1R) having parallax with each other are obtained in synchronization with the irradiation of the first structured illumination LG1 from the illumination unit 17.
  • the first left image P1L is obtained from the first imaging module CAP1
  • the first right image P1R is obtained from the second imaging module CAP2.
  • the CPU 152 or the ISP 153 of the camera device 10 waits for a minute time ⁇ ta to elapse (standby) from the execution timing of step St1A (step St3A).
  • the AI processing unit 154 of the camera device 10 uses AI (for example, a trained model for segmentation) to perform region determination processing to determine the region of the workpiece OB1 in the captured image IMG1 (step St11).
  • AI for example, a trained model for segmentation
  • the CPU 152 or the ISP 153 of the camera device 10 determines the irradiation area for irradiating the second structured illumination LG2 based on the result of the area determination process by the AI processing unit 154 (Step St12).
  • the CPU 152 or the ISP 153 of the camera device 10 irradiates the irradiation area determined in step St12 with the second structured illumination LG2, and irradiates areas other than the irradiation area with second light (for example, white light). Controls the lighting control section 16.
  • the lighting control unit 16 switches to the second irradiation mode at time t2 (see FIG. 2), for example, and lights the second structured lighting LG2 so as to narrow down (limit) the irradiation range to only the irradiation area determined in step St12.
  • the second light (for example, white light) is irradiated from the illumination section 17 to a region other than the irradiation area (step St4A).
  • step St4A Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the second structured illumination LG2 from the illumination unit 17 in step St4A (step St4A). That is, in step St4A, two pieces of imaging data (for example, the second left image P2L and the second right image P2R) having parallax with each other are obtained in synchronization with the irradiation of the second structured illumination LG2 from the illumination unit 17. For example, the second left image P2L is obtained from the first imaging module CAP1, and the second right image P2R is obtained from the second imaging module CAP2.
  • the CPU 152 or the ISP 153 of the camera device 10 selects one of the two image data captured in step St1A that has a parallax with each other (for example, the first left image P1L) and one of the two image data that has a parallax that is captured in step St4A. (for example, second left image P2L) (step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 selects the other of the two image data (for example, the first right image P1R) that has a parallax from each other that was imaged in step St1A, and the two images that have a parallax from each other that was imaged in step St4A.
  • the other data (for example, the second right image P2R) is synthesized (step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 calculates the parallax between the two composite images (for example, the composite left image P3L and the composite right image P3R) obtained by the composition.
  • the CPU 152 or ISP 153 of the camera device 10 uses the parallax calculation results to generate a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object, and outputs it to the PC 50. (Step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 waits for a minute time ⁇ tb to elapse (standby) from the execution timing of step St4A (step St7A). After step St7A, the processing of the camera device 10 returns to step St1A.
  • step St6 the CPU 152 or the ISP 153 of the camera device 10 displays the first left image P1L captured by the first imaging module CAP1 and the first right image captured by the second imaging module CAP2 during irradiation with the illumination pattern P1.
  • the P1R parallax may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 combines the parallax based on the first left image P1L and the first right image P1R with the parallax based on the second left image P2L and the second right image P2R, thereby determining the standard of the camera device 10.
  • a distance image for specifying the distance between a point (see above) and a target object may be generated and output to the PC 50.
  • step St6 the CPU 152 or the ISP 153 of the camera device 10 displays a first left image P1L captured by the first imaging module CAP1 and a first right image captured by the second imaging module CAP2 during the irradiation with the illumination pattern P1.
  • the P1R parallax may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 generates a distance image from the parallax based on the first left image P1L and the first right image P1R, and further generates a distance image from the parallax based on the second left image P2L and the second right image P2R. may be generated.
  • the CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object by combining these two distance images, and You can also output to
  • the AI processing unit 154 of the camera device 10 uses AI (for example, a trained model for segmentation) that inputs the distance image generated in step St6 to Area determination processing may be performed to determine the area of work OB1 (step St11A).
  • the CPU 152 or the ISP 153 of the camera device 10 may determine the irradiation area for irradiating the second structured illumination LG2 based on the result of the area determination process by the AI processing unit 154 (Step St12A). After step St12A, the processing of the camera device 10 returns to step St1A.
  • FIG. 10 is a diagram showing an example of an operation outline and an example of a UI setting screen when controlling the illumination intensity using the processing results of the AI processing unit 154.
  • FIG. 11A is a flowchart illustrating an example of the fourth operation procedure of the camera device 10.
  • FIG. 11B is a flowchart showing another example of the fourth operation procedure of the camera device 10.
  • illumination of the first structured illumination LG1 and imaging under the first structured illumination LG1, illumination of the second structured illumination LG2 and imaging under the second structured illumination LG2 are described. , are executed synchronously.
  • FIG. 11B the same step numbers are assigned to processes that overlap with the explanation of FIG. 11A to simplify or omit the explanation, and different contents will be explained.
  • the captured image IMG0 captured by one of the first imaging module CAP1 and the second imaging module CAP2 includes: A background object BKG1 with substantially no texture and a workpiece OB1 are shown. Since there are few image features in the captured image IMG0 from the first imaging module CAP1 and the second imaging module CAP2, it is difficult to accurately grasp the distance from the camera device 10 to the workpiece OB1.
  • the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 are irradiated with the first structured illumination LG1 (for example, the illumination pattern P1 corresponding to the pattern image EX1) from the illumination unit 17.
  • the workpiece OB1 and the background object are imaged.
  • a captured image IMG1 is obtained.
  • the AI processing unit 154 of the camera device 10 uses AI (for example, a trained model for segmentation) to perform region determination processing for determining the region of the workpiece OB1 in the captured image IMG1.
  • the CPU 152 or the ISP 153 of the camera device 10 sets the fourth irradiation mode to the illumination control unit 16.
  • the CPU 152 or the ISP 153 of the camera device 10 determines the first irradiation intensity (for example, brightness) of the second structured illumination LG2 based on the result of the area determination process by the AI processing unit 154 (for example, the brightness in the area of the workpiece OB1). and an irradiation area for irradiating with the second structured illumination LG2.
  • the CPU 152 or ISP 153 of the camera device 10 controls the illumination control unit 16 to irradiate the determined irradiation area with the second structured illumination LG2 having the determined first irradiation intensity (for example, first brightness). .
  • the CPU 152 or the ISP 153 of the camera device 10 applies a second irradiation intensity (the first irradiation intensity is
  • the illumination control unit 16 is controlled to irradiate the second structured illumination LG2 with different irradiation intensities (an example).
  • the illumination control unit 16 causes the illumination unit 17 to irradiate the irradiation area of the determined workpiece OB1 with the second structured illumination LG2 at the first irradiation intensity, and to irradiate the area other than the irradiation area with the second irradiation intensity (for example, the second structured illumination LG2).
  • the second structured illumination LG2 having a second luminance different from the first luminance is emitted from the illumination unit 17.
  • the camera device 10 aims to improve the recognition accuracy of the workpiece by being irradiated with the second structured illumination LG2, which has a higher irradiation intensity (for example, brightness) than the first structured illumination LG1.
  • the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 image the workpiece OB1 and the background object while being irradiated with the second structured illumination LG2 from the illumination unit 17. Through this imaging, a captured image IMG3 is obtained.
  • the captured image IMG3 is an image in which an object within the same angle of view as the captured image IMG1 is imaged, and is illuminated with a second structured illumination LG2 having a different irradiation intensity (for example, brightness) than the first structured illumination LG1. It is.
  • the CPU 152 or the ISP 153 of the camera device 10 combines one of the captured images IMG1 from the first imaging module CAP1 and one of the captured images IMG3 from the second imaging module CAP2 so that the workpiece OB1 overlaps, and creates one composite image. Then, the other captured image IMG1 from the first imaging module CAP1 and the other captured image IMG3 from the second imaging module CAP2 are combined so that the workpiece OB1 overlaps to generate the other composite image.
  • the CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance from the camera device 10 to the workpiece OB1 based on the two composite images.
  • the CPU 152 or the ISP 153 of the camera device 10 may generate a UI setting screen WD5 for setting the irradiation intensity (for example, brightness) of the second structured illumination LG2 according to a user's operation, and send it to the PC 50.
  • the PC 50 displays the UI setting screen WD5 sent from the camera device 10.
  • the UI setting screen WD5 shown in FIG. 10 is displayed on the display (not shown) of the PC 50 when sent from the camera device 10.
  • the PC 50 displays a UI setting screen WD5 including a specification field FLD3 for prompting the user to input the relative ratio of the irradiation intensity (for example, brightness) of the second structured lighting LG2, an OK button BT1, and an NG button BT2. .
  • the designation of the relative ratio of illumination intensity is confirmed by inputting it into the designation field FLD3 through a user operation and pressing the OK button BT1.
  • the PC 50 sets parameter values that respectively indicate the irradiation intensity (for example, brightness) of the first structured illumination LG1 and the irradiation intensity (for example, brightness) of the second structured illumination LG2.
  • This set parameter value is input from the PC 50 to the CPU 152 or the ISP 153 via the external interface 155.
  • the CPU 152 or the ISP 153 sets the illumination intensity (for example, brightness) of the first structured illumination LG1 and a pattern image corresponding to the illumination pattern, and the illumination intensity (for example, brightness) of the second structured illumination in the illumination control unit 16.
  • the illumination unit 17 can emit the first structured illumination LG1 having the illumination pattern P1 (see pattern image EX1), and has the same illumination pattern P1 but different illumination intensities (for example, brightness).
  • a second structured illumination LG2 can be emitted. Note that when respecifying the relative ratio of illumination intensity, for example, the NG button BT2 is pressed by the user's operation.
  • the illumination control unit 16 of the camera device 10 switches to the fourth illumination mode at time t1 (see FIG. 2), for example, and directs the illumination of the illumination pattern P1 (first structured illumination LG1) toward the target object. Irradiation is performed from the illumination unit 17 (step St1A).
  • Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the first structured illumination LG1 from the illumination unit 17 in step St1A (step St1A).
  • step St1A two pieces of imaging data (for example, the first left image P1L and the first right image P1R) having parallax with each other are obtained in synchronization with the irradiation of the first structured illumination LG1 from the illumination unit 17.
  • the first left image P1L is obtained from the first imaging module CAP1
  • the first right image P1R is obtained from the second imaging module CAP2.
  • the CPU 152 or the ISP 153 of the camera device 10 waits for a minute time ⁇ ta to elapse (standby) from the execution timing of step St1A (step St3A).
  • the AI processing unit 154 of the camera device 10 uses AI (for example, a trained model for segmentation) to perform region determination processing to determine the region of the workpiece OB1 in the captured image IMG1 (step St11).
  • AI for example, a trained model for segmentation
  • the CPU 152 or the ISP 153 of the camera device 10 sets the first irradiation intensity (for example, brightness) of the second structured illumination LG2 and the second structured illumination LG2 based on the result of the area determination processing by the AI processing unit 154.
  • the irradiation area is determined (Step St21).
  • the CPU 152 or the ISP 153 of the camera device 10 determines a synthesis coefficient indicating the synthesis ratio of the captured image IMG1 during irradiation with the first structured illumination LG1 and the captured image during irradiation with the second structured illumination LG2 (step St21).
  • This synthesis coefficient may be stored in the memory 151 in advance, or may be dynamically determined based on the brightness within the region of the workpiece OB1 in the captured image IMG1.
  • the CPU 152 or the ISP 153 of the camera device 10 irradiates the determined irradiation area with the second structured illumination LG2 having the first irradiation intensity (for example, brightness) determined in step St21, and irradiates the area other than the irradiation area.
  • the illumination control unit 16 is controlled to irradiate the second structured illumination LG2 with the second irradiation intensity (for example, brightness).
  • the illumination control unit 16 causes the illumination unit 17 to irradiate the determined irradiation area with the second structured illumination LG2 having the determined first irradiation intensity (for example, brightness), and causes the illumination unit 17 to irradiate the determined irradiation area with the second structured illumination LG2, and to apply the second irradiation to an area other than the irradiation area.
  • the second structured illumination LG2 of high intensity (for example, brightness) is emitted from the illumination unit 17 (step St4A).
  • Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the second structured illumination LG2 from the illumination unit 17 in step St4A (step St4A).
  • step St4A two pieces of imaging data (for example, the second left image P2L and the second right image P2R) having parallax with each other are obtained in synchronization with the irradiation of the second structured illumination LG2 from the illumination unit 17.
  • the second left image P2L is obtained from the first imaging module CAP1
  • the second right image P2R is obtained from the second imaging module CAP2.
  • the CPU 152 or ISP 153 of the camera device 10 selects one of the two imaged data (for example, the first left image P1L) that has a parallax from each other that was imaged in step St1A, and the image captured in step St4A.
  • One of the two captured image data having a parallax (for example, the second left image P2L) is synthesized (step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 uses the synthesis coefficient determined in step St21 to combine the other of the two pieces of imaged data (for example, the first right image P1R) that have a parallax with each other that was imaged in step St1A.
  • the other image data (for example, second right image P2R) of the two imaged data having a parallax that are imaged in St4A are combined (step St6). Further, the CPU 152 or the ISP 153 of the camera device 10 calculates the parallax between the two composite images (for example, the composite left image P3L and the composite right image P3R) obtained by the composition. The CPU 152 or ISP 153 of the camera device 10 uses the parallax calculation results to generate a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object, and outputs it to the PC 50. (Step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 waits for a minute time ⁇ tb to elapse (standby) from the execution timing of step St4A (step St7A). After step St7A, the processing of the camera device 10 returns to step St1A.
  • step St6 the CPU 152 or the ISP 153 of the camera device 10 displays the first left image P1L captured by the first imaging module CAP1 and the first right image captured by the second imaging module CAP2 during irradiation with the illumination pattern P1.
  • the P1R parallax may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 combines the parallax based on the first left image P1L and the first right image P1R with the parallax based on the second left image P2L and the second right image P2R, thereby determining the standard of the camera device 10.
  • a distance image for specifying the distance between a point (see above) and a target object may be generated and output to the PC 50.
  • step St6 the CPU 152 or the ISP 153 of the camera device 10 displays a first left image P1L captured by the first imaging module CAP1 and a first right image captured by the second imaging module CAP2 during the irradiation with the illumination pattern P1.
  • the P1R parallax may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 generates a distance image from the parallax based on the first left image P1L and the first right image P1R, and further generates a distance image from the parallax based on the second left image P2L and the second right image P2R. may be generated.
  • the CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object by combining these two distance images, and You can also output to
  • the AI processing unit 154 of the camera device 10 uses AI (for example, a trained model for segmentation) that inputs the distance image generated in step St6 to Area determination processing may be performed to determine the area of work OB1 (step St11A).
  • the CPU 152 or the ISP 153 of the camera device 10 sets the first irradiation intensity (for example, brightness) of the second structured illumination LG2 and the second structured illumination LG2 based on the result of the area determination processing by the AI processing unit 154.
  • the irradiation area may be determined (Step St21A). After step St21A, the processing of the camera device 10 returns to step St1A.
  • the CPU 152 or the ISP 153 of the camera device 10 may set the fifth irradiation mode to the illumination control unit 16. That is, the CPU 152 or the ISP 153 of the camera device 10 generates the illumination pattern P2 corresponding to the second structured illumination LG2 as a distance image generation result for a captured image captured while the object is irradiated with the first structured illumination LG1. It may also be controlled to change dynamically based on. An example of this control will be explained with reference to FIG. 12.
  • FIG. 12 is a flowchart illustrating an example of the fifth operation procedure of the camera device 10. In the explanation of FIG. 12, the illumination of the first structured illumination LG1 and the imaging under the first structured illumination LG1, and the illumination of the second structured illumination LG2 and the imaging under the second structured illumination LG2, respectively. executed synchronously.
  • the illumination control unit 16 of the camera device 10 switches to the fifth illumination mode at time t1 (see FIG. 2), for example, and directs the illumination of the illumination pattern P1 (first structured illumination LG1) toward the object. Irradiation is performed from the illumination unit 17 (step St1A).
  • Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the first structured illumination LG1 from the illumination unit 17 in step St1A (step St1A).
  • step St1A two pieces of imaging data (for example, the first left image P1L and the first right image P1R) having parallax with each other are obtained in synchronization with the irradiation of the first structured illumination LG1 from the illumination unit 17.
  • the first left image P1L is obtained from the first imaging module CAP1
  • the first right image P1R is obtained from the second imaging module CAP2.
  • the CPU 152 or the ISP 153 of the camera device 10 waits for a minute time ⁇ ta to elapse (standby) from the execution timing of step St1A (step St3A).
  • the CPU 152 or the ISP 153 of the camera device 10 stores two image data (for example, the first left image P1L and the first right image P1R) that have parallax from each other and that are imaged during the irradiation with the first structured illumination LG1 in step St1A. Parallax is calculated based on , and a distance image for specifying the distance to the target object is generated (Step St31).
  • the CPU 152 or the ISP 153 of the camera device 10 measures the distance from the camera device 10 to the object based on the distance image generated in step St31 (step St32). However, the accuracy of the distance measured in this step St32 does not necessarily have to be high.
  • the CPU 152 or ISP 153 of the camera device 10 determines the irradiation area for irradiating the second structured illumination LG2 based on the distance to the object measured in step St32.
  • the CPU 152 or the ISP 153 of the camera device 10 controls the illumination control unit 16 to irradiate the irradiation area determined in step St12 with the second structured illumination LG2.
  • the illumination control unit 16 causes the illumination unit 17 to irradiate the second structured illumination LG2 so as to narrow down (restrict) the irradiation range to only the determined irradiation area (step St4A).
  • step St4A Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the second structured illumination LG2 from the illumination unit 17 in step St4A (step St4A). That is, in step St4A, two pieces of imaging data (for example, the second left image P2L and the second right image P2R) having parallax with each other are obtained in synchronization with the irradiation of the second structured illumination LG2 from the illumination unit 17. For example, the second left image P2L is obtained from the first imaging module CAP1, and the second right image P2R is obtained from the second imaging module CAP2.
  • the CPU 152 or the ISP 153 of the camera device 10 selects one of the two image data captured in step St1A that has a parallax with each other (for example, the first left image P1L) and one of the two image data that has a parallax that is captured in step St4A. (for example, second left image P2L) (step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 selects the other of the two image data (for example, the first right image P1R) that has a parallax from each other that was imaged in step St1A, and the two images that have a parallax from each other that was imaged in step St4A.
  • the other data (for example, the second right image P2R) is synthesized (step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 calculates the parallax between the two composite images (for example, the composite left image P3L and the composite right image P3R) obtained by the composition.
  • the CPU 152 or ISP 153 of the camera device 10 uses the parallax calculation results to generate a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object, and outputs it to the PC 50. (Step St6).
  • the CPU 152 or the ISP 153 of the camera device 10 waits for a minute time ⁇ tb to elapse (standby) from the execution timing of step St4A (step St7A). After step St7A, the processing of the camera device 10 returns to step St1A.
  • step St6 the CPU 152 or the ISP 153 of the camera device 10 displays the first left image P1L captured by the first imaging module CAP1 and the first right image captured by the second imaging module CAP2 during irradiation with the illumination pattern P1.
  • the P1R parallax may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 combines the parallax based on the first left image P1L and the first right image P1R with the parallax based on the second left image P2L and the second right image P2R, thereby determining the standard of the camera device 10.
  • a distance image for specifying the distance between a point (see above) and a target object may be generated and output to the PC 50.
  • step St6 the CPU 152 or the ISP 153 of the camera device 10 displays a first left image P1L captured by the first imaging module CAP1 and a first right image captured by the second imaging module CAP2 during the irradiation with the illumination pattern P1.
  • the P1R parallax may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated.
  • the CPU 152 or the ISP 153 of the camera device 10 generates a distance image from the parallax based on the first left image P1L and the first right image P1R, and further generates a distance image from the parallax based on the second left image P2L and the second right image P2R. may be generated.
  • the CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object by combining these two distance images, and You can also output to
  • the camera device 10 has the illumination section 17 that irradiates light onto the object and the illumination section 17 in the first irradiation mode and the second irradiation mode that is different from the first irradiation mode.
  • the illumination control unit 16 to control, a first image taken of the object while being irradiated with light in the first irradiation mode, and a second image taken of the object while being irradiated with light in the second irradiation mode.
  • the imaging unit for example, the first imaging module CAP1 and the second imaging module CAP2
  • a generation unit for example, CPU 152 or ISP 153 that generates a distance image.
  • the imaging unit also includes a first imaging module CAP1 that images the object from a first viewpoint, and a second imaging module CAP2 that images the object from a second viewpoint different from the first viewpoint.
  • the generation unit generates a first composite image (for example, a composite left image P3L) obtained by combining a first captured image captured from the first viewpoint and a second captured image captured from the first viewpoint, and a first composite image captured from the second viewpoint.
  • a distance image is generated based on a second composite image (eg, composite right image P3R) obtained by combining the captured image and a second captured image captured from the second viewpoint.
  • the camera device 10 can calculate the parallax by apparently increasing each image feature on the composite left image P3L and the composite right image P3R through the composition process, and can recognize the distance to the object with high accuracy. Can generate distance images.
  • the first irradiation mode is a mode in which a pattern (for example, pattern image EX1) in which a plurality of irradiation figures are arranged at a first interval is irradiated.
  • the second irradiation mode is a mode in which a pattern (for example, pattern image EX2) in which a plurality of irradiation figures are arranged at a second interval different from the first interval is irradiated.
  • the camera device 10 uses the first structured illumination LG1 and the second structured illumination LG2, which have two types of illumination patterns in which the arrangement of a plurality of dots is different, to improve the accuracy of parallax calculation of the object. It is also possible to improve the accuracy of generating distance images of objects.
  • the first irradiation mode is a mode in which a pattern (for example, pattern image EX1) in which a plurality of irradiation figures are arranged at a third interval is irradiated.
  • the second irradiation mode is a mode in which a pattern in which a plurality of irradiation figures arranged at third intervals are shifted by a predetermined amount is irradiated.
  • the first irradiation mode is a mode in which a pattern in which irradiation figures having a first size are arranged is irradiated.
  • the second irradiation mode is a mode in which a pattern in which irradiation figures having a second size smaller than the first size are arranged is irradiated.
  • the camera device 10 further includes an AI processing unit 154 that determines the area of the object in the image captured under the first structured illumination LG1.
  • the illumination control unit 16 causes the illumination unit 17 to execute a third irradiation mode in which a region of the object is irradiated with the first light and a region that does not correspond to the object is irradiated with the second light different from the first light.
  • the camera device 10 can dynamically determine the area of the object in the image captured under the first structured illumination LG1, even if the object is a movable object, and move the object to the appropriate area.
  • a second structured illumination LG2 can be emitted.
  • the camera device 10 further includes an AI processing unit 154 that determines the area of the object in the image captured under the first structured illumination LG1.
  • the illumination control unit 16 irradiates an area of the object with a first irradiation intensity determined based on the brightness of the area of the object in the image, and irradiates an area that does not correspond to the object with a second irradiation intensity different from the first irradiation intensity.
  • the illumination unit 17 is caused to execute a fourth irradiation mode in which irradiation is performed by.
  • the camera device 10 can emit the second structured illumination LG2 with increased irradiation intensity (for example, brightness) compared to the first structured illumination LG1. , it is possible to improve the recognition accuracy of the workpiece.
  • irradiation intensity for example, brightness
  • the illumination control unit 16 determines the distance between the reference point and the target object from a first distance image (e.g., corresponding to the distance image generated in step St31 in FIG. 12) based on the first image taken from the first viewpoint.
  • the illumination unit 17 is caused to execute a fifth illumination mode in which the illumination area is determined based on the distance between the reference point and the object and is illuminated.
  • the camera device 10 measures the distance to the object from the above-described first distance image generated only by irradiation with the first structured illumination LG1, and uses this distance measurement result.
  • the irradiation area of the second structured lighting LG2 can be changed by Therefore, even if the target object is a movable object, for example, the camera device 10 can roughly grasp the distance and position to the target object from the imaging result under the first structured illumination LG1.
  • the second structured illumination LG2 can be appropriately irradiated to an area where there is a high possibility of being exposed.
  • the irradiation figure is either circular or polygonal.
  • the camera device 10 can increase the image features of the captured image under the first structured illumination LG1 and the captured image under the second structured illumination LG2, and can increase the image characteristics of the captured image under the first structured illumination LG1 and the captured image under the second structured illumination LG2. Highly accurate distance images can be generated.
  • the elements included in the camera device 10 of this embodiment may be operated as a system.
  • the system 100 may be operated by mounting the first imaging module CAP1, the second imaging module CAP2, and elements other than the first imaging module CAP1 and the second imaging module CAP2 in different devices.
  • the present disclosure is useful as a camera device, image generation method, and system that recognize the distance to an object with high accuracy.

Abstract

This camera device comprises: an irradiation unit that irradiates an object with light using a first irradiation mode and a second irradiation mode different from the first irradiation mode; an irradiation control unit that controls the irradiation unit; an image capturing unit that generates a first captured image in which the object is captured while irradiated with light using the first irradiation mode, and a second captured image in which the object is captured while irradiated with light using the second irradiation mode; and a generation unit that generates a range image for identifying the range between a reference point and the object on the basis of the first captured image and the second captured image.

Description

カメラ装置、画像生成方法及びシステムCamera device, image generation method and system
 本開示は、カメラ装置、画像生成方法及びシステムに関する。 The present disclosure relates to a camera device, an image generation method, and a system.
 特許文献1は、第1の波長を含む第1の光を照射する第1の照明装置と、第1の波長とは異なる第2の波長を含む第2の光を照射する第2の照明装置と、対象物を撮像する撮像装置とを有する画像処理装置を開示している。撮像装置は、第1の照明装置から第1の光を対象物に照射しかつ第2の照明装置から第2の光を対象物に照射した状態で対象物を撮像する。画像処理装置は、第1の波長を透過させる第1の光学フィルタを介して撮像された画素から成る濃淡画像を生成し、第2の波長を透過させる第2の光学フィルタを介して撮像された画素を含む距離画像を生成する。 Patent Document 1 discloses a first illumination device that emits first light including a first wavelength, and a second illumination device that emits second light that includes a second wavelength different from the first wavelength. An image processing device is disclosed that includes: and an imaging device that captures an image of a target object. The imaging device images the object in a state in which the object is irradiated with first light from a first illumination device and second light is irradiated onto the object from a second illumination device. The image processing device generates a gray-scale image consisting of pixels imaged through a first optical filter that transmits a first wavelength, and pixels that are imaged through a second optical filter that transmits a second wavelength. Generate a distance image containing pixels.
日本国特開2020-193867号公報Japanese Patent Application Publication No. 2020-193867
 特許文献1によれば、複雑な背景(例えば、検査対象となるべき部品だけでなく、検査治具表面の模様、機械の構造物、トレイ等の様々な要素が写り込む背景)が写り込む環境でも、部品の良否判定等の画像検査を安定的かつ高速に実行できる。しかし、特許文献1では、テクスチャ(模様等の質感)が実質的に無い領域が背景である場合については考慮されていない。ここでいうテクスチャが実質的に無い領域とは、テクスチャが全く無い領域に限らず、テクスチャの存在が判別しにくい程度に画角内で目立たない領域も概念的に含まれる。つまり、特許文献1では、テクスチャが実質的に無い領域である背景に対象物(例えばロボットハンド等のエンドエフェクタがピックアップする物)が置かれている場合、対象物までの距離を正確に認識することが難しく、対象物のピックアップが困難となることがあった。 According to Patent Document 1, an environment in which a complex background (for example, a background in which not only the parts to be inspected but also various elements such as the pattern on the surface of the inspection jig, the structure of the machine, the tray, etc. are reflected) is reflected. However, image inspections such as determining the quality of parts can be performed stably and quickly. However, Patent Document 1 does not consider the case where the background is an area that has substantially no texture (texture such as a pattern). The area where there is substantially no texture as used herein is not limited to an area where there is no texture at all, but also conceptually includes an area where the presence of texture is so inconspicuous within the angle of view that it is difficult to distinguish. In other words, in Patent Document 1, when an object (for example, an object to be picked up by an end effector such as a robot hand) is placed in a background that is an area with substantially no texture, the distance to the object is accurately recognized. This sometimes made it difficult to pick up the object.
 本開示は、上述した従来の事情に鑑みて案出され、対象物までの距離を高精度に認識することを目的とする。 The present disclosure has been devised in view of the above-mentioned conventional circumstances, and aims to recognize the distance to an object with high precision.
 本開示は、第1照射モードと、前記第1照射モードとは異なる第2照射モードとにより、対象物に光を照射する照明部と、前記照明部を制御する照明制御部と、前記第1照射モードにより光を照射した状態で前記対象物を撮像した第1撮像画像と、前記第2照射モードにより光を照射した状態で前記対象物を撮像した第2撮像画像とを生成する撮像部と、前記第1撮像画像と前記第2撮像画像とを基に、基準地点と前記対象物との間の距離を特定するための距離画像を生成する生成部と、を備える、カメラ装置を提供する。 The present disclosure includes: an illumination section that irradiates light to a target object using a first irradiation mode and a second irradiation mode that is different from the first irradiation mode; an illumination control section that controls the illumination section; an imaging unit that generates a first captured image of the object while being irradiated with light in an irradiation mode and a second captured image of the object while being irradiated with light in the second irradiation mode; , a generation unit that generates a distance image for specifying a distance between a reference point and the target object based on the first captured image and the second captured image. .
 また、本開示は、第1照射モードと、前記第1照射モードとは異なる第2照射モードとにより、対象物に光を照射する照明部を制御するステップと、前記第1照射モードにより光を照射した状態で前記対象物を撮像した第1撮像画像と、前記第2照射モードにより光を照射した状態で前記対象物を撮像した第2撮像画像とを生成するステップと、前記第1撮像画像と前記第2撮像画像とを基に、基準地点と前記対象物との間の距離を特定するための距離画像を生成するステップと、を含む、画像生成方法を提供する。 The present disclosure also provides a step of controlling an illumination unit that irradiates light to a target object using a first irradiation mode and a second irradiation mode different from the first irradiation mode; generating a first captured image of the target object in a irradiated state and a second captured image of the target object in a state of irradiation with light in the second irradiation mode; and a step of generating the first captured image. and the step of generating a distance image for specifying the distance between the reference point and the target object based on the second captured image.
 また、本開示は、第1照射モードと、前記第1照射モードとは異なる第2照射モードとにより、対象物に光を照射する照明部と、前記照明部を制御する照明制御部と、前記第1照射モードにより光を照射した状態で前記対象物を撮像した第1撮像画像と、前記第2照射モードにより光を照射した状態で前記対象物を撮像した第2撮像画像とを生成する撮像部と、前記第1撮像画像と前記第2撮像画像とを基に、基準地点と前記対象物との間の距離を特定するための距離画像を生成する生成部と、を備える、システムを提供する。 The present disclosure also provides an illumination unit that irradiates light to a target object using a first irradiation mode and a second irradiation mode that is different from the first irradiation mode, an illumination control unit that controls the illumination unit, and Imaging that generates a first captured image that captures the object while being irradiated with light in a first irradiation mode, and a second captured image that captures the object while being irradiated with light in the second irradiation mode. and a generation unit that generates a distance image for specifying the distance between the reference point and the target object based on the first captured image and the second captured image. do.
 なお、これらの包括的又は具体的な態様は、システム、装置、方法、集積回路、コンピュータプログラム、又は、記録媒体のうち、一つ以上を含む任意の組み合わせにより実現されてもよい。 Note that these general or specific aspects may be realized by any combination including one or more of a system, a device, a method, an integrated circuit, a computer program, or a recording medium.
 本開示によれば、対象物までの距離を高精度に認識できる。 According to the present disclosure, the distance to the target object can be recognized with high accuracy.
 本開示の一態様における更なる利点及び効果は、明細書及び図面から明らかにされる。かかる利点及び/又は効果は、いくつかの実施形態並びに明細書及び図面に記載された特徴によってそれぞれ提供されるが、1つ又はそれ以上の同一の特徴を得るために必ずしも全てが提供される必要はない。 Further advantages and effects of one aspect of the present disclosure will become apparent from the specification and drawings. Such advantages and/or effects may be provided by each of the embodiments and the features described in the specification and drawings, but not all need to be provided in order to obtain one or more of the same features. There isn't.
図1は、システムのシステム構成例を示す図である。FIG. 1 is a diagram showing an example of the system configuration of the system. 図2Aは、照明パターンP1の照明と照明パターンP2の照明とを交互に行うタイムチャートの一例を示す図である。FIG. 2A is a diagram showing an example of a time chart in which illumination of the illumination pattern P1 and illumination of the illumination pattern P2 are performed alternately. 図2Bは、照明パターンP1の照明と照明パターンP2の照明とを交互に行うタイムチャートの他の一例を示す図である。FIG. 2B is a diagram showing another example of a time chart in which the illumination of the illumination pattern P1 and the illumination of the illumination pattern P2 are alternately performed. 図3は、カメラ装置の第1動作手順の一例を示すフローチャートである。FIG. 3 is a flowchart showing an example of the first operation procedure of the camera device. 図4は、カメラ装置の第2動作手順の一例を示すフローチャートである。FIG. 4 is a flowchart illustrating an example of the second operation procedure of the camera device. 図5は、複数の異なる照明パターンを用いて照明する場合のUI設定画面例を示す図である。FIG. 5 is a diagram showing an example of a UI setting screen when lighting is performed using a plurality of different lighting patterns. 図6は、同一の照明パターンを位置シフトして照射する場合のUI設定画面例を示す図である。FIG. 6 is a diagram showing an example of a UI setting screen when the same illumination pattern is irradiated with shifted positions. 図7は、ドットピッチを指定倍率分縮小して照射する場合のUI設定画面例を示す図である。FIG. 7 is a diagram showing an example of a UI setting screen when emitting light with the dot pitch reduced by a specified magnification. 図8は、AI処理部の処理結果を用いた照明エリアを制御する場合の動作概要例及びUI設定画面例を示す図である。FIG. 8 is a diagram showing an example of an operation outline and an example of a UI setting screen when controlling a lighting area using the processing results of the AI processing unit. 図9Aは、カメラ装置の第3動作手順の一例を示すフローチャートである。FIG. 9A is a flowchart illustrating an example of the third operation procedure of the camera device. 図9Bは、カメラ装置の第3動作手順の他の一例を示すフローチャートである。FIG. 9B is a flowchart showing another example of the third operation procedure of the camera device. 図10は、AI処理部の処理結果を用いた照明強度を制御する場合の動作概要例及びUI設定画面例を示す図である。FIG. 10 is a diagram showing an example of an operation outline and an example of a UI setting screen when controlling illumination intensity using the processing result of the AI processing unit. 図11Aは、カメラ装置の第4動作手順の一例を示すフローチャートである。FIG. 11A is a flowchart illustrating an example of the fourth operation procedure of the camera device. 図11Bは、カメラ装置の第4動作手順の他の一例を示すフローチャートである。FIG. 11B is a flowchart showing another example of the fourth operation procedure of the camera device. 図12は、カメラ装置の第5動作手順の一例を示すフローチャートである。FIG. 12 is a flowchart illustrating an example of the fifth operation procedure of the camera device.
 以下、適宜図面を参照しながら、本開示に係るカメラ装置、画像生成方法及びシステムを具体的に開示した実施の形態を詳細に説明する。但し、必要以上に詳細な説明は省略する場合がある。例えば、既によく知られた事項の詳細説明あるいは実質的に同一の構成に対する重複説明を省略する場合がある。これは、以下の説明が不必要に冗長になることを避け、当業者の理解を容易にするためである。なお、添付図面及び以下の説明は、当業者が本開示を十分に理解するために提供されるものであり、これらにより特許請求の範囲に記載の主題を限定することは意図されていない。 Hereinafter, embodiments specifically disclosing a camera device, an image generation method, and a system according to the present disclosure will be described in detail with reference to the drawings as appropriate. However, more detailed explanation than necessary may be omitted. For example, detailed explanations of well-known matters or redundant explanations of substantially the same configurations may be omitted. This is to avoid unnecessary redundancy in the following description and to facilitate understanding by those skilled in the art. The accompanying drawings and the following description are provided to enable those skilled in the art to fully understand the present disclosure, and are not intended to limit the subject matter recited in the claims.
 以下の実施の形態1では、ロボットアーム(図示略)の先端部であるロボットハンド(図示略)に固定されているカメラ装置により対象物を撮像するユースケースを例示して説明する。対象物とは、例えば、テクスチャが実質的に無い領域である背景物(例えば模様の無い白いテーブル或いは床面、模様の無い黒いテーブル或いは床面)、背景物の一例としてのテーブル上に配置された配置物の一例としてのワーク等、カメラ装置によって撮影される物体に相当する。以下の説明を簡単にするため、特段の説明が無い限り、対象物として、テクスチャが実質的に無い白いテーブルと、白いテーブルに配置されたワーク(例えば、白い花瓶、赤いリンゴ等)とを例に挙げて説明する。なお、実施の形態1のユースケースは上述した例に限定されなくてもよい。対象物は、例えば工業製品或いは工業部品であり、ロボットハンドによりピックアップされてもよい。なお、対象物は工業製品や工業部品に限定されなくてもよい。 In Embodiment 1 below, a use case in which an object is imaged by a camera device fixed to a robot hand (not shown) that is the tip of a robot arm (not shown) will be described. The target object is, for example, a background object that is an area with substantially no texture (for example, a white table or floor surface without a pattern, a black table or floor surface without a pattern), or a background object placed on a table as an example of a background object. This corresponds to an object photographed by a camera device, such as a workpiece as an example of a placed object. To simplify the following explanation, unless otherwise specified, the objects are a white table with virtually no texture and a workpiece placed on the white table (for example, a white vase, a red apple, etc.). I will list and explain. Note that the use case of Embodiment 1 does not need to be limited to the above-mentioned example. The object is, for example, an industrial product or an industrial part, and may be picked up by a robot hand. Note that the target object does not have to be limited to industrial products or industrial parts.
 図1は、システム100のシステム構成例を示す図である。図1に示すように、システム100は、カメラ装置10と、PC(Personal Computer)50とを少なくとも含む。カメラ装置10とPC50との間は、データ信号の入出力が互いに可能となるように接続されている。なお、図1では図示が省略されているが、カメラ装置10には、カメラ装置10が固定されるロボットハンド(図示略)の動きを制御可能なロボットコントローラ(図示略)が接続されてもよい。 FIG. 1 is a diagram showing an example of the system configuration of the system 100. As shown in FIG. 1, the system 100 includes at least a camera device 10 and a PC (Personal Computer) 50. The camera device 10 and the PC 50 are connected so that data signals can be input and output to each other. Although not shown in FIG. 1, the camera device 10 may be connected to a robot controller (not shown) that can control the movement of a robot hand (not shown) to which the camera device 10 is fixed. .
 カメラ装置10は、例えばロボットアーム(図示略。以下同様。)の先端部であるロボットハンド(図示略。以下同様。)と一体的に移動可能にロボットハンドに固定されている。カメラ装置10は、ロボットハンドと干渉しないように(つまり、カメラ装置10の画角内にロボットハンド(図示略)が含まれないように)固定されている。カメラ装置10は、その画角内に含まれる対象物に向けて、複数種類の照明モード(言い換えると、構造化照明(後述参照))のそれぞれを時分割に交互に使い分けて照明モードに対応する構造化照明を照射しながら対象物を複眼で撮像する。つまり、カメラ装置10は、複数の撮像モジュール(例えば、第1撮像モジュールCAP1、第2撮像モジュールCAP2、後述参照)を備える、いわゆる2眼タイプのステレオカメラに相当する。カメラ装置10は、それぞれの照射モードに対応する構造化照明の照射中に撮像された対象物の撮像画像を基に、対象物までの距離を特定するための画像(例えば、距離画像及び/又は視差画像)を生成する。またカメラ装置10は距離画像及び/又は視差画像を基に、対象物の位置情報を特定することができる。カメラ装置10は、生成された距離画像と対象物の位置情報とを含む出力データをロボットコントローラに出力する。なお、図1にはカメラ装置10は1台のみ図示されているが、システム100は複数台の同一構成のカメラ装置10を備えてもよい。この場合、複数台のカメラ装置10とPC50との間でデータ信号の入出力が可能となるように接続される。 The camera device 10 is, for example, fixed to a robot hand (not shown; the same applies hereinafter), which is the tip of a robot arm (not shown; the same applies hereinafter), and is movable integrally with the robot hand. The camera device 10 is fixed so as not to interfere with the robot hand (that is, so that the robot hand (not shown) is not included within the field of view of the camera device 10). The camera device 10 corresponds to the illumination mode by alternately using each of a plurality of types of illumination modes (in other words, structured illumination (see below)) in a time-sharing manner toward an object included within its angle of view. The object is imaged with compound eyes while irradiating structured illumination. That is, the camera device 10 corresponds to a so-called two-lens type stereo camera that includes a plurality of imaging modules (for example, a first imaging module CAP1 and a second imaging module CAP2, see below). The camera device 10 uses an image (for example, a distance image and/or parallax image). Further, the camera device 10 can specify the position information of the object based on the distance image and/or the parallax image. The camera device 10 outputs output data including the generated distance image and target object position information to the robot controller. Although only one camera device 10 is illustrated in FIG. 1, the system 100 may include a plurality of camera devices 10 having the same configuration. In this case, the plurality of camera devices 10 and the PC 50 are connected to each other so that data signals can be input and output.
 ロボットコントローラは、カメラ装置10からの出力データを基に、カメラ装置10(言い換えると、ロボットハンド)から対象物までの距離を認識する。続いて、ロボットコントローラは、対象物の位置に合わせてロボットハンドの動き(例えば対象物のピックアップ)を制御する。なお、ロボットハンドが行う動きは、システム100のユースケースに応じて適応的に定められ、上述した対象物のピックアップに限定されない。 The robot controller recognizes the distance from the camera device 10 (in other words, the robot hand) to the target object based on the output data from the camera device 10. Subsequently, the robot controller controls the movement of the robot hand (for example, picking up the object) according to the position of the object. Note that the movements performed by the robot hand are adaptively determined according to the use case of the system 100, and are not limited to the above-mentioned picking up of objects.
 PC50は、カメラ装置10が使用する各種のパラメータの設定を行うためのブラウザソフトウェア(図示略)を実行可能にインストールしている。PC50は、システム100のユーザ(図示略。以下同様。)の操作に応じてブラウザソフトウェアを起動して立ち上げ、カメラ装置10の各種のUI(User Interface)設定画面を表示する。各種のパラメータ及びUI設定画面の詳細については後述する。 The PC 50 has executable browser software (not shown) installed therein for setting various parameters used by the camera device 10. The PC 50 starts up browser software in response to an operation by a user of the system 100 (not shown; the same applies hereinafter), and displays various UI (User Interface) setting screens for the camera device 10. Details of various parameters and the UI setting screen will be described later.
 カメラ装置10は、第1撮像モジュールCAP1と、第2撮像モジュールCAP2と、SoC(System on a Chip)15と、照明制御部16と、照明部17とを含む。第1撮像モジュールCAP1は、Lレンズ11と、Lイメージセンサ13とを含み、第1視点から対象物を撮像する。第2撮像モジュールCAP2は、Rレンズ12と、Rイメージセンサ14とを含み、第1視点とは異なる第2視点から対象物を撮像する。カメラ装置10の設置面と平行な方向において、第1撮像モジュールCAP1と第2撮像モジュールCAP2とは、互いに間隔(例えば、モジュール設置間隔)を設けて配置されている。つまり、第2視点は、カメラ装置10の設置面と平行な方向において、第1視点からモジュール設置間隔離れた位置から対象物を撮像するための視点に相当する。なお、モジュール設置間隔は、カメラ装置10の設置面と平行な方向以外の方向に関して定義してもよい。第1撮像モジュールCAP1と第2撮像モジュールCAP2とは、本実施形態の撮像部に相当する。 The camera device 10 includes a first imaging module CAP1, a second imaging module CAP2, an SoC (System on a Chip) 15, a lighting control section 16, and a lighting section 17. The first imaging module CAP1 includes an L lens 11 and an L image sensor 13, and images the object from a first viewpoint. The second imaging module CAP2 includes an R lens 12 and an R image sensor 14, and images the object from a second viewpoint different from the first viewpoint. In the direction parallel to the installation surface of the camera device 10, the first imaging module CAP1 and the second imaging module CAP2 are arranged with an interval (eg, module installation interval) between them. That is, the second viewpoint corresponds to a viewpoint for capturing an image of the object from a position separated by the module installation distance from the first viewpoint in a direction parallel to the installation surface of the camera device 10. Note that the module installation interval may be defined with respect to a direction other than the direction parallel to the installation surface of the camera device 10. The first imaging module CAP1 and the second imaging module CAP2 correspond to the imaging section of this embodiment.
 Lレンズ11は、例えばフォーカスレンズ及びズームレンズを含む。Lレンズ11には、照明部17から対象物に向けて第1照明モードに対応する構造化照明(後述参照)が照射されている中で、画角内に含まれる被写体(例えば対象物)により反射された光である入射光が入射する。Lレンズ11に入射した入射光は、Lレンズ11とLイメージセンサ13との間に可視光カットフィルタ及びIRカットフィルタのうちいずれかが配置されている場合、そのフィルタを介してLイメージセンサ13の受光面(撮像面)に被写体の光学像を結像する。Lレンズ11は、カメラ装置10の設置場所又は撮影用途等に応じて、様々な焦点距離又は撮影範囲のレンズを用いることができる。 The L lens 11 includes, for example, a focus lens and a zoom lens. The L lens 11 is illuminated with structured illumination (see below) that corresponds to the first illumination mode from the illumination unit 17 toward the object, and the L lens 11 is illuminated by the object (for example, the object) included within the angle of view. Incident light, which is reflected light, enters. If either a visible light cut filter or an IR cut filter is disposed between the L lens 11 and the L image sensor 13, the incident light that entered the L lens 11 passes through the filter to the L image sensor 13. An optical image of the subject is formed on the light-receiving surface (imaging surface). As the L lens 11, lenses with various focal lengths or shooting ranges can be used depending on the installation location of the camera device 10, the shooting purpose, etc.
 なお、カメラ装置10は、Lレンズ11の駆動を制御するLレンズ駆動部(図示略。以下同様。)を備えてもよい。また、CPU(Central Processing Unit)152又はISP(Image Signal Processor)153は、Lレンズ11の駆動に関する内部パラメータ(例えばフォーカスレンズの位置、ズーム倍率に対応するズームレンズの位置)を調整(変更)してLレンズ駆動部を介してLレンズ11を駆動させてもよい。また、Lレンズ11は固定配置されてもよい。 Note that the camera device 10 may include an L lens drive section (not shown; the same applies hereinafter) that controls the drive of the L lens 11. Further, the CPU (Central Processing Unit) 152 or the ISP (Image Signal Processor) 153 adjusts (changes) internal parameters related to driving the L lens 11 (for example, the position of the focus lens, the position of the zoom lens corresponding to the zoom magnification). Alternatively, the L lens 11 may be driven via the L lens driving section. Further, the L lens 11 may be fixedly arranged.
 Rレンズ12は、例えばフォーカスレンズ及びズームレンズを含む。Rレンズ12には、照明部17から対象物に向けて第2照明モードに対応する構造化照明(後述参照)が照射されている中で、画角内に含まれる被写体(例えば対象物)により反射された光である入射光が入射する。Rレンズ12に入射した入射光は、Rレンズ12とRイメージセンサ14との間に可視光カットフィルタ及びIRカットフィルタのうちいずれかが配置されている場合、そのフィルタを介してRイメージセンサ14の受光面(撮像面)に被写体の光学像を結像する。Rレンズ12は、カメラ装置10の設置場所又は撮影用途等に応じて、様々な焦点距離又は撮影範囲のレンズを用いることができる。 The R lens 12 includes, for example, a focus lens and a zoom lens. The R lens 12 is illuminated with structured illumination (see below) that corresponds to the second illumination mode from the illumination unit 17 toward the object, and when the object (for example, the object) included in the angle of view Incident light, which is reflected light, enters. If either a visible light cut filter or an IR cut filter is disposed between the R lens 12 and the R image sensor 14, the incident light that enters the R lens 12 passes through the filter to the R image sensor 14. An optical image of the subject is formed on the light-receiving surface (imaging surface). As the R lens 12, lenses with various focal lengths or shooting ranges can be used depending on the installation location of the camera device 10, the shooting purpose, etc.
 なお、カメラ装置10は、Rレンズ12の駆動を制御するRレンズ駆動部(図示略。以下同様。)を備えてもよい。また、CPU152又はISP153は、Rレンズ12の駆動に関する内部パラメータ(例えばフォーカスレンズの位置、ズーム倍率に対応するズームレンズの位置)を調整(変更)してRレンズ駆動部を介してRレンズ12を駆動させてもよい。また、Rレンズ12は固定配置されてもよい。 Note that the camera device 10 may include an R lens drive section (not shown; the same applies hereinafter) that controls the drive of the R lens 12. Further, the CPU 152 or the ISP 153 adjusts (changes) internal parameters related to driving the R lens 12 (for example, the position of the focus lens, the position of the zoom lens corresponding to the zoom magnification), and drives the R lens 12 via the R lens driving section. It may be driven. Further, the R lens 12 may be fixedly arranged.
 可視光カットフィルタは、Lレンズ11又はRレンズ12を透過した入射光(つまり被写体により反射された光)のうち可視光(例えば400~760[nm]の波長を有する光)を遮断する特性を有する。可視光カットフィルタは、Lレンズ11又はRレンズ12を透過した入射光のうち可視光を遮断する。カメラ装置10は、可視光カットフィルタの駆動を制御するフィルタ駆動部(図示略。以下同様。)を備えてもよい。フィルタ駆動部を備える場合、可視光カットフィルタは、CPU152又はISP153からの制御信号に基づいて、所定期間(例えば夜間)にLレンズ11とLイメージセンサ13との間、Rレンズ12とRイメージセンサ14との間に位置するようフィルタ駆動部を介して配置される。 The visible light cut filter has a property of blocking visible light (for example, light having a wavelength of 400 to 760 [nm]) out of the incident light that has passed through the L lens 11 or the R lens 12 (that is, the light reflected by the subject). have The visible light cut filter blocks visible light among the incident light that has passed through the L lens 11 or the R lens 12. The camera device 10 may include a filter drive unit (not shown; the same applies hereinafter) that controls the drive of the visible light cut filter. In the case where the filter drive unit is provided, the visible light cut filter operates between the L lens 11 and the L image sensor 13 and between the R lens 12 and the R image sensor during a predetermined period (for example, at night) based on a control signal from the CPU 152 or the ISP 153. 14 via the filter drive section.
 IRカットフィルタは、可視光(例えば400~760[nm]の波長を有する光)を通過させ、かつ近赤外光(例えば780[nm]以上の波長を有する光)を遮断する特性を有する。IRカットフィルタは、Lレンズ11又はRレンズ12を透過した入射光のうち近赤外光を遮断して可視光を通過させる。カメラ装置10は、IRカットフィルタの駆動を制御するフィルタ駆動部を備えてもよい。フィルタ駆動部を備える場合、IRカットフィルタは、CPU152又はISP153からの制御信号に基づいて、所定期間(例えば昼間)にLレンズ11とLイメージセンサ13との間、Rレンズ12とRイメージセンサ14との間に位置するようフィルタ駆動部を介して配置される。 The IR cut filter has the property of passing visible light (for example, light having a wavelength of 400 to 760 [nm]) and blocking near-infrared light (for example, light having a wavelength of 780 [nm] or more). The IR cut filter blocks near-infrared light of the incident light that has passed through the L lens 11 or the R lens 12 and allows visible light to pass through. The camera device 10 may include a filter drive unit that controls the drive of the IR cut filter. In the case where the filter drive section is provided, the IR cut filter operates between the L lens 11 and the L image sensor 13 and between the R lens 12 and the R image sensor 14 during a predetermined period (for example, during the day) based on a control signal from the CPU 152 or the ISP 153. The filter drive unit is located between the filter drive unit and the filter drive unit.
 Lイメージセンサ13は、可視光あるいは近赤外光の撮像に適した複数の画素が配列されたCCD(Charge Coupled Device)センサあるいはCMOS(Complementary Metal Oxide Semiconductor)センサと、露光制御回路(図示略)と、信号処理回路(図示略)とを含む。Lイメージセンサ13は、複数の画素により構成される受光面(撮像面)に受けた光を電気信号に変換する光電変換を所定間隔ごとに行う。光電変換の所定間隔は、いわゆるフレームレート(fps:frame per second)に応じて定まる。例えば、フレームレートが120[fps]である場合、所定間隔は1/120[秒]となる。これにより、Lイメージセンサ13は、照明部17から対象物に向けて第1照射モードに対応する構造化照明(後述参照)が照射されている中で、対象物により反射された光に応じて、赤色成分信号(R信号)、緑色成分信号(G信号)、青色成分信号(B信号)のそれぞれを画素ごとに時間的に連続して電気信号として取得する。なお、Lイメージセンサ13は、RGB形式の電気信号の代わりにモノクロ形式の電気信号を取得してもよい。Lイメージセンサ13の信号処理回路(図示略)は、電気信号(アナログ信号)をデジタル形式の撮像データに変換する。Lイメージセンサ13は、デジタル形式の撮像データをフレームレートに応じた所定間隔ごとにメモリ151に転送する。メモリ151は、Lイメージセンサ13から受信したデジタル形式の撮像データを保存する。なお、Lイメージセンサ13は、デジタル形式の撮像データをフレームレートに応じた所定間隔ごとにCPU152に送ってもよい。また、Lイメージセンサ13は、CPU152又はISP153からの露光制御信号に基づいて、露光条件を示す内部パラメータ(例えば露光時間、ゲイン、フレームレート)を露光制御回路により調整(変更)する。これにより、カメラ装置10は、周囲の環境に応じた露光条件を変更でき、良好な画質の撮像データを得ることができる。 The L image sensor 13 includes a CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor in which a plurality of pixels suitable for imaging visible light or near-infrared light are arranged, and an exposure control circuit (not shown). omitted) and a signal processing circuit (not shown). The L image sensor 13 performs photoelectric conversion at predetermined intervals to convert light received by a light-receiving surface (imaging surface) made up of a plurality of pixels into an electrical signal. The predetermined interval of photoelectric conversion is determined according to the so-called frame rate (fps: frame per second). For example, if the frame rate is 120 [fps], the predetermined interval is 1/120 [second]. As a result, the L image sensor 13 is configured to respond to light reflected by the object while the illumination unit 17 is emitting structured illumination (see below) corresponding to the first irradiation mode toward the object. , a red component signal (R signal), a green component signal (G signal), and a blue component signal (B signal) are acquired temporally continuously for each pixel as electrical signals. Note that the L image sensor 13 may acquire a monochrome electrical signal instead of an RGB electrical signal. A signal processing circuit (not shown) of the L image sensor 13 converts an electrical signal (analog signal) into digital imaging data. The L image sensor 13 transfers digital image data to the memory 151 at predetermined intervals depending on the frame rate. The memory 151 stores digital image data received from the L image sensor 13. Note that the L image sensor 13 may send digital image data to the CPU 152 at predetermined intervals depending on the frame rate. Further, the L image sensor 13 adjusts (changes) internal parameters indicating exposure conditions (for example, exposure time, gain, frame rate) based on an exposure control signal from the CPU 152 or the ISP 153 using an exposure control circuit. Thereby, the camera device 10 can change the exposure conditions according to the surrounding environment, and can obtain imaging data with good image quality.
 Rイメージセンサ14は、可視光あるいは近赤外光の撮像に適した複数の画素が配列されたCCDセンサあるいはCMOSセンサと、露光制御回路(図示略)と、信号処理回路(図示略)とを含む。Rイメージセンサ14は、複数の画素により構成される受光面(撮像面)に受けた光を電気信号に変換する光電変換を所定間隔ごとに行う。光電変換の所定間隔は、いわゆるフレームレート(fps)に応じて定まる。例えば、フレームレートが120[fps]である場合、所定間隔は1/120[秒]となる。これにより、Rイメージセンサ14は、照明部17から対象物に向けて第2照射モードに対応する構造化照明(後述参照)が照射されている中で、対象物により反射された光に応じて、赤色成分信号(R信号)、緑色成分信号(G信号)、青色成分信号(B信号)のそれぞれを画素ごとに時間的に連続して電気信号として取得する。なお、Rイメージセンサ14は、RGB形式の電気信号の代わりにモノクロ形式の電気信号を取得してもよい。Rイメージセンサ14の信号処理回路(図示略)は、電気信号(アナログ信号)をデジタル形式の撮像データに変換する。Rイメージセンサ14は、デジタル形式の撮像データをフレームレートに応じた所定間隔ごとにメモリ151に転送する。メモリ151は、Rイメージセンサ14から受信したデジタル形式の撮像データを保存する。なお、Rイメージセンサ14は、デジタル形式の撮像データをフレームレートに応じた所定間隔ごとにCPU152に送ってもよい。また、Rイメージセンサ14は、CPU152又はISP153からの露光制御信号に基づいて、露光条件に関する内部パラメータ(例えば露光時間、ゲイン、フレームレート)を露光制御回路により調整(変更)する。これにより、カメラ装置10は、周囲の環境に応じた露光条件を変更でき、良好な画質の撮像データを得ることができる。 The R image sensor 14 includes a CCD sensor or CMOS sensor in which a plurality of pixels suitable for imaging visible light or near-infrared light are arranged, an exposure control circuit (not shown), and a signal processing circuit (not shown). include. The R image sensor 14 performs photoelectric conversion at predetermined intervals to convert light received by a light receiving surface (imaging surface) made up of a plurality of pixels into an electrical signal. The predetermined interval of photoelectric conversion is determined according to the so-called frame rate (fps). For example, if the frame rate is 120 [fps], the predetermined interval is 1/120 [second]. As a result, the R image sensor 14 is configured to respond to the light reflected by the object while the illumination unit 17 is emitting structured illumination (see below) corresponding to the second irradiation mode toward the object. , a red component signal (R signal), a green component signal (G signal), and a blue component signal (B signal) are acquired temporally continuously for each pixel as electrical signals. Note that the R image sensor 14 may acquire a monochrome electrical signal instead of an RGB electrical signal. A signal processing circuit (not shown) of the R image sensor 14 converts an electrical signal (analog signal) into digital imaging data. The R image sensor 14 transfers digital image data to the memory 151 at predetermined intervals depending on the frame rate. The memory 151 stores digital format imaging data received from the R image sensor 14. Note that the R image sensor 14 may send digital image data to the CPU 152 at predetermined intervals depending on the frame rate. Further, the R image sensor 14 adjusts (changes) internal parameters related to exposure conditions (for example, exposure time, gain, frame rate) by an exposure control circuit based on an exposure control signal from the CPU 152 or the ISP 153. Thereby, the camera device 10 can change the exposure conditions according to the surrounding environment, and can obtain imaging data with good image quality.
 SoC15は、集積回路により構成される1つのチップ上に各種の電子部品が集積された集積回路製品である。SoC15は、メモリ151と、CPU152と、ISP153と、AI処理部154と、外部インターフェース155とを含む。図1では、インターフェースを便宜的に「I/F」と図示している。 The SoC 15 is an integrated circuit product in which various electronic components are integrated on a single chip made up of an integrated circuit. The SoC 15 includes a memory 151, a CPU 152, an ISP 153, an AI processing section 154, and an external interface 155. In FIG. 1, the interface is illustrated as "I/F" for convenience.
 メモリ151は、RAM(Random Access Memory)とROM(Read Only Memory)とを少なくとも含む。メモリ151は、カメラ装置10の動作の実行に必要なプログラム及び制御用データ、更には、カメラ装置10の各部の動作中に生成されたデータもしくは情報を一時的に保持する。RAMは、例えば、カメラ装置10の各部の動作時に使用されるワークメモリである。ROMは、例えば、カメラ装置10の各部を制御するためのプログラム及び制御用データを予め記憶して保持する。また、メモリ151は、PC50により設定される構造化照明のパターン(図示略)又はそのパターンを指定するための属性情報を保存する。また、メモリ151は、照明部17が照明パターンP1の第1構造化照明を照射してから照明パターンP2の第2構造化照明を照射するまでの待機時間(微小時間ΔTa)の情報、照明部17が照明パターンP2の第2構造化照明を照射してから照明パターンP1の第1構造化照明を照射するまでの待機時間(微小時間ΔTb)の情報を保存する。 The memory 151 includes at least a RAM (Random Access Memory) and a ROM (Read Only Memory). The memory 151 temporarily stores programs and control data necessary for executing the operations of the camera device 10, as well as data or information generated during the operation of each part of the camera device 10. The RAM is, for example, a work memory used when each part of the camera device 10 operates. The ROM stores and retains programs and control data for controlling each part of the camera device 10 in advance, for example. The memory 151 also stores a structured illumination pattern (not shown) set by the PC 50 or attribute information for specifying the pattern. The memory 151 also stores information about the waiting time (minute time ΔTa) from when the illumination unit 17 emits the first structured illumination of the illumination pattern P1 until it emits the second structured illumination of the illumination pattern P2, and the illumination unit 17 stores information on the waiting time (minute time ΔTb) from when the second structured illumination of the illumination pattern P2 is irradiated to when the first structured illumination of the illumination pattern P1 is irradiated.
 CPU152は、カメラ装置10の全体的な動作を司るコントローラとして機能するプロセッサである。CPU152は、カメラ装置10の各部の動作を統括するための制御処理、カメラ装置10の各部との間のデータの入出力処理、データの演算処理及びデータの記憶処理を行う。CPU152は、メモリ151に記憶されたプログラム及び制御用データに従って動作する。CPU152は、動作時にメモリ151を使用し、CPU152が生成又は取得したデータもしくは情報をメモリ151に転送して一時的に保存する。また、CPU152は、Lイメージセンサ13及びRイメージセンサ14のそれぞれからの撮像データをメモリ151から取得し、Lイメージセンサ13からの撮像データとRイメージセンサ14からの撮像データとの視差を算出する。CPU152は、視差の算出結果を基にして、画角内の対象物に含まれる物体の距離画像を生成する。なお、CPU152は、タイマ(図示略)あるいは照度センサ(図示略)を有し、タイマあるいは照度センサの出力を基にして可視光カットフィルタあるいはIRカットフィルタのいずれかを配置するための制御信号を生成してフィルタ駆動部に送ってもよい。CPU152が行う各種の処理の詳細については後述する。 The CPU 152 is a processor that functions as a controller that controls the overall operation of the camera device 10. The CPU 152 performs control processing for unifying the operations of each part of the camera device 10, data input/output processing with respect to each part of the camera device 10, data calculation processing, and data storage processing. The CPU 152 operates according to programs and control data stored in the memory 151. The CPU 152 uses the memory 151 during operation, and transfers data or information generated or acquired by the CPU 152 to the memory 151 for temporary storage. Further, the CPU 152 acquires image data from each of the L image sensor 13 and the R image sensor 14 from the memory 151, and calculates the parallax between the image data from the L image sensor 13 and the image data from the R image sensor 14. . The CPU 152 generates a distance image of an object included in the target object within the viewing angle based on the parallax calculation result. Note that the CPU 152 has a timer (not shown) or an illuminance sensor (not shown), and generates a control signal for arranging either the visible light cut filter or the IR cut filter based on the output of the timer or the illuminance sensor. It may also be generated and sent to the filter driver. Details of various processes performed by the CPU 152 will be described later.
 ISP153は、カメラ装置10内で行われる各種の画像処理を司るプロセッサである。ISP153は、メモリ151から撮像データを読み出し、読み出した撮像データを用いて各種の画像処理を行う。ISP153は、動作時にメモリ151を使用し、ISP153が生成又は取得したデータもしくは情報をメモリ151に転送して一時的に保存する。ISP153は、メモリ151から読み出した撮像データのサイズをAI処理部154が行う領域判定処理に適するサイズに変換するリサイズ処理を行う。領域判定処理について詳細は後述する。ISP153は、AI処理部154の処理結果を基に、照明部17が照射する第2照射モードに対応する第2構造化照明(後述参照)の照射エリア或いは照射強度を決定して照明制御部16に送る。ISP153は、第1照射モードに対応する第1構造化照明が照射された状態でLイメージセンサ13により撮像された撮像データ(第1左画像P1L)及び同状態でRイメージセンサ14により撮像された撮像データ(第1右画像P1R)を基に、基準地点から対象物までの距離を特定するための距離画像を生成する。ここで、基準地点とは、Lイメージセンサ13及びRイメージセンサ14のそれぞれの撮像面、焦点、Lレンズ11及びRレンズ12のそれぞれのレンズ界面等に相当する。換言すると、基準地点は、カメラ装置10に含まれる光学要素により定義可能な位置に相当する。なお、基準地点の定義は上述した例に限定されない。なお、ISP153は、第1照射モードに対応する第1構造化照明が照射された状態でLイメージセンサ13により撮像された撮像データ(第1左画像P1L)及び同状態でRイメージセンサ14により撮像された撮像データ(第1右画像P1R)を基に、Lイメージセンサ13及びRイメージセンサ14のそれぞれによる撮像(つまり露光)の露光条件を定める内部パラメータを調整するための露光制御信号を生成してLイメージセンサ13及びRイメージセンサ14のそれぞれに送ってよい。ISP153が行う各種の処理の詳細については後述する。 The ISP 153 is a processor that manages various image processes performed within the camera device 10. The ISP 153 reads imaging data from the memory 151 and performs various image processing using the read imaging data. The ISP 153 uses the memory 151 during operation, and transfers data or information generated or acquired by the ISP 153 to the memory 151 for temporary storage. The ISP 153 performs resizing processing to convert the size of the imaging data read from the memory 151 into a size suitable for the area determination processing performed by the AI processing unit 154. Details of the area determination process will be described later. The ISP 153 determines the irradiation area or irradiation intensity of the second structured illumination (see below) that corresponds to the second irradiation mode to be emitted by the illumination unit 17 based on the processing result of the AI processing unit 154, and controls the illumination control unit 16. send to The ISP 153 includes image data (first left image P1L) captured by the L image sensor 13 in a state where the first structured illumination corresponding to the first irradiation mode is irradiated, and image data captured by the R image sensor 14 in the same state. Based on the imaging data (first right image P1R), a distance image for specifying the distance from the reference point to the target object is generated. Here, the reference point corresponds to the imaging planes and focal points of the L image sensor 13 and the R image sensor 14, the lens interfaces of the L lens 11 and the R lens 12, and the like. In other words, the reference point corresponds to a position definable by an optical element included in the camera device 10. Note that the definition of the reference point is not limited to the example described above. Note that the ISP 153 stores image data (first left image P1L) captured by the L image sensor 13 in a state where the first structured illumination corresponding to the first irradiation mode is irradiated, and image data captured by the R image sensor 14 in the same state. Based on the captured imaging data (first right image P1R), an exposure control signal is generated for adjusting internal parameters that determine the exposure conditions for imaging (that is, exposure) by each of the L image sensor 13 and the R image sensor 14. and may be sent to each of the L image sensor 13 and the R image sensor 14. Details of various processes performed by the ISP 153 will be described later.
 AI処理部154は、例えばGPU(Graphics Processing Unit)及びメモリを用いて構成されている。なお、GPUの代わり或いはGPUとともにDSP(Digital Signal Processor)が使用されても構わない。AI処理部154は、AI(例えばセグメンテーション用の学習済みモデル)を用いて、第1照射モードに対応する第1構造化照明が照射された状態でLイメージセンサ13により撮像された撮像データ(第1左画像P1L)及び同状態でRイメージセンサ14により撮像された撮像データ(第1右画像P1R)を基に、各撮像データ(画像)における領域判定処理を行う。領域判定処理とは、例えば、撮像データとしてテーブル上に配置されたワークを撮像した画像において、テーブルに相当する画像領域と、ワークに相当する画像領域とを判定する処理に相当する。AI処理部154は、ISP153によるリサイズ済みの撮像データ(例えば第1左画像P1L、第1右画像P1R)をメモリ151から読み出して上述した領域判定処理を行う。AI処理部154が用いる学習済みモデルは、例えば予め機械学習を介して生成されている。ここで、学習済みモデルとは、複数枚の撮像データ(画像データ)に基づく機械学習により、AI処理部154が背景物に相当する領域と配置物に相当する領域とを判定するためのパラメータ(例えば、各種の重み付け係数等)を備えたモデルに相当する。AI処理部154は、領域判定処理の結果をメモリ151に保存する。 The AI processing unit 154 is configured using, for example, a GPU (Graphics Processing Unit) and memory. Note that a DSP (Digital Signal Processor) may be used instead of or together with the GPU. The AI processing unit 154 uses AI (for example, a trained model for segmentation) to generate image data (the first Based on the first left image P1L) and the image data captured by the R image sensor 14 in the same state (the first right image P1R), area determination processing is performed for each image data (image). The area determination process corresponds to, for example, a process of determining an image area corresponding to the table and an image area corresponding to the work in an image obtained by capturing a workpiece placed on a table as imaging data. The AI processing unit 154 reads out the image data (for example, the first left image P1L, first right image P1R) that has been resized by the ISP 153 from the memory 151 and performs the above-described area determination process. The learned model used by the AI processing unit 154 is generated in advance through machine learning, for example. Here, the learned model refers to parameters ( For example, it corresponds to a model equipped with various weighting coefficients, etc.). The AI processing unit 154 stores the result of the area determination process in the memory 151.
 外部インターフェース155は、PC50からのデータ要求を受信したり、そのデータ要求にしたがってCPU152又はISP153が生成或いは取得したデータをPC50に送信したりする。なお、図1には図示が省略されているが、ロボットコントローラとカメラ装置10とが接続されている場合、外部インターフェース155は、CPU152又はISP153が生成あるいは取得したデータをロボットコントローラに送信してもよい。 The external interface 155 receives data requests from the PC 50 and transmits data generated or acquired by the CPU 152 or the ISP 153 to the PC 50 in accordance with the data requests. Although not shown in FIG. 1, when the robot controller and the camera device 10 are connected, the external interface 155 can transmit data generated or acquired by the CPU 152 or the ISP 153 to the robot controller. good.
 照明制御部16は、照明部17からの第1構造化照明及び第2構造化照明の各種の制御を行うための制御回路により構成される。照明制御部16は、SoC15からの出力を基に、照明部17が照射する第1構造化照明及び第2構造化照明の各パターン、照射モード、照射タイミングを制御するための制御信号を生成して照明部17に送る。ここで、照射モードとは、照明部17が第1構造化照明又は第2構造化照明のいずれの光を照射するかを定める動作モードである。具体的には、照明モードは、照明部17から第1構造化照明を照射するための第1照射モードと、照明部17から第2構造化照明を照射するための第2照射モードとに分かれる。照明制御部16は、例えば図2Aに示すタイムチャート、或いは図2Bに示すタイムチャートに沿って第1照射モードと第2照射モードとを切り替える。 The illumination control unit 16 is constituted by a control circuit for performing various controls on the first structured illumination and the second structured illumination from the illumination unit 17. The lighting control unit 16 generates control signals for controlling each pattern, irradiation mode, and irradiation timing of the first structured illumination and the second structured illumination irradiated by the illumination unit 17 based on the output from the SoC 15. and sends it to the illumination section 17. Here, the irradiation mode is an operation mode that determines which light, the first structured illumination or the second structured illumination, is irradiated by the illumination unit 17. Specifically, the illumination mode is divided into a first illumination mode in which the illumination unit 17 emits the first structured illumination, and a second illumination mode in which the illumination unit 17 emits the second structured illumination. . The illumination control unit 16 switches between the first irradiation mode and the second irradiation mode, for example, according to the time chart shown in FIG. 2A or the time chart shown in FIG. 2B.
 照明部17は、第1構造化照明LG1及び第2構造化照明LG2のそれぞれを照明可能な発光素子を含むプロジェクタにより構成される。照明部17は、照明制御部16からの制御信号を基に、第1構造化照明LG1及び第2構造化照明LG2のそれぞれを、対象物に向けて照射する。また、照明部17は、対象物全体に対して第1構造化照明LG1を照射し、さらに、照明制御部16により制御される照明エリア(例えば配置物に相当する領域)に向けて第2構造化照明LG2を照射する。また、照明部17は、対象物全体に対して第1構造化照明LG1を照射し、さらに、照明制御部16により制御される照明強度(例えば照射輝度)を有する第2構造化照明LG2を対象物全体に対して照射する。ここで、第1構造化照明LG1は、例えば複数のドットを所定のパターンで配置したテクスチャを有する照明である。第2構造化照明LG2は、例えば複数のドットを所定のパターンで配置したテクスチャを有する照明である。第1構造化照明LG1と第2構造化照明LG2とでは、複数のドットの配置位置が異なっている。 The illumination unit 17 is configured by a projector including a light emitting element capable of illuminating each of the first structured illumination LG1 and the second structured illumination LG2. The illumination unit 17 irradiates the object with each of the first structured illumination LG1 and the second structured illumination LG2 based on the control signal from the illumination control unit 16. The illumination unit 17 also irradiates the entire object with the first structured illumination LG1, and further irradiates the entire object with the second structured illumination LG1 toward an illumination area (for example, an area corresponding to an arrangement object) controlled by the illumination control unit 16. irradiate with the chemical illumination LG2. Further, the illumination unit 17 irradiates the entire object with the first structured illumination LG1, and further irradiates the entire object with the second structured illumination LG2 having an illumination intensity (for example, irradiation brightness) controlled by the illumination control unit 16. Irradiates the entire object. Here, the first structured illumination LG1 is, for example, illumination having a texture in which a plurality of dots are arranged in a predetermined pattern. The second structured illumination LG2 is, for example, illumination having a texture in which a plurality of dots are arranged in a predetermined pattern. The arrangement positions of the plurality of dots are different between the first structured illumination LG1 and the second structured illumination LG2.
 なお、図1において、照明制御部16及び照明部17のそれぞれはカメラ装置10に含まれるように設けられている例を図示しているが、照明制御部16及び照明部17のそれぞれはカメラ装置10とは別体で設けられてもよい。この場合、照明制御部16とSoC15の外部インターフェース155とがデータ信号の入出力が可能となるように接続される。 Note that although FIG. 1 shows an example in which each of the lighting control section 16 and the lighting section 17 is included in the camera device 10, each of the lighting control section 16 and the lighting section 17 is included in the camera device 10. 10 may be provided separately. In this case, the lighting control section 16 and the external interface 155 of the SoC 15 are connected so that data signals can be input and output.
 図2Aは、照明パターンP1の照明と照明パターンP2の照明とを交互に行うタイムチャートの一例を示す図である。図2Bは、照明パターンP1の照明と照明パターンP2の照明とを交互に行うタイムチャートの他の一例を示す図である。照明パターンP1の照明は第1構造化照明LG1に対応し、照明パターンP2の照明は第2構造化照明LG2に対応する。図2の横軸は時間を示す。 FIG. 2A is a diagram showing an example of a time chart in which illumination of the illumination pattern P1 and illumination of the illumination pattern P2 are alternately performed. FIG. 2B is a diagram showing another example of a time chart in which the illumination of the illumination pattern P1 and the illumination of the illumination pattern P2 are alternately performed. The illumination of the illumination pattern P1 corresponds to the first structured illumination LG1, and the illumination of the illumination pattern P2 corresponds to the second structured illumination LG2. The horizontal axis in FIG. 2 indicates time.
 図2Aに示すように、照明制御部16は、照明パターンP1の照明、照明パターンP2の照明、照明パターンP1の照明、…のように、2つの照射モードの構造化照明(つまり、第1照射モードに対応する照明パターンP1,第2照射モードに対応する照明パターンP2)を交互に時分割にして照明部17から照射させる。例えば、時刻t1に照明パターンP1の照明が照射されると、時刻t1から微小時間Δtaが経過した後の時刻t2に照明パターンP2の照明が照射される。時刻t2に照明パターンP2の照明が照射されると、時刻t2から微小時間Δtbが経過した後の時刻t3に照明パターンP1の照明が照射される。時刻t3に照明パターンP1の照明が照射されると、時刻t3から微小時間Δtaが経過した後の時刻t4に照明パターンP2の照明が照射される。時刻t4に照明パターンP2の照明が照射されると、時刻t4から微小時間Δtbが経過した後の時刻t5に照明パターンP1の照明が照射される。時刻t5に照明パターンP1の照明が照射されると、時刻t5から微小時間Δtaが経過した後の時刻t6に照明パターンP2の照明が照射される。時刻t6に照明パターンP2の照明が照射されると、時刻t6から微小時間Δtbが経過した後の時刻に照明パターンP1の照明が照射される。以下同様である。ここで、微小時間Δtaと微小時間Δtbとは同じでもよいし、異なってもよく、大小関係も特に限定されない。 As shown in FIG. 2A, the illumination control unit 16 provides structured illumination in two irradiation modes (i.e., the first irradiation The illumination pattern P1 corresponding to the mode and the illumination pattern P2 corresponding to the second irradiation mode are alternately time-divisionally irradiated from the illumination unit 17. For example, when the illumination pattern P1 is applied at time t1, the illumination pattern P2 is applied at time t2 after a minute time Δta has elapsed from time t1. When the illumination of the illumination pattern P2 is applied at time t2, the illumination of the illumination pattern P1 is applied at time t3 after a minute time Δtb has elapsed from time t2. When the illumination of the illumination pattern P1 is applied at time t3, the illumination of the illumination pattern P2 is applied at time t4 after a minute time Δta has elapsed from time t3. When the illumination of the illumination pattern P2 is applied at time t4, the illumination of the illumination pattern P1 is applied at time t5 after a short time Δtb has elapsed from time t4. When the illumination pattern P1 is irradiated at time t5, the illumination pattern P2 is irradiated at time t6 after a short time Δta has elapsed from time t5. When the illumination of the illumination pattern P2 is irradiated at time t6, the illumination of the illumination pattern P1 is irradiated at a time after a minute time Δtb has elapsed from the time t6. The same applies below. Here, the minute time Δta and the minute time Δtb may be the same or different, and their magnitude relationship is not particularly limited.
 時刻t1では、照明パターンP1の照明(第1照射モードに対応する第1構造化照明LG1)が照射された状態で、Lイメージセンサ13により第1左画像P1Lが生成され、かつ、Rイメージセンサ14により第1右画像P1Rが生成される。第1左画像P1Lと第1右画像P1Rとは互いに視差を有する。第1左画像P1Lは、第1視点から撮像した第1撮像画像の一例である。第1右画像P1Rは、第2視点から撮像した第2撮像画像の一例である。 At time t1, the first left image P1L is generated by the L image sensor 13 under illumination of the illumination pattern P1 (the first structured illumination LG1 corresponding to the first illumination mode), and the R image sensor 14, the first right image P1R is generated. The first left image P1L and the first right image P1R have parallax with each other. The first left image P1L is an example of a first captured image captured from the first viewpoint. The first right image P1R is an example of a second captured image captured from the second viewpoint.
 時刻t2では、照明パターンP2の照明(第2照射モードに対応する第2構造化照明LG2)が照射された状態で、Lイメージセンサ13により第2左画像P2Lが生成され、かつ、Rイメージセンサ14により第2右画像P2Rが生成される。第2左画像P2Lと第2右画像P2Rとは互いに視差を有する。第2左画像P2Lは、第1視点から撮像した第1撮像画像の一例である。第2右画像P2Rは、第2視点から撮像した第2撮像画像の一例である。 At time t2, the second left image P2L is generated by the L image sensor 13 under illumination of the illumination pattern P2 (second structured illumination LG2 corresponding to the second illumination mode), and the R image sensor 14, the second right image P2R is generated. The second left image P2L and the second right image P2R have parallax with each other. The second left image P2L is an example of the first captured image captured from the first viewpoint. The second right image P2R is an example of a second captured image captured from the second viewpoint.
 CPU152又はISP153は、第1構造化照明LG1の照射中に撮像された第1左画像P1Lと第2構造化照明LG2の照射中に撮像された第2左画像P2Lとを合成処理する。CPU152又はISP153は、この合成処理により、第1左画像P1L上の画像特徴或いは第2左画像P2L上の画像特徴よりも画像特徴が多い合成左画像P3Lを生成する。合成左画像P3Lは、第1合成画像の一例である。 The CPU 152 or the ISP 153 combines the first left image P1L captured during the irradiation with the first structured illumination LG1 and the second left image P2L captured during the irradiation with the second structured illumination LG2. Through this combining process, the CPU 152 or the ISP 153 generates a combined left image P3L that has more image features than the image features on the first left image P1L or the image features on the second left image P2L. The composite left image P3L is an example of the first composite image.
 CPU152又はISP153は、第1構造化照明LG1の照射中に撮像された第1右画像P1Rと第2構造化照明LG2の照射中に撮像された第2右画像P2Rとを合成処理する。CPU152又はISP153は、この合成処理により、第1右画像P1R上の画像特徴或いは第2右画像P2R上の画像特徴よりも画像特徴が多い合成右画像P3Rを生成する。合成右画像P3Rは、第2合成画像の一例である。したがって、合成左画像P3Lは、合成右画像P3Rに対して視差を有する画像に相当する。 The CPU 152 or the ISP 153 combines the first right image P1R captured during the irradiation with the first structured illumination LG1 and the second right image P2R captured during the irradiation with the second structured illumination LG2. Through this combining process, the CPU 152 or the ISP 153 generates a combined right image P3R that has more image features than the image features on the first right image P1R or the image features on the second right image P2R. The composite right image P3R is an example of the second composite image. Therefore, the composite left image P3L corresponds to an image having parallax with respect to the composite right image P3R.
 CPU152又はISP153は、合成左画像P3L及び合成右画像P3Rを基にして、合成左画像P3Lと合成右画像P3Rとの視差を算出する。また、CPU152又はISP153は、カメラ装置10から対象物までの距離を特定するための距離画像を生成する。 The CPU 152 or the ISP 153 calculates the parallax between the composite left image P3L and the composite right image P3R based on the composite left image P3L and the composite right image P3R. Further, the CPU 152 or the ISP 153 generates a distance image for specifying the distance from the camera device 10 to the target object.
 ここで、対象物として、テクスチャが実質的に無い領域である背景物(例えば白いテーブル)に配置物(例えば白い花瓶)が置かれている状況と、テクスチャが実質的に無い領域である背景物(例えば白いテーブル)に配置物(例えば赤いリンゴ)が置かれている状況とを、ステレオカメラ等の2眼カメラを用いて撮像したと仮定する。この場合、白い花瓶と白いテーブルと撮像した撮像画像は、赤いリンゴと白いテーブルと撮像した撮像画像よりも、色数が少ない、つまり、画像特徴(特徴点)が少ない。ゆえに、白い花瓶と白いテーブルと撮像した場合、白い花瓶と白いテーブルとそれぞれに対する視差を精度よく算出することができず、カメラ装置から対象物(白い花瓶と白いテーブル)までの距離を精度よく認識することが困難であった。 Here, the target object is a situation in which an object (for example, a white vase) is placed on a background object (for example, a white table) that is an area that has virtually no texture, and a situation in which an object (for example, a white vase) is placed on a background object (for example, a white table) that is an area that has virtually no texture. Assume that a situation in which an object (for example, a red apple) is placed on a (for example, a white table) is imaged using a two-lens camera such as a stereo camera. In this case, the captured image of the white vase and white table has fewer colors, that is, fewer image features (feature points), than the captured image of the red apple and white table. Therefore, when capturing an image of a white vase and a white table, it is not possible to accurately calculate the parallax for each of the white vase and white table, and it is difficult to accurately recognize the distance from the camera device to the object (white vase and white table). It was difficult to do so.
 このような問題に対して、実施の形態1に係るカメラ装置10は、複数のドット(例えば図5参照)が設けられた照明パターンP1を有する第1構造化照明LG1と照明パターンP1とは異なる照明パターンP2を有する第2構造化照明LG2とを交互に照射する。さらに、カメラ装置10は、左画像同士(つまり第1左画像P1L及び第2左画像P2L)を合成した合成左画像P3Lと右画像同士(つまり第1右画像P1R及び第2右画像P2R)を合成した合成右画像P3Rとをそれぞれ生成して視差を算出する。したがって、合成処理によって合成左画像P3L上、合成右画像P3R上の各画像特徴が見かけ上増加した状態で視差が算出される。これにより、カメラ装置10は、テクスチャが実質的に無い対象物までの距離を高精度に認識可能となる距離画像を生成できる。 To solve this problem, the camera device 10 according to the first embodiment has a first structured illumination LG1 having an illumination pattern P1 provided with a plurality of dots (for example, see FIG. 5), and the illumination pattern P1 is different from the first structured illumination LG1. The second structured illumination LG2 having the illumination pattern P2 is alternately irradiated. Furthermore, the camera device 10 combines the left images (that is, the first left image P1L and the second left image P2L) into a composite left image P3L and the right images (that is, the first right image P1R and the second right image P2R). The combined right image P3R and the combined right image P3R are respectively generated and the parallax is calculated. Therefore, the parallax is calculated in a state where each image feature on the composite left image P3L and the composite right image P3R is apparently increased by the composition process. Thereby, the camera device 10 can generate a distance image that allows highly accurate recognition of the distance to an object that is substantially free of texture.
 また、図2Bに示すように、微小時間Δtaと微小時間Δtbとの比は撮像最短間隔(例えばフレームレートが480[fps]なら1/480[秒])として照明パターンP1から照明パターンP2までの間隔を図2Aと比べてより短く、かつ、照明パターンP2から照明パターンP1までの間隔を図2Aと比べてより長くしてもよい。これにより、照明パターンP1及び照明パターンP2の照射によって生成される距離画像に写り込む対象物(例えば移動物体)の時間的なブレが短くなり、画像合成に基づいて生成される距離画像の精度が向上する。 Furthermore, as shown in FIG. 2B, the ratio between the minute time Δta and the minute time Δtb is determined from the illumination pattern P1 to the illumination pattern P2, assuming the shortest imaging interval (for example, 1/480 [second] if the frame rate is 480 [fps]). The interval may be shorter than in FIG. 2A, and the interval from illumination pattern P2 to illumination pattern P1 may be longer than in FIG. 2A. As a result, the temporal blurring of the object (for example, a moving object) reflected in the distance image generated by the irradiation of the illumination pattern P1 and the illumination pattern P2 is shortened, and the accuracy of the distance image generated based on image synthesis is improved. improves.
 次に、実施の形態1に係るカメラ装置10の第1動作手順例について、図3を参照して説明する。図3は、カメラ装置10の第1動作手順の一例を示すフローチャートである。図3の説明では、照明パターンP1の照明とその照明下での撮像、照明パターンP2の照明とその照明下での撮像、がそれぞれ非同期で実行される例を説明する。 Next, a first operational procedure example of the camera device 10 according to the first embodiment will be described with reference to FIG. 3. FIG. 3 is a flowchart illustrating an example of the first operation procedure of the camera device 10. In the description of FIG. 3, an example will be described in which illumination of the illumination pattern P1 and imaging under the illumination, and illumination of the illumination pattern P2 and imaging under the illumination, are each performed asynchronously.
 図3において、カメラ装置10の照明制御部16は、例えば時刻t1(図2参照)に第1照射モードに切り替えて照明パターンP1の照明(第1構造化照明LG1)を、対象物に向けて照明部17から照射する(ステップSt1)。カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2のそれぞれは、ステップSt1で照明部17から第1構造化照明LG1が照射されている中で対象物を撮像する(ステップSt2)。ステップSt2では、互いに視差を有する2つの撮像データ(例えば第1左画像P1L及び第1右画像P1R)が得られる。例えば、第1撮像モジュールCAP1から第1左画像P1Lが得られ、第2撮像モジュールCAP2から第1右画像P1Rが得られる。カメラ装置10のCPU152又はISP153は、ステップSt1の実行タイミングあるいはステップSt2の実行タイミングから微小時間Δtaの経過(待機)を待つ(ステップSt3)。 In FIG. 3, the illumination control unit 16 of the camera device 10 switches to the first illumination mode at time t1 (see FIG. 2), for example, and directs the illumination of the illumination pattern P1 (first structured illumination LG1) toward the target object. Irradiation is performed from the illumination unit 17 (step St1). Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the first structured illumination LG1 from the illumination unit 17 in step St1 (step St2). In step St2, two pieces of imaging data (for example, the first left image P1L and the first right image P1R) having parallax with each other are obtained. For example, the first left image P1L is obtained from the first imaging module CAP1, and the first right image P1R is obtained from the second imaging module CAP2. The CPU 152 or the ISP 153 of the camera device 10 waits for the elapse of a minute time Δta (standby) from the execution timing of step St1 or the execution timing of step St2 (step St3).
 カメラ装置10の照明制御部16は、例えば時刻t2(図2参照)に第2照射モードに切り替えて照明パターンP2の照明(第2構造化照明LG2)を、対象物に向けて照明部17から照射する(ステップSt4)。カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2のそれぞれは、ステップSt4で照明部17から第2構造化照明LG2が照射されている中で対象物を撮像する(ステップSt5)。ステップSt5では、互いに視差を有する2つの撮像データ(例えば第2左画像P2L及び第2右画像P2R)が得られる。例えば、第1撮像モジュールCAP1から第2左画像P2Lが得られ、第2撮像モジュールCAP2から第2右画像P2Rが得られる。 For example, at time t2 (see FIG. 2), the illumination control unit 16 of the camera device 10 switches to the second illumination mode and directs illumination of the illumination pattern P2 (second structured illumination LG2) from the illumination unit 17 toward the object. irradiate (step St4). Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the second structured illumination LG2 from the illumination unit 17 in step St4 (step St5). In step St5, two pieces of imaging data (eg, second left image P2L and second right image P2R) having parallax with each other are obtained. For example, the second left image P2L is obtained from the first imaging module CAP1, and the second right image P2R is obtained from the second imaging module CAP2.
 カメラ装置10のCPU152又はISP153は、ステップSt2で撮像された互いに視差を有する2つの撮像データの一方(例えば第1左画像P1L)とステップSt5で撮像された互いに視差を有する2つの撮像データの一方(例えば第2左画像P2L)とを合成する(ステップSt6)。同様に、カメラ装置10のCPU152又はISP153は、ステップSt2で撮像された互いに視差を有する2つの撮像データの他方(例えば第1右画像P1R)とステップSt5で撮像された互いに視差を有する2つの撮像データの他方(例えば第2右画像P2R)とを合成する(ステップSt6)。さらに、カメラ装置10のCPU152又はISP153は、合成により得られた2つの合成画像(例えば合成左画像P3L及び合成右画像P3R)の視差を算出する。カメラ装置10のCPU152又はISP153は、視差の算出結果を用いて、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力する(ステップSt6)。 The CPU 152 or the ISP 153 of the camera device 10 selects one of the two image data captured in step St2 that has a parallax with each other (for example, the first left image P1L) and one of the two image data that has a parallax that is captured in step St5. (for example, second left image P2L) (step St6). Similarly, the CPU 152 or the ISP 153 of the camera device 10 selects the other of the two pieces of image data (for example, the first right image P1R) that have a parallax from each other that are imaged in step St2, and the two images that have a parallax from each other that are imaged in step St5. The other data (for example, the second right image P2R) is synthesized (step St6). Further, the CPU 152 or the ISP 153 of the camera device 10 calculates the parallax between the two composite images (for example, the composite left image P3L and the composite right image P3R) obtained by the composition. The CPU 152 or ISP 153 of the camera device 10 uses the parallax calculation results to generate a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object, and outputs it to the PC 50. (Step St6).
 カメラ装置10のCPU152又はISP153は、ステップSt4の実行タイミング、ステップSt5の実行タイミング或いはステップSt6の実行タイミングから微小時間Δtbの経過(待機)を待つ(ステップSt7)。ステップSt7の後、カメラ装置10の処理はステップSt1に戻る。 The CPU 152 or the ISP 153 of the camera device 10 waits for a minute time Δtb to elapse (standby) from the execution timing of step St4, the execution timing of step St5, or the execution timing of step St6 (step St7). After step St7, the processing of the camera device 10 returns to step St1.
 なお、ステップSt6において、カメラ装置10のCPU152又はISP153は、照明パターンP1の照射中に第1撮像モジュールCAP1により撮像された第1左画像P1L及び第2撮像モジュールCAP2により撮像された第1右画像P1Rの視差を算出してもよい。同様に、カメラ装置10のCPU152又はISP153は、照明パターンP2の照射中に第1撮像モジュールCAP1により撮像された第2左画像P2L及び第2撮像モジュールCAP2により撮像された第2右画像P2Rの視差を算出してもよい。カメラ装置10のCPU152又はISP153は、第1左画像P1L及び第1右画像P1Rに基づく視差と第2左画像P2L及び第2右画像P2Rに基づく視差とを合成することにより、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力してもよい。 Note that in step St6, the CPU 152 or the ISP 153 of the camera device 10 displays the first left image P1L captured by the first imaging module CAP1 and the first right image captured by the second imaging module CAP2 during irradiation with the illumination pattern P1. The P1R parallax may be calculated. Similarly, the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated. The CPU 152 or the ISP 153 of the camera device 10 combines the parallax based on the first left image P1L and the first right image P1R with the parallax based on the second left image P2L and the second right image P2R, thereby determining the standard of the camera device 10. A distance image for specifying the distance between a point (see above) and a target object may be generated and output to the PC 50.
 また、ステップSt6において、カメラ装置10のCPU152又はISP153は、照明パターンP1の照射中に第1撮像モジュールCAP1により撮像された第1左画像P1L及び第2撮像モジュールCAP2により撮像された第1右画像P1Rの視差を算出してもよい。同様に、カメラ装置10のCPU152又はISP153は、照明パターンP2の照射中に第1撮像モジュールCAP1により撮像された第2左画像P2L及び第2撮像モジュールCAP2により撮像された第2右画像P2Rの視差を算出してもよい。カメラ装置10のCPU152又はISP153は、第1左画像P1L及び第1右画像P1Rに基づく視差から距離画像を生成し、さらに、第2左画像P2L及び第2右画像P2Rに基づく視差から距離画像を生成してもよい。カメラ装置10のCPU152又はISP153は、これら2つの距離画像を合成することにより、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力してもよい。 Further, in step St6, the CPU 152 or the ISP 153 of the camera device 10 displays a first left image P1L captured by the first imaging module CAP1 and a first right image captured by the second imaging module CAP2 during the irradiation with the illumination pattern P1. The P1R parallax may be calculated. Similarly, the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated. The CPU 152 or ISP 153 of the camera device 10 generates a distance image from the parallax based on the first left image P1L and the first right image P1R, and further generates a distance image from the parallax based on the second left image P2L and the second right image P2R. may be generated. The CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object by combining these two distance images, and You can also output to
 次に、実施の形態1に係るカメラ装置10の第2動作手順例について、図4を参照して説明する。図4は、カメラ装置の第2動作手順の一例を示すフローチャートである。図4の説明では、照明パターンP1の照明とその照明下での撮像、照明パターンP2の照明とその照明下での撮像、がそれぞれ同期して実行される例を説明する。なお、図4の説明において、図3中の処理と同一の処理については同一のステップ番号を付与して説明を簡略化あるいは省略し、異なる内容について説明する。 Next, a second operational procedure example of the camera device 10 according to the first embodiment will be described with reference to FIG. 4. FIG. 4 is a flowchart illustrating an example of the second operation procedure of the camera device. In the explanation of FIG. 4, an example will be described in which illumination of the illumination pattern P1 and imaging under the illumination, and illumination of the illumination pattern P2 and imaging under the illumination are executed in synchronization. In the description of FIG. 4, the same step numbers are assigned to the same processes as those in FIG. 3 to simplify or omit the description, and different contents will be described.
 図4において、カメラ装置10の照明制御部16は、例えば時刻t1(図2参照)に第1照射モードに切り替えて照明パターンP1の照明(第1構造化照明LG1)を、対象物に向けて照明部17から照射する(ステップSt1A)。カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2のそれぞれは、ステップSt1Aで照明部17から第1構造化照明LG1が照射されている中で対象物を撮像する(ステップSt1A)。つまり、ステップSt1Aでは、照明部17からの第1構造化照明LG1の照射と同期して互いに視差を有する2つの撮像データ(例えば第1左画像P1L及び第1右画像P1R)が得られる。例えば、第1撮像モジュールCAP1から第1左画像P1Lが得られ、第2撮像モジュールCAP2から第1右画像P1Rが得られる。カメラ装置10のCPU152又はISP153は、ステップSt1Aの実行タイミングから微小時間Δtaの経過(待機)を待つ(ステップSt3A)。 In FIG. 4, the illumination control unit 16 of the camera device 10 switches to the first illumination mode at time t1 (see FIG. 2), for example, and directs the illumination of the illumination pattern P1 (first structured illumination LG1) toward the target object. Irradiation is performed from the illumination unit 17 (step St1A). Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the first structured illumination LG1 from the illumination unit 17 in step St1A (step St1A). That is, in step St1A, two pieces of imaging data (for example, the first left image P1L and the first right image P1R) having parallax with each other are obtained in synchronization with the irradiation of the first structured illumination LG1 from the illumination unit 17. For example, the first left image P1L is obtained from the first imaging module CAP1, and the first right image P1R is obtained from the second imaging module CAP2. The CPU 152 or the ISP 153 of the camera device 10 waits for a minute time Δta to elapse (standby) from the execution timing of step St1A (step St3A).
 カメラ装置10の照明制御部16は、例えば時刻t2(図2参照)に第2照射モードに切り替えて照明パターンP2の照明(第2構造化照明LG2)を、対象物に向けて照明部17から照射する(ステップSt4A)。カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2のそれぞれは、ステップSt4Aで照明部17から第2構造化照明LG2が照射されている中で対象物を撮像する(ステップSt4A)。つまり、ステップSt4Aでは、照明部17からの第2構造化照明LG2の照射と同期して互いに視差を有する2つの撮像データ(例えば第2左画像P2L及び第2右画像P2R)が得られる。例えば、第1撮像モジュールCAP1から第2左画像P2Lが得られ、第2撮像モジュールCAP2から第2右画像P2Rが得られる。 For example, at time t2 (see FIG. 2), the illumination control unit 16 of the camera device 10 switches to the second illumination mode and directs illumination of the illumination pattern P2 (second structured illumination LG2) from the illumination unit 17 toward the object. irradiate (step St4A). Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the second structured illumination LG2 from the illumination unit 17 in step St4A (step St4A). That is, in step St4A, two pieces of imaging data (for example, the second left image P2L and the second right image P2R) having parallax with each other are obtained in synchronization with the irradiation of the second structured illumination LG2 from the illumination unit 17. For example, the second left image P2L is obtained from the first imaging module CAP1, and the second right image P2R is obtained from the second imaging module CAP2.
 カメラ装置10のCPU152又はISP153は、ステップSt1Aで撮像された互いに視差を有する2つの撮像データの一方(例えば第1左画像P1L)とステップSt4Aで撮像された互いに視差を有する2つの撮像データの一方(例えば第2左画像P2L)とを合成する(ステップSt6)。同様に、カメラ装置10のCPU152又はISP153は、ステップSt1Aで撮像された互いに視差を有する2つの撮像データの他方(例えば第1右画像P1R)とステップSt4Aで撮像された互いに視差を有する2つの撮像データの他方(例えば第2右画像P2R)とを合成する(ステップSt6)。さらに、カメラ装置10のCPU152又はISP153は、合成により得られた2つの合成画像(例えば合成左画像P3L及び合成右画像P3R)の視差を算出する。カメラ装置10のCPU152又はISP153は、視差の算出結果を用いて、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力する(ステップSt6)。 The CPU 152 or the ISP 153 of the camera device 10 selects one of the two image data captured in step St1A that has a parallax with each other (for example, the first left image P1L) and one of the two image data that has a parallax that is captured in step St4A. (for example, second left image P2L) (step St6). Similarly, the CPU 152 or the ISP 153 of the camera device 10 selects the other of the two image data (for example, the first right image P1R) that has a parallax from each other that was imaged in step St1A, and the two images that have a parallax from each other that was imaged in step St4A. The other data (for example, the second right image P2R) is synthesized (step St6). Further, the CPU 152 or the ISP 153 of the camera device 10 calculates the parallax between the two composite images (for example, the composite left image P3L and the composite right image P3R) obtained by the composition. The CPU 152 or ISP 153 of the camera device 10 uses the parallax calculation results to generate a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object, and outputs it to the PC 50. (Step St6).
 カメラ装置10のCPU152又はISP153は、ステップSt4Aの実行タイミングから微小時間Δtbの経過(待機)を待つ(ステップSt7A)。ステップSt7Aの後、カメラ装置10の処理はステップSt1Aに戻る。 The CPU 152 or the ISP 153 of the camera device 10 waits for a minute time Δtb to elapse (standby) from the execution timing of step St4A (step St7A). After step St7A, the processing of the camera device 10 returns to step St1A.
 なお、ステップSt6において、カメラ装置10のCPU152又はISP153は、照明パターンP1の照射中に第1撮像モジュールCAP1により撮像された第1左画像P1L及び第2撮像モジュールCAP2により撮像された第1右画像P1Rの視差を算出してもよい。同様に、カメラ装置10のCPU152又はISP153は、照明パターンP2の照射中に第1撮像モジュールCAP1により撮像された第2左画像P2L及び第2撮像モジュールCAP2により撮像された第2右画像P2Rの視差を算出してもよい。カメラ装置10のCPU152又はISP153は、第1左画像P1L及び第1右画像P1Rに基づく視差と第2左画像P2L及び第2右画像P2Rに基づく視差とを合成することにより、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力してもよい。 Note that in step St6, the CPU 152 or the ISP 153 of the camera device 10 displays the first left image P1L captured by the first imaging module CAP1 and the first right image captured by the second imaging module CAP2 during irradiation with the illumination pattern P1. The P1R parallax may be calculated. Similarly, the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated. The CPU 152 or the ISP 153 of the camera device 10 combines the parallax based on the first left image P1L and the first right image P1R with the parallax based on the second left image P2L and the second right image P2R, thereby determining the standard of the camera device 10. A distance image for specifying the distance between a point (see above) and a target object may be generated and output to the PC 50.
 また、ステップSt6において、カメラ装置10のCPU152又はISP153は、照明パターンP1の照射中に第1撮像モジュールCAP1により撮像された第1左画像P1L及び第2撮像モジュールCAP2により撮像された第1右画像P1Rの視差を算出してもよい。同様に、カメラ装置10のCPU152又はISP153は、照明パターンP2の照射中に第1撮像モジュールCAP1により撮像された第2左画像P2L及び第2撮像モジュールCAP2により撮像された第2右画像P2Rの視差を算出してもよい。カメラ装置10のCPU152又はISP153は、第1左画像P1L及び第1右画像P1Rに基づく視差から距離画像を生成し、さらに、第2左画像P2L及び第2右画像P2Rに基づく視差から距離画像を生成してもよい。カメラ装置10のCPU152又はISP153は、これら2つの距離画像を合成することにより、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力してもよい。 In addition, in step St6, the CPU 152 or the ISP 153 of the camera device 10 displays a first left image P1L captured by the first imaging module CAP1 and a first right image captured by the second imaging module CAP2 during the irradiation with the illumination pattern P1. The P1R parallax may be calculated. Similarly, the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated. The CPU 152 or the ISP 153 of the camera device 10 generates a distance image from the parallax based on the first left image P1L and the first right image P1R, and further generates a distance image from the parallax based on the second left image P2L and the second right image P2R. may be generated. The CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object by combining these two distance images, and You can also output to
 次に、照明部17が照射する第1構造化照明LG1及び第2構造化照明LG2の設定例について、図5、図6及び図7を参照して説明する。図5は、複数の異なる照明パターンを用いて照明する場合のUI設定画面例を示す図である。図6は、同一の照明パターンを位置シフトして照射する場合のUI設定画面例を示す図である。図7は、ドットピッチを指定倍率分縮小して照射する場合のUI設定画面例を示す図である。 Next, a setting example of the first structured illumination LG1 and the second structured illumination LG2 emitted by the illumination unit 17 will be described with reference to FIGS. 5, 6, and 7. FIG. 5 is a diagram showing an example of a UI setting screen when lighting is performed using a plurality of different lighting patterns. FIG. 6 is a diagram showing an example of a UI setting screen when the same illumination pattern is irradiated with shifted positions. FIG. 7 is a diagram showing an example of a UI setting screen when emitting light with the dot pitch reduced by a specified magnification.
 図5に示すUI設定画面WD1は、ユーザの操作により、PC50のディスプレイ(図示略)に表示される。PC50は、第1構造化照明LG1に対応する照明パターンP1を示すパターン画像EX1と、第2構造化照明LG2に対応する照明パターンP2を示すパターン画像EX2と、をUI設定画面WD1に表示する。パターン画像EX1及びパターン画像EX2が指定されてOKボタンBT1がユーザの操作により押下された場合、PC50は、パターン画像EX1を第1構造化照明LG1に対応する照明パターンP1として設定し、パターン画像EX2を第2構造化照明LG2に対応する照明パターンP2として設定する。第1照射モードは、複数のドット(照射図形の一例)を第1間隔で配置したパターン(図5のパターン画像EX1参照)を照射するモードである。第2照射モードは、複数のドット(照射図形の一例)を第2間隔で配置したパターン(図5のパターン画像EX2参照)を照射するモードである。パターン画像EX1とパターン画像EX2とは、ともに背景色が黒色で複数の白色のドットが不規則に配置されたパターンである点は共通であるが、ドットの配置箇所、配置間隔及び配置数のうち少なくとも1つが異なる。このような背景色が黒色で複数の白色のドットが配置された2種類の異なる構造化照明が対象物に照射される。これにより、例えば、白いテーブルに置かれた白い花瓶のようにテクスチャが実質的に無い対象物である場合でも、構造化照明が照射されない場合に比べて照射中に撮像された撮像画像には図5のUI設定画面WD1で設定された照明パターンP1,P2の構造化照明が写り込んで画像特徴が増えるため、対象物(例えば、白いテーブルと白い花瓶)の視差算出の精度が向上し、対象物の距離画像の生成精度も向上する。なお、照明パターンP1のパターン画像EX1及び照明パターンP2のパターン画像EX2は、図5に示すようなフレームの単位で設けられてもよいし、フレームを構成するラインの単位で設けられてもよいし、或いは複数の画素からなるブロックの単位で設けられてもよい。 The UI setting screen WD1 shown in FIG. 5 is displayed on the display (not shown) of the PC 50 by the user's operation. The PC 50 displays a pattern image EX1 indicating the illumination pattern P1 corresponding to the first structured illumination LG1 and a pattern image EX2 indicating the illumination pattern P2 corresponding to the second structured illumination LG2 on the UI setting screen WD1. When the pattern image EX1 and the pattern image EX2 are specified and the OK button BT1 is pressed by the user's operation, the PC 50 sets the pattern image EX1 as the illumination pattern P1 corresponding to the first structured illumination LG1, and sets the pattern image EX2 as the illumination pattern P1 corresponding to the first structured illumination LG1. is set as the illumination pattern P2 corresponding to the second structured illumination LG2. The first irradiation mode is a mode in which a pattern (see pattern image EX1 in FIG. 5) in which a plurality of dots (an example of an irradiation figure) are arranged at a first interval is irradiated. The second irradiation mode is a mode in which a pattern (see pattern image EX2 in FIG. 5) in which a plurality of dots (an example of an irradiation figure) are arranged at second intervals is irradiated. Pattern image EX1 and pattern image EX2 both have a black background and a pattern in which a plurality of white dots are arranged irregularly. At least one is different. Two different types of structured illumination having a black background and a plurality of white dots are irradiated onto the object. As a result, even if the object has virtually no texture, such as a white vase placed on a white table, the captured image taken during structured illumination will have a different shape compared to when it is not illuminated with structured illumination. The structured illumination of the illumination patterns P1 and P2 set on the UI setting screen WD1 of 5 is reflected and the image features increase, so the accuracy of parallax calculation of the object (for example, a white table and a white vase) is improved, and the The accuracy of generating distance images of objects is also improved. Note that the pattern image EX1 of the illumination pattern P1 and the pattern image EX2 of the illumination pattern P2 may be provided in units of frames as shown in FIG. 5, or may be provided in units of lines constituting a frame. , or may be provided in units of blocks each consisting of a plurality of pixels.
 また、以下の説明中において、パターン画像中のドットはあくまで照射図形の一例であって、照射図形はドット(点又は微小な円)に限定されない。照射図形は、例えば円だけでなく、多角形であってもよい。 Furthermore, in the following description, the dots in the pattern image are just an example of the irradiation figure, and the irradiation figure is not limited to dots (dots or minute circles). The irradiation figure may be, for example, not only a circle but also a polygon.
 この設定された内容がPC50から外部インターフェース155を介してCPU152又はISP153に入力される。CPU152又はISP153は、第1構造化照明LG1に対応する照明パターンP1及び第2構造化照明LG2に対応する照明パターンP2を照明制御部16に設定する。これにより、照明部17は、照明パターンP1(パターン画像EX1参照)を有する第1構造化照明LG1を照射可能であり、かつ、照明パターンP2(パターン画像EX2参照)を有する第2構造化照明LG2を照射可能である。なお、パターン画像を再指定する場合には、例えばNGボタンBT2がユーザの操作により押下される。 The set contents are input from the PC 50 to the CPU 152 or the ISP 153 via the external interface 155. The CPU 152 or the ISP 153 sets the illumination pattern P1 corresponding to the first structured illumination LG1 and the illumination pattern P2 corresponding to the second structured illumination LG2 in the illumination control unit 16. Thereby, the illumination unit 17 can emit the first structured illumination LG1 having the illumination pattern P1 (see pattern image EX1), and the second structured illumination LG2 having the illumination pattern P2 (see pattern image EX2). It is possible to irradiate In addition, when re-designating a pattern image, for example, the NG button BT2 is pressed by the user's operation.
 図6に示すUI設定画面WD2は、ユーザの操作により、PC50のディスプレイ(図示略)に表示される。PC50は、第1構造化照明LG1に対応する照明パターンP1を示すパターン画像EX1と、パターン画像EX1を2次元座標軸(図6参照)上でΔmシフトさせた第2構造化照明LG2に対応する照明パターンP2を示すパターン画像EX1Aと、をUI設定画面WD1に表示する。Δmの指定は、例えばパターン画像EX1がユーザの操作により指定された後、Δmの値(数値)がユーザの操作により指定欄FLD1に入力されてOKボタンBT1が押下されることで確定する。この確定後、PC50は、パターン画像EX1を第1構造化照明LG1に対応する照明パターンP1として設定し、パターン画像EX1Aを第2構造化照明LG2に対応する照明パターンP2として設定する。第1照射モードは、複数のドット(照射図形の一例)を第3間隔で配置したパターン(図6のパターン画像EX1参照)を照射するモードである。第2照射モードは、第3間隔で配置された複数のドット(照射図形の一例)を所定量シフトしたパターン(図6のパターン画像EX1A参照)を照射するモードである。パターン画像EX1とパターン画像EX1Aとの関係は、パターン画像EX1とパターン画像EX2との関係と同様に、ともに背景色が黒色で複数の白色のドットが不規則に配置されたパターンである点は共通であるが、ドットの配置箇所、配置間隔及び配置数が異なる。したがって、図6の設定例でも、このような背景色が黒色で複数の白色のドットが配置された2種類の異なる構造化照明が対象物に照射される。これにより、例えば、白いテーブルに置かれた白い花瓶のようにテクスチャが実質的に無い対象物である場合でも、構造化照明が照射されない場合に比べて照射中に撮像された撮像画像には図6のUI設定画面WD2で設定された照明パターンP1,P2の構造化照明が写り込んで画像特徴が増えるため、対象物(例えば、白いテーブルと白い花瓶)の視差算出の精度が向上し、対象物の距離画像の生成精度も向上する。なお、照明パターンP1のパターン画像EX1及び照明パターンP2のパターン画像EX1Aは、図6に示すようなフレームの単位で設けられてもよいし、フレームを構成するラインの単位で設けられてもよいし、或いは複数の画素からなるブロックの単位で設けられてもよい。 The UI setting screen WD2 shown in FIG. 6 is displayed on the display (not shown) of the PC 50 by the user's operation. The PC 50 includes a pattern image EX1 indicating an illumination pattern P1 corresponding to the first structured illumination LG1, and an illumination corresponding to the second structured illumination LG2 obtained by shifting the pattern image EX1 by Δm on the two-dimensional coordinate axis (see FIG. 6). A pattern image EX1A indicating pattern P2 is displayed on the UI setting screen WD1. The designation of Δm is confirmed, for example, after the pattern image EX1 is designated by the user's operation, the value (numeric value) of Δm is input into the designation field FLD1 by the user's operation, and the OK button BT1 is pressed. After this determination, the PC 50 sets the pattern image EX1 as the illumination pattern P1 corresponding to the first structured illumination LG1, and sets the pattern image EX1A as the illumination pattern P2 corresponding to the second structured illumination LG2. The first irradiation mode is a mode in which a pattern (see pattern image EX1 in FIG. 6) in which a plurality of dots (an example of an irradiation figure) are arranged at third intervals is irradiated. The second irradiation mode is a mode in which a pattern (see pattern image EX1A in FIG. 6) in which a plurality of dots (an example of an irradiation figure) arranged at third intervals is shifted by a predetermined amount is irradiated. The relationship between pattern image EX1 and pattern image EX1A is similar to the relationship between pattern image EX1 and pattern image EX2, in that both have a black background and a pattern in which a plurality of white dots are irregularly arranged. However, the location, spacing, and number of dots are different. Therefore, in the setting example of FIG. 6 as well, the object is irradiated with two different types of structured illumination in which the background color is black and a plurality of white dots are arranged. As a result, even if the object has virtually no texture, such as a white vase placed on a white table, the captured image taken during structured illumination will have a different shape compared to when it is not illuminated with structured illumination. The structured illumination of the illumination patterns P1 and P2 set on the UI setting screen WD2 of 6 is reflected and the image features increase, so the accuracy of parallax calculation of the object (for example, a white table and a white vase) is improved, and the The accuracy of generating distance images of objects is also improved. Note that the pattern image EX1 of the illumination pattern P1 and the pattern image EX1A of the illumination pattern P2 may be provided in units of frames as shown in FIG. 6, or may be provided in units of lines constituting a frame. , or may be provided in units of blocks each consisting of a plurality of pixels.
 この設定された内容がPC50から外部インターフェース155を介してCPU152又はISP153に入力される。CPU152又はISP153は、第1構造化照明LG1に対応する照明パターンP1及び第2構造化照明LG2に対応する照明パターンP2を照明制御部16に設定する。これにより、照明部17は、照明パターンP1(パターン画像EX1参照)を有する第1構造化照明LG1を照射可能であり、かつ、照明パターンP2(パターン画像EX1A参照)を有する第2構造化照明LG2を照射可能である。なお、パターン画像を再指定する場合には、例えばNGボタンBT2がユーザの操作により押下される。 The set contents are input from the PC 50 to the CPU 152 or the ISP 153 via the external interface 155. The CPU 152 or the ISP 153 sets the illumination pattern P1 corresponding to the first structured illumination LG1 and the illumination pattern P2 corresponding to the second structured illumination LG2 in the illumination control unit 16. Thereby, the illumination unit 17 can emit the first structured illumination LG1 having the illumination pattern P1 (see pattern image EX1), and the second structured illumination LG2 having the illumination pattern P2 (see pattern image EX1A). It is possible to irradiate In addition, when re-designating a pattern image, for example, the NG button BT2 is pressed by the user's operation.
 図7に示すUI設定画面WD3は、ユーザの操作により、PC50のディスプレイ(図示略)に表示される。PC50は、第1構造化照明LG1に対応する照明パターンP1を示すパターン画像EX1と、パターン画像EX1中の任意の2つのドットピッチをΔr倍縮小した第2構造化照明LG2に対応する照明パターンP2を示すパターン画像EX1Bと、をUI設定画面WD1に表示する。Δrは、パターン画像EX1中の任意の2つのドットピッチをd1とし、かつ、パターン画像EX1B中の対応する縮小後のドットピッチをd2とした場合、Δr=(d2/d1)となる。Δrの指定は、例えばパターン画像EX1がユーザの操作により指定された後、Δrの値(数値)がユーザの操作により指定欄FLD2に入力されてOKボタンBT1が押下されることで確定する。この確定後、PC50は、パターン画像EX1を第1構造化照明LG1に対応する照明パターンP1として設定し、パターン画像EX1Bを第2構造化照明LG2に対応する照明パターンP2として設定する。第1照射モードは、第1サイズを有する複数のドット(照射図形の一例)を配置したパターン(図7のパターン画像EX1参照)を照射するモードである。第2照射モードは、第1サイズよりも小さい第2サイズを有する複数のドット(照射図形の一例)を配置したパターン(図7のパターン画像EX1B参照)を照射するモードである。パターン画像EX1とパターン画像EX1Bとの関係は、パターン画像EX1とパターン画像EX2あるいはパターン画像EX1Aとの関係と同様に、ともに背景色が黒色で複数の白色のドットが不規則に配置されたパターンである点は共通であるが、ドットの配置箇所及び配置数が異なる。したがって、図7の設定例でも、このような背景色が黒色で複数の白色のドットが配置された2種類の異なる構造化照明が対象物に照射される。これにより、例えば、白いテーブルに置かれた白い花瓶のようにテクスチャが実質的に無い対象物である場合でも、構造化照明が照射されない場合に比べて照射中に撮像された撮像画像には図7のUI設定画面WD3で設定された照明パターンP1,P2の構造化照明が写り込んで画像特徴が増えるため、対象物の視差算出の精度が向上し、対象物の距離画像の生成精度も向上する。なお、照明パターンP1のパターン画像EX1及び照明パターンP2のパターン画像EX1Bは、図5に示すようなフレームの単位で設けられてもよいし、フレームを構成するラインの単位で設けられてもよいし、或いは複数の画素からなるブロックの単位で設けられてもよい。 The UI setting screen WD3 shown in FIG. 7 is displayed on the display (not shown) of the PC 50 by the user's operation. The PC 50 includes a pattern image EX1 showing an illumination pattern P1 corresponding to the first structured illumination LG1, and an illumination pattern P2 corresponding to the second structured illumination LG2, which is obtained by reducing the pitch of two arbitrary dots in the pattern image EX1 by Δr times. A pattern image EX1B showing . is displayed on the UI setting screen WD1. Δr is Δr=(d2/d1), where d1 is the pitch of any two dots in the pattern image EX1, and d2 is the corresponding reduced dot pitch in the pattern image EX1B. The designation of Δr is confirmed, for example, after the pattern image EX1 is designated by the user's operation, the value (numerical value) of Δr is input into the designation field FLD2 by the user's operation, and the OK button BT1 is pressed. After this determination, the PC 50 sets the pattern image EX1 as the illumination pattern P1 corresponding to the first structured illumination LG1, and sets the pattern image EX1B as the illumination pattern P2 corresponding to the second structured illumination LG2. The first irradiation mode is a mode in which a pattern (see pattern image EX1 in FIG. 7) in which a plurality of dots (an example of an irradiation figure) having a first size are arranged is irradiated. The second irradiation mode is a mode in which a pattern (see pattern image EX1B in FIG. 7) in which a plurality of dots (an example of an irradiation figure) having a second size smaller than the first size is arranged is irradiated. The relationship between pattern image EX1 and pattern image EX1B is similar to the relationship between pattern image EX1 and pattern image EX2 or pattern image EX1A, in which both have a black background and a pattern in which a plurality of white dots are irregularly arranged. Although they have one thing in common, the location and number of dots are different. Therefore, in the setting example of FIG. 7 as well, the object is irradiated with two different types of structured illumination in which the background color is black and a plurality of white dots are arranged. As a result, even if the object has virtually no texture, such as a white vase placed on a white table, the captured image taken during structured illumination will have a different shape compared to when it is not illuminated with structured illumination. The structured illumination of the illumination patterns P1 and P2 set on the UI setting screen WD3 of 7 is reflected and the image features increase, so the accuracy of parallax calculation of the object is improved, and the accuracy of generation of the distance image of the object is also improved. do. Note that the pattern image EX1 of the illumination pattern P1 and the pattern image EX1B of the illumination pattern P2 may be provided in units of frames as shown in FIG. 5, or may be provided in units of lines constituting a frame. , or may be provided in units of blocks each consisting of a plurality of pixels.
 この設定された内容がPC50から外部インターフェース155を介してCPU152又はISP153に入力される。CPU152又はISP153は、第1構造化照明LG1に対応する照明パターンP1及び第2構造化照明LG2に対応する照明パターンP2を照明制御部16に設定する。これにより、照明部17は、照明パターンP1(パターン画像EX1参照)を有する第1構造化照明LG1を照射可能であり、かつ、照明パターンP2(パターン画像EX1B参照)を有する第2構造化照明LG2を照射可能である。なお、パターン画像を再指定する場合には、例えばNGボタンBT2がユーザの操作により押下される。 The set contents are input from the PC 50 to the CPU 152 or the ISP 153 via the external interface 155. The CPU 152 or the ISP 153 sets the illumination pattern P1 corresponding to the first structured illumination LG1 and the illumination pattern P2 corresponding to the second structured illumination LG2 in the illumination control unit 16. Thereby, the illumination unit 17 can emit the first structured illumination LG1 having the illumination pattern P1 (see pattern image EX1), and the second structured illumination LG2 having the illumination pattern P2 (see pattern image EX1B). It is possible to irradiate In addition, when re-designating a pattern image, for example, the NG button BT2 is pressed by the user's operation.
 また、カメラ装置10は、第2構造化照明LG2に対応する照明パターンP2を、第1構造化照明LG1の対象物への照射中に撮像された撮像画像に対するAI処理部154の処理結果を基にして動的に変更するように制御してもよい。この制御例について、図8及び図9A/図9Bの組、並びに、図10及び図11の組のそれぞれを参照して説明する。 Further, the camera device 10 generates the illumination pattern P2 corresponding to the second structured illumination LG2 based on the processing result of the AI processing unit 154 on the captured image taken during the irradiation of the object with the first structured illumination LG1. It may also be controlled to change dynamically. This control example will be explained with reference to the set of FIGS. 8 and 9A/9B, and the set of FIGS. 10 and 11, respectively.
 図8は、AI処理部154の処理結果を用いた照明エリアを制御する場合の動作概要例及びUI設定画面例を示す図である。図9Aは、カメラ装置10の第3動作手順の一例を示すフローチャートである。図9Bは、カメラ装置10の第3動作手順の他の一例を示すフローチャートである。図9A及び図9Bの説明では、第1構造化照明LG1の照明とその第1構造化照明LG1下での撮像、第2構造化照明LG2の照明とその第2構造化照明LG2下での撮像、がそれぞれ同期して実行される。図9Bの説明では、図9Aの説明と重複する処理は同一のステップ番号を付与して説明を簡略化あるいは省略し、異なる内容について説明する。 FIG. 8 is a diagram showing an example of an operation outline and an example of a UI setting screen when controlling a lighting area using the processing results of the AI processing unit 154. FIG. 9A is a flowchart illustrating an example of the third operation procedure of the camera device 10. FIG. 9B is a flowchart showing another example of the third operation procedure of the camera device 10. In the description of FIGS. 9A and 9B, illumination of the first structured illumination LG1 and imaging under the first structured illumination LG1, illumination of the second structured illumination LG2 and imaging under the second structured illumination LG2 are described. , are executed synchronously. In the explanation of FIG. 9B, the same step numbers are assigned to processes that overlap with the explanation of FIG. 9A to simplify or omit the explanation, and different contents will be explained.
 カメラ装置10の照明部17から構造化照明が照射される前、第1撮像モジュールCAP1及び第2撮像モジュールCAP2のうちいずれか(例えば第1撮像モジュールCAP1)により撮像された撮像画像IMG0には、テクスチャが実質的に無い背景物BKG1と、ワークOB1とが写っている。第1撮像モジュールCAP1及び第2撮像モジュールCAP2からの撮像画像IMG0中の画像特徴が少ないため、カメラ装置10からワークOB1までの距離を正確に把握することが困難である。 Before the structured illumination is irradiated from the illumination unit 17 of the camera device 10, the captured image IMG0 captured by either the first imaging module CAP1 or the second imaging module CAP2 (for example, the first imaging module CAP1) includes: A background object BKG1 with substantially no texture and a workpiece OB1 are shown. Since there are few image features in the captured image IMG0 from the first imaging module CAP1 and the second imaging module CAP2, it is difficult to accurately grasp the distance from the camera device 10 to the workpiece OB1.
 そこで図8に示すように、カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2は、照明部17から第1構造化照明LG1(例えばパターン画像EX1に対応する照明パターンP1)が照射された状態で、ワークOB1と背景物とを撮像する。この撮像により撮像画像IMG1が得られる。カメラ装置10のAI処理部154は、AI(例えばセグメンテーション用の学習済みモデル)を用いて、撮像画像IMG1中のワークOB1の領域を判定するための領域判定処理を行う。カメラ装置10のCPU152又はISP153は、第3照射モードを照明制御部16に設定する。つまり、カメラ装置10のCPU152又はISP153は、AI処理部154による領域判定処理の結果を基に、第2構造化照明LG2(第1光の一例)を照射するための照射エリアを決定し、この決定された照射エリアに第2構造化照明LG2を照射するように照明制御部16を制御する。さらに、カメラ装置10のCPU152又はISP153は、AI処理部154による領域判定処理の結果を基に、撮像画像IMG1中のワークOB1の領域に相当しない領域に第2光(第1光とは異なる光の一例)を照射するように照明制御部16を制御する。照明制御部16は、決定された照射エリアにのみ照射範囲を絞る(制限する)ように第2構造化照明LG2を照明部17から照射させ、かつ、その照射エリア以外の領域に構造化照明ではない第2光(例えば白色光)を照明部17から照射させる。 Therefore, as shown in FIG. 8, the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 are irradiated with the first structured illumination LG1 (for example, the illumination pattern P1 corresponding to the pattern image EX1) from the illumination unit 17. In this state, the workpiece OB1 and the background object are imaged. Through this imaging, a captured image IMG1 is obtained. The AI processing unit 154 of the camera device 10 uses AI (for example, a trained model for segmentation) to perform region determination processing for determining the region of the workpiece OB1 in the captured image IMG1. The CPU 152 or the ISP 153 of the camera device 10 sets the third irradiation mode to the illumination control unit 16. That is, the CPU 152 or the ISP 153 of the camera device 10 determines the irradiation area for irradiating the second structured illumination LG2 (an example of the first light) based on the result of the area determination process by the AI processing unit 154, and The illumination control unit 16 is controlled to irradiate the determined irradiation area with the second structured illumination LG2. Further, the CPU 152 or the ISP 153 of the camera device 10 applies a second light (a light different from the first light) to an area that does not correspond to the area of the workpiece OB1 in the captured image IMG1 based on the result of the area determination process by the AI processing unit 154. The illumination control unit 16 is controlled to irradiate the light (an example). The illumination control unit 16 causes the illumination unit 17 to irradiate the second structured illumination LG2 so as to narrow down (restrict) the irradiation range to the determined irradiation area, and to apply the structured illumination to areas other than the irradiation area. The illumination section 17 irradiates the second light (for example, white light).
 カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2は、照明部17から第2構造化照明LG2(例えば照射エリアの面積に相当する、パターン画像EX1に対応する照明パターンP1の一部)が照射された状態で、ワークOB1と背景物とを撮像する。この撮像により撮像画像IMG2が得られる。撮像画像IMG2は、画角内の対象物全体の中で、第2構造化照明LG2が照射された部分の画像領域CT1と第2構造化照明LG2が照射されなかった部分の画像領域CT0とが存在する。カメラ装置10のCPU152又はISP153は、第1撮像モジュールCAP1からの一方の撮像画像IMG1と第2撮像モジュールCAP2からの一方の撮像画像IMG2とをワークOB1が重なるように合成して一方の合成画像を生成し、かつ、第1撮像モジュールCAP1からの他方の撮像画像IMG1と第2撮像モジュールCAP2からの他方の撮像画像IMG2とをワークOB1が重なるように合成して他方の合成画像を生成する。カメラ装置10のCPU152又はISP153は、2つの合成画像を基にしてカメラ装置10からワークOB1までの距離を特定するための距離画像を生成する。また、カメラ装置10のCPU152又はISP153は、ワークOB1(例えばネジ)を2つの合成画像から認識し、その認識結果からワークOB1(例えばネジ)のサイズ及び距離を特定した認識結果画面WD4を生成し、PC50に送ってよい。PC50は、カメラ装置10から送られた認識結果画面WD4を表示する。 The first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 are provided with a second structured illumination LG2 (for example, a part of the illumination pattern P1 corresponding to the pattern image EX1 corresponding to the area of the irradiation area) from the illumination unit 17. The workpiece OB1 and the background object are imaged while being irradiated. Through this imaging, a captured image IMG2 is obtained. The captured image IMG2 includes an image region CT1 of a portion illuminated with the second structured illumination LG2 and an image region CT0 of a portion not illuminated with the second structured illumination LG2 within the entire object within the viewing angle. exist. The CPU 152 or the ISP 153 of the camera device 10 synthesizes one captured image IMG1 from the first imaging module CAP1 and one captured image IMG2 from the second imaging module CAP2 so that the workpiece OB1 overlaps, and creates one composite image. Then, the other captured image IMG1 from the first imaging module CAP1 and the other captured image IMG2 from the second imaging module CAP2 are combined so that the workpiece OB1 overlaps to generate the other composite image. The CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance from the camera device 10 to the workpiece OB1 based on the two composite images. Further, the CPU 152 or the ISP 153 of the camera device 10 recognizes the workpiece OB1 (for example, a screw) from the two composite images, and generates a recognition result screen WD4 that specifies the size and distance of the workpiece OB1 (for example, a screw) from the recognition result. , may be sent to the PC50. The PC 50 displays the recognition result screen WD4 sent from the camera device 10.
 認識結果画面WD4は、例えば第2構造化照明LG2が照射された部分の画像領域CT1の画像と、ワークOB1(例えばネジ)の詳細情報の表示領域DTL1とを少なくとも有する。また、画像領域CT1の画像には、ワークOB1(例えばネジ)の形状を示す枠WK1が示されている。この枠WK1は、カメラ装置10のCPU152又はISP153により付加される。表示領域DTL1には、ワークOB1(例えばネジ)の詳細情報として、認識されたワークOB1(例えばネジ)のサイズ(例えば100mm×50mm)及びカメラ装置10からの距離(30cm)を示している。 The recognition result screen WD4 includes at least an image of the image area CT1 of the portion irradiated with the second structured illumination LG2, for example, and a display area DTL1 of detailed information on the workpiece OB1 (for example, a screw). Furthermore, a frame WK1 indicating the shape of the workpiece OB1 (for example, a screw) is shown in the image of the image area CT1. This frame WK1 is added by the CPU 152 or the ISP 153 of the camera device 10. The display area DTL1 shows the size (for example, 100 mm x 50 mm) and distance (30 cm) of the recognized workpiece OB1 (for example, a screw) and the distance from the camera device 10 as detailed information about the workpiece OB1 (for example, a screw).
 図9Aにおいて、カメラ装置10の照明制御部16は、例えば時刻t1(図2参照)に第3照射モードに切り替えて照明パターンP1の照明(第1構造化照明LG1)を、対象物に向けて照明部17から照射する(ステップSt1A)。カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2のそれぞれは、ステップSt1Aで照明部17から第1構造化照明LG1が照射されている中で対象物を撮像する(ステップSt1A)。つまり、ステップSt1Aでは、照明部17からの第1構造化照明LG1の照射と同期して互いに視差を有する2つの撮像データ(例えば第1左画像P1L及び第1右画像P1R)が得られる。例えば、第1撮像モジュールCAP1から第1左画像P1Lが得られ、第2撮像モジュールCAP2から第1右画像P1Rが得られる。カメラ装置10のCPU152又はISP153は、ステップSt1Aの実行タイミングから微小時間Δtaの経過(待機)を待つ(ステップSt3A)。 In FIG. 9A, the illumination control unit 16 of the camera device 10 switches to the third illumination mode at time t1 (see FIG. 2), for example, and directs the illumination of the illumination pattern P1 (first structured illumination LG1) toward the target object. Irradiation is performed from the illumination unit 17 (step St1A). Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the first structured illumination LG1 from the illumination unit 17 in step St1A (step St1A). That is, in step St1A, two pieces of imaging data (for example, the first left image P1L and the first right image P1R) having parallax with each other are obtained in synchronization with the irradiation of the first structured illumination LG1 from the illumination unit 17. For example, the first left image P1L is obtained from the first imaging module CAP1, and the first right image P1R is obtained from the second imaging module CAP2. The CPU 152 or the ISP 153 of the camera device 10 waits for a minute time Δta to elapse (standby) from the execution timing of step St1A (step St3A).
 また、カメラ装置10のAI処理部154は、AI(例えばセグメンテーション用の学習済みモデル)を用いて、撮像画像IMG1中のワークOB1の領域を判定するための領域判定処理を行う(ステップSt11)。カメラ装置10のCPU152又はISP153は、AI処理部154による領域判定処理の結果を基に、第2構造化照明LG2を照射するための照射エリアを決定する(ステップSt12)。 Furthermore, the AI processing unit 154 of the camera device 10 uses AI (for example, a trained model for segmentation) to perform region determination processing to determine the region of the workpiece OB1 in the captured image IMG1 (step St11). The CPU 152 or the ISP 153 of the camera device 10 determines the irradiation area for irradiating the second structured illumination LG2 based on the result of the area determination process by the AI processing unit 154 (Step St12).
 カメラ装置10のCPU152又はISP153は、ステップSt12で決定された照射エリアに第2構造化照明LG2を照射し、かつ、その照射エリア以外の領域に第2光(例えば白色光)を照射するように照明制御部16を制御する。照明制御部16は、例えば時刻t2(図2参照)に第2照射モードに切り替え、ステップSt12で決定された照射エリアにのみ照射範囲を絞る(制限する)ように第2構造化照明LG2を照明部17から照射させ、その照射エリア以外の領域に第2光(例えば白色光)を照明部17から照射させる(ステップSt4A)。カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2のそれぞれは、ステップSt4Aで照明部17から第2構造化照明LG2が照射されている中で対象物を撮像する(ステップSt4A)。つまり、ステップSt4Aでは、照明部17からの第2構造化照明LG2の照射と同期して互いに視差を有する2つの撮像データ(例えば第2左画像P2L及び第2右画像P2R)が得られる。例えば、第1撮像モジュールCAP1から第2左画像P2Lが得られ、第2撮像モジュールCAP2から第2右画像P2Rが得られる。 The CPU 152 or the ISP 153 of the camera device 10 irradiates the irradiation area determined in step St12 with the second structured illumination LG2, and irradiates areas other than the irradiation area with second light (for example, white light). Controls the lighting control section 16. The lighting control unit 16 switches to the second irradiation mode at time t2 (see FIG. 2), for example, and lights the second structured lighting LG2 so as to narrow down (limit) the irradiation range to only the irradiation area determined in step St12. The second light (for example, white light) is irradiated from the illumination section 17 to a region other than the irradiation area (step St4A). Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the second structured illumination LG2 from the illumination unit 17 in step St4A (step St4A). That is, in step St4A, two pieces of imaging data (for example, the second left image P2L and the second right image P2R) having parallax with each other are obtained in synchronization with the irradiation of the second structured illumination LG2 from the illumination unit 17. For example, the second left image P2L is obtained from the first imaging module CAP1, and the second right image P2R is obtained from the second imaging module CAP2.
 カメラ装置10のCPU152又はISP153は、ステップSt1Aで撮像された互いに視差を有する2つの撮像データの一方(例えば第1左画像P1L)とステップSt4Aで撮像された互いに視差を有する2つの撮像データの一方(例えば第2左画像P2L)とを合成する(ステップSt6)。同様に、カメラ装置10のCPU152又はISP153は、ステップSt1Aで撮像された互いに視差を有する2つの撮像データの他方(例えば第1右画像P1R)とステップSt4Aで撮像された互いに視差を有する2つの撮像データの他方(例えば第2右画像P2R)とを合成する(ステップSt6)。さらに、カメラ装置10のCPU152又はISP153は、合成により得られた2つの合成画像(例えば合成左画像P3L及び合成右画像P3R)の視差を算出する。カメラ装置10のCPU152又はISP153は、視差の算出結果を用いて、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力する(ステップSt6)。 The CPU 152 or the ISP 153 of the camera device 10 selects one of the two image data captured in step St1A that has a parallax with each other (for example, the first left image P1L) and one of the two image data that has a parallax that is captured in step St4A. (for example, second left image P2L) (step St6). Similarly, the CPU 152 or the ISP 153 of the camera device 10 selects the other of the two image data (for example, the first right image P1R) that has a parallax from each other that was imaged in step St1A, and the two images that have a parallax from each other that was imaged in step St4A. The other data (for example, the second right image P2R) is synthesized (step St6). Further, the CPU 152 or the ISP 153 of the camera device 10 calculates the parallax between the two composite images (for example, the composite left image P3L and the composite right image P3R) obtained by the composition. The CPU 152 or ISP 153 of the camera device 10 uses the parallax calculation results to generate a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object, and outputs it to the PC 50. (Step St6).
 カメラ装置10のCPU152又はISP153は、ステップSt4Aの実行タイミングから微小時間Δtbの経過(待機)を待つ(ステップSt7A)。ステップSt7Aの後、カメラ装置10の処理はステップSt1Aに戻る。 The CPU 152 or the ISP 153 of the camera device 10 waits for a minute time Δtb to elapse (standby) from the execution timing of step St4A (step St7A). After step St7A, the processing of the camera device 10 returns to step St1A.
 なお、ステップSt6において、カメラ装置10のCPU152又はISP153は、照明パターンP1の照射中に第1撮像モジュールCAP1により撮像された第1左画像P1L及び第2撮像モジュールCAP2により撮像された第1右画像P1Rの視差を算出してもよい。同様に、カメラ装置10のCPU152又はISP153は、照明パターンP2の照射中に第1撮像モジュールCAP1により撮像された第2左画像P2L及び第2撮像モジュールCAP2により撮像された第2右画像P2Rの視差を算出してもよい。カメラ装置10のCPU152又はISP153は、第1左画像P1L及び第1右画像P1Rに基づく視差と第2左画像P2L及び第2右画像P2Rに基づく視差とを合成することにより、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力してもよい。 Note that in step St6, the CPU 152 or the ISP 153 of the camera device 10 displays the first left image P1L captured by the first imaging module CAP1 and the first right image captured by the second imaging module CAP2 during irradiation with the illumination pattern P1. The P1R parallax may be calculated. Similarly, the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated. The CPU 152 or the ISP 153 of the camera device 10 combines the parallax based on the first left image P1L and the first right image P1R with the parallax based on the second left image P2L and the second right image P2R, thereby determining the standard of the camera device 10. A distance image for specifying the distance between a point (see above) and a target object may be generated and output to the PC 50.
 また、ステップSt6において、カメラ装置10のCPU152又はISP153は、照明パターンP1の照射中に第1撮像モジュールCAP1により撮像された第1左画像P1L及び第2撮像モジュールCAP2により撮像された第1右画像P1Rの視差を算出してもよい。同様に、カメラ装置10のCPU152又はISP153は、照明パターンP2の照射中に第1撮像モジュールCAP1により撮像された第2左画像P2L及び第2撮像モジュールCAP2により撮像された第2右画像P2Rの視差を算出してもよい。カメラ装置10のCPU152又はISP153は、第1左画像P1L及び第1右画像P1Rに基づく視差から距離画像を生成し、さらに、第2左画像P2L及び第2右画像P2Rに基づく視差から距離画像を生成してもよい。カメラ装置10のCPU152又はISP153は、これら2つの距離画像を合成することにより、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力してもよい。 In addition, in step St6, the CPU 152 or the ISP 153 of the camera device 10 displays a first left image P1L captured by the first imaging module CAP1 and a first right image captured by the second imaging module CAP2 during the irradiation with the illumination pattern P1. The P1R parallax may be calculated. Similarly, the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated. The CPU 152 or the ISP 153 of the camera device 10 generates a distance image from the parallax based on the first left image P1L and the first right image P1R, and further generates a distance image from the parallax based on the second left image P2L and the second right image P2R. may be generated. The CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object by combining these two distance images, and You can also output to
 なお、図9Bに示すように、カメラ装置10のAI処理部154は、ステップSt6で生成された距離画像を入力としたAI(例えばセグメンテーション用の学習済みモデル)を用いて、撮像画像IMG1中のワークOB1の領域を判定するための領域判定処理を行ってもよい(ステップSt11A)。カメラ装置10のCPU152又はISP153は、AI処理部154による領域判定処理の結果を基に、第2構造化照明LG2を照射するための照射エリアを決定してもよい(ステップSt12A)。ステップSt12Aの後、カメラ装置10の処理はステップSt1Aに戻る。 Note that, as shown in FIG. 9B, the AI processing unit 154 of the camera device 10 uses AI (for example, a trained model for segmentation) that inputs the distance image generated in step St6 to Area determination processing may be performed to determine the area of work OB1 (step St11A). The CPU 152 or the ISP 153 of the camera device 10 may determine the irradiation area for irradiating the second structured illumination LG2 based on the result of the area determination process by the AI processing unit 154 (Step St12A). After step St12A, the processing of the camera device 10 returns to step St1A.
 図10は、AI処理部154の処理結果を用いた照明強度を制御する場合の動作概要例及びUI設定画面例を示す図である。図11Aは、カメラ装置10の第4動作手順の一例を示すフローチャートである。図11Bは、カメラ装置10の第4動作手順の他の一例を示すフローチャートである。図11A及び図11Bの説明では、第1構造化照明LG1の照明とその第1構造化照明LG1下での撮像、第2構造化照明LG2の照明とその第2構造化照明LG2下での撮像、がそれぞれ同期して実行される。図11Bの説明では、図11Aの説明と重複する処理は同一のステップ番号を付与して説明を簡略化あるいは省略し、異なる内容について説明する。 FIG. 10 is a diagram showing an example of an operation outline and an example of a UI setting screen when controlling the illumination intensity using the processing results of the AI processing unit 154. FIG. 11A is a flowchart illustrating an example of the fourth operation procedure of the camera device 10. FIG. 11B is a flowchart showing another example of the fourth operation procedure of the camera device 10. In the description of FIGS. 11A and 11B, illumination of the first structured illumination LG1 and imaging under the first structured illumination LG1, illumination of the second structured illumination LG2 and imaging under the second structured illumination LG2 are described. , are executed synchronously. In the explanation of FIG. 11B, the same step numbers are assigned to processes that overlap with the explanation of FIG. 11A to simplify or omit the explanation, and different contents will be explained.
 カメラ装置10の照明部17から構造化照明が照射される前、第1撮像モジュールCAP1及び第2撮像モジュールCAP2のうちいずれか(例えば第1撮像モジュールCAP1)により撮像された撮像画像IMG0には、テクスチャが実質的に無い背景物BKG1とワークOB1とが写っている。第1撮像モジュールCAP1及び第2撮像モジュールCAP2からの撮像画像IMG0中の画像特徴が少ないため、カメラ装置10からワークOB1までの距離を正確に把握することが困難である。 Before the structured illumination is irradiated from the illumination unit 17 of the camera device 10, the captured image IMG0 captured by one of the first imaging module CAP1 and the second imaging module CAP2 (for example, the first imaging module CAP1) includes: A background object BKG1 with substantially no texture and a workpiece OB1 are shown. Since there are few image features in the captured image IMG0 from the first imaging module CAP1 and the second imaging module CAP2, it is difficult to accurately grasp the distance from the camera device 10 to the workpiece OB1.
 そこで図10に示すように、カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2は、照明部17から第1構造化照明LG1(例えばパターン画像EX1に対応する照明パターンP1)が照射された状態で、ワークOB1と背景物とを撮像する。この撮像により撮像画像IMG1が得られる。カメラ装置10のAI処理部154は、AI(例えばセグメンテーション用の学習済みモデル)を用いて、撮像画像IMG1中のワークOB1の領域を判定するための領域判定処理を行う。カメラ装置10のCPU152又はISP153は、第4照射モードを照明制御部16に設定する。つまり、カメラ装置10のCPU152又はISP153は、AI処理部154による領域判定処理の結果(例えば、ワークOB1の領域内の輝度)に基づいて第2構造化照明LG2の第1照射強度(例えば輝度)と第2構造化照明LG2を照射するための照射エリアとを決定する。カメラ装置10のCPU152又はISP153は、この決定された第1照射強度(例えば第1の輝度)を有する第2構造化照明LG2を決定された照射エリアに照射するように照明制御部16を制御する。さらに、カメラ装置10のCPU152又はISP153は、AI処理部154による領域判定処理の結果を基に、撮像画像IMG1中のワークOB1の領域に相当しない領域に第2照射強度(第1照射強度とは異なる照射強度の一例)の第2構造化照明LG2を照射するように照明制御部16を制御する。照明制御部16は、決定されたワークOB1の照射エリアに第1照射強度の第2構造化照明LG2を照明部17から照射させ、かつ、その照射エリア以外の領域に第2照射強度(例えば第1の輝度とは異なる第2の輝度)の第2構造化照明LG2を照明部17から照射させる。例えば背景物が黒色に近い色を有する場合には反射率が低下するので、第1構造化照明LG1の照射中に撮像された撮像画像IMG1だけでは、ワークの認識精度が劣化する場合がある。そこで、カメラ装置10は、第1構造化照明LG1に比べて照射強度(例えば輝度)を上げた第2構造化照明LG2を照射されることにより、ワークの認識精度の向上を図る。 Therefore, as shown in FIG. 10, the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 are irradiated with the first structured illumination LG1 (for example, the illumination pattern P1 corresponding to the pattern image EX1) from the illumination unit 17. In this state, the workpiece OB1 and the background object are imaged. Through this imaging, a captured image IMG1 is obtained. The AI processing unit 154 of the camera device 10 uses AI (for example, a trained model for segmentation) to perform region determination processing for determining the region of the workpiece OB1 in the captured image IMG1. The CPU 152 or the ISP 153 of the camera device 10 sets the fourth irradiation mode to the illumination control unit 16. That is, the CPU 152 or the ISP 153 of the camera device 10 determines the first irradiation intensity (for example, brightness) of the second structured illumination LG2 based on the result of the area determination process by the AI processing unit 154 (for example, the brightness in the area of the workpiece OB1). and an irradiation area for irradiating with the second structured illumination LG2. The CPU 152 or ISP 153 of the camera device 10 controls the illumination control unit 16 to irradiate the determined irradiation area with the second structured illumination LG2 having the determined first irradiation intensity (for example, first brightness). . Further, the CPU 152 or the ISP 153 of the camera device 10 applies a second irradiation intensity (the first irradiation intensity is The illumination control unit 16 is controlled to irradiate the second structured illumination LG2 with different irradiation intensities (an example). The illumination control unit 16 causes the illumination unit 17 to irradiate the irradiation area of the determined workpiece OB1 with the second structured illumination LG2 at the first irradiation intensity, and to irradiate the area other than the irradiation area with the second irradiation intensity (for example, the second structured illumination LG2). The second structured illumination LG2 having a second luminance different from the first luminance) is emitted from the illumination unit 17. For example, when the background object has a color close to black, the reflectance decreases, so the recognition accuracy of the workpiece may deteriorate with only the captured image IMG1 captured during irradiation with the first structured illumination LG1. Therefore, the camera device 10 aims to improve the recognition accuracy of the workpiece by being irradiated with the second structured illumination LG2, which has a higher irradiation intensity (for example, brightness) than the first structured illumination LG1.
 カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2は、照明部17から第2構造化照明LG2が照射された状態で、ワークOB1と背景物とを撮像する。この撮像により撮像画像IMG3が得られる。撮像画像IMG3は、撮像画像IMG1と同じ画角内の対象物が撮像されており、第1構造化照明LG1とは異なる照射強度(例えば輝度)を有する第2構造化照明LG2が照射された画像である。カメラ装置10のCPU152又はISP153は、第1撮像モジュールCAP1からの一方の撮像画像IMG1と第2撮像モジュールCAP2からの一方の撮像画像IMG3とをワークOB1が重なるように合成して一方の合成画像を生成し、かつ、第1撮像モジュールCAP1からの他方の撮像画像IMG1と第2撮像モジュールCAP2からの他方の撮像画像IMG3とをワークOB1が重なるように合成して他方の合成画像を生成する。カメラ装置10のCPU152又はISP153は、2つの合成画像を基にしてカメラ装置10からワークOB1までの距離を特定するための距離画像を生成する。また、カメラ装置10のCPU152又はISP153は、ユーザの操作により第2構造化照明LG2の照射強度(例えば輝度)を設定するためのUI設定画面WD5を生成し、PC50に送ってよい。PC50は、カメラ装置10から送られたUI設定画面WD5を表示する。 The first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 image the workpiece OB1 and the background object while being irradiated with the second structured illumination LG2 from the illumination unit 17. Through this imaging, a captured image IMG3 is obtained. The captured image IMG3 is an image in which an object within the same angle of view as the captured image IMG1 is imaged, and is illuminated with a second structured illumination LG2 having a different irradiation intensity (for example, brightness) than the first structured illumination LG1. It is. The CPU 152 or the ISP 153 of the camera device 10 combines one of the captured images IMG1 from the first imaging module CAP1 and one of the captured images IMG3 from the second imaging module CAP2 so that the workpiece OB1 overlaps, and creates one composite image. Then, the other captured image IMG1 from the first imaging module CAP1 and the other captured image IMG3 from the second imaging module CAP2 are combined so that the workpiece OB1 overlaps to generate the other composite image. The CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance from the camera device 10 to the workpiece OB1 based on the two composite images. Further, the CPU 152 or the ISP 153 of the camera device 10 may generate a UI setting screen WD5 for setting the irradiation intensity (for example, brightness) of the second structured illumination LG2 according to a user's operation, and send it to the PC 50. The PC 50 displays the UI setting screen WD5 sent from the camera device 10.
 図10に示すUI設定画面WD5は、カメラ装置10から送られるとPC50のディスプレイ(図示略)に表示される。PC50は、第2構造化照明LG2の照射強度(例えば輝度)の相対比率の入力をユーザに促すための指定欄FLD3と、OKボタンBT1と、NGボタンBT2とを含むUI設定画面WD5を表示する。照明強度の相対比率の指定は、ユーザの操作により指定欄FLD3に入力されてOKボタンBT1が押下されることで確定する。この確定後、PC50は、第1構造化照明LG1の照射強度(例えば輝度)と第2構造化照明LG2の照射強度(例えば輝度)とをそれぞれ示すパラメータ値を設定する。 The UI setting screen WD5 shown in FIG. 10 is displayed on the display (not shown) of the PC 50 when sent from the camera device 10. The PC 50 displays a UI setting screen WD5 including a specification field FLD3 for prompting the user to input the relative ratio of the irradiation intensity (for example, brightness) of the second structured lighting LG2, an OK button BT1, and an NG button BT2. . The designation of the relative ratio of illumination intensity is confirmed by inputting it into the designation field FLD3 through a user operation and pressing the OK button BT1. After this determination, the PC 50 sets parameter values that respectively indicate the irradiation intensity (for example, brightness) of the first structured illumination LG1 and the irradiation intensity (for example, brightness) of the second structured illumination LG2.
 この設定されたパラメータ値がPC50から外部インターフェース155を介してCPU152又はISP153に入力される。CPU152又はISP153は、第1構造化照明LG1の照射強度(例えば輝度)及び照明パターンに相当するパターン画像と、第2構造化照明の照射強度(例えば輝度)を照明制御部16に設定する。これにより、照明部17は、照明パターンP1(パターン画像EX1参照)を有する第1構造化照明LG1を照射可能であり、かつ、同じ照明パターンP1であってかつ異なる照射強度(例えば輝度)を有する第2構造化照明LG2を照射可能である。なお、照明強度の相対比率を再指定する場合には、例えばNGボタンBT2がユーザの操作により押下される。 This set parameter value is input from the PC 50 to the CPU 152 or the ISP 153 via the external interface 155. The CPU 152 or the ISP 153 sets the illumination intensity (for example, brightness) of the first structured illumination LG1 and a pattern image corresponding to the illumination pattern, and the illumination intensity (for example, brightness) of the second structured illumination in the illumination control unit 16. Thereby, the illumination unit 17 can emit the first structured illumination LG1 having the illumination pattern P1 (see pattern image EX1), and has the same illumination pattern P1 but different illumination intensities (for example, brightness). A second structured illumination LG2 can be emitted. Note that when respecifying the relative ratio of illumination intensity, for example, the NG button BT2 is pressed by the user's operation.
 図11Aにおいて、カメラ装置10の照明制御部16は、例えば時刻t1(図2参照)に第4照射モードに切り替えて照明パターンP1の照明(第1構造化照明LG1)を、対象物に向けて照明部17から照射する(ステップSt1A)。カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2のそれぞれは、ステップSt1Aで照明部17から第1構造化照明LG1が照射されている中で対象物を撮像する(ステップSt1A)。つまり、ステップSt1Aでは、照明部17からの第1構造化照明LG1の照射と同期して互いに視差を有する2つの撮像データ(例えば第1左画像P1L及び第1右画像P1R)が得られる。例えば、第1撮像モジュールCAP1から第1左画像P1Lが得られ、第2撮像モジュールCAP2から第1右画像P1Rが得られる。カメラ装置10のCPU152又はISP153は、ステップSt1Aの実行タイミングから微小時間Δtaの経過(待機)を待つ(ステップSt3A)。 In FIG. 11A, the illumination control unit 16 of the camera device 10 switches to the fourth illumination mode at time t1 (see FIG. 2), for example, and directs the illumination of the illumination pattern P1 (first structured illumination LG1) toward the target object. Irradiation is performed from the illumination unit 17 (step St1A). Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the first structured illumination LG1 from the illumination unit 17 in step St1A (step St1A). That is, in step St1A, two pieces of imaging data (for example, the first left image P1L and the first right image P1R) having parallax with each other are obtained in synchronization with the irradiation of the first structured illumination LG1 from the illumination unit 17. For example, the first left image P1L is obtained from the first imaging module CAP1, and the first right image P1R is obtained from the second imaging module CAP2. The CPU 152 or the ISP 153 of the camera device 10 waits for a minute time Δta to elapse (standby) from the execution timing of step St1A (step St3A).
 また、カメラ装置10のAI処理部154は、AI(例えばセグメンテーション用の学習済みモデル)を用いて、撮像画像IMG1中のワークOB1の領域を判定するための領域判定処理を行う(ステップSt11)。カメラ装置10のCPU152又はISP153は、AI処理部154による領域判定処理の結果を基に、第2構造化照明LG2の第1照射強度(例えば輝度)とその第2構造化照明LG2を照射するための照射エリアとを決定する(ステップSt21)。さらに、カメラ装置10のCPU152又はISP153は、第1構造化照明LG1の照射中の撮像画像IMG1と第2構造化照明LG2の照射中の撮像画像との合成比率を示す合成係数を決定する(ステップSt21)。この合成係数はあらかじめメモリ151に保存されたものであってもよいし、撮像画像IMG1中のワークOB1の領域内の輝度に基づいて動的に決定されてもよい。 Furthermore, the AI processing unit 154 of the camera device 10 uses AI (for example, a trained model for segmentation) to perform region determination processing to determine the region of the workpiece OB1 in the captured image IMG1 (step St11). The CPU 152 or the ISP 153 of the camera device 10 sets the first irradiation intensity (for example, brightness) of the second structured illumination LG2 and the second structured illumination LG2 based on the result of the area determination processing by the AI processing unit 154. The irradiation area is determined (Step St21). Furthermore, the CPU 152 or the ISP 153 of the camera device 10 determines a synthesis coefficient indicating the synthesis ratio of the captured image IMG1 during irradiation with the first structured illumination LG1 and the captured image during irradiation with the second structured illumination LG2 (step St21). This synthesis coefficient may be stored in the memory 151 in advance, or may be dynamically determined based on the brightness within the region of the workpiece OB1 in the captured image IMG1.
 カメラ装置10のCPU152又はISP153は、ステップSt21で決定された第1照射強度(例えば輝度)を有する第2構造化照明LG2を決定された照射エリアに照射し、かつ、その照射エリア以外の領域に第2照射強度(例えば輝度)の第2構造化照明LG2を照射するように照明制御部16を制御する。照明制御部16は、決定された第1照射強度(例えば輝度)の第2構造化照明LG2を決定された照射エリアに照明部17に照射させ、かつ、その照射エリア以外の領域に第2照射強度(例えば輝度)の第2構造化照明LG2を照明部17から照射させる(ステップSt4A)。カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2のそれぞれは、ステップSt4Aで照明部17から第2構造化照明LG2が照射されている中で対象物を撮像する(ステップSt4A)。つまり、ステップSt4Aでは、照明部17からの第2構造化照明LG2の照射と同期して互いに視差を有する2つの撮像データ(例えば第2左画像P2L及び第2右画像P2R)が得られる。例えば、第1撮像モジュールCAP1から第2左画像P2Lが得られ、第2撮像モジュールCAP2から第2右画像P2Rが得られる。 The CPU 152 or the ISP 153 of the camera device 10 irradiates the determined irradiation area with the second structured illumination LG2 having the first irradiation intensity (for example, brightness) determined in step St21, and irradiates the area other than the irradiation area. The illumination control unit 16 is controlled to irradiate the second structured illumination LG2 with the second irradiation intensity (for example, brightness). The illumination control unit 16 causes the illumination unit 17 to irradiate the determined irradiation area with the second structured illumination LG2 having the determined first irradiation intensity (for example, brightness), and causes the illumination unit 17 to irradiate the determined irradiation area with the second structured illumination LG2, and to apply the second irradiation to an area other than the irradiation area. The second structured illumination LG2 of high intensity (for example, brightness) is emitted from the illumination unit 17 (step St4A). Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the second structured illumination LG2 from the illumination unit 17 in step St4A (step St4A). That is, in step St4A, two pieces of imaging data (for example, the second left image P2L and the second right image P2R) having parallax with each other are obtained in synchronization with the irradiation of the second structured illumination LG2 from the illumination unit 17. For example, the second left image P2L is obtained from the first imaging module CAP1, and the second right image P2R is obtained from the second imaging module CAP2.
 カメラ装置10のCPU152又はISP153は、ステップSt21で決定された合成係数を用いて、ステップSt1Aで撮像された互いに視差を有する2つの撮像データの一方(例えば第1左画像P1L)とステップSt4Aで撮像された互いに視差を有する2つの撮像データの一方(例えば第2左画像P2L)とを合成する(ステップSt6)。同様に、カメラ装置10のCPU152又はISP153は、ステップSt21で決定された合成係数を用いて、ステップSt1Aで撮像された互いに視差を有する2つの撮像データの他方(例えば第1右画像P1R)とステップSt4Aで撮像された互いに視差を有する2つの撮像データの他方(例えば第2右画像P2R)とを合成する(ステップSt6)。さらに、カメラ装置10のCPU152又はISP153は、合成により得られた2つの合成画像(例えば合成左画像P3L及び合成右画像P3R)の視差を算出する。カメラ装置10のCPU152又はISP153は、視差の算出結果を用いて、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力する(ステップSt6)。 Using the synthesis coefficient determined in step St21, the CPU 152 or ISP 153 of the camera device 10 selects one of the two imaged data (for example, the first left image P1L) that has a parallax from each other that was imaged in step St1A, and the image captured in step St4A. One of the two captured image data having a parallax (for example, the second left image P2L) is synthesized (step St6). Similarly, the CPU 152 or the ISP 153 of the camera device 10 uses the synthesis coefficient determined in step St21 to combine the other of the two pieces of imaged data (for example, the first right image P1R) that have a parallax with each other that was imaged in step St1A. The other image data (for example, second right image P2R) of the two imaged data having a parallax that are imaged in St4A are combined (step St6). Further, the CPU 152 or the ISP 153 of the camera device 10 calculates the parallax between the two composite images (for example, the composite left image P3L and the composite right image P3R) obtained by the composition. The CPU 152 or ISP 153 of the camera device 10 uses the parallax calculation results to generate a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object, and outputs it to the PC 50. (Step St6).
 カメラ装置10のCPU152又はISP153は、ステップSt4Aの実行タイミングから微小時間Δtbの経過(待機)を待つ(ステップSt7A)。ステップSt7Aの後、カメラ装置10の処理はステップSt1Aに戻る。 The CPU 152 or the ISP 153 of the camera device 10 waits for a minute time Δtb to elapse (standby) from the execution timing of step St4A (step St7A). After step St7A, the processing of the camera device 10 returns to step St1A.
 なお、ステップSt6において、カメラ装置10のCPU152又はISP153は、照明パターンP1の照射中に第1撮像モジュールCAP1により撮像された第1左画像P1L及び第2撮像モジュールCAP2により撮像された第1右画像P1Rの視差を算出してもよい。同様に、カメラ装置10のCPU152又はISP153は、照明パターンP2の照射中に第1撮像モジュールCAP1により撮像された第2左画像P2L及び第2撮像モジュールCAP2により撮像された第2右画像P2Rの視差を算出してもよい。カメラ装置10のCPU152又はISP153は、第1左画像P1L及び第1右画像P1Rに基づく視差と第2左画像P2L及び第2右画像P2Rに基づく視差とを合成することにより、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力してもよい。 Note that in step St6, the CPU 152 or the ISP 153 of the camera device 10 displays the first left image P1L captured by the first imaging module CAP1 and the first right image captured by the second imaging module CAP2 during irradiation with the illumination pattern P1. The P1R parallax may be calculated. Similarly, the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated. The CPU 152 or the ISP 153 of the camera device 10 combines the parallax based on the first left image P1L and the first right image P1R with the parallax based on the second left image P2L and the second right image P2R, thereby determining the standard of the camera device 10. A distance image for specifying the distance between a point (see above) and a target object may be generated and output to the PC 50.
 また、ステップSt6において、カメラ装置10のCPU152又はISP153は、照明パターンP1の照射中に第1撮像モジュールCAP1により撮像された第1左画像P1L及び第2撮像モジュールCAP2により撮像された第1右画像P1Rの視差を算出してもよい。同様に、カメラ装置10のCPU152又はISP153は、照明パターンP2の照射中に第1撮像モジュールCAP1により撮像された第2左画像P2L及び第2撮像モジュールCAP2により撮像された第2右画像P2Rの視差を算出してもよい。カメラ装置10のCPU152又はISP153は、第1左画像P1L及び第1右画像P1Rに基づく視差から距離画像を生成し、さらに、第2左画像P2L及び第2右画像P2Rに基づく視差から距離画像を生成してもよい。カメラ装置10のCPU152又はISP153は、これら2つの距離画像を合成することにより、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力してもよい。 In addition, in step St6, the CPU 152 or the ISP 153 of the camera device 10 displays a first left image P1L captured by the first imaging module CAP1 and a first right image captured by the second imaging module CAP2 during the irradiation with the illumination pattern P1. The P1R parallax may be calculated. Similarly, the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated. The CPU 152 or the ISP 153 of the camera device 10 generates a distance image from the parallax based on the first left image P1L and the first right image P1R, and further generates a distance image from the parallax based on the second left image P2L and the second right image P2R. may be generated. The CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object by combining these two distance images, and You can also output to
 なお、図11Bに示すように、カメラ装置10のAI処理部154は、ステップSt6で生成された距離画像を入力としたAI(例えばセグメンテーション用の学習済みモデル)を用いて、撮像画像IMG1中のワークOB1の領域を判定するための領域判定処理を行ってもよい(ステップSt11A)。カメラ装置10のCPU152又はISP153は、AI処理部154による領域判定処理の結果を基に、第2構造化照明LG2の第1照射強度(例えば輝度)とその第2構造化照明LG2を照射するための照射エリアとを決定してもよい(ステップSt21A)。ステップSt21Aの後、カメラ装置10の処理はステップSt1Aに戻る。 Note that, as shown in FIG. 11B, the AI processing unit 154 of the camera device 10 uses AI (for example, a trained model for segmentation) that inputs the distance image generated in step St6 to Area determination processing may be performed to determine the area of work OB1 (step St11A). The CPU 152 or the ISP 153 of the camera device 10 sets the first irradiation intensity (for example, brightness) of the second structured illumination LG2 and the second structured illumination LG2 based on the result of the area determination processing by the AI processing unit 154. The irradiation area may be determined (Step St21A). After step St21A, the processing of the camera device 10 returns to step St1A.
 また、カメラ装置10のCPU152又はISP153は、第5照射モードを照明制御部16に設定してもよい。つまり、カメラ装置10のCPU152又はISP153は、第2構造化照明LG2に対応する照明パターンP2を、第1構造化照明LG1の対象物への照射中に撮像された撮像画像に対する距離画像の生成結果を基にして動的に変更するように制御してもよい。この制御例について、図12を参照して説明する。図12は、カメラ装置10の第5動作手順の一例を示すフローチャートである。図12の説明では、第1構造化照明LG1の照明とその第1構造化照明LG1下での撮像、第2構造化照明LG2の照明とその第2構造化照明LG2下での撮像、がそれぞれ同期して実行される。 Further, the CPU 152 or the ISP 153 of the camera device 10 may set the fifth irradiation mode to the illumination control unit 16. That is, the CPU 152 or the ISP 153 of the camera device 10 generates the illumination pattern P2 corresponding to the second structured illumination LG2 as a distance image generation result for a captured image captured while the object is irradiated with the first structured illumination LG1. It may also be controlled to change dynamically based on. An example of this control will be explained with reference to FIG. 12. FIG. 12 is a flowchart illustrating an example of the fifth operation procedure of the camera device 10. In the explanation of FIG. 12, the illumination of the first structured illumination LG1 and the imaging under the first structured illumination LG1, and the illumination of the second structured illumination LG2 and the imaging under the second structured illumination LG2, respectively. executed synchronously.
 図12において、カメラ装置10の照明制御部16は、例えば時刻t1(図2参照)に第5照射モードに切り替えて照明パターンP1の照明(第1構造化照明LG1)を、対象物に向けて照明部17から照射する(ステップSt1A)。カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2のそれぞれは、ステップSt1Aで照明部17から第1構造化照明LG1が照射されている中で対象物を撮像する(ステップSt1A)。つまり、ステップSt1Aでは、照明部17からの第1構造化照明LG1の照射と同期して互いに視差を有する2つの撮像データ(例えば第1左画像P1L及び第1右画像P1R)が得られる。例えば、第1撮像モジュールCAP1から第1左画像P1Lが得られ、第2撮像モジュールCAP2から第1右画像P1Rが得られる。カメラ装置10のCPU152又はISP153は、ステップSt1Aの実行タイミングから微小時間Δtaの経過(待機)を待つ(ステップSt3A)。 In FIG. 12, the illumination control unit 16 of the camera device 10 switches to the fifth illumination mode at time t1 (see FIG. 2), for example, and directs the illumination of the illumination pattern P1 (first structured illumination LG1) toward the object. Irradiation is performed from the illumination unit 17 (step St1A). Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the first structured illumination LG1 from the illumination unit 17 in step St1A (step St1A). That is, in step St1A, two pieces of imaging data (for example, the first left image P1L and the first right image P1R) having parallax with each other are obtained in synchronization with the irradiation of the first structured illumination LG1 from the illumination unit 17. For example, the first left image P1L is obtained from the first imaging module CAP1, and the first right image P1R is obtained from the second imaging module CAP2. The CPU 152 or the ISP 153 of the camera device 10 waits for a minute time Δta to elapse (standby) from the execution timing of step St1A (step St3A).
 また、カメラ装置10のCPU152又はISP153は、ステップSt1Aでの第1構造化照明LG1の照射中に撮像された互いに視差を有する2つの撮像データ(例えば第1左画像P1L及び第1右画像P1R)を基にして視差を算出し、対象物までの距離を特定するための距離画像を生成する(ステップSt31)。カメラ装置10のCPU152又はISP153は、ステップSt31で生成された距離画像を基に、カメラ装置10から対象物までの距離を測定する(ステップSt32)。但し、このステップSt32で測定される距離の精度は決して高くなくても構わない。 Further, the CPU 152 or the ISP 153 of the camera device 10 stores two image data (for example, the first left image P1L and the first right image P1R) that have parallax from each other and that are imaged during the irradiation with the first structured illumination LG1 in step St1A. Parallax is calculated based on , and a distance image for specifying the distance to the target object is generated (Step St31). The CPU 152 or the ISP 153 of the camera device 10 measures the distance from the camera device 10 to the object based on the distance image generated in step St31 (step St32). However, the accuracy of the distance measured in this step St32 does not necessarily have to be high.
 カメラ装置10のCPU152又はISP153は、ステップSt32で測定された対象物までの距離を基に、第2構造化照明LG2を照射するための照射エリアを決定する。カメラ装置10のCPU152又はISP153は、ステップSt12で決定された照射エリアに第2構造化照明LG2を照射するように照明制御部16を制御する。照明制御部16は、決定された照射エリアにのみ照射範囲を絞る(制限する)ように第2構造化照明LG2を照明部17から照射させる(ステップSt4A)。カメラ装置10の第1撮像モジュールCAP1及び第2撮像モジュールCAP2のそれぞれは、ステップSt4Aで照明部17から第2構造化照明LG2が照射されている中で対象物を撮像する(ステップSt4A)。つまり、ステップSt4Aでは、照明部17からの第2構造化照明LG2の照射と同期して互いに視差を有する2つの撮像データ(例えば第2左画像P2L及び第2右画像P2R)が得られる。例えば、第1撮像モジュールCAP1から第2左画像P2Lが得られ、第2撮像モジュールCAP2から第2右画像P2Rが得られる。 The CPU 152 or ISP 153 of the camera device 10 determines the irradiation area for irradiating the second structured illumination LG2 based on the distance to the object measured in step St32. The CPU 152 or the ISP 153 of the camera device 10 controls the illumination control unit 16 to irradiate the irradiation area determined in step St12 with the second structured illumination LG2. The illumination control unit 16 causes the illumination unit 17 to irradiate the second structured illumination LG2 so as to narrow down (restrict) the irradiation range to only the determined irradiation area (step St4A). Each of the first imaging module CAP1 and the second imaging module CAP2 of the camera device 10 images the object while being irradiated with the second structured illumination LG2 from the illumination unit 17 in step St4A (step St4A). That is, in step St4A, two pieces of imaging data (for example, the second left image P2L and the second right image P2R) having parallax with each other are obtained in synchronization with the irradiation of the second structured illumination LG2 from the illumination unit 17. For example, the second left image P2L is obtained from the first imaging module CAP1, and the second right image P2R is obtained from the second imaging module CAP2.
 カメラ装置10のCPU152又はISP153は、ステップSt1Aで撮像された互いに視差を有する2つの撮像データの一方(例えば第1左画像P1L)とステップSt4Aで撮像された互いに視差を有する2つの撮像データの一方(例えば第2左画像P2L)とを合成する(ステップSt6)。同様に、カメラ装置10のCPU152又はISP153は、ステップSt1Aで撮像された互いに視差を有する2つの撮像データの他方(例えば第1右画像P1R)とステップSt4Aで撮像された互いに視差を有する2つの撮像データの他方(例えば第2右画像P2R)とを合成する(ステップSt6)。さらに、カメラ装置10のCPU152又はISP153は、合成により得られた2つの合成画像(例えば合成左画像P3L及び合成右画像P3R)の視差を算出する。カメラ装置10のCPU152又はISP153は、視差の算出結果を用いて、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力する(ステップSt6)。 The CPU 152 or the ISP 153 of the camera device 10 selects one of the two image data captured in step St1A that has a parallax with each other (for example, the first left image P1L) and one of the two image data that has a parallax that is captured in step St4A. (for example, second left image P2L) (step St6). Similarly, the CPU 152 or the ISP 153 of the camera device 10 selects the other of the two image data (for example, the first right image P1R) that has a parallax from each other that was imaged in step St1A, and the two images that have a parallax from each other that was imaged in step St4A. The other data (for example, the second right image P2R) is synthesized (step St6). Further, the CPU 152 or the ISP 153 of the camera device 10 calculates the parallax between the two composite images (for example, the composite left image P3L and the composite right image P3R) obtained by the composition. The CPU 152 or ISP 153 of the camera device 10 uses the parallax calculation results to generate a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object, and outputs it to the PC 50. (Step St6).
 カメラ装置10のCPU152又はISP153は、ステップSt4Aの実行タイミングから微小時間Δtbの経過(待機)を待つ(ステップSt7A)。ステップSt7Aの後、カメラ装置10の処理はステップSt1Aに戻る。 The CPU 152 or the ISP 153 of the camera device 10 waits for a minute time Δtb to elapse (standby) from the execution timing of step St4A (step St7A). After step St7A, the processing of the camera device 10 returns to step St1A.
 なお、ステップSt6において、カメラ装置10のCPU152又はISP153は、照明パターンP1の照射中に第1撮像モジュールCAP1により撮像された第1左画像P1L及び第2撮像モジュールCAP2により撮像された第1右画像P1Rの視差を算出してもよい。同様に、カメラ装置10のCPU152又はISP153は、照明パターンP2の照射中に第1撮像モジュールCAP1により撮像された第2左画像P2L及び第2撮像モジュールCAP2により撮像された第2右画像P2Rの視差を算出してもよい。カメラ装置10のCPU152又はISP153は、第1左画像P1L及び第1右画像P1Rに基づく視差と第2左画像P2L及び第2右画像P2Rに基づく視差とを合成することにより、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力してもよい。 Note that in step St6, the CPU 152 or the ISP 153 of the camera device 10 displays the first left image P1L captured by the first imaging module CAP1 and the first right image captured by the second imaging module CAP2 during irradiation with the illumination pattern P1. The P1R parallax may be calculated. Similarly, the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated. The CPU 152 or the ISP 153 of the camera device 10 combines the parallax based on the first left image P1L and the first right image P1R with the parallax based on the second left image P2L and the second right image P2R, thereby determining the standard of the camera device 10. A distance image for specifying the distance between a point (see above) and a target object may be generated and output to the PC 50.
 また、ステップSt6において、カメラ装置10のCPU152又はISP153は、照明パターンP1の照射中に第1撮像モジュールCAP1により撮像された第1左画像P1L及び第2撮像モジュールCAP2により撮像された第1右画像P1Rの視差を算出してもよい。同様に、カメラ装置10のCPU152又はISP153は、照明パターンP2の照射中に第1撮像モジュールCAP1により撮像された第2左画像P2L及び第2撮像モジュールCAP2により撮像された第2右画像P2Rの視差を算出してもよい。カメラ装置10のCPU152又はISP153は、第1左画像P1L及び第1右画像P1Rに基づく視差から距離画像を生成し、さらに、第2左画像P2L及び第2右画像P2Rに基づく視差から距離画像を生成してもよい。カメラ装置10のCPU152又はISP153は、これら2つの距離画像を合成することにより、カメラ装置10の基準地点(上述参照)と対象物との間の距離を特定するための距離画像を生成し、PC50に出力してもよい。 In addition, in step St6, the CPU 152 or the ISP 153 of the camera device 10 displays a first left image P1L captured by the first imaging module CAP1 and a first right image captured by the second imaging module CAP2 during the irradiation with the illumination pattern P1. The P1R parallax may be calculated. Similarly, the CPU 152 or the ISP 153 of the camera device 10 determines the parallax difference between the second left image P2L captured by the first imaging module CAP1 and the second right image P2R captured by the second imaging module CAP2 during irradiation with the illumination pattern P2. may be calculated. The CPU 152 or the ISP 153 of the camera device 10 generates a distance image from the parallax based on the first left image P1L and the first right image P1R, and further generates a distance image from the parallax based on the second left image P2L and the second right image P2R. may be generated. The CPU 152 or the ISP 153 of the camera device 10 generates a distance image for specifying the distance between the reference point of the camera device 10 (see above) and the target object by combining these two distance images, and You can also output to
 以上により、実施の形態1に係るカメラ装置10は、第1照射モードと、第1照射モードとは異なる第2照射モードとにより、対象物に光を照射する照明部17と、照明部17を制御する照明制御部16と、第1照射モードにより光を照射した状態で対象物を撮像した第1撮像画像と、第2照射モードにより光を照射した状態で対象物を撮像した第2撮像画像とを生成する撮像部(例えば第1撮像モジュールCAP1及び第2撮像モジュールCAP2)と、第1撮像画像と第2撮像画像とを基に、基準地点と対象物との間の距離を特定するための距離画像を生成する生成部(例えばCPU152又はISP153)と、を備える。これにより、カメラ装置10は、照明パターンが異なる2種類の構造化照明を用いるので、対象物までの距離を高精度に認識できる。 As described above, the camera device 10 according to the first embodiment has the illumination section 17 that irradiates light onto the object and the illumination section 17 in the first irradiation mode and the second irradiation mode that is different from the first irradiation mode. The illumination control unit 16 to control, a first image taken of the object while being irradiated with light in the first irradiation mode, and a second image taken of the object while being irradiated with light in the second irradiation mode. In order to specify the distance between the reference point and the target object based on the imaging unit (for example, the first imaging module CAP1 and the second imaging module CAP2) that generates the image and the first captured image and the second captured image. A generation unit (for example, CPU 152 or ISP 153) that generates a distance image. Thereby, since the camera device 10 uses two types of structured illumination with different illumination patterns, it is possible to recognize the distance to the target object with high accuracy.
 また、撮像部は、第1視点から対象物を撮像する第1撮像モジュールCAP1と、第1視点とは異なる第2視点から対象物を撮像する第2撮像モジュールCAP2と、を有する。生成部は、第1視点から撮像した第1撮像画像と第1視点から撮像した第2撮像画像とを合成した第1合成画像(例えば合成左画像P3L)と、第2視点から撮像した第1撮像画像と、第2視点から撮像した第2撮像画像とを合成した第2合成画像(例えば合成右画像P3R)とを基に、距離画像を生成する。これにより、カメラ装置10は、合成処理によって合成左画像P3L上、合成右画像P3R上の各画像特徴を見かけ上増加して視差を算出でき、対象物までの距離を高精度に認識可能となる距離画像を生成できる。 The imaging unit also includes a first imaging module CAP1 that images the object from a first viewpoint, and a second imaging module CAP2 that images the object from a second viewpoint different from the first viewpoint. The generation unit generates a first composite image (for example, a composite left image P3L) obtained by combining a first captured image captured from the first viewpoint and a second captured image captured from the first viewpoint, and a first composite image captured from the second viewpoint. A distance image is generated based on a second composite image (eg, composite right image P3R) obtained by combining the captured image and a second captured image captured from the second viewpoint. Thereby, the camera device 10 can calculate the parallax by apparently increasing each image feature on the composite left image P3L and the composite right image P3R through the composition process, and can recognize the distance to the object with high accuracy. Can generate distance images.
 また、第1照射モードは、複数の照射図形を第1間隔で配置したパターン(例えばパターン画像EX1)を照射するモードである。第2照射モードは、複数の照射図形を第1間隔とは異なる第2間隔で配置したパターン(例えばパターン画像EX2)を照射するモードである。これにより、カメラ装置10は、複数のドットの配置を異ならせた2種類の照明パターンを有する第1構造化照明LG1及び第2構造化照明LG2を用いることにより、対象物の視差算出の精度を向上でき、対象物の距離画像の生成精度も向上できる。 Further, the first irradiation mode is a mode in which a pattern (for example, pattern image EX1) in which a plurality of irradiation figures are arranged at a first interval is irradiated. The second irradiation mode is a mode in which a pattern (for example, pattern image EX2) in which a plurality of irradiation figures are arranged at a second interval different from the first interval is irradiated. Thereby, the camera device 10 uses the first structured illumination LG1 and the second structured illumination LG2, which have two types of illumination patterns in which the arrangement of a plurality of dots is different, to improve the accuracy of parallax calculation of the object. It is also possible to improve the accuracy of generating distance images of objects.
 また、第1照射モードは、複数の照射図形を第3間隔で配置したパターン(例えばパターン画像EX1)を照射するモードである。第2照射モードは、第3間隔で配置された複数の照射図形を所定量シフトしたパターンを照射するモードである。これにより、カメラ装置10は、1つの照明パターンを所定方向に所定量シフトするという簡易な方法で、複数のドットの配置を異ならせた2種類の照明パターンを有する第1構造化照明LG1及び第2構造化照明LG2を使用でき、対象物の視差算出の精度を向上でき、対象物の距離画像の生成精度も向上できる。 Further, the first irradiation mode is a mode in which a pattern (for example, pattern image EX1) in which a plurality of irradiation figures are arranged at a third interval is irradiated. The second irradiation mode is a mode in which a pattern in which a plurality of irradiation figures arranged at third intervals are shifted by a predetermined amount is irradiated. As a result, the camera device 10 can easily shift one illumination pattern by a predetermined amount in a predetermined direction, and the first structured illumination light LG1 and The two-structured illumination LG2 can be used, the accuracy of calculating the parallax of the object can be improved, and the accuracy of generating a distance image of the object can also be improved.
 また、第1照射モードは、第1サイズを有する照射図形を配置したパターンを照射するモードである。第2照射モードは、第1サイズよりも小さい第2サイズを有する照射図形を配置したパターンを照射するモードである。これにより、カメラ装置10は、1つの照明パターン中の任意の2つのドットピッチを所定倍縮小するという簡易な方法で、複数のドットの配置を異ならせた2種類の照明パターンを有する第1構造化照明LG1及び第2構造化照明LG2を使用でき、対象物の視差算出の精度を向上でき、対象物の距離画像の生成精度も向上できる。 Further, the first irradiation mode is a mode in which a pattern in which irradiation figures having a first size are arranged is irradiated. The second irradiation mode is a mode in which a pattern in which irradiation figures having a second size smaller than the first size are arranged is irradiated. As a result, the camera device 10 uses a simple method of reducing the dot pitch of any two dots in one illumination pattern by a predetermined factor to create a first structure having two types of illumination patterns in which the arrangement of a plurality of dots is different. The structured illumination LG1 and the second structured illumination LG2 can be used, the accuracy of parallax calculation of the object can be improved, and the accuracy of generation of a distance image of the object can also be improved.
 また、カメラ装置10は、第1構造化照明LG1下で撮像された画像における対象物の領域を判定するAI処理部154をさらに備える。照明制御部16は、対象物の領域を第1光により照射し、対象物に相当しない領域を第1光とは異なる第2光により照射する第3照射モードを、照明部17に実行させる。これにより、カメラ装置10は、例えば対象物が移動可能な物体である場合でも、第1構造化照明LG1下で撮像された画像中の対象物の領域を動的に判定でき、適切なエリアに第2構造化照明LG2を照射できる。 In addition, the camera device 10 further includes an AI processing unit 154 that determines the area of the object in the image captured under the first structured illumination LG1. The illumination control unit 16 causes the illumination unit 17 to execute a third irradiation mode in which a region of the object is irradiated with the first light and a region that does not correspond to the object is irradiated with the second light different from the first light. Thereby, the camera device 10 can dynamically determine the area of the object in the image captured under the first structured illumination LG1, even if the object is a movable object, and move the object to the appropriate area. A second structured illumination LG2 can be emitted.
 また、カメラ装置10は、第1構造化照明LG1下で撮像された画像における対象物の領域を判定するAI処理部154をさらに備える。照明制御部16は、画像における対象物の領域の輝度を基に定めた第1照射強度により対象物の領域を照射し、対象物に相当しない領域を第1照射強度とは異なる第2照射強度により照射する第4照射モードを、照明部17に実行させる。これにより、例えば背景物が黒色に近い色を有する場合でも、カメラ装置10は、第1構造化照明LG1に比べて照射強度(例えば輝度)を上げた第2構造化照明LG2を照射することにより、ワークの認識精度の向上を図ることができる。 In addition, the camera device 10 further includes an AI processing unit 154 that determines the area of the object in the image captured under the first structured illumination LG1. The illumination control unit 16 irradiates an area of the object with a first irradiation intensity determined based on the brightness of the area of the object in the image, and irradiates an area that does not correspond to the object with a second irradiation intensity different from the first irradiation intensity. The illumination unit 17 is caused to execute a fourth irradiation mode in which irradiation is performed by. As a result, even if the background object has a color close to black, for example, the camera device 10 can emit the second structured illumination LG2 with increased irradiation intensity (for example, brightness) compared to the first structured illumination LG1. , it is possible to improve the recognition accuracy of the workpiece.
 また、照明制御部16は、第1視点から撮像した第1撮像画像に基づく第1距離画像(例えば図12のステップSt31で生成される距離画像に相当)から、基準地点と対象物との間の距離を求め、基準地点と対象物との間の距離を基にして定めた照射エリアに照明を照射する第5照射モードを、照明部17に実行させる。これにより、カメラ装置10は、AIを用いなくても、第1構造化照明LG1の照射だけで生成した上述した第1距離画像から対象物までの距離を測距し、この測距結果を用いて第2構造化照明LG2の照射エリアを変更できる。したがって、カメラ装置10は、例えば対象物が移動可能な物体である場合でも、第1構造化照明LG1下での撮像結果から対象物までの距離及び位置を大まかに把握できるので、対象物が存在する可能性が高いエリアに絞って第2構造化照明LG2を適切に照射できる。 Further, the illumination control unit 16 determines the distance between the reference point and the target object from a first distance image (e.g., corresponding to the distance image generated in step St31 in FIG. 12) based on the first image taken from the first viewpoint. The illumination unit 17 is caused to execute a fifth illumination mode in which the illumination area is determined based on the distance between the reference point and the object and is illuminated. Thereby, the camera device 10 measures the distance to the object from the above-described first distance image generated only by irradiation with the first structured illumination LG1, and uses this distance measurement result. The irradiation area of the second structured lighting LG2 can be changed by Therefore, even if the target object is a movable object, for example, the camera device 10 can roughly grasp the distance and position to the target object from the imaging result under the first structured illumination LG1. The second structured illumination LG2 can be appropriately irradiated to an area where there is a high possibility of being exposed.
 また、照射図形は、円形、多角形のいずれかである。これにより、カメラ装置10は、第1構造化照明LG1下での撮像画像及び第2構造化照明LG2下での撮像画像のそれぞれの画像特徴を増加でき、対象物までの距離を特定するための高精度な距離画像を生成できる。また、本実施形態のカメラ装置10に含まれる要素をシステムとして運用してもよい。例えば、第1撮像モジュールCAP1と第2撮像モジュールCAP2と、第1撮像モジュールCAP1及び第2撮像モジュールCAP2以外の要素とを異なる装置内に実装し、システム100を運用してもよい。 Furthermore, the irradiation figure is either circular or polygonal. Thereby, the camera device 10 can increase the image features of the captured image under the first structured illumination LG1 and the captured image under the second structured illumination LG2, and can increase the image characteristics of the captured image under the first structured illumination LG1 and the captured image under the second structured illumination LG2. Highly accurate distance images can be generated. Further, the elements included in the camera device 10 of this embodiment may be operated as a system. For example, the system 100 may be operated by mounting the first imaging module CAP1, the second imaging module CAP2, and elements other than the first imaging module CAP1 and the second imaging module CAP2 in different devices.
 以上、図面を参照しながら各種の実施の形態について説明したが、本開示はかかる例に限定されないことは言うまでもない。当業者であれば、特許請求の範囲に記載された範疇内において、各種の変更例、修正例、置換例、付加例、削除例、均等例に想到し得ることは明らかであり、それらについても当然に本開示の技術的範囲に属するものと了解される。また、発明の趣旨を逸脱しない範囲において、上述した各種の実施の形態における各構成要素を任意に組み合わせてもよい。 Although various embodiments have been described above with reference to the drawings, it goes without saying that the present disclosure is not limited to such examples. It is clear that those skilled in the art can come up with various changes, modifications, substitutions, additions, deletions, and equivalent examples within the scope of the claims, and It is understood that it naturally falls within the technical scope of the present disclosure. Further, each component in the various embodiments described above may be combined as desired without departing from the spirit of the invention.
 なお、本出願は、2022年3月31日出願の日本特許出願(特願2022-059664)に基づくものであり、その内容は本出願の中に参照として援用される。 Note that this application is based on a Japanese patent application (Japanese Patent Application No. 2022-059664) filed on March 31, 2022, the contents of which are incorporated as a reference in this application.
 本開示は、対象物までの距離を高精度に認識するカメラ装置、画像生成方法及びシステムとして有用である。 The present disclosure is useful as a camera device, image generation method, and system that recognize the distance to an object with high accuracy.
10 カメラ装置
11 Lレンズ
12 Rレンズ
13 Lイメージセンサ
14 Rイメージセンサ
15 SoC
16 照明制御部
17 照明部
50 PC
100 システム
151 メモリ
152 CPU
153 ISP
154 AI処理部
155 外部インターフェース
CAP1 第1撮像モジュール
CAP2 第2撮像モジュール
10 Camera device 11 L lens 12 R lens 13 L image sensor 14 R image sensor 15 SoC
16 Lighting control section 17 Lighting section 50 PC
100 System 151 Memory 152 CPU
153 ISP
154 AI processing unit 155 External interface CAP1 First imaging module CAP2 Second imaging module

Claims (11)

  1.  第1照射モードと、前記第1照射モードとは異なる第2照射モードとにより、対象物に光を照射する照明部と、
     前記照明部を制御する照明制御部と、
     前記第1照射モードにより光を照射した状態で前記対象物を撮像した第1撮像画像と、前記第2照射モードにより光を照射した状態で前記対象物を撮像した第2撮像画像とを生成する撮像部と、
     前記第1撮像画像と前記第2撮像画像とを基に、基準地点と前記対象物との間の距離を特定するための距離画像を生成する生成部と、を備える、
     カメラ装置。
    an illumination unit that irradiates a target with light in a first irradiation mode and a second irradiation mode different from the first irradiation mode;
    a lighting control section that controls the lighting section;
    Generating a first captured image of the object while being irradiated with light in the first irradiation mode and a second captured image of the object while being irradiated with light in the second irradiation mode. an imaging unit;
    a generation unit that generates a distance image for specifying the distance between the reference point and the target object based on the first captured image and the second captured image;
    camera equipment.
  2.  前記撮像部は、第1視点から前記対象物を撮像する第1撮像モジュールと、前記第1視点とは異なる第2視点から前記対象物を撮像する第2撮像モジュールと、を有し、
     前記生成部は、前記第1視点から撮像した前記第1撮像画像と、前記第1視点から撮像した前記第2撮像画像とを合成した第1合成画像と、前記第2視点から撮像した前記第1撮像画像と、前記第2視点から撮像した前記第2撮像画像とを合成した第2合成画像とを基に、前記距離画像を生成する、
     請求項1に記載のカメラ装置。
    The imaging unit includes a first imaging module that images the object from a first viewpoint, and a second imaging module that images the object from a second viewpoint different from the first viewpoint,
    The generation unit generates a first composite image obtained by combining the first captured image captured from the first viewpoint and the second captured image captured from the first viewpoint, and a first composite image obtained by synthesizing the first captured image captured from the first viewpoint, and the first composite image captured from the second viewpoint. generating the distance image based on a second composite image obtained by combining the first captured image and the second captured image captured from the second viewpoint;
    The camera device according to claim 1.
  3.  前記第1照射モードは、複数の照射図形を第1間隔で配置したパターンを照射するモードであり、
     前記第2照射モードは、複数の照射図形を前記第1間隔とは異なる第2間隔で配置したパターンを照射するモードである、
     請求項1に記載のカメラ装置。
    The first irradiation mode is a mode in which a pattern in which a plurality of irradiation figures are arranged at a first interval is irradiated,
    The second irradiation mode is a mode in which a pattern in which a plurality of irradiation figures are arranged at a second interval different from the first interval is irradiated.
    The camera device according to claim 1.
  4.  前記第1照射モードは、複数の照射図形を第3間隔で配置したパターンを照射するモードであり、
     前記第2照射モードは、前記第3間隔で配置された複数の照射図形を所定量シフトしたパターンを照射するモードである、
     請求項1に記載のカメラ装置。
    The first irradiation mode is a mode in which a pattern in which a plurality of irradiation figures are arranged at third intervals is irradiated,
    The second irradiation mode is a mode in which a pattern in which a plurality of irradiation figures arranged at the third interval are shifted by a predetermined amount is irradiated.
    The camera device according to claim 1.
  5.  前記第1照射モードは、第1サイズを有する照射図形を配置したパターンを照射するモードであり、
     前記第2照射モードは、前記第1サイズよりも小さい第2サイズを有する照射図形を配置したパターンを照射するモードである、
     請求項1に記載のカメラ装置。
    The first irradiation mode is a mode in which a pattern in which irradiation figures having a first size are arranged is irradiated,
    The second irradiation mode is a mode in which a pattern in which irradiation figures having a second size smaller than the first size are arranged is irradiated.
    The camera device according to claim 1.
  6.  前記第1照射モード下で撮像された画像における前記対象物の領域を判定するAI処理部、を備え、
     前記照明制御部は、
     前記対象物の領域を第1光により照射し、前記対象物に相当しない領域を第1光とは異なる第2光により照射する第3照射モードを、前記照明部に実行させる、
     請求項1に記載のカメラ装置。
    an AI processing unit that determines a region of the object in an image captured under the first irradiation mode;
    The lighting control section includes:
    causing the illumination unit to execute a third irradiation mode in which a region of the target object is irradiated with first light and a region not corresponding to the target object is irradiated with second light different from the first light;
    The camera device according to claim 1.
  7.  前記第1照射モード下で撮像された画像における前記対象物の領域を判定するAI処理部、を備え、
     前記照明制御部は、
     前記画像における前記対象物の領域の輝度を基に定めた第1照射強度により前記対象物の領域を照射し、前記対象物に相当しない領域を前記第1照射強度とは異なる第2照射強度により照射する第4照射モードを、前記照明部に実行させる、
     請求項1に記載のカメラ装置。
    an AI processing unit that determines a region of the object in an image captured under the first irradiation mode;
    The lighting control section includes:
    The area of the object is irradiated with a first irradiation intensity determined based on the brightness of the area of the object in the image, and the area that does not correspond to the object is irradiated with a second irradiation intensity different from the first irradiation intensity. causing the illumination unit to execute a fourth irradiation mode for irradiation;
    The camera device according to claim 1.
  8.  前記照明制御部は、
     前記第1視点から撮像した前記第1撮像画像に基づく第1距離画像から、前記基準地点と前記対象物との間の距離を求め、前記基準地点と前記対象物との間の距離を基にして定めた照射エリアに照明を照射する第5照射モードを、前記照明部に実行させる、
     請求項2に記載のカメラ装置。
    The lighting control section includes:
    Determine the distance between the reference point and the object from a first distance image based on the first image taken from the first viewpoint, and calculate the distance between the reference point and the object based on the distance between the reference point and the object. causing the illumination unit to execute a fifth illumination mode in which illumination is applied to an illumination area determined by
    The camera device according to claim 2.
  9.  前記照射図形は、
     円形、多角形のいずれかである、
     請求項3から5のいずれか1項に記載のカメラ装置。
    The irradiation figure is
    Either circular or polygonal,
    The camera device according to any one of claims 3 to 5.
  10.  第1照射モードと、前記第1照射モードとは異なる第2照射モードとにより、対象物に光を照射する照明部を制御するステップと、
     前記第1照射モードにより光を照射した状態で前記対象物を撮像した第1撮像画像と、前記第2照射モードにより光を照射した状態で前記対象物を撮像した第2撮像画像とを生成するステップと、
     前記第1撮像画像と前記第2撮像画像とを基に、基準地点と前記対象物との間の距離を特定するための距離画像を生成するステップと、を含む、
     画像生成方法。
    controlling an illumination unit that irradiates the object with light in a first irradiation mode and a second irradiation mode different from the first irradiation mode;
    Generating a first captured image of the object while being irradiated with light in the first irradiation mode and a second captured image of the object while being irradiated with light in the second irradiation mode. step and
    generating a distance image for specifying the distance between the reference point and the target object based on the first captured image and the second captured image;
    Image generation method.
  11.  第1照射モードと、前記第1照射モードとは異なる第2照射モードとにより、対象物に光を照射する照明部と、
     前記照明部を制御する照明制御部と、
     前記第1照射モードにより光を照射した状態で前記対象物を撮像した第1撮像画像と、前記第2照射モードにより光を照射した状態で前記対象物を撮像した第2撮像画像とを生成する撮像部と、
     前記第1撮像画像と前記第2撮像画像とを基に、基準地点と前記対象物との間の距離を特定するための距離画像を生成する生成部と、を備える、
     システム。
    an illumination unit that irradiates a target with light in a first irradiation mode and a second irradiation mode different from the first irradiation mode;
    a lighting control section that controls the lighting section;
    Generating a first captured image of the object while being irradiated with light in the first irradiation mode and a second captured image of the object while being irradiated with light in the second irradiation mode. an imaging unit;
    a generation unit that generates a distance image for specifying the distance between the reference point and the target object based on the first captured image and the second captured image;
    system.
PCT/JP2023/007444 2022-03-31 2023-02-28 Camera device, image generation method, and system WO2023189125A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022059664 2022-03-31
JP2022-059664 2022-03-31

Publications (1)

Publication Number Publication Date
WO2023189125A1 true WO2023189125A1 (en) 2023-10-05

Family

ID=88200629

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/007444 WO2023189125A1 (en) 2022-03-31 2023-02-28 Camera device, image generation method, and system

Country Status (1)

Country Link
WO (1) WO2023189125A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001091232A (en) * 1999-09-24 2001-04-06 Sony Corp Three-dimensional shape measuring device and method, and recording medium
JP2008292432A (en) * 2007-05-28 2008-12-04 Panasonic Electric Works Co Ltd Three-dimensional measuring method and instrument by space encoding method
JP2011137753A (en) * 2009-12-28 2011-07-14 Canon Inc Measuring system, image correction method, and computer program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001091232A (en) * 1999-09-24 2001-04-06 Sony Corp Three-dimensional shape measuring device and method, and recording medium
JP2008292432A (en) * 2007-05-28 2008-12-04 Panasonic Electric Works Co Ltd Three-dimensional measuring method and instrument by space encoding method
JP2011137753A (en) * 2009-12-28 2011-07-14 Canon Inc Measuring system, image correction method, and computer program

Similar Documents

Publication Publication Date Title
EP1491936B1 (en) Focus detecting method of an image pickup device and focus detection mechanism
JP6089436B2 (en) Image processing apparatus, method of operating image processing apparatus, and imaging apparatus
JP7090446B2 (en) Image processing equipment
WO2014174779A1 (en) Motion sensor apparatus having a plurality of light sources
JP2005003385A (en) Image measuring method and image measuring apparatus
JP2013257340A (en) Three-dimensional shape measuring apparatus and method
JP2006292385A (en) System and method for processing visual information
JP6279048B2 (en) Shape measuring device
US20190320886A1 (en) Endoscope apparatus
JP6698451B2 (en) Observation device
JP5593750B2 (en) Image processing method and image processing apparatus
JP2019168286A (en) Image processing device
WO2023189125A1 (en) Camera device, image generation method, and system
JP4158750B2 (en) Autofocus control method, autofocus control device, and image processing device
JP2002031513A (en) Three-dimensional measuring device
CN112437908A (en) Depth sensing robot hand-eye camera using structured light
JP2021158348A (en) Solid-state imaging device and manufacturing method of solid-state imaging device
JP2019158393A (en) Image inspection device
JPH0634345A (en) Optical inspection device
JP2011107573A (en) System, device and method for adjusting optical axis, and program
JP2016178600A (en) Image processing apparatus, image processing method and program
JP7247700B2 (en) Spectral camera device and inspection system
JP2017119146A (en) Image processing device, operation method of image processing device, and imaging device
JP2011039322A (en) Laser projector
JP2010079387A (en) Image processing method and image processor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23779155

Country of ref document: EP

Kind code of ref document: A1