US20230224450A1 - Imaging device for acquiring three-dimensional information of workpiece surface and two-dimensional image of workpiece - Google Patents

Imaging device for acquiring three-dimensional information of workpiece surface and two-dimensional image of workpiece Download PDF

Info

Publication number
US20230224450A1
US20230224450A1 US18/000,933 US202118000933A US2023224450A1 US 20230224450 A1 US20230224450 A1 US 20230224450A1 US 202118000933 A US202118000933 A US 202118000933A US 2023224450 A1 US2023224450 A1 US 2023224450A1
Authority
US
United States
Prior art keywords
workpiece
dimensional
image
conveyor
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/000,933
Other languages
English (en)
Inventor
Zeyuan SUN
Kentaro Koga
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fanuc Corp
Original Assignee
Fanuc Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fanuc Corp filed Critical Fanuc Corp
Assigned to FANUC CORPORATION reassignment FANUC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KOGA, KENTARO, SUN, Zeyuan
Publication of US20230224450A1 publication Critical patent/US20230224450A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/20Linear translation of a whole image or part thereof, e.g. panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2522Projection by scanning of the object the position of the object changing and being recorded
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Definitions

  • the present invention relates to an imaging device for acquiring three-dimensional information of a surface of a workpiece and a two-dimensional image of the workpiece.
  • an apparatus for capturing an image with a vision sensor and detecting the shape of the surface of an object or the position of the object based on the obtained image As a vision sensor, a two-dimensional sensor for capturing a two-dimensional image of a surface of a workpiece is known. Further, a three-dimensional sensor for measuring the distance from a vision sensor to the surface of a workpiece is known. The three-dimensional position of a specific part of a workpiece can be calculated based on the distance from a three-dimensional sensor to the surface of a workpiece and the position of the three-dimensional sensor.
  • a robot device including a robot and an operation tool is known in the related art. It is known that a vision sensor is arranged in the robot device in order to detect the position of a workpiece. For example, a robot device grasps a workpiece using a hand and conveys the workpiece. The robot device needs to detect the position and the orientation of a workpiece in order to grasp the workpiece.
  • a control for calculating the position and the orientation of a workpiece by processing a two-dimensional image captured by a two-dimensional camera and a distance image captured by a three-dimensional sensor e.g., Japanese Unexamined Patent Publication No. 2013-36988A).
  • three-dimensional information acquired by the three-dimensional sensor and a two-dimensional image acquired by the two-dimensional sensor are used in some cases.
  • the workpiece conveyed by the conveyor is imaged by a three-dimensional sensor.
  • a three-dimensional map of measurement points is acquired from the output of the three-dimensional sensor.
  • the position and the orientation of the workpiece can be detected based on the three-dimensional map.
  • the position and the orientation of the robot when the robot takes out the workpiece can be calculated.
  • a two-dimensional image can be used in order to specify the position of the workpiece.
  • the workpiece has been moved by the conveyor.
  • a deviation occurs between the position of the workpiece when the three-dimensional map is acquired and the position of the workpiece when the two-dimensional image is captured.
  • acquisition of three-dimensional information by the three-dimensional sensor and acquisition of a two-dimensional image by the two-dimensional sensor are preferably performed in a state where the workpiece is arranged at the same position.
  • imaging by the three-dimensional sensor and imaging by the two-dimensional sensor are simultaneously performed.
  • the three-dimensional sensor is a stereo camera including two cameras and a projector
  • the projector projects the reference pattern onto the workpiece.
  • the reference pattern appears on the two-dimensional image.
  • the three-dimensional information can be acquired by the three-dimensional sensor and the two-dimensional image can be acquired by the two-dimensional sensor in a state where the carrier machine is stopped.
  • the work efficiency is lowered because it is necessary to stop the carrier machine every time the three-dimensional information and the two-dimensional image are acquired by the vision sensors.
  • the carrier machine is started or stopped, there is a possibility that the position of the workpiece on the carrier machine changes. That is, the position of the workpiece on the carrier machine may be shifted.
  • a sensor other than the stereo camera can be used as the three-dimensional sensor.
  • a Time of Flight (TOF) camera using the flight time of light can be used as the three-dimensional sensor.
  • TOF Time of Flight
  • a TOF camera there is a problem that it is necessary to further arrange a camera for acquiring a two-dimensional image.
  • the imaging device for imaging a workpiece conveyed in a predetermined direction by a carrier machine driven by a motor.
  • the imaging device includes a three-dimensional sensor for detecting three-dimensional information of the surface of a workpiece and a two-dimensional sensor for acquiring a two-dimensional image of the surface of the workpiece.
  • the imaging device includes a processing unit for processing output of the three-dimensional sensor and output of the two-dimensional sensor.
  • the processing unit includes a three-dimensional information generation unit for generating three-dimensional information based on the output of the three-dimensional sensor, and a two-dimensional image acquisition unit for acquiring a two-dimensional image from the two-dimensional sensor.
  • the processing unit includes a movement control unit for changing the relative position of the three-dimensional information with respect to the two-dimensional image in a predetermined coordinate system.
  • the carrier machine includes a position detector for detecting the position of the movement member moved by the motor.
  • the timing for acquiring the three-dimensional information is different from the timing for acquiring the two-dimensional image.
  • the movement control unit calculates the movement amount of the workpiece corresponding to the difference between the first position of the movement member when the three-dimensional information is acquired and the second position of the movement member when the two-dimensional image is acquired.
  • the movement control unit performs control for moving the three-dimensional information in a manner to correspond to the movement amount of the workpiece in the coordinate system, and moving the three-dimensional information of the workpiece into the region of the two-dimensional image of the workpiece.
  • an imaging device for acquiring three-dimensional information on the surface of the workpiece and a two-dimensional image of the workpiece can be provided.
  • FIG. 1 is a front view of a first robot device according to an embodiment.
  • FIG. 2 is a plan view of a first robot device.
  • FIG. 3 is a block diagram of a first robot device.
  • FIG. 4 is a schematic diagram of a vision sensor according to an embodiment.
  • FIG. 5 is a flowchart of a control of a first robot device.
  • FIG. 6 is a front view of a workpiece, a vision sensor, and a conveyor when a three-dimensional map of the workpiece is generated.
  • FIG. 7 is a plan view of a workpiece and a conveyor illustrating measurement points of a three-dimensional map acquired by the vision sensor.
  • FIG. 8 is a front view of a workpiece, a vision sensor, and a conveyor when a two-dimensional image of the workpiece is acquired.
  • FIG. 9 is an example of a two-dimensional image acquired with a vision sensor.
  • FIG. 10 is a plan view of a second robot device according to an embodiment.
  • the robot device includes a robot and an operation tool attached to the robot. Further, the robot device includes an imaging device for imaging a workpiece, and a carrier machine for conveying the workpiece.
  • FIG. 1 is a schematic front view of a first robot device according to the present embodiment.
  • FIG. 2 is a schematic plan view of the first robot device according to the present embodiment.
  • a first robot device 8 includes a hand 2 as an operation tool (end effector) and a robot 1 for moving the hand 2 .
  • the robot 1 according to the present embodiment is an articulated robot including a plurality of joints.
  • the robot 1 includes a base 14 fixed to an installation surface, and a turning base 13 rotating with respect to the base 14 .
  • the robot 1 includes a lower arm 12 rotatably supported by a turning base 13 and an upper arm 11 rotatably supported by the lower arm 12 . Further, the upper arm 11 rotates about a rotation axis parallel to the extending direction of the upper arm 11 .
  • the robot 1 includes a wrist 15 rotatably supported at the end of the upper arm 11 .
  • a flange 16 configured rotatably is disposed at the tip of the wrist 15 .
  • the robot 1 according to the present embodiment includes six drive axes, but is not limited to this configuration. Any robot that can move an operation tool can be employed.
  • a workpiece 81 according to the present embodiment is a rectangular parallelepiped box.
  • Hand 2 is an operation tool for grasping and releasing the workpiece 81 .
  • the hand 2 according to the present embodiment includes a plurality of absorption pads 2 a .
  • the hand 2 grasps the workpiece 81 by suction.
  • the hand 2 is fixed to the flange 16 of the wrist 15 .
  • the operation tool attached to the robot 1 is not limited to this configuration. Any operation tool can be employed depending on the task performed by the robot device. For example, in a robot device for arc welding, a welding torch can be arranged as a working tool. Alternatively, in a robot device that applies a sealant to the surface of the workpiece, a dispenser can be arranged as an operation tool.
  • the first robot device 8 includes a conveyor 6 as a carrier machine that conveys the workpiece 81 in a predetermined direction.
  • the conveyor 6 according to the present embodiment rotates an annular belt 6 a .
  • the conveyor 6 moves the workpiece 81 horizontally, as indicated by an arrow 86 .
  • the conveyor 6 conveys the workpiece 81 to a position where the robot 1 change its position and orientation so that the hand 2 can grasp the workpiece 81 .
  • the robot device 8 grasps the workpiece 81 conveyed by the conveyor 6 , and then moves the workpiece 81 to the target position. For example, the robot device 8 performs the task of stacking the workpiece 81 on the upper side of the pallet.
  • the robot device 8 includes an imaging device 3 for imaging the workpiece 81 conveyed by the conveyor 6 .
  • the imaging device 3 includes a vision sensor 30 as a three-dimensional sensor for detecting three-dimensional information on the surface of the workpiece 81 .
  • the position information of a three-dimensional measurement point (three-dimensional point) corresponding to the surface of the workpiece 81 as an object is generated by the output of the vision sensor 30 .
  • the vision sensor 30 according to the present embodiment also functions as a two-dimensional sensor for capturing a two-dimensional image of the surface of the workpiece 81 .
  • the vision sensor 30 is supported by a support member 37 .
  • the position of the vision sensor 30 according to the present embodiment is fixed.
  • the vision sensor 30 is disposed at a position where the vision sensor 30 can image the workpiece 81 conveyed by the conveyor 6 .
  • the vision sensor 30 is disposed upstream of the robot 1 in the direction where the workpiece 81 is conveyed.
  • the position and the orientation of the workpiece 81 are detected based on the three-dimensional information acquired from the output of the vision sensor 30 .
  • the position and the orientation of the robot 1 for grasping the workpiece 81 by the hand 2 are calculated based on the position and the orientation of the workpiece 81 .
  • the vision sensor 30 images the workpiece 81 without stopping the conveyor 6 . That is, the vision sensor 30 images the workpiece 81 while the conveyor 6 moves the workpiece 81 . Further, the robot 1 changes the position and the orientation so as to grasp the workpiece 81 by hand 2 while the conveyor 6 moves the workpiece 81 . The workpiece 81 is picked up from the conveyor 6 by the robot 1 changing the position and the orientation.
  • the robot device 8 is set with the world coordinate system 76 as a reference coordinate system.
  • the origin of the world coordinate system 76 is located in the base 14 of the robot 1 . Even when the position and the orientation of the robot 1 change, the position and the direction of the world coordinate system 76 do not change.
  • the world coordinate system 76 includes an X-axis, a Y-axis, and a Z-axis orthogonal to each other as coordinate axes.
  • the W-axis is set as a coordinate axis around the X-axis.
  • a P-axis is set as a coordinate axis around the Y-axis.
  • An R-axis is set as a coordinate axis around the Z-axis.
  • a tool coordinate system 77 including an origin set at an arbitrary position of the operation tool is set.
  • the origin of the tool coordinate system 77 is set at the tool tip point of hand 2 .
  • the position and the orientation of the tool coordinate system 77 change.
  • the position of the robot 1 corresponds to the position of the tool tip point.
  • the orientation of the robot 1 corresponds to the direction of tool coordinate system 77 with respect to the world coordinate system 76 .
  • a sensor coordinate system 78 is set corresponding to the vision sensor 30 .
  • the sensor coordinate system 78 is a coordinate system whose origin is fixed to the vision sensor 30 .
  • the sensor coordinate system 78 according to the present embodiment has a fixed position.
  • the coordinate values in the sensor coordinate system 78 can be converted to the coordinate values in the world coordinate system 76 based on the position and the orientation of the sensor coordinate system 78 with respect to the world coordinate system 76 .
  • FIG. 3 is a block diagram of the first robot device according to the present embodiment.
  • the robot 1 includes a robot drive device for changing the position and the orientation of the robot 1 .
  • the robot drive device includes a robot drive motor 22 for driving a component such as an arm and a wrist.
  • the hand 2 includes a hand drive device for driving the hand 2 .
  • the hand drive device includes a pump 21 and a solenoid valve, for depressurizing the interior of the absorption pad 2 a of the hand 2 .
  • the controller of the robot device 8 includes a robot controller 4 for controlling the robot 1 and the hand 2 .
  • the robot controller 4 includes an arithmetic processing device (computer) including a Central Processing Unit (CPU) as a processor.
  • the arithmetic processing device includes a Random Access Memory (RAM) and a Read Only Memory (ROM), or the like, connected to the CPU via a bus.
  • An operation program 41 created in advance in order to control the robot 1 , the hand 2 , and the conveyor 6 is input to the robot controller 4 .
  • the robot controller 4 includes a storage part 42 that stores information about the control of the robot 1 , the hand 2 , and the conveyor 6 .
  • the storage part 42 may be configured of a storage storing information, such as a volatile memory, a nonvolatile memory, or a hard disk.
  • the operation program 41 is stored in the storage part 42 .
  • the robot device 8 conveys the workpiece 81 based on the operation program 41 .
  • the robot device 8 can automatically convey the workpiece 81 to a predetermined target position.
  • the robot controller 4 includes an operation control unit 43 that transmits an operation command.
  • the operation control unit 43 corresponds to a processor driven according to the operation program 41 .
  • the processor reads the operation program 41 and performs the control specified in the operation program 41 , thereby functioning as the operation control unit 43 .
  • the operation control unit 43 transmits an operation command for driving the robot 1 to the robot drive part 45 based on the operation program 41 .
  • the robot drive part 45 includes an electrical circuit for driving the robot drive motor 22 .
  • the robot drive part 45 supplies electricity to the robot drive motor 22 based on the operation command.
  • the operation control unit 43 transmits an operation command for driving the hand 2 to the hand drive part 44 based on the operation program 41 .
  • the hand drive part 44 includes an electrical circuit for driving a pump 21 and the solenoid valve.
  • the hand drive part 44 supplies electricity to the pump 21 and the solenoid valve based on the operation command.
  • the operation control unit 43 sends a command, to the vision sensor 30 , for capturing an image based on the operation program 41 .
  • the robot controller 4 includes a display device 46 for displaying information about the control of the hand 2 , the robot 1 , and the conveyor 6 .
  • the display device 46 is composed of an arbitrary display panel such as a liquid crystal display panel.
  • the robot 1 includes a state detector for detecting the position and the orientation of the robot 1 .
  • the state detector according to the present embodiment includes a position detector 23 mounted on a robot drive motor 22 corresponding to a drive axis of a component of the robot 1 .
  • the position detector 23 according to the present embodiment is constituted by an encoder attached to the output shaft of the robot drive motor 22 . The position and the orientation of the robot 1 are detected based on the output of the position detector 23 .
  • the robot controller 4 includes an image processing unit 47 as a processing unit for processing the output of the three-dimensional sensor and the output of the two-dimensional sensor. That is, the robot controller 4 also functions as a device processing an image.
  • the image processing unit 47 includes a three-dimensional information generation unit 61 for generating three-dimensional information based on the output of the three-dimensional sensor.
  • the image processing unit 47 includes a two-dimensional image acquisition unit 62 for acquiring the two-dimensional image from the two-dimensional sensor. Further, the image processing unit 47 includes a movement control unit 63 for changing the relative position of the three-dimensional information with respect to the two-dimensional image, in a predetermined coordinate system.
  • the image processing unit 47 includes a feature portion detection unit 64 for detecting a predetermined feature portion of the workpiece 81 based on a previously created reference image and a two-dimensional image acquired from the two-dimensional sensor.
  • the image processing unit 47 includes a calculation unit 65 for calculating a position and an orientation of the workpiece 81 based on three-dimensional information in a feature portion of the workpiece 81 . Further, the calculation unit 65 calculates the position and the orientation of the robot 1 based on the position and the orientation of the workpiece 81 .
  • the position and the orientation of the robot 1 calculated in the calculation unit 65 are transmitted to the operation control unit 43 .
  • the operation control unit 43 controls the robot 1 and the hand 2 based on an operation command received from the calculation unit 65 .
  • Each unit of the image processing unit 47 , the three-dimensional information generation unit 61 , the two-dimensional image acquisition unit 62 , the movement control unit 63 , the feature portion detection unit 64 , and the calculation unit 65 corresponds to a processor driven according to the operation program 41 .
  • the processor performs the control defined in the operation program 41 so as to function as each unit.
  • the controller of the robot device 8 includes a conveyor controller 5 for controlling a conveyor 6 .
  • the conveyor controller 5 includes an arithmetic processing device (computer) that includes a CPU as a processor, a ROM, and a RAM.
  • the conveyor controller 5 is configured to communicate with the robot controller 4 .
  • the conveyor controller 5 receives a command from the robot controller 4 and drives the conveyor 6 .
  • the conveyor 6 includes a conveyor drive device 26 for driving the conveyor 6 .
  • the conveyor drive device 26 includes a conveyor drive motor 24 as a motor for driving the conveyor 6 and a reduction gear for reducing the rotation speed of the conveyor drive motor 24 .
  • the workpiece 81 is placed on the surface of a belt 6 a as a movement member moved by the conveyor drive motor 24 .
  • the conveyor drive motor 24 rotates the belt 6 a .
  • the conveyor 6 includes a position detector 25 for detecting the position of the belt 6 a .
  • the position of the belt 6 a corresponds to the rotational position of the output shaft of the conveyor drive motor 24 .
  • the position detector 25 according to the present embodiment is attached to the output shaft of the conveyor drive motor 24 .
  • the position detector 25 is composed of an encoder for detecting the rotational position of the output shaft.
  • the output of the position detector 25 is input to the conveyor controller 5 .
  • the position detector of the carrier machine can be disposed at an arbitrary position so as to detect the position of the movement member of the carrier machine.
  • an encoder may be attached to a shaft supporting the belt of the conveyor.
  • a disk may be attached to the encoder and pressed against the belt such that the disk rotates by the movement of the conveyor belt. With this configuration, when the belt moves, the disk rotates, and the position of the belt can be detected by the output of the encoder.
  • the conveyor controller 5 includes a storage part 52 that stores information about the control of the conveyor 6 .
  • the storage part 52 may be configured of a storage storing information, such as a volatile memory, a nonvolatile memory, or a hard disk.
  • the conveyor controller 5 includes an operation control unit 53 that transmits an operation command for the conveyor 6 .
  • a processor of the arithmetic processing device functions as an operation control unit 53 .
  • the conveyor controller 5 includes a conveyor drive part 54 including an electrical circuit that supplies electricity to the conveyor drive motor 24 based on an operation command.
  • the controller of the robot device 8 includes the robot controller 4 for controlling the robot 1 and the hand 2 and the conveyor controller 5 for controlling the conveyor 6 , but is not limited to this configuration.
  • the robot device 8 may be configured to control the robot 1 , the hand 2 , and the conveyor 6 by one controller.
  • FIG. 4 is a schematic view of the vision sensor according to the present embodiment.
  • the vision sensor 30 according to the present embodiment is a stereo camera including a first camera 31 and a second camera 32 .
  • Each of the cameras 31 and 32 is a two-dimensional camera capable of capturing a two-dimensional image.
  • the cameras 31 and 32 may each be any camera including an image sensor such as a Charge-Coupled Device (CCD) sensor or a Complementary Metal-Oxide Semiconductor (CMOS) sensor.
  • CCD Charge-Coupled Device
  • CMOS Complementary Metal-Oxide Semiconductor
  • the vision sensor 30 includes a projector 33 that projects a pattern light such as a stripe pattern toward the workpiece 81 .
  • the cameras 31 , 32 and the projector 33 are arranged inside the housing 34 .
  • the projector 33 projects the pattern light, and cameras 31 and 32 capture the two-dimensional image.
  • the three-dimensional information generation unit 61 of the image processing unit 47 can generate three-dimensional information of the surface of the object with a three-dimensional map by processing the image acquired by the vision sensor 30 .
  • the three-dimensional information includes information about the positions of a plurality of measurement points set on the surface of the object.
  • the three-dimensional map represents the position of the surface of the object by a set of coordinate values (x, y, z) of measurement points set on the surface of the object.
  • the image processing unit 47 is included in the robot controller 4 for controlling the robot 1 , but is not limited to this configuration.
  • the arithmetic processing device for processing the image acquired by the vision sensor 30 may be disposed separately from the robot controller.
  • the three-dimensional information generation unit 61 sets a plurality of measurement points on the surface of an object disposed within the imaging range 35 of the vision sensor 30 .
  • the measurement point can be set for each pixel of the two-dimensional image of the camera 31 or the camera 32 .
  • the three-dimensional information generation unit 61 calculates the distance from the vision sensor 30 to the measurement point based on the parallax of the two-dimensional images captured by the two cameras 31 and 32 .
  • the three-dimensional information generation unit 61 can calculate the coordinate value of the measurement point in the sensor coordinate system 78 based on the distance from the vision sensor 30 to the measurement point.
  • the coordinate values in the sensor coordinate system 78 may be converted to the coordinate values in the world coordinate system 76 based on the position and the orientation of the vision sensor 30 .
  • the three-dimensional information generation unit 61 can form a three-dimensional map including coordinate values of a plurality of measurement points.
  • the vision sensor 30 has a function as a two-dimensional sensor. Since the first camera 31 and the second camera 32 are two-dimensional cameras, a two-dimensional image can be captured by either of cameras 31 or 32 . In the present embodiment, the two-dimensional image is acquired by the first camera 31 .
  • the three-dimensional sensor according to the present embodiment is a stereo camera capable of capturing a two-dimensional image.
  • a two-dimensional image can be acquired by one two-dimensional camera included in the stereo camera.
  • Both the position information of the three-dimensional measurement point and the two-dimensional image can be acquired by one sensor.
  • the configuration of the imaging device can be simplified.
  • the vision sensor 30 acquires a three-dimensional map including the measurement point information of the surface of the workpiece 81 . Thereafter, a two-dimensional image including the same workpiece 81 is captured by the first camera 31 of the vision sensor 30 .
  • the timing when performing imaging by the vision sensor 30 in order to acquire the three-dimensional map and the timing when capturing the two-dimensional image are different from each other.
  • the position of the workpiece when the three-dimensional map is acquired and the position of the workpiece 81 when the two-dimensional image is captured are different from each other.
  • the movement control unit 63 obtains the output of the position detector 25 of the conveyor 6 .
  • the actual movement amount of the workpiece 81 moved from the time when the three-dimensional map is acquired to the time when the two-dimensional image is captured is calculated.
  • control is performed to move the three-dimensional information, so as to correspond to the movement amount of the workpiece 81 , and so as to correspond to the two-dimensional image in a predetermined coordinate system.
  • at least a part of the three-dimensional information of the workpiece, and the two-dimensional image of the workpiece can be overlapped in the above coordinate system.
  • the three-dimensional information and the two-dimensional image that are equivalent to the three-dimensional information and the two-dimensional image acquired at the same time can be generated.
  • FIG. 5 is a flowchart of the control of the robot device according to the present embodiment.
  • FIG. 6 is a schematic front view of the vision sensor, the workpiece, and the conveyor when the three-dimensional map is acquired.
  • the conveyor controller 5 drives the conveyor 6 so as to move workpiece 81 as illustrated by an arrow 86 .
  • the conveyor controller 5 moves the workpiece 81 into the imaging range 35 of the vision sensor 30 .
  • a sensor for detecting the arrival of the workpiece 81 is arranged in the robot device 8 according to the present embodiment.
  • the position detector 25 detects the position at the time when the sensor reacts.
  • the operation control unit 53 calculates the rotational position when the workpiece 81 is placed inside the imaging range 35 based on the above-described position.
  • the position detector 25 detects the rotational position, which indicates that the workpiece 81 is located inside the imaging range 35 .
  • the vision sensor 30 images the workpiece 81 .
  • the three-dimensional information generation unit 61 generates three-dimensional information.
  • the three-dimensional information generation unit 61 generates a three-dimensional map by using the two-dimensional images captured by the cameras 31 and 32 .
  • the conveyor controller 5 acquires the first rotational position from the position detector 25 attached to the conveyor drive motor 24 .
  • the conveyor controller 5 acquires the first rotational position simultaneously with imaging the workpiece 81 by the vision sensor 30 .
  • the first rotational position corresponds to a first position of the belt 6 a when the three-dimensional information is acquired.
  • the storage part 52 of the conveyor controller 5 stores the first rotational position.
  • FIG. 7 illustrates a three-dimensional map generated by the three-dimensional information generation unit.
  • FIG. 7 illustrates a plan view of the conveyor 6 and the workpiece 81 .
  • the imaging range 35 of the vision sensor 30 determines the measurement region 71 in which the measurement points are set.
  • the three-dimensional map includes the coordinate values of the positions of a plurality of measurement points PX.
  • the measurement points PX are set on the surface of an object to be imaged inside the measurement region 71 .
  • the measurement region 71 corresponds, for example, to an image captured by the first camera 31 .
  • the measurement points PX are set on the surface of the workpiece 81 , the surface of the conveyor 6 , and the floor surface.
  • a coordinate value of the position of the measurement point PX in a predetermined coordinate system is calculated.
  • the coordinate value of the position of each measurement point PX is calculated in the sensor coordinate system 78 .
  • a plurality of measurement points PX set on the surface of the workpiece 81 are referred to as a group 71 a of the measurement points PX.
  • a plurality of measurement points PX set on the surface of the frame body of the conveyor 6 is called a group 71 b of the measurement points PX.
  • a plurality of measurement points PX set on the surface of the belt 6 a is called a group 71 c of measurement points PX.
  • a plurality of measurement points PX set on the floor surface is called a group 71 d of measurement points PX.
  • FIG. 8 illustrates a schematic front view of the vision sensor, the workpiece, and the conveyor when the two-dimensional image is captured.
  • the position of the workpiece 81 when the three-dimensional map is acquired is indicated by a broken line.
  • the workpiece 81 moves as indicated by the arrow 86 .
  • the first camera 31 of the vision sensor 30 captures a two-dimensional image.
  • the first camera 31 captures an image of the workpiece 81 when the workpiece 81 is arranged in the imaging range 35 .
  • the vision sensor 30 can capture a two-dimensional image immediately after performing imaging for acquiring a three-dimensional map.
  • the vision sensor 30 can capture a two-dimensional image after a predetermined time elapses after performing imaging for acquiring a three-dimensional map.
  • the two-dimensional image acquisition unit 62 acquires a two-dimensional image from the first camera 31 of the vision sensor 30 .
  • the two-dimensional image may be acquired by the second camera 32 .
  • the conveyor controller 5 acquires the second rotational position from the position detector 25 of the conveyor drive motor 24 .
  • the conveyor controller 5 acquires the second rotational position simultaneously with capturing the two-dimensional image of the workpiece 81 .
  • the second rotational position corresponds to the second position of the belt 6 a when the two-dimensional image is acquired.
  • the storage part 52 stores the second rotational position.
  • FIG. 9 illustrates an example of a two-dimensional image captured by the first camera of the vision sensor.
  • An image 72 is a two-dimensional image.
  • An image coordinate system 79 is set in the image 72 according to the present embodiment.
  • the image coordinate system 79 is a coordinate system whose origin is a predetermined point of the image 72 .
  • the image coordinate system 79 is composed of an X-axis and a Y-axis.
  • the image 72 includes an image 72 a corresponding to the top face of the workpiece 81 .
  • the image 72 includes an image 72 b corresponding to the frame of the conveyor 6 and an image 72 c corresponding to the belt 6 a .
  • FIG. 9 when the measurement region 71 is superimposed on the image 72 , the measurement point group 71 a corresponding to the top face of the workpiece 81 is illustrated by a broken line.
  • the three-dimensional map and the two-dimensional image are acquired while moving the workpiece 81 on the conveyor 6 .
  • the position of the workpiece 81 when the two-dimensional image is captured is different from the position of the workpiece 81 when performing imaging for acquiring the three-dimensional map is performed.
  • the position of the image 72 a of the workpiece 81 in the image 72 is shifted from the position of the group 71 a of the measurement points on the top face of the workpiece 81 .
  • the movement control unit 63 of the image processing unit 47 performs control so as to move the positions of all measurement points of the three-dimensional map as indicated by an arrow 87 .
  • the positions of the measurement points are modified based on the direction in which the workpiece 81 has moved and the movement amount of the workpiece 81 .
  • the movement control unit 63 performs control for matching the position of the measurement point group 71 a set on the top face of the workpiece 81 to the position of the image 72 a.
  • the movement control unit 63 calculates the movement distance of the workpiece 81 on the conveyor 6 as the movement amount of the workpiece 81 .
  • the movement distance of the workpiece 81 corresponds to the length of movement of the belt 6 a .
  • the movement control unit 63 acquires the first rotational position and the second rotational position of the conveyor drive motor 24 from the storage part 52 of the conveyor controller 5 .
  • the movement control unit 63 calculates the movement distance of the workpiece 81 based on the difference between the first rotational position and the second rotational position.
  • the position detector 25 is configured to output a signal at each predetermined rotation angle.
  • the signal output from the position detector 25 corresponds to the rotational position.
  • the position detector 25 detects the first count CT1 as the first rotational position.
  • the position detector 25 detects the second count CT2 as the second rotational position.
  • the movement amount of the belt 6 a (movement amount of the workpiece) relative to the difference of counts is predetermined.
  • the coefficient SC [count/mm] of the count number with respect to the movement amount of 1 mm of the belt 6 a is predetermined.
  • the movement distance X of the workpiece 81 at this time can be expressed by the following equation (1).
  • the movement control unit 63 calculates the movement distance of the workpiece 81 on the conveyor 6 based on the output of the position detector 25 .
  • the movement control unit 63 modifies the three-dimensional map according to the moving direction of the workpiece 81 and the movement amount of the workpiece 81 .
  • the movement control unit 63 moves the positions of the measurement points included in the three-dimensional map in the moving direction of the workpiece 81 .
  • the moving direction of the workpiece 81 in the sensor coordinate system 78 is predetermined. In this example, the workpiece 81 moves in the direction of the X-axis of the sensor coordinate system 78 .
  • the positions of the measurement points in the measurement region 71 are detected by the coordinate values in the sensor coordinate system 78 .
  • the movement control unit 63 moves the positions of the measurement points included in the three-dimensional map by the movement amount of the workpiece 81 . Further, the movement control unit 63 parallel-translates the position of the measurement point included in the three-dimensional map in the moving direction of the workpiece 81 .
  • the movement control unit 63 moves the position for each measurement point.
  • the three-dimensional map includes the coordinate values of the measurement point P 1 A located on the surface of the workpiece 81 .
  • the movement control unit 63 moves the measurement point P 1 A by the movement amount of the workpiece 81 in the direction in which the workpiece 81 moves.
  • the measurement point P 1 A moves to the measurement point P 1 B.
  • the X-axis coordinate value of the measurement point P 1 A in the sensor coordinate system 78 is modified by the movement amount of the workpiece 81 .
  • the three-dimensional information includes the coordinate values of the measurement point P 2 A set on the surface of the workpiece 81 .
  • the movement control unit 63 moves the measurement point P 2 A to the measurement point P 2 B in a similar manner.
  • the X-axis coordinate value of the measurement point P 2 A in the sensor coordinate system 78 is modified by the movement amount of the workpiece 81 .
  • the coordinate values of the measurement point can be changed so as to correspond to the movement of the workpiece 81 .
  • all measurement points in the measurement region 71 are moved.
  • the moving direction of the measurement point groups 71 a to 71 d in the image 72 is preset.
  • the movement control unit 63 moves the measurement point groups 71 a to 71 d in the Y-axis positive direction in the image coordinate system 79 .
  • the position of the vision sensor 30 is fixed.
  • the ratio of the movement distance of the measurement point in the image coordinate system 79 to the actual movement distance of the workpiece 81 is preset.
  • the movement control unit 63 calculates the movement distance of the measurement point groups 71 a to 71 d in the image coordinate system 79 based on the movement distance of the workpiece 81 .
  • the movement control unit 63 moves the three-dimensional map as indicated by the arrow 87 . At least a part of the measurement points of the group 71 a moves into the region of the image 72 a that is a two-dimensional image of the workpiece 81 . As a result, at least some measurement points of the group 71 a are included in the region of the image 72 a.
  • the feature portion detection unit 64 of the image processing unit 47 detects the feature portion of the workpiece 81 in the image 72 which is a two-dimensional image.
  • the top face of the workpiece 81 is set as a feature portion.
  • the reference image of the top face of the workpiece 81 is previously stored in the storage part 42 .
  • the reference image is a two-dimensional image.
  • the reference image may be an image obtained by actually capturing an image of the top face of the workpiece 81 with the two-dimensional camera.
  • the reference image of the workpiece 81 may be generated based on the three-dimensional data of the workpiece 81 using a Computer Aided Design (CAD) apparatus.
  • CAD Computer Aided Design
  • the feature portion detection unit 64 detects the image 72 a on the top face of the workpiece 81 in the image 72 by a template matching method by using the reference image.
  • the calculation unit 65 calculates the position and the orientation of the workpiece 81 .
  • the position of the workpiece 81 for example, the position of the center of gravity of the rectangle on the top face of the workpiece 81 can be adopted.
  • the orientation of the workpiece the normal direction of the top face of the workpiece 81 can be adopted.
  • the calculation unit 65 extracts the measurement points arranged in a region overlapping the image 72 a . In this example, in order to overlap the measurement point group 71 a with the image 72 a , the calculation unit 65 extracts a plurality of measurement points included in the group 71 a .
  • the calculation unit 65 can calculate the position and the orientation of the workpiece 81 based on the coordinate values of the plurality of measurement points.
  • the feature portion of the workpiece 81 can be detected in the two-dimensional image, and the position and the orientation of the workpiece 81 can be detected based on the three-dimensional information in the feature portion of the workpiece 81 .
  • step 98 calculation unit 65 calculates the position and the orientation of the robot 1 based on the position and the orientation of the workpiece 81 .
  • a calculation unit 65 calculates a position and an orientation of the workpiece 81 when the workpiece 81 is moved by a conveyor 6 to a position in which the workpiece 81 is held by the hand 2 .
  • the calculation unit 65 calculates the position and the orientation of the robot 1 based on the position and the orientation of the workpiece 81 at that time.
  • step 99 the calculation unit 65 transmits commands, to the operation control unit 43 , for driving the robot 1 and the hand 2 .
  • the operation control unit 43 drives the robot 1 and the hand 2 based on the commands from the calculation unit 65 .
  • the workpiece 81 is held by the hand 2 and conveyed to a target position.
  • the imaging device 3 can synchronize the three-dimensional information with the two-dimensional image. That is, the three-dimensional information can be modified as though the three-dimensional information has been acquired at the same time as the two-dimensional image.
  • imaging the workpiece 81 and grasping the workpiece 81 by the hand 2 can be performed while moving the workpiece 81 by the conveyor 6 .
  • the robot device 8 it is not necessary to stop the conveyor 6 for acquiring the three-dimensional information and the two-dimensional image, and the working time can be shortened. Alternatively, it is possible to prevent the position of the workpiece from shifting on the conveyor 6 when the conveyor 6 stops or starts.
  • the imaging device 3 described above captures a two-dimensional image after acquiring three-dimensional information, but is not limited to this configuration.
  • the imaging device may acquire three-dimensional information after capturing a two-dimensional image.
  • the movement control unit can perform control for moving the position of the measurement point in the three-dimensional information so as to match the position of the two-dimensional image.
  • the carrier machine of the first robot device 8 is a conveyor 6 , but is not limited to this configuration.
  • As the carrier machine any device for conveying the workpiece 81 in a predetermined direction can be adopted.
  • FIG. 10 is a schematic plan view of a second robot device according to the present embodiment.
  • a carrier 7 as a carrier machine is arranged in the second robot device 9 .
  • the carrier 7 according to the present embodiment is an unmanned traveling carriage that automatically moves along the tape 39 .
  • a carrier controller is arranged in place of the conveyor controller 5 of the first robot device 8 .
  • the carrier controller controls the carrier 7 by radio communication, for example.
  • the carrier 7 conveys the workpiece 81 .
  • the carrier 7 includes a mounting table 7 a on which the workpiece 81 is placed.
  • the carrier 7 includes a sensor for detecting the tape 39 attached to a floor surface.
  • the carrier 7 is configured to move along the tape 39 while detecting the tape 39 with the sensor. That is, the moving direction of the carrier 7 is predetermined.
  • the carrier 7 moves the workpiece 81 horizontally.
  • a vision sensor 30 fixed to a support member 37 is arranged above a path where the carrier 7 moves.
  • the vision sensor 30 is arranged at a position where the workpiece 81 conveyed by the carrier 7 can be imaged.
  • the carrier 7 includes a drive motor for driving wheels.
  • a position detector is attached to the output shaft of the drive motor.
  • the mounting table 7 a corresponds to a movement member moved by a driving motor.
  • the position of the mounting table 7 a may be determined, for example, by a position of a set point set at an arbitrary position of the carrier 7 . Further, the position of the mounting table 7 a corresponds to the rotational position of the output shaft of the drive motor.
  • the position detector may be attached to the shaft of the carrier's wheel.
  • the vision sensor 30 performs imaging while the carrier 7 is moving.
  • the image processing unit 47 acquires the three-dimensional map and the two-dimensional image at different time points. Further, the image processing unit 47 acquires a first rotational position detected by the position detector when the three-dimensional map is acquired and a second rotational position detected by the position detector when the two-dimensional image is acquired.
  • the movement control unit 63 of the image processing unit 47 calculates the movement distance of the workpiece 81 based on the first rotational position and the second rotational position.
  • the movement control unit 63 modifies the positions of the measurement points of the three-dimensional map based on the movement amount of the workpiece 81 .
  • a movement control unit 63 changes the relative position of the three-dimensional information of the workpiece with respect to the two-dimensional image of the workpiece in the image coordinate system.
  • a movement control unit 63 performs control for moving the three-dimensional information of the workpiece into the region of the two-dimensional image of the workpiece.
  • the image processing unit 47 can then detect the position and the orientation of the workpiece 81 based on the three-dimensional map and the two-dimensional image.
  • the reference image is used in order to detect the feature portion of the workpiece, but the embodiment is not limited to this.
  • the reference image needs not to be used when the robot grasps the workpiece.
  • three-dimensional information such as a three-dimensional map is moved according to the movement amount of the workpiece.
  • the position of the center of gravity of the workpiece is detected based on the three-dimensional information.
  • the position and the orientation of the hand with respect to the position of the center of gravity are predetermined.
  • the position and the orientation of the hand can be calculated based on the position of the center of gravity of the workpiece.
  • the carrier machine conveys the workpiece along a straight path, but is not limited to this configuration.
  • the carrier machine may move the workpiece along a curved path.
  • the imaging device acquires three-dimensional information and two-dimensional images in order to detect the position and the orientation of the workpiece, but is not limited to this configuration.
  • the imaging device can acquire three-dimensional information and two-dimensional images for arbitrary control. Further, the imaging device can synchronize the two-dimensional image with the three-dimensional information by moving the three-dimensional information in a predetermined coordinate system.
US18/000,933 2020-06-18 2021-06-14 Imaging device for acquiring three-dimensional information of workpiece surface and two-dimensional image of workpiece Pending US20230224450A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2020-105456 2020-06-18
JP2020105456 2020-06-18
PCT/JP2021/022570 WO2021256437A1 (ja) 2020-06-18 2021-06-14 ワークの表面の3次元情報およびワークの2次元画像を取得する撮像装置

Publications (1)

Publication Number Publication Date
US20230224450A1 true US20230224450A1 (en) 2023-07-13

Family

ID=79268063

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/000,933 Pending US20230224450A1 (en) 2020-06-18 2021-06-14 Imaging device for acquiring three-dimensional information of workpiece surface and two-dimensional image of workpiece

Country Status (5)

Country Link
US (1) US20230224450A1 (zh)
JP (1) JPWO2021256437A1 (zh)
CN (1) CN115803584A (zh)
DE (1) DE112021001323T5 (zh)
WO (1) WO2021256437A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114669446B (zh) * 2022-04-06 2023-12-22 宁波九纵智能科技有限公司 一种新型蒸发器视觉引导涂胶系统

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6420404B1 (ja) * 2017-04-26 2018-11-07 ファナック株式会社 物体認識装置

Also Published As

Publication number Publication date
CN115803584A (zh) 2023-03-14
WO2021256437A1 (ja) 2021-12-23
JPWO2021256437A1 (zh) 2021-12-23
DE112021001323T5 (de) 2023-02-16

Similar Documents

Publication Publication Date Title
CN109665307B (zh) 作业系统、对物品的作业实施方法以及机器人
JP6823008B2 (ja) バラ積みされたワークを取り出すロボットシステムおよびロボットシステムの制御方法
CN109940662B (zh) 具备拍摄工件的视觉传感器的摄像装置
JP5849403B2 (ja) ロボットコントローラー、ロボット、及び、ロボットシステム
CN107150032A (zh) 一种基于多图像获取设备的工件识别与分拣装置和方法
US11904483B2 (en) Work robot system
JP5893695B1 (ja) 物品搬送システム
CN108274469B (zh) 基于多维视觉传感器的真空机械手防碰撞检测系统的检测方法
US10314220B2 (en) Mounting machine and component detection method
JP5544320B2 (ja) 立体視ロボットピッキング装置
JP2016147330A (ja) 物体認識に基づく制御装置
JP2002211747A (ja) コンベヤ装置
JP7000361B2 (ja) 追随ロボットおよび作業ロボットシステム
US20230256615A1 (en) Robot device controller for controlling position of robot
US10328582B2 (en) Process system including robot that transfers workpiece to process machine
US20230224450A1 (en) Imaging device for acquiring three-dimensional information of workpiece surface and two-dimensional image of workpiece
JP6849631B2 (ja) 作業ロボットシステムおよび作業ロボット
US11161239B2 (en) Work robot system and work robot
CN109916346B (zh) 一种基于视觉系统的工件平整度的检测装置及检测方法
JP5509859B2 (ja) ロボット制御装置及び方法
WO2022075303A1 (ja) ロボットシステム
CN204868885U (zh) 用于操控工件的机器人系统
JP6889216B2 (ja) 作業システム
JP4077553B2 (ja) 電子部品装着方法および電子部品装着装置
US20220341726A1 (en) Height measurement device

Legal Events

Date Code Title Description
AS Assignment

Owner name: FANUC CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SUN, ZEYUAN;KOGA, KENTARO;REEL/FRAME:062003/0252

Effective date: 20220729

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION