WO2022244212A1 - Dispositif d'imagerie pour calculer une position tridimensionnelle sur la base d'une image capturée par un capteur visuel - Google Patents

Dispositif d'imagerie pour calculer une position tridimensionnelle sur la base d'une image capturée par un capteur visuel Download PDF

Info

Publication number
WO2022244212A1
WO2022244212A1 PCT/JP2021/019252 JP2021019252W WO2022244212A1 WO 2022244212 A1 WO2022244212 A1 WO 2022244212A1 JP 2021019252 W JP2021019252 W JP 2021019252W WO 2022244212 A1 WO2022244212 A1 WO 2022244212A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
focus
imaging
image
workpiece
Prior art date
Application number
PCT/JP2021/019252
Other languages
English (en)
Japanese (ja)
Inventor
祐輝 高橋
Original Assignee
ファナック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ファナック株式会社 filed Critical ファナック株式会社
Priority to US18/552,899 priority Critical patent/US20240185455A1/en
Priority to JP2023522140A priority patent/JPWO2022244212A1/ja
Priority to DE112021007292.7T priority patent/DE112021007292T5/de
Priority to CN202180098103.4A priority patent/CN117321382A/zh
Priority to PCT/JP2021/019252 priority patent/WO2022244212A1/fr
Priority to TW111115626A priority patent/TW202246872A/zh
Publication of WO2022244212A1 publication Critical patent/WO2022244212A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/571Depth or shape recovery from multiple images from focus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/026Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Definitions

  • the present invention relates to an imaging device that calculates a three-dimensional position based on an image captured by a visual sensor.
  • a device that detects the three-dimensional position of an object by processing an image obtained by imaging the object with a visual sensor.
  • a device that captures two-dimensional images of an object from two directions and calculates the three-dimensional position of a specific portion (for example, Japanese Unexamined Patent Application Publication No. 2016-706475).
  • a visual sensor called a stereo camera captures images simultaneously with two two-dimensional cameras and calculates the three-dimensional position of a feature point based on the parallax between the two images.
  • a device that calculates such a three-dimensional position can be attached to a robot for moving a work tool that performs a predetermined work.
  • a camera takes an image of the work placed at a predetermined position.
  • the three-dimensional position of the work is detected based on the image captured by the camera.
  • the position and posture of the robot are changed so that the workpiece can be gripped according to the position of the workpiece. With such control, the accurate position of the workpiece can be detected and the work can be reliably carried out.
  • a calculation model is used to convert the position in the image into a 3D position.
  • a computational model includes predetermined parameters such as coefficients and constants.
  • the parameters in the calculation model for calculating the three-dimensional position of the object depend on the situation in which the camera is installed, the characteristics of the lens, and the individual differences of the lens.
  • the parameters can be determined in advance by calculation, experiment, or the like. For example, parameters can be calculated in advance by actually capturing an image of the object after placing the camera at a predetermined position.
  • the camera In the conventional combination of camera body and lens, the camera is fixed in place.
  • the position of the camera lens is fixed, and the parameters are calculated in advance and used.
  • an object captured by a camera may have individual differences.
  • the position of the object may deviate from the desired position when the camera picks up the image.
  • the image may be blurred when imaging the workpiece.
  • the position of at least one of the work and the camera may change.
  • the position and attitude of the work placed on the workbench may change depending on how the work is transported.
  • the robot can change its position and orientation according to the position and orientation of the workpiece.
  • the robot may interfere with obstacles such as fences placed around the robot system.
  • the robot stroke may be limited. For this reason, it may be difficult to align the relative position of the camera with respect to the workpiece to a predetermined position.
  • An imaging device includes a visual sensor that captures an image of an object, and a focus position detection unit that detects a focus position when the visual sensor is in focus.
  • the imaging device includes a parameter setting unit that sets parameters for calculating a three-dimensional position corresponding to a specific position in an image captured by the visual sensor.
  • the imaging device includes a storage unit that stores setting information for setting parameters corresponding to focus positions.
  • the imaging device includes a feature detection unit that detects a predetermined feature portion in the image of the object, and a feature position calculation unit that calculates the three-dimensional position of the feature portion using the parameters set by the parameter setting unit. and
  • the parameter setting unit sets parameters based on the focus position and setting information.
  • an imaging device that accurately detects the three-dimensional position of a characteristic portion when the focus position changes.
  • FIG. 1 is a schematic diagram of a first robot system in an embodiment
  • FIG. 1 is a block diagram of a first robot system in an embodiment
  • FIG. 1 is a plan view of a workpiece in an embodiment
  • FIG. It is the schematic explaining the position which a camera focuses, and the field of view of a camera. It is an example of an image when the position to focus is changed.
  • FIG. 4 is a schematic diagram of features of the image sensor, lens, and workpiece as the lens position is moved for focusing; 4 is a flowchart for explaining control of the first robot system;
  • FIG. 11 is a first schematic diagram illustrating another control of the first robot system;
  • FIG. 4B is a second schematic diagram illustrating another control of the first robot system;
  • It is a schematic diagram of a second robot system in an embodiment.
  • It is the schematic of the conveying apparatus in embodiment. It is a block diagram of a conveying device in an embodiment.
  • FIG. 1 An imaging device according to an embodiment will be described with reference to FIGS. 1 to 12.
  • FIG. The imaging device of this embodiment functions as a three-dimensional position acquisition device that calculates the three-dimensional position of a specific position in the image based on the image captured by the visual sensor.
  • FIG. 1 is a schematic diagram of a first robot system equipped with an imaging device according to this embodiment.
  • FIG. 2 is a block diagram of the first robot system in this embodiment.
  • the robot system of the present embodiment detects the position of a work as an object and conveys the work.
  • the first robot system 3 includes a hand 5 as a working tool that grips a workpiece 38 and a robot 1 that moves the hand 5.
  • the robot system 3 includes a control device 2 that controls the robot system 3 .
  • the robot system 3 also includes a pedestal 95 on which the workpiece 38 is placed.
  • the hand 5 of the present embodiment is a work tool that grips and releases the workpiece 38.
  • the work tool attached to the robot 1 is not limited to this form, and any work tool suitable for the work performed by the robot system 3 can be adopted.
  • a work tool or the like that performs welding can be employed as the end effector.
  • the robot 1 of this embodiment is a multi-joint robot including a plurality of joints 18 .
  • Robot 1 includes an upper arm 11 and a lower arm 12 .
  • the lower arm 12 is supported by a swivel base 13 .
  • a swivel base 13 is supported by a base 14 .
  • Robot 1 includes a wrist 15 connected to the end of upper arm 11 .
  • Wrist 15 includes a flange 16 to which hand 5 is secured.
  • the components of robot 1 are formed to rotate about a predetermined drive axis.
  • the robot 1 is not limited to this form, and any robot capable of moving a working tool can be adopted.
  • the robot 1 of this embodiment includes a robot driving device 21 having a driving motor for driving constituent members such as the upper arm 11 .
  • the hand 5 includes a hand drive device 22 that drives the hand 5 .
  • the hand drive device 22 of this embodiment drives the hand 5 by air pressure.
  • the hand drive 22 includes an air pump and solenoid valve for supplying compressed air to the cylinders.
  • the control device 2 includes a control device main body 40 and a teaching operation panel 26 for operating the control device main body 40 by an operator.
  • the control device body 40 includes an arithmetic processing device (computer) having a CPU (Central Processing Unit) as a processor.
  • the arithmetic processing unit has a RAM (Random Access Memory), a ROM (Read Only Memory), etc., which are connected to the CPU via a bus.
  • the robot 1 is driven based on operation commands from the control device 2 .
  • the robot 1 automatically transports the workpiece 38 based on the operation program 61.
  • the robot driving device 21 and the hand driving device 22 are controlled by the control device 2 .
  • the control device main body 40 includes a storage section 42 that stores arbitrary information regarding the robot system 3 .
  • the storage unit 42 can be configured by a non-temporary storage medium capable of storing information.
  • the storage unit 42 can be configured with a storage medium such as a volatile memory, a nonvolatile memory, a magnetic storage medium, or an optical storage medium.
  • An operation program 61 that has been created in advance to operate the robot 1 is input to the control device 2 .
  • the operator can set the teaching point of the robot 1 by operating the teaching operation panel 26 to drive the robot 1 .
  • the control device 2 can generate the operation program 61 based on the teaching points.
  • the operating program 61 is stored in the storage unit 42 .
  • the motion control unit 43 sends a motion command for driving the robot 1 to the robot driving unit 44 based on the motion program 61 .
  • the robot drive unit 44 includes an electric circuit that drives the drive motor, and supplies electricity to the robot drive device 21 based on the operation command. Further, the motion control unit 43 sends an operation command for driving the hand drive device 22 to the hand drive unit 45 .
  • the hand driving unit 45 includes an electric circuit for driving an air pump or the like, and supplies electricity to the air pump or the like based on an operation command.
  • the operation control unit 43 corresponds to a processor driven according to the operation program 61.
  • the processor is formed so as to be able to read information stored in the storage unit 42 .
  • the processor functions as the operation control unit 43 by reading the operation program 61 and performing control defined in the operation program 61 .
  • the robot 1 includes a state detector for detecting the position and orientation of the robot 1.
  • the state detector in this embodiment includes a position detector 23 attached to the drive motor of each drive shaft of the robot drive device 21 .
  • the position detector 23 can be composed of, for example, an encoder that detects the rotational position of the output shaft of the drive motor. The position and orientation of the robot 1 are detected from the output of the position detector 23 .
  • a reference coordinate system 71 that does not move when the position and orientation of the robot 1 changes is set in the robot system 3 .
  • the origin of the reference coordinate system 71 is arranged on the base 14 of the robot 1 .
  • the reference coordinate system 71 is also called a world coordinate system.
  • the reference coordinate system 71 has, as coordinate axes, an X-axis, a Y-axis, and a Z-axis that are orthogonal to each other.
  • the W axis is set as a coordinate axis around the X axis.
  • a P-axis is set as a coordinate axis around the Y-axis.
  • An R-axis is set as a coordinate axis around the Z-axis.
  • the teaching operation panel 26 is connected to the control device body 40 via a communication device.
  • the teaching operation panel 26 includes an input section 27 for inputting information regarding the robot 1 and the hand 5 .
  • the input unit 27 is composed of input members such as a keyboard and dials.
  • the teaching operation panel 26 includes a display section 28 that displays information regarding the robot 1 and hand 5 .
  • the display unit 28 can be configured by an arbitrary display panel such as a liquid crystal display panel or an organic EL (Electro Luminescence) display panel.
  • the teaching console panel includes a touch panel type display panel, the display panel functions as an input section and a display section.
  • the robot system 3 is set with a tool coordinate system having an origin set at an arbitrary position on the work tool.
  • the tool coordinate system changes position and orientation with the work tool.
  • the origin of the tool coordinate system is set at the tool tip point of the hand 5 .
  • the position of the robot 1 corresponds to the position of the tip point of the tool (the position of the origin of the tool coordinate system).
  • the posture of the robot 1 corresponds to the posture of the tool coordinate system with respect to the reference coordinate system 71 .
  • the robot system 3 in the present embodiment includes an imaging device that detects the position of the work 38.
  • the imaging device detects the position of the workpiece 38 on the pedestal 95 before the hand 5 grips the workpiece 38 .
  • the imaging device has a camera 6 as a visual sensor that captures an image of the workpiece 38 .
  • the camera 6 of this embodiment is a two-dimensional camera that captures a two-dimensional image.
  • a camera 6 is supported by the robot 1 .
  • Camera 6 is fixed to hand 5 via a support member.
  • the camera 6 can capture an image in the field of view 6a.
  • the camera 6 has a focus adjustment mechanism 24 for adjusting focus.
  • the focus adjustment mechanism 24 of this embodiment has a function of automatically adjusting the focus. That is, the camera 6 has an autofocus function.
  • the camera 6 is formed so as to automatically focus on the work 38 and take an image of the work 38 when the robot 1 changes its position and posture.
  • As the focus adjustment mechanism a mechanism for focusing by arbitrary control such as a contrast detection method or a phase difference method can be adopted.
  • a camera with a liquid lens can be employed as a visual sensor.
  • a mechanism for changing the shape of the liquid lens can be employed as the focus adjustment mechanism.
  • a mechanism for changing the voltage applied to the liquid lens or a mechanism for moving a liquid lens holding member for changing the water pressure applied to the liquid lens can be employed.
  • a camera coordinate system 72 is set for the camera 6 as a sensor coordinate system.
  • the camera coordinate system 72 changes position and orientation along with the camera 6 .
  • the origin of the camera coordinate system 72 is set at a predetermined position of the camera 6 such as the lens center or optical center of the camera 6 .
  • the camera coordinate system 72 has X-, Y-, and Z-axes that are orthogonal to each other.
  • the camera coordinate system 72 of this embodiment is set such that the Z axis is parallel to the optical axis of the lens of the camera 6 .
  • the imaging apparatus of the present embodiment includes a moving device that moves one of the workpiece 38 and the camera 6 as objects to change the position of one relative to the other.
  • the robot 1 functions as a mobile device.
  • the position and orientation of the robot 1 change, the position and orientation of the camera 6 also change.
  • the imaging device has an image processing device that processes the image captured by the visual sensor.
  • the controller body 40 functions as an image processing device.
  • the control device main body 40 includes an image processing section 51 that processes images captured by the camera 6 .
  • the image processing unit 51 includes an image capturing control unit 58 that sends an image capturing command to the camera 6 .
  • the image processing unit 51 includes a focus position detection unit 52 that detects the focus position when the camera 6 is in focus.
  • the image processing unit 51 includes a parameter setting unit 53 that sets parameters for calculating a three-dimensional position corresponding to a specific position in the image captured by the camera 6 .
  • Image processing unit 51 includes a feature detection unit 54 that detects a predetermined feature portion in the image of workpiece 38 .
  • Image processing portion 51 includes a feature position calculation portion 55 that calculates the three-dimensional position of the feature portion using the parameters set by parameter setting portion 53 .
  • the image processing section 51 includes a distance calculation section 56 that calculates the distance from the camera 6 to the workpiece 38 .
  • the image processing unit 51 includes an action command generation unit 59 that generates action commands for the robot 1 and the hand 5 based on the result of image processing.
  • the image processing unit 51 corresponds to a processor driven according to the operation program 61.
  • each unit of the focus position detection unit 52, the parameter setting unit 53, the feature detection unit 54, the feature position calculation unit 55, the distance calculation unit 56, the imaging control unit 58, and the operation command generation unit 59 operates according to the operation program 61. It corresponds to the processor that drives it.
  • the processors read the operation program 61 and perform control defined in the operation program 61, thereby functioning as respective units.
  • the workpiece 38 is arranged on the surface of the pedestal 95 by a predetermined method.
  • a predetermined method For example, an operator or other robotic system places workpiece 38 on the surface of pedestal 95 . Then, the robot 1 changes its position and posture, and grips the workpiece 38 placed on the upper surface of the pedestal 95 with the hands 5 .
  • the robot system 3 conveys the work 38 to a predetermined position by changing the position and posture of the robot 1 .
  • the position of the work 38 on the pedestal 95 may shift.
  • a position P38a determined in teaching work when teaching the position and orientation of the robot 1 is shown.
  • the position P38a is a position where it is desirable to place the work 38 and is a reference position for placing the work 38 .
  • the work 38 when the work 38 is actually placed on the upper surface of the mount 95, the work 38 may be placed at a position P38b deviated from the reference position P38a. Alternatively, there may be a dimensional error in the work 38 .
  • the camera 6 images the work 38 .
  • the image processing unit 51 calculates the three-dimensional position of the work 38 based on the image of the work 38 .
  • the image processing unit 51 detects the three-dimensional position of the characteristic portion of the workpiece 38 .
  • the image processing unit 51 calculates the position of the work 38 based on the three-dimensional position of the characteristic portion of the work 38 .
  • Such a position of the workpiece 38 can be calculated using the reference coordinate system 71 .
  • the image processing unit 51 controls the position and orientation of the robot 1 so as to correspond to the position of the workpiece 38 .
  • the hand 5 grips the work 38 and conveys it to a desired predetermined position.
  • FIG. 3 shows a plan view of the workpiece 38 in this embodiment.
  • workpiece 38 has a plate-like portion 38a and a plate-like portion 38b formed above plate-like portion 38a.
  • Each plate-like part 38a, 38b has a rectangular parallelepiped shape.
  • the plate-like portion 38b has an edge portion 38c on the outer peripheral portion of the upper surface.
  • the edge portion 38c is a portion corresponding to the corner formed in the plate-like portion 38b.
  • the edge 38c having a quadrangular shape in plan view is a feature of the workpiece 38. As shown in FIG.
  • camera 6 is arranged above workpiece 38 in the vertical direction. At this time, the distance from the surface of the workpiece 38 on which the characteristic portion is formed to the camera 6 is predetermined.
  • the position and posture of the robot 1 are controlled such that the position of the upper surface of the plate-like portion 38b is at a predetermined Z-axis value of the camera coordinate system 72.
  • the posture of the camera 6 is adjusted so that the optical axis of the camera 6 is substantially perpendicular to the surface of the plate-like portion 38b having the characteristic portion of the workpiece 38.
  • the camera 6 takes an image of the work 38 by performing focusing.
  • the feature detection section 54 of the image processing section 51 detects the edge portion 38c as a feature portion of the work 38 by performing pattern matching.
  • a reference image for detecting the position of the edge portion 38 c is created in advance and stored in the storage section 42 .
  • the feature detection section 54 detects the edge portion 38c, which is a feature portion, in the image captured by the camera 6 using the reference image.
  • the feature position calculation unit 55 calculates the position of the workpiece in the three-dimensional space based on the position of the feature portion in the image captured by the camera. As the work position, the position of any set point set for the work can be calculated. The position of the workpiece 38 can be obtained in the reference coordinate system 71. FIG.
  • the action command generator 59 calculates the position and orientation of the robot 1 based on the position of the work 38 calculated by the feature position calculator 55 . Then, the position and attitude of the robot 1 for gripping the workpiece 38 are sent to the motion control section 43 .
  • the motion control unit 43 drives the robot 1 and the hand 5 to grip the workpiece 38 based on the motion command received from the motion command generation unit 59 .
  • the characteristic detection unit 54 detects characteristic portions, and the characteristic position calculation unit 55 accurately calculates the three-dimensional position of the workpiece based on the positions of the characteristic portions. Therefore, the robot system 3 can grip the work 38 more reliably. Even if the position of the work 38 on the pedestal 95 (the position of the work 38 on the reference coordinate system 71) differs from the reference position or there is a dimensional error in the work, the robot system 3 can reliably grip the work 38. can.
  • the focus adjustment mechanism 24 may include a drive motor for driving the position of the lens for focusing.
  • the rotational position when the output shaft of the drive motor is in focus can be used as the focus position.
  • the lens may have a focus ring for focusing. The position of the focus ring can be set at the focus position.
  • a predetermined lens position can be set as the focus position.
  • the magnitude of the voltage applied to the liquid lens can be used as the focus position.
  • the rotational position of the output shaft of the motor included in the drive mechanism for the holding member that changes the pressure applied to the liquid lens can be used as the focus position.
  • Fig. 4 shows a schematic diagram explaining the field of view when the focus position changes.
  • the field of view of camera 6 corresponds to the angle of view or imaging range.
  • positions A and B to be focused are shown.
  • the imaging range of the camera 6 is the field of view A.
  • the imaging range of the camera 6 is the field of view B.
  • the size of the field of view changes when the position of focus changes.
  • Fig. 5 shows an example of an image captured by the camera when the focus position is changed.
  • Image 66 is an image when the focus is adjusted to one position, which corresponds to position A in FIG. 4, for example.
  • An image 67 is an image when another position is focused, for example, an image corresponding to position B in FIG.
  • An image coordinate system 73 is set for the images 66 and 67 .
  • Each of the images 66 and 67 includes hole images 68a and 68b as characteristic portions.
  • the position of hole image 68a in image 66 becomes the position of hole image 68b in image 67, as indicated by arrow 101.
  • FIG. When the focus position changes, the positions of feature points in the image change.
  • changing the focus position is synonymous with changing the focal length.
  • FIG. 6 shows a schematic diagram for explaining the positions in the pixel sensor where the feature points of the object appear.
  • a lens 37 is positioned between the work surface and the image sensor that produces the image for the camera.
  • the focal lengths f 1 and f 2 correspond to distances from the image sensor to the lens center of the lens 37 .
  • FIG. 6 shows a focus position A and a focus position B.
  • Position A corresponds to the lens 37 being placed at the focal length f 1 .
  • Position B corresponds to the lens 37 being placed at the focal length f 2 .
  • the distances z 1 and z 2 from the lens 37 to the workpiece change according to the focal lengths f 1 and f 2 .
  • the characteristic portion 69 arranged at a distance X 1 from the optical axis on the surface of the workpiece is detected at a position at a distance u 1 from the optical axis by the image sensor.
  • the characteristic portion 69 arranged at the position of the distance X 2 which is the same as the distance X 1 is detected at the position of the distance u 2 by the image sensor.
  • the parameter setting unit 53 calculates the parameters in the calculation model for calculating the position of the feature part of the work from the image captured by the camera, based on the focus position.
  • the feature position calculator 55 calculates a three-dimensional position from a specific position in the image using a calculation model. The three-dimensional position of the characteristic portion on the surface of the work is calculated using the parameters set corresponding to the focus position.
  • a position in the image of the camera corresponding to an arbitrary position in space is generally represented by the following equation (1) using a pinhole camera model.
  • the coordinate values (X, Y, Z) of the three-dimensional position are expressed in the reference coordinate system 71, for example.
  • the coordinate values (u, v) of the position on the image are expressed in the image coordinate system 73, for example.
  • the extrinsic parameter matrix is a transformation matrix for transforming a three-dimensional position in space into coordinate values of the camera coordinate system 72 .
  • the matrix of internal parameters is a matrix for converting the coordinate values of the camera coordinate system 72 into the coordinate values of the image coordinate system 73 in the image.
  • the Z-axis value of the three-dimensional position or the z-axis coordinate value in the camera coordinate system 72 is predetermined according to the distance from the camera to the workpiece.
  • Equation (1) is an ideal example in which there is no lens distortion or the like. In practice, parameter changes and the like due to lens distortion and the like are taken into consideration.
  • Equation (2) the computation of the portion of the three-dimensional position in space and the matrix of the extrinsic parameters in Equation (1) can be expressed by the following Equation (2).
  • the coordinate values (X, Y, Z) expressed in the reference coordinate system 71 can be converted into the coordinate values (x, y, z) expressed in the camera coordinate system 72 by equation (2).
  • variables x' and y' are defined as shown in the following equations (3) and (4) to account for camera lens distortion.
  • variables x'' and variables y'' that take distortion into account are calculated as shown in equations (5) and (6).
  • the relationship between variable x', variable y', and variable r is as shown in equation (7).
  • equations (5) and (6) coefficients k 1 to k 6 are coefficients for radial lens distortion, and coefficients p 1 and p 2 are coefficients for circumferential lens distortion.
  • coordinate values (u, v) on the image in the image coordinate system 73 are given by the following equations (8) and (9). can be calculated to Equations (8) and (9) correspond to the calculation using the matrix of internal parameters in Equation (1) above.
  • the three-dimensional position (X, Y, Z) in space is calculated.
  • the distance z from the camera 6 to the workpiece 38 can be determined in advance and stored in the storage unit 42 .
  • the feature position calculator 55 calculates a three-dimensional position (X, Y, Z) in space from the coordinate values (u, v) of a specific position on the image based on the calculation model.
  • the calculation model for calculating the three-dimensional position from the position in the image includes the products f x and f y of the focal length and the effective size of the image, We need the centers c x , c y and the distortion coefficients k 1 -k 6 , p 1 , p 2 . These parameters change according to the focus position when the camera is in focus.
  • setting information 63 for setting parameters corresponding to focus positions is predetermined.
  • the setting information 63 is stored in the storage section 42 .
  • the parameter setting unit 53 uses the focus position and setting information 63 to set these parameters.
  • Table 1 shows a table of parameter values corresponding to the focus position pp as the setting information 63 .
  • the value of the product f x of the focal length and the effective pixel size is shown as an example of the parameter.
  • parameter values are predetermined for a plurality of discrete focus positions pp.
  • the parameter setting unit 53 sets parameters in the calculation model based on parameter values determined for each focus position. For example, if the focus position pp is 1.4 when the camera 6 captures an image, the parameter setting unit 53 interpolates the value of the product f x as a parameter to set it to 2.8. can do.
  • Parameters can be set by any method using a table containing discrete parameter values. For example, a median value of two parameters corresponding to two focus positions pp may be adopted, or a parameter value corresponding to either closer focus position pp may be adopted.
  • the parameter setting unit can set parameters according to arbitrary focus positions.
  • the parameter setting unit can set parameters by simple calculation. Alternatively, even if it is difficult to set the functions described below, parameters can be set according to the focus position.
  • a parameter can be calculated by a mathematical expression including the focus position pp.
  • a function f(pp) for calculating the product f of the focal length and the effective size of the pixel for the focus position pp can be determined in advance.
  • a function k(pp) for calculating the distortion coefficient k with respect to the focus position pp can be determined in advance.
  • the parameter setting unit 53 can set each parameter such as a parameter related to distortion using a function.
  • the characteristic position calculation section 55 can calculate the three-dimensional position of the characteristic portion based on the parameters set by the parameter setting section 53 .
  • FIG. 7 shows a flowchart of control in this embodiment.
  • the operator predefines setting information for calculating the parameters of the calculation model. Then, the operator causes the storage unit 42 to store the setting information 63 .
  • the operation control unit 43 moves the camera 6 to the imaging position for imaging the work 38 .
  • the camera 6 is arranged directly above the reference position P38a of the workpiece 38.
  • the posture of the camera 6 is adjusted so that the direction of the Z axis of the camera coordinate system 72 is parallel to the vertical direction.
  • the distance from the surface of the plate-like portion 38b of the workpiece 38 on which the characteristic portion is formed to the camera 6 is predetermined.
  • step 81 the focus adjustment mechanism 24 of the camera 6 brings the camera 6 into focus.
  • the focus adjustment mechanism 24 of the present embodiment has an autofocus function, so it automatically focuses.
  • step 82 the imaging control unit 58 captures an image with the camera 6 . An image is captured in a focused state.
  • the focus position detection unit 52 detects the focus position when the image is captured.
  • the focus position detection unit 52 detects, for example, a predetermined variable corresponding to the position of the lens.
  • the parameter setting unit 53 sets parameters of a calculation model for calculating the three-dimensional position of the characteristic portion based on the focus position and setting information.
  • the feature detection unit 54 detects feature portions in the image by performing pattern matching.
  • the edge 38c in the image is detected by performing pattern matching using a reference image of the edge 38c of the plate-like portion 38b.
  • the feature position calculator 55 detects the position of the feature portion in the image.
  • control is performed to change the position of the camera 6 with respect to the work 38 and take an image.
  • the light from the illumination may reflect off the feature, making it white and obscuring the feature.
  • by moving the position of the camera it may be possible to clearly image the characteristic portion.
  • step 86 the image processing unit 51 determines whether or not the position of the characteristic portion has been detected. If the characteristic position calculator 55 cannot detect the position of the characteristic portion, control proceeds to step 87 .
  • the action command generator 59 generates a command to change the position of the camera 6 .
  • the action command generator 59 generates a command to translate the camera 6 in a predetermined direction by a predetermined amount of movement.
  • the action command generator 59 generates a command to move the camera 6 in the X-axis direction of the camera coordinate system 72 .
  • the motion command generation unit 59 sends motion commands for the robot 1 to the motion control unit 43 .
  • the motion control section 43 changes the position and posture of the robot 1 . Control then returns to step 81 .
  • the image processing section 51 repeats the control from step 81 to step 86 .
  • step 86 when the characteristic position calculation unit 55 has calculated the position of the characteristic portion, the control proceeds to step 88. If the characteristic portion cannot be detected even after changing the position and posture of the robot a plurality of times, the control may be stopped.
  • the feature position calculator 55 calculates the three-dimensional position of the feature part based on the position of the feature part in the image. Based on the coordinate values of the image coordinate system 73 in the image, the coordinate values of the reference coordinate system 71 are calculated. A feature position calculator 55 calculates the position of the workpiece based on the three-dimensional position of the feature portion. The position of the workpiece can be calculated using the reference coordinate system 71, for example.
  • the motion command generator 59 calculates the position and orientation of the robot 1 based on the position of the workpiece. Then, at step 90 , the motion command generation unit 59 sends a motion command for driving the robot 1 to the motion control unit 43 .
  • the motion control unit 43 drives the robot 1 and the hand 5 based on motion commands.
  • the imaging apparatus sets the parameters of the calculation model for calculating the three-dimensional position corresponding to the specific position in the image captured by the visual sensor according to the focus position. Then, the three-dimensional position of the specific position is calculated based on parameters corresponding to the focus position.
  • the focus position is set to any position within a predetermined range.
  • the imaging device can set parameters corresponding to the focus position and detect the accurate position of the workpiece.
  • the visual sensor does not have to have the function of automatically focusing.
  • the operator can manually focus. For example, the operator may focus by operating the input section 27 while watching the image displayed on the display section 28 of the teaching operation panel 26 .
  • the robot system in this embodiment includes a robot as a moving device that moves at least one of the workpiece and the visual sensor.
  • a robot as a moving device that moves at least one of the workpiece and the visual sensor.
  • the imaging device can set parameters corresponding to the focus position, and can detect the accurate position of the workpiece.
  • the display section 28 of the teaching operation panel 26 of the present embodiment displays the parameter values set by the parameter setting section 53 .
  • the operator can see the parameters displayed on the display section 28 of the teaching operation panel 26 and confirm the parameter values.
  • the operator can check the parameter values set according to each focus position.
  • the distance calculation section 56 of the image processing section 51 in this embodiment can calculate the distance from the camera 6 to the work 38 based on the focus position detected by the focus position detection section 52 .
  • the focus position depends on the distance between camera 6 and workpiece 38 . Therefore, once the focal position is determined, the distance between the camera 6 and the workpiece 38 can be estimated.
  • the distance calculator 56 estimates the distance from the origin of the camera coordinate system 72 to the surface of the plate-like portion 38b of the workpiece 38 when the surface of the plate-like portion 38b is focused. For example, the operator can create in advance a function for calculating the z-axis coordinate value of the camera coordinate system 72 with the focus position pp as a variable. The z-axis coordinate value of the camera coordinate system 72 corresponds to the distance from the camera 6 to the workpiece 38 . The distance calculator 56 can calculate the coordinate value of the z-axis of the camera coordinate system 72 using the focus position pp and the function. Alternatively, the operator can determine the distance from the camera 6 to the work 38 for each of a plurality of discrete focus positions. The distance calculator 56 can calculate the distance from the camera 6 to the workpiece 38 by interpolation or other calculation based on the actually detected focus position pp.
  • the distance calculator 56 of the present embodiment can calculate the distance from the camera 6 to the object.
  • the distance from the camera to the object should be determined in advance.
  • the distance from the camera 6 to the workpiece 38 can be calculated by the image processing section 51 including the distance calculation section 56 .
  • the image processing unit 51 can calculate the three-dimensional position of the characteristic portion of the workpiece without setting the distance from the camera to the object.
  • FIG. 8 shows a schematic diagram for explaining the first step of another control of the first robot system in this embodiment.
  • the image processing unit 51 detects the three-dimensional position of the work 38 based on an image captured by placing the camera 6 at the first imaging position. Based on the position of the workpiece 38, the image processing unit 51 calculates a second imaging position closer to the workpiece 38 than the first imaging position. The image processing unit 51 moves the position and posture of the robot 1 to the second imaging position as indicated by an arrow 102 .
  • the second imaging position is a position where the distance from the object to the visual sensor is smaller than that of the first imaging position. Also, the second imaging position is a position where the work 38 is arranged substantially in the center of the image.
  • the image processing unit 51 calculates the three-dimensional position of the work 38 based on the image captured at the second imaging position. Then, based on the position of the work 38, the robot 1 is driven and the work 38 is gripped.
  • the position of the work 38 may be largely misaligned. Therefore, the position and posture of the robot 1 can be determined in advance so that the camera 6 is arranged at the first imaging position away from the workpiece 38 .
  • control is performed to automatically focus by the autofocus function.
  • the workpiece 38 is imaged in a small size.
  • the position of the workpiece 38 can be detected by the feature detection section 54 and the feature position calculation section 55 .
  • the action command generator 59 calculates the second imaging position of the camera 6 in order to image the work 38 at a position closer than the first imaging position.
  • the second imaging position is determined such that the workpiece is placed substantially in the center of the image. Also, the second imaging position is set at a position directly above the work 38 where the camera 6 approaches the work 38 .
  • z' be the coordinate value of the z-axis of the surface of the work 38 in the camera coordinate system 72 at the first imaging position.
  • a be the ratio of the workpiece to the size of the image at the first imaging position.
  • the ratio of the work to the size of the image for example, the ratio of the length of the work to the length of the image in one direction in the image can be used.
  • the ratio of the workpiece to the size of the image can be detected by the feature position calculator 55 .
  • z'' be the coordinate value of the z-axis of the surface of the workpiece 38 in the camera coordinate system 72 at the second imaging position.
  • k be the desired ratio of the workpiece to the image size. This ratio can be predetermined by the operator.
  • the motion command generation unit 59 moves the camera 6 to the second imaging position as indicated by the arrow 102 based on the amount of movement in the x-axis direction, the y-axis direction, and the z-axis direction of the camera coordinate system 72 . Change the position and posture of the robot 1 so as to place it.
  • FIG. 9 shows a schematic diagram explaining the second step of another control of the first robot system.
  • FIG. 9 shows a schematic diagram of the robotic system when the camera is placed in the second imaging position.
  • the second imaging position of the camera 6 is closer to the work 38 than the first imaging position.
  • the autofocus function of the camera 6 is used to automatically focus.
  • the position of the workpiece 38 may not be accurately detected because the proportion of the workpiece 38 in the image is small.
  • the work 38 occupies a large proportion. Therefore, the position of the workpiece 38 can be accurately calculated.
  • the action command generator 59 can calculate the second imaging position based on the three-dimensional position of the workpiece 38 in the image captured at the first imaging position. Then, the characteristic position calculator 55 calculates the three-dimensional position of the characteristic portion based on the image captured at the second imaging position. Even if the focus positions of the cameras are different at the first imaging position and the second imaging position, the three-dimensional position of the characteristic portion can be detected at each imaging position. In particular, by having the robot system perform work based on the image captured at the second imaging position, it is possible to move the work tool to an accurate position and perform highly accurate work.
  • the image processing unit 51 can detect the work 38 based on the captured image and inspect the work 38 .
  • the image processing unit 51 can measure the dimensions of the workpiece from the captured image. Then, the image processing unit 51 can inspect the dimension of the workpiece based on the predetermined dimension determination value. In this case, only the visual sensor may be attached to the robot 1 without attaching the working tool to the robot 1 .
  • Workpiece inspection is not limited to inspection of workpiece dimensions, and any inspection can be performed. For example, it is possible to inspect whether or not a predetermined component is arranged on the surface of the workpiece. Alternatively, an inspection can be carried out as to whether there are any flaws on the surface of the workpiece. In any case, since the position of the characteristic portion can be accurately detected according to the focus position, highly accurate inspection can be performed.
  • the workpiece is stopped and the camera is moved by the moving device, but it is not limited to this form.
  • the position of the camera may be fixed and the workpiece may be moved by the moving device.
  • the moving device may be configured to move both the camera and the workpiece.
  • FIG. 10 shows a schematic diagram of the second robot system in this embodiment.
  • the camera 6 is fixed to the pedestal 96 .
  • a workpiece 38 is supported by the robot 1 .
  • the second robot system 4 conveys the workpiece 38 placed on the pedestal 97 to the pedestal 98 as indicated by an arrow 103 .
  • the imaging device of the second robot system 4 detects positional deviation in the hand 5 when the work 38 is gripped by the hand 5 .
  • the control device 2 controls the position and attitude of the robot 1 so as to place the work 38 at a predetermined imaging position for detecting the three-dimensional position of the work 38 .
  • the image processing section 51 detects the three-dimensional position of the characteristic portion of the work 38 based on the image captured by the camera 6 . For example, the edge of the bottom surface of the workpiece 38 can be detected as a characteristic portion.
  • the image processing section 51 detects the position of the workpiece 38 .
  • a reference position of the workpiece 38 in the predetermined position and orientation of the robot 1 is stored in the storage unit 42 .
  • the image processing unit 51 can calculate the shift in gripping of the work 38 by the hand 5 based on the reference position of the work 38 .
  • the motion command generation unit 59 calculates the position and posture of the robot 1 so as to place the work 38 at the desired position P38e on the pedestal 98 based on the positional deviation of the work 38 within the hand 5 . Then, the motion control unit 43 drives the robot 1 to place the workpiece 38 at the position P38e.
  • the camera 6 performs focusing when the work 38 is placed at a predetermined imaging position.
  • the parameter setting unit 53 calculates parameters of the calculation model based on the focus position.
  • the characteristic position calculator 55 calculates the three-dimensional position of the characteristic portion based on the calculated parameters. Then, the position of the workpiece 38 is detected based on the position of the characteristic portion.
  • the imaging position of the work 38 may deviate from the desired position when imaging the work 38 .
  • the position of the workpiece 38 can be calculated accurately even if the camera 6 performs focusing.
  • the robot system 4 can transport the workpiece 38 to a desired position.
  • the image processing unit 51 picks up an image of the workpiece at the first imaging position away from the camera to detect the rough position of the workpiece, and then detects the rough position of the workpiece. 2 imaging positions can be calculated. Then, based on the image captured at the second imaging position, the deviation of gripping of the workpiece 38 may be calculated.
  • the second robot system may also perform inspections such as dimensional inspection of the work.
  • FIG. 11 shows a schematic diagram of the transport system in this embodiment.
  • FIG. 12 shows a block diagram of the transport system in this embodiment.
  • transport system 9 includes an image pickup device that picks up an image of work 38 and detects work 38 .
  • the transport system 9 includes a conveyor 7 as a moving device for moving the work 38 .
  • the transport system 9 has a configuration in which a conveyor 7 is arranged instead of the robot 1 of the second robot system 4 .
  • the work 38 moves in the direction indicated by the arrow 104 by driving the conveyor 7 . That is, the position of the workpiece 38 changes as the conveyor 7 is driven.
  • a camera 6 as a visual sensor is supported by a support member 99 .
  • the transport system 9 includes a control device 8 that controls the conveyor 7 and the camera 6.
  • the control device 8 is composed of an arithmetic processing device including a CPU and the like.
  • the controller 8 includes a conveyor drive 46 .
  • Conveyor 7 includes a conveyor drive 30 having a drive motor for driving the belt. Each drive motor is provided with a position detector 31 for detecting the rotational position of the drive motor.
  • the control device 8 includes an image processing section 51 that processes images captured by the camera 6 .
  • the control device 8 includes an operation panel 32.
  • the operating panel 32 has an input section 27 and a display section 28, similarly to the teaching operating panel 26.
  • the display unit 28 can display parameters and the like set by the parameter setting unit 53 .
  • Other configurations of the control device 8 are the same as those of the control device 2 of the robot system shown in FIG.
  • the camera 6 is fixed at a predetermined position where the workpiece 38 can be imaged.
  • the transport system 9 can detect the position of the work 38 or inspect the work 38 based on the image captured by the camera 6 .
  • the imaging position at which the work 38 conveyed by the conveyor 7 stops may be different. Alternatively, there may be individual differences in the dimensions of the work 38 .
  • focusing is performed when the work 38 is imaged. Then, based on the focus position, parameters of a calculation model for calculating a three-dimensional position corresponding to the specific position of the image are set. Based on the parameters, the positions of characteristic portions of the workpiece 38 are calculated. Based on the position of the characteristic portion of the workpiece 38, the position of the workpiece 38 can be calculated and the workpiece 38 can be inspected.
  • the position of camera 6 may be fixed.
  • any device that moves an object or a camera can be employed as the moving device.
  • Other configurations, actions, and effects of the transport system are the same as those of the first robot system and the second robot system described above, so description thereof will not be repeated here.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)

Abstract

Ce dispositif d'imagerie comprend une unité de détection de position de mise au point permettant de détecter une position de mise au point lorsqu'un capteur visuel est au point focal. Le dispositif d'imagerie comprend une unité de réglage de paramètre permettant de régler un paramètre pour calculer une position tridimensionnelle correspondant à une position spécifique dans une image capturée par le capteur visuel. Le dispositif d'imagerie comprend une unité de calcul de position de caractéristique permettant de calculer la position tridimensionnelle d'une partie caractéristique à l'aide du paramètre réglé par l'unité de réglage de paramètre. L'unité de réglage de paramètre règle le paramètre sur la base de la position de mise au point.
PCT/JP2021/019252 2021-05-20 2021-05-20 Dispositif d'imagerie pour calculer une position tridimensionnelle sur la base d'une image capturée par un capteur visuel WO2022244212A1 (fr)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US18/552,899 US20240185455A1 (en) 2021-05-20 2021-05-20 Imaging device for calculating three-dimensional position on the basis of image captured by visual sensor
JP2023522140A JPWO2022244212A1 (fr) 2021-05-20 2021-05-20
DE112021007292.7T DE112021007292T5 (de) 2021-05-20 2021-05-20 Bildgebende vorrichtung zur berechnung der dreidimensionalen lage auf der grundlage eines von einem bild-sensor aufgenommenen bildes
CN202180098103.4A CN117321382A (zh) 2021-05-20 2021-05-20 基于由视觉传感器拍摄到的图像来计算三维位置的拍摄装置
PCT/JP2021/019252 WO2022244212A1 (fr) 2021-05-20 2021-05-20 Dispositif d'imagerie pour calculer une position tridimensionnelle sur la base d'une image capturée par un capteur visuel
TW111115626A TW202246872A (zh) 2021-05-20 2022-04-25 根據由視覺感測器所拍攝之圖像來算出三維之位置的攝像裝置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/019252 WO2022244212A1 (fr) 2021-05-20 2021-05-20 Dispositif d'imagerie pour calculer une position tridimensionnelle sur la base d'une image capturée par un capteur visuel

Publications (1)

Publication Number Publication Date
WO2022244212A1 true WO2022244212A1 (fr) 2022-11-24

Family

ID=84140201

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/019252 WO2022244212A1 (fr) 2021-05-20 2021-05-20 Dispositif d'imagerie pour calculer une position tridimensionnelle sur la base d'une image capturée par un capteur visuel

Country Status (6)

Country Link
US (1) US20240185455A1 (fr)
JP (1) JPWO2022244212A1 (fr)
CN (1) CN117321382A (fr)
DE (1) DE112021007292T5 (fr)
TW (1) TW202246872A (fr)
WO (1) WO2022244212A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000293695A (ja) * 1999-04-08 2000-10-20 Fanuc Ltd 画像処理装置
JP2001257932A (ja) * 2000-03-09 2001-09-21 Denso Corp 撮像装置
WO2011058876A1 (fr) * 2009-11-13 2011-05-19 富士フイルム株式会社 Dispositif, procédé, programme et système de mesure de la distance, et dispositif de capture d'images
JP2020185639A (ja) * 2019-05-14 2020-11-19 ファナック株式会社 ロボット操作装置、ロボットおよびロボット操作方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000293695A (ja) * 1999-04-08 2000-10-20 Fanuc Ltd 画像処理装置
JP2001257932A (ja) * 2000-03-09 2001-09-21 Denso Corp 撮像装置
WO2011058876A1 (fr) * 2009-11-13 2011-05-19 富士フイルム株式会社 Dispositif, procédé, programme et système de mesure de la distance, et dispositif de capture d'images
JP2020185639A (ja) * 2019-05-14 2020-11-19 ファナック株式会社 ロボット操作装置、ロボットおよびロボット操作方法

Also Published As

Publication number Publication date
JPWO2022244212A1 (fr) 2022-11-24
US20240185455A1 (en) 2024-06-06
TW202246872A (zh) 2022-12-01
DE112021007292T5 (de) 2024-01-25
CN117321382A (zh) 2023-12-29

Similar Documents

Publication Publication Date Title
KR102532072B1 (ko) 로봇 모션 용 비전 시스템의 자동 핸드-아이 캘리브레이션을 위한 시스템 및 방법
US6816755B2 (en) Method and apparatus for single camera 3D vision guided robotics
JP6734253B2 (ja) ワークを撮像する視覚センサを備える撮像装置
US8095237B2 (en) Method and apparatus for single image 3D vision guided robotics
JP6430986B2 (ja) ロボットを用いた位置決め装置
JP5815761B2 (ja) 視覚センサのデータ作成システム及び検出シミュレーションシステム
JP2013036988A (ja) 情報処理装置及び情報処理方法
US20190030722A1 (en) Control device, robot system, and control method
JP2023108062A (ja) 制御装置、ロボット装置、制御方法、およびプログラム
JP2019049467A (ja) 距離計測システムおよび距離計測方法
Hefele et al. Robot pose correction using photogrammetric tracking
WO2022244212A1 (fr) Dispositif d'imagerie pour calculer une position tridimensionnelle sur la base d'une image capturée par un capteur visuel
WO2022249410A1 (fr) Dispositif d'imagerie permettant de calculer une position tridimensionnelle sur la base d'une image capturée par un capteur visuel
Marny et al. Configuration and programming of the fanuc irvision vision system for applications in the dynamic environment of manipulated elements
JP2022530589A (ja) ロボット搭載移動装置、システム及び工作機械
JP2005186193A (ja) ロボットのキャリブレーション方法および三次元位置計測方法
JP2010214546A (ja) 組立装置および組立方法
WO2023135764A1 (fr) Dispositif robot pourvu d'un capteur tridimensionnel et procédé de commande de dispositif robot
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
WO2022163580A1 (fr) Procédé de traitement et dispositif de traitement pour la génération d'une image de section transversale à partir d'informations de position tridimensionnelle acquises par un capteur visuel
CN112549052B (zh) 用于调整机器人支承的部件位置的机器人装置的控制装置
US20230264352A1 (en) Robot device for detecting interference of constituent member of robot
KR100784734B1 (ko) 산업용 로봇 시스템의 타원 보간방법
CN117565107A (zh) 机器人空间定位的方法、系统、介质及设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21940826

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023522140

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 18552899

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 202180098103.4

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 112021007292

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21940826

Country of ref document: EP

Kind code of ref document: A1