WO2020026712A1 - Dispositif de traitement d'informations, système de commande, procédé de traitement d'informations, et programme - Google Patents

Dispositif de traitement d'informations, système de commande, procédé de traitement d'informations, et programme Download PDF

Info

Publication number
WO2020026712A1
WO2020026712A1 PCT/JP2019/026960 JP2019026960W WO2020026712A1 WO 2020026712 A1 WO2020026712 A1 WO 2020026712A1 JP 2019026960 W JP2019026960 W JP 2019026960W WO 2020026712 A1 WO2020026712 A1 WO 2020026712A1
Authority
WO
WIPO (PCT)
Prior art keywords
orientation
frame
information processing
dimensional position
path
Prior art date
Application number
PCT/JP2019/026960
Other languages
English (en)
Japanese (ja)
Inventor
加藤 豊
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2020026712A1 publication Critical patent/WO2020026712A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01RELECTRICALLY-CONDUCTIVE CONNECTIONS; STRUCTURAL ASSOCIATIONS OF A PLURALITY OF MUTUALLY-INSULATED ELECTRICAL CONNECTING ELEMENTS; COUPLING DEVICES; CURRENT COLLECTORS
    • H01R43/00Apparatus or processes specially adapted for manufacturing, assembling, maintaining, or repairing of line connectors or current collectors or for joining electric conductors
    • H01R43/26Apparatus or processes specially adapted for manufacturing, assembling, maintaining, or repairing of line connectors or current collectors or for joining electric conductors for engaging or disengaging the two parts of a coupling device

Definitions

  • the present disclosure relates to an information processing device, a control system, an information processing method, and a program.
  • Patent Literature 1 discloses, based on a time-series captured image of a work that moves while avoiding an obstacle, obtains three-dimensional position information on the movement path of the work. An information processing device is disclosed. The information processing apparatus generates, based on the three-dimensional position information, scenario information for moving the work along the movement path by the robot.
  • the information processing apparatus described in Patent Document 1 determines the position and orientation of a moving workpiece so as to avoid an obstacle, and does not determine the position and orientation of a workpiece that fits with another workpiece.
  • the position and orientation that the work should take to avoid the obstacle is determined according to environmental conditions such as the position of the obstacle and the initial position of the work, and has a certain margin. Therefore, even when the environmental conditions slightly change, obstacles can be avoided by moving the work based on the obtained position and orientation.
  • the fitting method differs depending on the type of the work. Therefore, the position and orientation to be taken by the work also changes according to the type of the work. Further, the position and orientation to be taken by the other work differ depending on the state of one of the two works. Therefore, even if the three-dimensional position information on the moving path of the work to be fitted with another work is obtained by using the technology described in Patent Document 1, the work such as the type of the work and the state of the other work is determined. If the condition changes, the three-dimensional position information cannot be used.
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide an information processing apparatus, a control system, and an information processing method capable of easily determining a position and orientation to be taken by an object to be fitted with another object. It is to provide a method and a program.
  • an information processing device includes a measurement unit, a selection unit, and an information generation unit.
  • the measurement unit detects the first object and the second object from each frame of a moving image obtained by imaging a state in which the first object and the second object are fitted, and detects the detected first object and the second object.
  • the three-dimensional position and orientation of each of the object and the second object are measured.
  • the selecting unit is configured to determine, based on the three-dimensional position and orientation measured by the measuring unit, a first frame when the first object and the second object start to contact each other, The second frame when the fitting with the object is completed is selected.
  • the information generation unit generates relative change information based on the three-dimensional position and orientation measured by the measurement unit for each frame from the first frame to the second frame.
  • the relative change information indicates that the first target object and the second target object are in contact with each other in a period from when the first target object comes into contact with the second target object to when the fitting of the first target object and the second target object is completed.
  • 3 shows changes in relative three-dimensional position and orientation.
  • the information processing apparatus further includes a path generation unit for generating a path of the first object in a space where the first object, the second object, and the obstacle are arranged.
  • the path generation unit may determine a first path of the first object from the initial position and orientation to a position and orientation when the second object starts to contact the second object so that the first object does not interfere with the second object and the obstacle. Generate. Further, based on the relative change information and the position and orientation of the second object in space, the path generation unit completes the fitting of the second object from the position and orientation when the contact with the second object starts.
  • the second path of the first object to the current position and orientation is generated. According to this disclosure, by changing the position and orientation of the first object along the first path and the second path generated by the information processing device, the first object can be moved without interference with the second object and the obstacle. One object can be fitted with the second object.
  • the route generation unit generates the first route so as to smoothly connect to the second route.
  • the first object can be moved smoothly by changing the position and orientation of the first object along the first path and the second path generated by the information processing device.
  • the relative change information indicates that the relative three-dimensional position of the first object with respect to the second object changes along a straight line. Further, the relative change information may indicate that the relative three-dimensional attitude of the first object with respect to the second object is constant. Alternatively, the relative change information may indicate that the relative three-dimensional posture of the first object with respect to the second object rotates around a straight line as an axis. As described above, the information processing apparatus can generate relative change information according to the manner of fitting between the first target and the second target.
  • a control system includes the information processing device described above, a robot for changing the position and orientation of the first object, and a control device for controlling the robot.
  • the control device controls the robot such that the position and orientation of the first object change according to the first route and the second route generated by the information processing device.
  • the first object can be fitted to the second object using the robot.
  • the information processing method includes first to third steps.
  • the first step is to detect and detect the first object and the second object from each frame of the moving image obtained by imaging the state where the first object and the second object are fitted. This is a step of measuring the three-dimensional position and orientation of each of the first object and the second object.
  • the second step includes, based on the measured three-dimensional position and orientation, a first frame when the first object and the second object start contacting each other from the moving image, and a first object and a second object. This is a step of selecting the second frame when the fitting with the object is completed.
  • the third step is a step of generating relative change information based on the three-dimensional position and orientation measured for each frame from the first frame to the second frame.
  • a program is a program for causing a computer to execute the above control method. According to these disclosures as well, it is possible to easily determine the position and orientation to be taken by an object to be fitted with another object.
  • FIG. 1 is a schematic diagram illustrating an outline of an information processing system according to an embodiment.
  • FIG. 3 is a diagram illustrating an example of a moving image output from an imaging device.
  • FIG. 2 is a schematic diagram illustrating a hardware configuration of the information processing apparatus according to the embodiment.
  • FIG. 2 is a block diagram illustrating a functional configuration of the information processing apparatus according to the embodiment. It is a figure showing an example of a 2D model image. It is a figure showing an example of relative change information.
  • 9 is a flowchart illustrating an example of a flow of a process performed by the information processing device.
  • 13 is a block diagram illustrating an internal configuration of an information processing device according to Modification 1.
  • FIG. FIG. 4 is a diagram illustrating a route generation method by a route generation unit. It is a figure showing another example of two objects to fit.
  • FIG. 11 is a diagram illustrating an example of a control system including an information processing device according to a first modification.
  • FIG. 1 is a schematic diagram illustrating an outline of an information processing system according to an embodiment.
  • the information processing system 1 is applied to, for example, a production line in which a male connector 2 and a female connector 3 are fitted.
  • the information processing system 1 includes an imaging device 10 and an information processing device 20.
  • the imaging device 10 captures a subject present in the field of view and generates moving image data (hereinafter, also simply referred to as “moving image”).
  • the imaging device 10 is set at a fixed position, and outputs a moving image obtained by imaging a state in which the male connector 2 and the female connector 3 as subjects are fitted.
  • the imaging device 10 may capture an image of the fitting of the male connector 2 and the female connector 3 by a human hand, or may capture an image of the fitting of the male connector 2 and the female connector 3 by a robot. You may.
  • FIG. 2 is a diagram illustrating an example of a moving image output from the imaging device.
  • the moving image in the example shown in FIG. 2 includes frames 60 to 62.
  • the frame 60 shows an image when the male connector 2 and the female connector 3 are separated.
  • the frame 61 shows an image when the male connector 2 and the female connector 3 start to contact.
  • the frame 62 shows an image when the fitting between the male connector 2 and the female connector 3 is completed.
  • the information processing device 20 performs the following processing using the moving image obtained by the imaging of the imaging device 10.
  • the information processing device 20 detects the male connector 2 and the female connector 3 from each frame of a moving image obtained by imaging a state in which the male connector 2 and the female connector 3 are fitted, and detects the detected male connector 2 and The three-dimensional position and orientation of each of the female connectors 3 are measured.
  • the information processing device 20 determines whether the frame 61 when the male connector 2 and the female connector 3 start to contact with each other and the male connector 2 and the female connector 3 from the moving image.
  • the frame 62 at the time of completion is selected.
  • the information processing device 20 is a frame when the male connector 2 and the female connector 3 start to contact with each other by using design data (3D-CAD (3D Dimensional Computer Aided Design) data) of the male connector 2 and the female connector 3. It is determined whether or not.
  • the information processing device 20 uses the 3D-CAD data of the male connector 2 and the female connector 3 to determine whether or not the frame is when the fitting between the male connector 2 and the female connector 3 is completed. .
  • the information processing device 20 generates relative change information based on the three-dimensional position and orientation of each of the male connector 2 and the female connector 3 measured for each frame from the frame 61 to the frame 62.
  • the relative change information refers to the relative change of the male connector 2 with respect to the female connector 3 during the period from when the male connector 2 and the female connector 3 start contacting to when the fitting between the male connector 2 and the female connector 3 is completed. This is information indicating a change in the dimensional position and orientation.
  • the position and orientation to be taken by the male connector 2 fitted with the female connector 3 can be easily determined.
  • the operation program of the robot can be easily designed by using the relative change information.
  • FIG. 3 is a schematic diagram illustrating a hardware configuration of the information processing apparatus according to the embodiment.
  • the information processing device 20 has a structure according to a computer architecture, and realizes various kinds of processing described below by executing a program installed in advance by a processor.
  • the information processing device 20 includes a processor 210 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 212, a display controller 214, and a system controller 216. , An input / output (I / O) controller 218, a hard disk 220, a camera interface 222, an input interface 224, a communication interface 228, and a memory card interface 230. These units are connected to each other so as to enable data communication with the system controller 216 at the center.
  • a processor 210 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 212, a display controller 214, and a system controller 216.
  • These units are connected
  • the processor 210 exchanges programs (codes) and the like with the system controller 216 and executes them in a predetermined order, thereby realizing the intended arithmetic processing.
  • the system controller 216 is connected to the processor 210, the RAM 212, the display controller 214, and the I / O controller 218 via buses, respectively, exchanges data with each unit, and performs processing of the entire information processing apparatus 20. Governing
  • the RAM 212 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and holds programs read from the hard disk 220, processing results for moving images, work data, and the like.
  • DRAM Dynamic Random Access Memory
  • the display controller 214 is connected to the display unit 232, and outputs a signal for displaying various information to the display unit 232 according to an internal command from the system controller 216.
  • the I / O controller 218 controls data exchange between a recording medium connected to the information processing apparatus 20 and an external device. More specifically, I / O controller 218 is connected to hard disk 220, camera interface 222, input interface 224, communication interface 228, and memory card interface 230.
  • the hard disk 220 is typically a non-volatile magnetic storage device, and stores various information in addition to the processing program 250 executed by the processor 210.
  • the processing program 250 installed on the hard disk 220 is distributed while being stored in the memory card 236 or the like.
  • a semiconductor storage device such as a flash memory or an optical storage device such as a DVD-RAM (Digital Versatile Disk Random Access Memory) may be employed.
  • the camera interface 222 corresponds to an input unit that receives moving image data from the imaging device 10, and mediates data transmission between the processor 210 and the imaging device 10.
  • the camera interface 222 includes an image buffer for temporarily storing moving image data from the imaging device 10.
  • the input interface 224 mediates data transmission between the processor 210 and an input device 234 such as a keyboard, a mouse, a touch panel, or a dedicated console.
  • the communication interface 228 mediates data transmission between the processor 210 and another personal computer or server device (not shown).
  • the communication interface 228 is typically made of Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
  • the memory card interface 230 mediates data transmission between the processor 210 and the memory card 236 as a recording medium.
  • the memory card 236 distributes a processing program 250 executed by the information processing apparatus 20 in a stored state, and the memory card interface 230 reads the processing program 250 from the memory card 236.
  • the memory card 236 includes a general-purpose semiconductor storage device such as an SD (Secure Digital), a magnetic recording medium such as a flexible disk (Flexible Disk), and an optical recording medium such as a CD-ROM (Compact Disk-Read Only Memory). Consists of Alternatively, a program downloaded from a distribution server or the like may be installed in the information processing device 20 via the communication interface 228.
  • an OS for providing basic functions of the computer in addition to an application for providing functions according to the present embodiment may be installed.
  • the processing program according to the present embodiment executes a process by calling necessary modules in a predetermined order and / or timing among program modules provided as a part of the OS. Is also good.
  • processing program according to the present embodiment may be provided by being incorporated in a part of another program. Even in such a case, the program itself does not include a module included in another program to be combined as described above, and the process is executed in cooperation with the other program. That is, the processing program according to the present embodiment may be a form incorporated in such another program.
  • part or all of the functions provided by executing the processing program may be implemented as a dedicated hardware circuit.
  • FIG. 4 is a block diagram illustrating a functional configuration of the information processing apparatus according to the embodiment.
  • the information processing device 20 includes a design data storage unit 21, a model data storage unit 22, a measurement unit 23, a selection unit 24, and an information generation unit 25.
  • the design data storage unit 21 and the model data storage unit 22 are configured by the hard disk 220 and the RAM 212 shown in FIG.
  • the measurement unit 23, the selection unit 24, and the information generation unit 25 are realized by the processor 210 illustrated in FIG.
  • the design data storage unit 21 stores 3D-CAD data indicating the shape of each of the male connector 2 and the female connector 3.
  • the model data storage unit 22 stores a plurality of model data for each of the male connector 2 and the female connector 3. Each of the plurality of model data represents coordinates of a plurality of feature points of the object in a two-dimensional model image of the object (either the male connector 2 or the female connector 3) in a certain posture and a feature amount of each feature point. Show.
  • the model data storage unit 22 stores each of the plurality of model data in association with the posture parameter.
  • the posture parameter is a parameter indicating the posture of the object, and is a rotation angle in each of a pitch direction, a yaw direction, and a roll direction from the reference posture.
  • FIG. 5 is a diagram showing an example of a 2D model image.
  • a plurality of 2D model images are created based on 3D-CAD data.
  • a plurality of 2D model images are created by arranging objects represented by 3D-CAD data in various postures in a virtual space in which a fixed position viewpoint is set.
  • a feature point is a point characterized by a corner or an outline included in an image, and is, for example, an edge point.
  • the feature quantity is, for example, luminance, luminance gradient direction, quantization gradient direction, HoG (Histogram of Oriented Gradients), HAAR-like, SIFT (Scale-Invariant Feature Transform), and the like.
  • the luminance gradient direction represents a direction (angle) of a luminance gradient in a local region centered on a feature point as a continuous value
  • the quantization gradient direction is a local region centered on a feature point.
  • the direction of the luminance gradient is represented by a discrete value (for example, eight directions are held by 1-byte information of 0 to 7).
  • the measuring unit 23 converts each frame of the male connector 2 and the female connector 3 from each frame of the moving image. Measure the position and orientation.
  • the measuring unit 23 extracts a plurality of feature points and their feature amounts from each frame of the moving image obtained by the imaging of the imaging device 10.
  • the measuring unit 23 detects the male connector 2 and the female connector 3 included in each frame by comparing the extracted feature points and feature amounts with the model data stored in the model data storage unit 22.
  • the measuring unit 23 measures the detected three-dimensional position and orientation of each of the male connector 2 and the female connector 3.
  • the three-dimensional position is indicated by coordinates of three axes (X, Y, Z axes) orthogonal to each other.
  • the three-dimensional posture is indicated by rotation angles in a pitch direction, a yaw direction, and a roll direction from the reference posture.
  • the measuring unit 23 For each object (either the male connector 2 or the female connector 3), the measuring unit 23 has the most similarity with the feature points and feature amounts extracted from each frame of the moving image from among the plurality of model data. Identify high model data. The measurement unit 23 determines a posture parameter corresponding to the specified model data as a three-dimensional posture of the target.
  • the measuring unit 23 calculates the size ratio between the calibration data, the position of the target detected from the frame on the image, and the target of the two-dimensional model image corresponding to the model data and the target detected from the frame. ,
  • the three-dimensional position of the target object is measured.
  • the calibration data is data indicating the correspondence between the coordinate system of the image and the coordinate system of the real space.
  • the measuring unit 23 may measure the three-dimensional position and orientation of each of the male connector 2 and the female connector 3 from each frame of the moving image using a method other than the two-dimensional matching method.
  • a method other than the two-dimensional matching method For example, “Carlo Tomasi, and one other,“ Shape and Motion from Image Streams under Orthography: a Factorization Method ”, International Journal of Computer Vision, 1992, 9: 2, p.137-154 (Non-Patent Document 1)
  • the three-dimensional position and orientation of each of the male connector 2 and the female connector 3 on the frame may be measured using the factor decomposition method described.
  • the factorization method is a method of estimating a three-dimensional shape and a movement of an object based on a moving image.
  • the selection unit 24 includes, based on the three-dimensional position and orientation measured by the measurement unit 23, a frame (hereinafter, referred to as a “contact start frame”) when the male connector 2 and the female connector 3 start to contact from the moving image. And the frame when the fitting of the male connector 2 and the female connector 3 is completed (hereinafter referred to as “completed frame”).
  • the selection unit 24 obtains the coordinates of the center of gravity of the male connector 2 based on the three-dimensional position and orientation of the male connector 2 measured by the measurement unit 23. Similarly, the selection unit 24 obtains the coordinates of the center of gravity of the female connector 3 based on the three-dimensional position and orientation of the female connector 3 measured by the measuring unit 23.
  • the selection unit 24 selects a frame when the distance between the center of gravity of the male connector 2 and the center of gravity of the female connector 3 first reaches the first specified value as a contact start frame.
  • the first prescribed value is a distance between the centers of gravity of the male connector 2 and the female connector 3 when the two connectors start to contact, and is set in advance based on 3D-CAD data.
  • the selection unit 24 further selects a frame when the distance between the center of gravity of the male connector 2 and the center of gravity of the female connector 3 first reaches the second specified value as a completed frame.
  • the second specified value is a distance between the centers of gravity of the male connector 2 and the female connector 3 when the fitting is completed, and is set in advance based on 3D-CAD data.
  • the information generation unit 25 generates relative change information based on the three-dimensional position and orientation measured by the measurement unit 23 for each frame from the contact start frame to the completion frame. As described above, the relative change information indicates that the male connector 2 is not connected to the female connector 3 during a period from when the male connector 2 starts to come into contact with the female connector 3 until the fitting between the male connector 2 and the female connector 3 is completed. This shows a change (time change) of a relative three-dimensional position and orientation.
  • the relative three-dimensional position of the male connector 2 with respect to the female connector 3 is based on three-axis (X, Y, Z-axis) coordinate values indicating the three-dimensional position of the male connector 2 and three-dimensional positions indicating the three-dimensional position of the female connector 3. Are subtracted from each other.
  • the relative three-dimensional posture of the male connector 2 with respect to the female connector 3 is obtained by subtracting the rotation angles of the female connector 3 in three directions from the reference posture from the rotation angles of the male connector 2 in three directions from the reference posture. Indicated by The three directions are a pitch direction, a yaw direction, and a roll direction.
  • the relative three-dimensional position and orientation are indicated by the six variables.
  • FIG. 6 is a diagram showing an example of the relative change information.
  • FIG. 6 shows the relative three-dimensional relationship of the male connector 2 with respect to the female connector 3 during a period from when the male connector 2 and the female connector 3 start contacting to when the fitting between the male connector 2 and the female connector 3 is completed. A change in position is indicated.
  • a point P1 is a point indicating the relative three-dimensional position of the male connector 2 with respect to the female connector 3 specified based on the three-dimensional positions of the male connector 2 and the female connector 3 in the contact start frame.
  • the point P2 is a point indicating the relative three-dimensional position of the male connector 2 with respect to the female connector 3 specified based on the three-dimensional positions of the male connector 2 and the female connector 3 in the completed frame.
  • the relative three-dimensional position of the male connector 2 with respect to the female connector 3 changes along the straight line B1.
  • FIG. 7 is a flowchart illustrating an example of the flow of a process performed by the information processing device.
  • step S ⁇ b> 1 the information processing device 20 acquires a moving image indicating a state in which the male connector 2 and the female connector 3 are fitted from the imaging device 10.
  • step S2 the measurement unit 23 compares the model data stored in the model data storage unit 22 with each frame of the moving image to determine the three-dimensional position and orientation of the male connector 2 and the female connector 3 included in the frame. measure.
  • step S3 the selection unit 24 selects a contact start frame and a completion frame from the moving image based on the three-dimensional position and orientation of the male connector 2 and the female connector 3 included in each frame.
  • step S4 the information generation unit 25 generates relative change information based on the three-dimensional position and orientation measured by the measurement unit 23 for each frame from the contact start frame to the completion frame. After step S4, the process ends.
  • the information processing device may generate a path from the initial position and orientation of the male connector 2 to the three-dimensional position and orientation when the fitting with the female connector 3 is completed.
  • the path indicates a time change of the three-dimensional position and orientation of the male connector 2.
  • the first modification will be described on the assumption that the three-dimensional position and orientation of the female connector 3 and the obstacle do not change.
  • the obstacle is an object arranged around the male connector 2 and the female connector 3.
  • FIG. 8 is a block diagram showing the internal configuration of the information processing apparatus according to the first modification. As illustrated in FIG. 8, the information processing apparatus 20A according to the first modification is different from the information processing apparatus 20 illustrated in FIG. 4 in that the information processing apparatus 20A further includes an arrangement information storage unit 26 and a route generation unit 27. .
  • the arrangement information storage unit 26 stores arrangement information indicating an initial position and orientation of the male connector 2, the female connector 3, and the obstacle in the real space.
  • the path generation unit 27 generates a first path of the male connector 2 from the initial position and posture to the three-dimensional position and posture when the contact with the female connector 3 is started. Further, the path generation unit 27 generates a second path of the male connector 2 from the three-dimensional position and posture when the contact with the female connector 3 starts to the three-dimensional position and posture when the fitting with the female connector 3 is completed. .
  • FIG. 9 is a diagram illustrating a route generation method by the route generation unit.
  • FIG. 9 shows a male connector 2, a female connector 3, and obstacles 4a and 4b arranged in a real space according to the arrangement information.
  • the path generation unit 27 determines from the three-dimensional position and orientation P20 when the contact with the female connector 3 starts to be completed when the fitting with the female connector 3 is completed. A second path A2 of the male connector 2 up to the three-dimensional position / posture P21 is obtained.
  • the path generation unit 27 further obtains the first path A1 of the male connector 2 from the initial position and posture P0 to the three-dimensional position and posture P20 when the contact with the female connector 3 starts.
  • the route generation unit 27 obtains the first route A1 according to, for example, the following procedure.
  • the path generation unit 27 obtains a plurality of three-dimensional position and orientation candidates that the male connector 2 can take in the real space (hereinafter, referred to as “position and orientation candidates”). A three-dimensional position and orientation that interferes with the female connector 3 or any of the obstacles 4a and 4b is excluded from the position and orientation candidates.
  • FIG. 9 shows some position and orientation candidates P10 to P14 of the plurality of position and orientation candidates obtained by the route generation unit 27.
  • the route generation unit 27 generates a section candidate that connects any two of the initial position / posture P0, the three-dimensional position / posture P20, and a plurality of position / posture candidates with a straight line.
  • FIG. 9 shows some section candidates C10 to C16 of the plurality of section candidates.
  • the route generation unit 27 determines whether each section candidate interferes with any of the female connector 3 and the obstacles 4a and 4b.
  • the route generation unit 27 excludes a section candidate that interferes with any of the female connector 3 and the obstacles 4a and 4b from the plurality of section candidates. For example, in the example shown in FIG. 9, a section candidate C12 that interferes with the obstacle 4b and a section candidate C13 that interferes with the obstacles 4a and 4b are excluded.
  • the route generation unit 27 generates a plurality of route candidates having the initial position / posture P0 as a starting point and the three-dimensional position / posture P20 as an end point by sequentially connecting the section candidates selected from the remaining section candidates.
  • the route generation unit 27 deforms each of the plurality of route candidates so as to be a higher-order (for example, third-order or higher) spline curve. As a result, each of the plurality of route candidates becomes smooth.
  • the path generation unit 27 determines whether the deformed path candidate (hereinafter, referred to as a “deformed path candidate”) interferes with any of the female connector 3 and the obstacles 4a and 4b.
  • the path generation unit 27 excludes, from the plurality of deformation path candidates, a deformation path candidate that interferes with one of the female connector 3 and the obstacles 4a and 4b.
  • the path generation unit 27 determines one of the deformation path candidates that does not interfere with any of the female connector 3 and the obstacles 4a and 4b as the first path A1.
  • the route generation unit 27 may determine the shortest deformation route candidate as the first route A1.
  • a modified route candidate obtained by modifying a route candidate in which the section candidates C10, C11, and C14 to C16 are sequentially connected is determined as the first route A1.
  • the arrangement information stored in the arrangement information storage unit 26 may include information indicating the shape of the robot hand 30a (see FIG. 9) holding the male connector 2.
  • the route generation unit 27 also determines whether there is interference between the hand 30a, the female connector 3, and any of the obstacles 4a and 4b. Then, the path generation unit 27 excludes position and orientation candidates, section candidates, and deformation path candidates where the hand 30a interferes with any one of the female connector 3 and the obstacles 4a, 4b.
  • the path generation unit 27 connects the first path A1 and the second path A2, and thereby the path A of the male connector 2 from the initial position and posture to the three-dimensional position and posture when the fitting with the female connector 3 is completed. Generate At this time, it is preferable that the path generation unit 27 corrects the first path A1 so as to smoothly connect to the second path A2.
  • “smooth” means dX (s) / ds, dY (s) / ds, dZ (s) / ds, dRy (s) / ds, dRp (s) / ds, dRr (s) / ds Are continuous for changes in the distance s.
  • the distance s indicates a distance along the route A from the initial position / posture P0.
  • dX (s) / ds indicates a one-time differential value of the X coordinate X (s) at a point at a distance s along the path A from the initial position / posture P0.
  • dY (s) / ds indicates a one-time differential value of the Y coordinate Y (s) at a point at a distance s along the path A from the initial position / posture P0.
  • dZ (s) / ds indicates a one-time differential value of the Z coordinate Z (s) at a point at a distance s along the path A from the initial position / posture P0.
  • dRy (s) / ds represents a one-time differential value of the rotation angle Ry (s) in the yaw direction from the reference posture at a point at a distance s from the initial position and posture P0 along the path A.
  • dRp (s) / ds indicates a one-time differential value of the rotation angle Rp (s) in the pitch direction from the reference posture at a point at a distance s from the initial position and posture P0 along the path A.
  • dRr (s) / ds indicates a one-time differential value of the rotation angle Rr (s) in the roll direction from the reference posture at a point at a distance s from the initial position and posture P0 along the path A.
  • the path generation unit 27 sets the position and orientation candidates on a line obtained by extrapolating the second path A2 to the three-dimensional position and orientation P20 side, and connects a plurality of position and orientation candidates including the position and orientation candidates. Is preferably generated as the second route A2.
  • the position and orientation candidate P14 is set on a line obtained by extrapolating the second path A2 to the three-dimensional position and orientation P20 side. This makes it easier to generate the first route A1 that smoothly connects to the second route A2.
  • the path generation unit 27 When the three-dimensional position and orientation of the female connector 3 changes over time, the path generation unit 27 generates the second path A2 using the relative change information in consideration of the time change of the three-dimensional position and orientation of the female connector 3. do it. For example, this is the case where the female connector 3 is placed on the transport belt.
  • the change in the relative three-dimensional position and orientation of one of the two fitted objects with respect to the other is not limited to this.
  • the relative three-dimensional position of one of the two objects with respect to the other may change along a path (for example, an L-shaped path) in which a plurality of straight lines are combined.
  • the relative three-dimensional position of one of the two objects with respect to the other may change along a straight line, and the relative three-dimensional posture of the other of the two objects may rotate.
  • the relative three-dimensional attitude of the other object with respect to one of the two objects may be rotated about a first axis and then rotated about a second axis different from the first axis.
  • FIG. 10 is a diagram showing another example of two fitted objects.
  • the two objects to be fitted may be a bolt 5 and a nut 6.
  • the information processing devices 20 and 20A may generate relative change information indicating a change in the relative three-dimensional position and orientation of the bolt 5 with respect to the nut 6, for example.
  • the relative three-dimensional position of the bolt 5 with respect to the nut 6 during the period from the time when the bolt 5 and the nut 6 start contacting to the time when the fitting (screwing) between the bolt 5 and the nut 6 is completed is determined. 6 along a straight line B2 indicating the axis of the screw hole.
  • the relative three-dimensional posture of the bolt with respect to the nut rotates about the straight line B2.
  • the information processing device 20 can recognize the rotation angle of the bolt about the straight line B2.
  • the two objects to be fitted may be a power tap and a power plug.
  • the information processing devices 20 and 20A may generate, for example, relative change information indicating a change in the three-dimensional position and orientation of the power plug relative to the power tap.
  • the relative three-dimensional position of the power plug with respect to the power tap changes along a straight line.
  • the power plug is rotated by a predetermined amount after the power plug is inserted into the power tap.
  • the relative three-dimensional attitude of the power plug with respect to the power tap does not change in the first half of the period from when the power tap and the power plug start contacting to when the fitting of the power tap and the power plug is completed. Revolves in the second half of the period.
  • FIG. 11 is a diagram illustrating an example of the control system 9 including the information processing device according to the first modification.
  • the control system 9 includes an information processing device 20A, a robot 30, a robot controller 40, and a control device 50.
  • the robot 30 is a mechanism for changing the three-dimensional position and orientation of the male connector 2, and is, for example, a vertical articulated robot.
  • the robot 30 has a hand for gripping the male connector 2 at the tip, and changes the three-dimensional position and orientation of the hand with six degrees of freedom.
  • the six degrees of freedom include translational degrees of freedom in the X, Y, and Z directions, and rotational degrees of freedom in the pitch, yaw, and roll directions.
  • the robot 30 is not limited to a vertical articulated robot, and may be another robot (for example, a parallel link robot).
  • the robot 30 has a plurality of servo motors, and the three-dimensional position and orientation of the male connector 2 are changed by driving the servo motors.
  • An encoder is provided for each of the plurality of servomotors, and the position of the servomotor is measured.
  • the robot controller 40 controls the operation of the robot 30 in accordance with the control command received from the control device 50.
  • the robot controller 40 receives, from the control device 50, control commands for translational degrees of freedom in the X, Y, and Z directions and rotational degrees of freedom in the pitch, yaw, and roll directions.
  • the robot controller 40 performs feedback control on the robot 30 so that the translation amounts of the hand in the X, Y, and Z directions approach control commands for the degrees of freedom of translation in the X, Y, and Z directions, respectively. .
  • the robot controller 40 performs feedback control on the robot 30 so that the rotational movement amounts of the hand in the pitch direction, the yaw direction, and the roll direction approach control commands for the rotational degrees of freedom in the pitch, yaw, and roll directions, respectively. .
  • the control device 50 controls the robot 30 by outputting a control command to the robot controller 40.
  • the control device 50 changes the three-dimensional position and orientation of the male connector 2 according to the route A (the first route A1 and the second route A2) of the male connector 2 (see FIG. 9) generated by the information processing device 20A. Generate control commands.
  • the control device 50 outputs the generated control command to the robot controller 40.
  • the three-dimensional position and orientation of the male connector 2 can be changed along the path A generated by the information processing device 20A.
  • the male connector 2 can be fitted to the female connector 3 using the robot 30.
  • Non-Patent Document 2 discloses a technique for reproducing the motion of a robot using a given moving image. ing. It is conceivable to use a CG (Computer Graphics) moving image as the moving image.
  • the CG moving image can be created by designating 3D-CAD data of two fitted objects and a change in the relative three-dimensional position and orientation of one of the two objects with respect to the other. Therefore, by using the relative change information generated by the information processing devices 20 and 20A, it is possible to easily create a CG moving image for reproducing the operation of the robot.
  • the first object (2,5) and the first object (2,5) are obtained from each frame of a moving image obtained by imaging a state in which the first object (2,5) and the second object (3,6) are fitted.
  • (Configuration 2) The path of the first object (2, 5) in a space where the first object (2, 5), the second object (3, 6), and the obstacle (4a, 4b) are arranged Further comprising a path generation unit (27) for generating The path generation unit (27) sets the first object (2, 5) from the initial position and orientation so as not to interfere with the second object (3, 6) and the obstacles (4a, 4b).
  • a first path of the first object (2, 5) up to the position and orientation when the second object (3, 6) starts to contact is generated, and the relative change information and the second path in the space are generated.
  • the fitting with the second object (3, 6) is completed from the position and orientation when the contact with the second object (3, 6) is started.
  • the information processing device (20A) according to configuration 1, wherein the information processing device (20A) generates a second path of the first target object (2, 5) up to the position and orientation when performing.
  • the relative change information indicates that a relative three-dimensional position of the first object (2, 5) with respect to the second object (3, 6) changes along a straight line.
  • the information processing device according to any one of (20, 20A).
  • (Configuration 7) The information processing device (20A) according to configuration 2, A robot (30) for changing the position and orientation of the first object (2, 5); A control system (9) including a control device (50) for controlling the robot, The control device (50) controls the robot so that the position and orientation of the first object (2, 5) change according to the first path and the second path generated by the information processing device (20A). A control system (9) for controlling (30).
  • the first object (2,5) and the first object (2,5) are obtained from each frame of a moving image obtained by imaging a state in which the first object (2,5) and the second object (3,6) are fitted. Detecting a second object (3, 6) and measuring a three-dimensional position and orientation of each of the detected first object (2, 5) and the second object (3, 6); A first frame when the first object (2, 5) and the second object (3, 6) start contacting each other based on the measured three-dimensional position and orientation from the moving image; Selecting a second frame when the fitting of the first object (2, 5) and the second object (3, 6) is completed; The first object (2, 5) and the second object (3, 6) come into contact with each other based on the three-dimensional position and orientation measured for each frame from the first frame to the second frame.
  • the first object (2,5) and the first object (2,5) are obtained from each frame of a moving image obtained by imaging a state in which the first object (2,5) and the second object (3,6) are fitted. Detecting a second object (3, 6) and measuring a three-dimensional position and orientation of each of the detected first object (2, 5) and the second object (3, 6); A first frame when the first object (2, 5) and the second object (3, 6) start contacting each other based on the measured three-dimensional position and orientation from the moving image; Selecting a second frame when the fitting of the first object (2, 5) and the second object (3, 6) is completed; The first object (2, 5) and the second object (3, 6) come into contact with each other based on the three-dimensional position and orientation measured for each frame from the first frame to the second frame.
  • 1 information processing system 2 male connector, 3 female connector, 4a, 4b obstacle, 5 bolt, 6 nut, 9 control system, 10 imaging device, 20, 20A information processing device, 21 design data storage unit, 22 model data storage Unit, 23 measurement unit, 24 selection unit, 25 information generation unit, 26 arrangement information storage unit, 27 path generation unit, 30 robot, 30a hand, 40 robot controller, 50 control device, 210 ⁇ processor, 212 RAM, 214 display controller, 216 system controller, 218 I / O controller, 220 hard disk, 222 camera interface, 224 input interface, 228 communication interface, 230 memory card interface, 232 display 234 input unit, 236 memory card, 250 processing programs, A path, A1 first path, A2 second path.

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Manufacturing Of Electrical Connectors (AREA)

Abstract

L'invention concerne un dispositif de traitement d'informations qui mesure chaque orientation de position tridimensionnelle d'un premier objet et d'un second objet dans chaque trame d'une image animée obtenue par imagerie d'une situation de mise en prise du premier objet et du second objet. Le dispositif de traitement d'informations sélectionne une première trame lorsque le premier objet commence à entrer en contact avec le second objet, et une seconde trame lorsque la mise en prise du premier objet et du second objet est achevée. Le dispositif de traitement d'informations génère des informations de changement relatif qui indiquent un changement relatif de l'orientation de position tridimensionnelle du premier objet par rapport au second objet pendant une période à partir du moment où le premier objet entre en contact avec le second objet jusqu'à ce que la mise en prise soit achevée. Par conséquent, l'orientation de position à adopter par un objet qui est en prise avec un autre objet peut être déterminée facilement.
PCT/JP2019/026960 2018-07-31 2019-07-08 Dispositif de traitement d'informations, système de commande, procédé de traitement d'informations, et programme WO2020026712A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-143469 2018-07-31
JP2018143469A JP6904315B2 (ja) 2018-07-31 2018-07-31 情報処理装置、制御システム、情報処理方法およびプログラム

Publications (1)

Publication Number Publication Date
WO2020026712A1 true WO2020026712A1 (fr) 2020-02-06

Family

ID=69231040

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/026960 WO2020026712A1 (fr) 2018-07-31 2019-07-08 Dispositif de traitement d'informations, système de commande, procédé de traitement d'informations, et programme

Country Status (2)

Country Link
JP (1) JP6904315B2 (fr)
WO (1) WO2020026712A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013146844A (ja) * 2012-01-23 2013-08-01 Seiko Epson Corp 教示画像生成装置、教示画像生成方法および教示画像生成プログラムならびにロボット制御装置、ロボット制御方法およびロボット制御プログラム
JP2015150636A (ja) * 2014-02-13 2015-08-24 ファナック株式会社 ビジュアルフィードバックを利用したロボットシステム
WO2015199086A1 (fr) * 2014-06-23 2015-12-30 Cyberdyne株式会社 Système de reproduction de mouvement et dispositif de reproduction de mouvement
JP2018015856A (ja) * 2016-07-29 2018-02-01 セイコーエプソン株式会社 ロボット、ロボット制御装置、及びロボットシステム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013146844A (ja) * 2012-01-23 2013-08-01 Seiko Epson Corp 教示画像生成装置、教示画像生成方法および教示画像生成プログラムならびにロボット制御装置、ロボット制御方法およびロボット制御プログラム
JP2015150636A (ja) * 2014-02-13 2015-08-24 ファナック株式会社 ビジュアルフィードバックを利用したロボットシステム
WO2015199086A1 (fr) * 2014-06-23 2015-12-30 Cyberdyne株式会社 Système de reproduction de mouvement et dispositif de reproduction de mouvement
JP2018015856A (ja) * 2016-07-29 2018-02-01 セイコーエプソン株式会社 ロボット、ロボット制御装置、及びロボットシステム

Also Published As

Publication number Publication date
JP6904315B2 (ja) 2021-07-14
JP2020019083A (ja) 2020-02-06

Similar Documents

Publication Publication Date Title
US20200096317A1 (en) Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium
JP5812599B2 (ja) 情報処理方法及びその装置
US11498220B2 (en) Control system and control method
US20140018957A1 (en) Robot system, robot, robot control device, robot control method, and robot control program
US20120306874A1 (en) Method and system for single view image 3 d face synthesis
JP7064257B2 (ja) 画像深度確定方法及び生き物認識方法、回路、装置、記憶媒体
JP7111114B2 (ja) 情報処理装置、情報処理方法及び情報処理システム
WO2021218542A1 (fr) Procédé et appareil d'étalonnage spatial basé sur un dispositif de perception visuelle pour système de coordonnées de corps de robot, et support de stockage
Olesen et al. Real-time extraction of surface patches with associated uncertainties by means of kinect cameras
US10992879B2 (en) Imaging system with multiple wide-angle optical elements arranged on a straight line and movable along the straight line
CN111476841A (zh) 一种基于点云和图像的识别定位方法及系统
CN113744340A (zh) 用轴向视点偏移的非中心相机模型校准相机并计算点投影
CN114310901B (zh) 用于机器人的坐标系标定方法、装置、系统以及介质
Lambrecht Robust few-shot pose estimation of articulated robots using monocular cameras and deep-learning-based keypoint detection
CN113284192A (zh) 运动捕捉方法、装置、电子设备以及机械臂控制系统
JP2019098409A (ja) ロボットシステムおよびキャリブレーション方法
JP6936974B2 (ja) 位置姿勢推定装置、位置姿勢推定方法及びプログラム
JP2019158427A (ja) 制御装置、ロボット、ロボットシステム,及び、物体を認識する方法
JP2021086432A (ja) 情報処理装置、情報処理方法、コンピュータプログラム、計測装置、システムおよび物品の製造方法
JP2010184300A (ja) 姿勢変更システムおよび姿勢変更方法
WO2020026712A1 (fr) Dispositif de traitement d'informations, système de commande, procédé de traitement d'informations, et programme
JP7439410B2 (ja) 画像処理装置、画像処理方法およびプログラム
Zhang et al. Vision-based six-dimensional peg-in-hole for practical connector insertion
WO2020022040A1 (fr) Système de commande, procédé de commande et programme
Amamra Smooth head tracking for virtual reality applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19843483

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19843483

Country of ref document: EP

Kind code of ref document: A1