WO2019182084A1 - Method and device - Google Patents

Method and device Download PDF

Info

Publication number
WO2019182084A1
WO2019182084A1 PCT/JP2019/011980 JP2019011980W WO2019182084A1 WO 2019182084 A1 WO2019182084 A1 WO 2019182084A1 JP 2019011980 W JP2019011980 W JP 2019011980W WO 2019182084 A1 WO2019182084 A1 WO 2019182084A1
Authority
WO
WIPO (PCT)
Prior art keywords
marker
coordinate system
predetermined
posture
tool
Prior art date
Application number
PCT/JP2019/011980
Other languages
French (fr)
Japanese (ja)
Inventor
陽 粟田
Original Assignee
アトラスコプコ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アトラスコプコ株式会社 filed Critical アトラスコプコ株式会社
Publication of WO2019182084A1 publication Critical patent/WO2019182084A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23PMETAL-WORKING NOT OTHERWISE PROVIDED FOR; COMBINED OPERATIONS; UNIVERSAL MACHINE TOOLS
    • B23P21/00Machines for assembling a multiplicity of different parts to compose units, with or without preceding or subsequent working of such parts, e.g. with programme control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present disclosure relates to a method and an apparatus for detecting whether an object using a marker attached to an object such as a tool or a work on the object is correctly performed.
  • a system in which a marker attached to the tool is imaged and the obtained image is analyzed to detect the position of the tool.
  • a marker attached near the tip of a tool is imaged and the position of the tool is detected.
  • the marker can be attached at a position where the driver is fitted with the screw, the marker may be hidden by the hand or the driver, and the marker may not be identified.
  • a marker cannot be attached to the screw hole of the part that is the most desired position.
  • the present disclosure has been made in view of the above points, and is the most desired position of a tool or the like having various shapes and sizes (that is, in the case of a tool, the position where the tool contacts a screw or the like).
  • the object marker is attached at a position different from the position where the component is acted on by a screw or the like.
  • one of the purposes is to obtain the position of the detection point that is the most desired position by using the object marker (in the above-described tool, the position where the tool contacts a screw or the like).
  • the posture of the part on which the tool or the like acts may be incorrect, and the work may not be performed correctly.
  • the screw attached to the tool enters obliquely with respect to the part or the like, and the screw cannot be inserted correctly.
  • one aspect of the present disclosure includes a step of acquiring the position and orientation of the object marker in a predetermined coordinate system from an image obtained by imaging an object marker attached to an object such as a tool;
  • the predetermined coordinates based on the coordinates of the detection point of the object from the object marker in the marker coordinate system (in the above driver, the coordinates of the place where the screw is fitted), and the position and orientation of the object marker Calculating the position of the detection point and the posture of the object in the system, the position of the detection point is within a predetermined position range in the predetermined coordinate system, and the posture of the object is the predetermined coordinate Determining whether the position is within a predetermined posture range in the system.
  • the predetermined position range is a position range in which a tool or the like can be moved in order to correctly assemble parts or the like.
  • the predetermined posture range is a posture range in which a tool or the like can be moved in order to correctly assemble parts or the like.
  • the position and orientation of the object marker in a predetermined coordinate system are detected from an image obtained by imaging an object marker attached to an object such as a tool, and the object marker in the marker coordinate system is detected.
  • a position / orientation detection unit that detects the position of the detection point and the attitude of the object in the predetermined coordinate system based on the coordinates of the detection point of the object from the position of the object marker and the position and orientation of the object marker;
  • a position and orientation comparison unit that determines whether the position of a point is within a predetermined position range in the predetermined coordinate system and the posture of the object is within a predetermined posture range in the predetermined coordinate system; It is a device provided.
  • FIG. 1 is a schematic configuration diagram of an offset setting system according to an embodiment of the present disclosure. It is a figure which shows the relationship between each coordinate system which concerns on one Embodiment of this indication, an object marker, etc.
  • FIG. 2 illustrates an example of a marker according to an embodiment of the present disclosure. 4 illustrates a configuration example of detection point data according to an embodiment of the present disclosure. 2 illustrates an example of an object according to an embodiment of the present disclosure. 2 illustrates an example of an object according to an embodiment of the present disclosure. 2 illustrates an example of an object according to an embodiment of the present disclosure. 2 illustrates an example of an object according to an embodiment of the present disclosure. 2 illustrates an example of an object according to an embodiment of the present disclosure. It is a schematic structure figure of a control system concerning a 1st embodiment of this indication.
  • FIG. 5 is a flowchart illustrating an operation of a control system according to an embodiment of the present disclosure. It is a figure which shows the relationship between each coordinate system which concerns on one Embodiment of this indication, an object marker, etc.
  • FIG. 3 shows a configuration example of target coordinate posture data according to an embodiment of the present disclosure. 3 shows a configuration example of a sequence table according to an embodiment of the present disclosure. It is a figure which shows the relationship between each coordinate system which concerns on one Embodiment of this indication, an object marker, etc.
  • FIG. 3 shows a schematic configuration of a control system according to a second embodiment of the present disclosure.
  • 3 shows a schematic configuration of a control system according to a second embodiment of the present disclosure.
  • 3 shows a schematic configuration of a control system according to a second embodiment of the present disclosure.
  • 4 shows a schematic configuration of a control system according to a third embodiment of the present disclosure.
  • 6 shows a schematic configuration of a control system according to a fourth embodiment of the present disclosure. 6 schematically shows a configuration of a control system according to a fifth embodiment of the present disclosure. It is a figure which shows the example of the setting tool which concerns on one Embodiment of this indication.
  • One embodiment of the present invention has the following configuration.
  • (Item 1) A step of acquiring the position and orientation of the object marker in a predetermined coordinate system from an image obtained by imaging the object marker attached to the object, and the detection point of the object from the object marker in the marker coordinate system Calculating the position of the detection point and the posture of the object in the predetermined coordinate system based on the coordinates and the position and posture of the object marker; and the position of the detection point in the predetermined coordinate system. Determining whether the object is within a predetermined position range and the posture of the object is within a predetermined posture range in the predetermined coordinate system.
  • the predetermined coordinate system is a camera coordinate system. The method described in 1.
  • the predetermined coordinate system is a reference marker coordinate system.
  • the position and orientation of the object marker in a predetermined coordinate system is detected from an image obtained by imaging the object marker attached to the object, and the coordinates of the detection point of the object from the object marker in the marker coordinate system
  • a position and orientation detection unit that detects the position of the detection point and the orientation of the object in the predetermined coordinate system based on the position and orientation of the object marker
  • a position and orientation comparison unit that determines whether the position of the detection point is within a predetermined position range in the predetermined coordinate system and the posture of the object is in a predetermined target posture in the predetermined coordinate system;
  • a device comprising:
  • (Item 8) Generation of a control signal for outputting a control signal for a control object when the position of the detection point is within the predetermined position range and the posture of the object is within the predetermined posture range
  • the control signal generation unit further acquires a response signal from the control object in accordance with the control signal, and the response signal is a signal that does not depend on the type of the control object.
  • Item 9 The apparatus according to item 8.
  • the three-dimensional position of the object marker is calculated based on the image obtained by imaging the object marker attached to the object.
  • the object is a position and posture detected by an object marker, and includes tools, parts, labels, workers' hands, and the like.
  • the tool is gripped and operated by an operator.
  • the part receives an action from an operator or a tool, for example, a part having a screw hole.
  • the label is moved or pasted by an operator.
  • the three-dimensional position of the detection point located away from the object marker is calculated, and the three-dimensional position of the detection point is compared with a predetermined position range set in advance. The correctness of the worker's work is determined.
  • the control object includes a tool operated by an operator, an indicator light controlled by a computer, a switch, and the like.
  • a tool can be either an object or a controlled object.
  • the object marker only needs to be arranged at a position where it can be imaged from the camera, and is arranged at a position different from the detection point, usually on the object.
  • the positional relationship between the object marker and the detection point of the object can be arbitrarily set.
  • FIG. 1A is a schematic configuration diagram of an offset setting system 100 according to an embodiment of the present disclosure.
  • the offset setting system 100 includes a camera 110, an object 120 to which an object marker 122 is attached, a setting tool 130 to which a setting marker 132 is attached, and a computer 140.
  • the camera 110 and the computer 140 are installed in the same place and connected to each other wirelessly or by wire.
  • the camera 110 and the computer 140 are installed at remote locations and are connected to each other via a network (not shown) such as a LAN (Local Area Network), a WAN (Wide Area Network), or the Internet. May be.
  • a network not shown
  • the object 120 is shown as a tool in FIG. 1A, the same applies to the object 120 such as a part other than the tool or a label.
  • the camera 110 is a camera that can capture images continuously or at regular intervals, for example, a web camera.
  • the camera 110 is arranged at a position where the object marker 122 and the setting marker 132 can be imaged.
  • the object marker 122 and the setting marker 132 are so-called AR (Augmented Reality) markers.
  • AR Augmented Reality
  • the marker position in the camera coordinate system ⁇ c can be obtained from the marker size, and the marker posture in the camera coordinate system ⁇ c can be obtained from the marker distortion method.
  • the camera coordinate system ⁇ c is a coordinate system with the center of the camera lens as the origin.
  • FIG. 1C shows an example of the object marker 122 and the setting marker 132.
  • Each marker is a quadrangular two-dimensional image in which a white pattern is arranged on a black background.
  • Each marker is assigned a marker identification number and a different bit pattern in which an error correction code is encoded.
  • the AR marker shown in FIG. 1C is an example, and the shape and color of the marker are not limited to this.
  • the object marker 122 is attached to the object 120 at an arbitrary position away from the detection point 124 and capable of being imaged from the camera 110.
  • the object 120 may be a tool as shown in FIG. 1A, or may be other objects of various types as described later with reference to FIGS. 3A to 3C.
  • the detection point 124 is set to a position on the object 120 whose position is to be detected. For example, when the object 120 is a tool as shown in FIG. 1A, the detection point 124 is set to the tip of the tool.
  • the object marker 122 may be imaged from the camera 110 and may not be provided on the object 120 as long as the positional relationship with the object 120 is fixed.
  • the setting tool 130 is used to set a relative positional relationship between the object marker 122 and the detection point 124.
  • the setting tool 130 is provided with a setting marker 132 and a graphic indicating the reference position 134.
  • the reference position 134 is arranged at a predetermined distance from the setting marker 132, for example, at a position away from the center of the setting marker 132 by a distance L in a certain direction.
  • the setting tool 130 may be a plate as shown in FIG. 1A, or when the object 120 is a tool, as shown in FIG. 12, it may be a jig having a portion that can be fitted to the tip of the tool. Good.
  • the computer 140 includes a processor 142, a memory 144, and a display operation unit 146 as main components.
  • the processor 142 executes a series of instructions included in the program stored in the memory 144 based on a signal given to the computer 140 or when a predetermined condition is established.
  • the processor 142 is realized as a device such as a CPU (Central Processing Unit), an MPU (Micro Processor Unit), or an FPGA (Field-Programmable Gate Array).
  • the components included in the processor 142 are just one example of expressing functions performed by the processor 142 as specific modules. The functions of multiple components may be realized by a single component.
  • the processor 142 may be configured to perform the functions of all components.
  • the processor 142 includes a marker coordinate setting unit 150.
  • the marker coordinate setting unit 150 includes a position detection unit 152 and a coordinate conversion unit 154.
  • the memory 144 stores programs and data.
  • the program is loaded from the memory 144, for example.
  • the data includes data input to the computer 140 and data generated by the processor 142.
  • the memory 144 is realized as a volatile memory such as a RAM (Random Access Memory) or a non-volatile memory such as a ROM (Read Only Memory).
  • the memory 144 includes detection point data 162, as shown in FIG. 1A.
  • the display operation unit 146 displays the state of each unit and can accept operation input by an operator.
  • FIG. 1B shows an object marker 122, a detection point 124, a reference position 134 set on the setting tool 130, a setting marker 132, and each three-dimensional orthogonal coordinate system (marker coordinate system ⁇ m (Xm, Ym, Zm).
  • the relationship between the camera coordinate system ⁇ c (Xc, Yc, Zc) and the setting marker coordinate system ⁇ s (Xs, Ys, Zs) is shown in the offset coordinate system 100 in the marker coordinate system ⁇ m with the object marker 122 as a reference.
  • the position of the detection point 124 is obtained, and the marker coordinate system ⁇ m has the origin at the object marker coordinate 126 set on the object marker 122.
  • the posture of the marker coordinate system ⁇ m is set based on the posture of the object marker 122, For example, the XY axis is set on the arrangement plane of the object marker 122, and the Z axis is set in a direction orthogonal to the arrangement plane. Over Ca coordinate system ⁇ s may be set similarly to the posture marker coordinate system ⁇ m of that. Set marker coordinate system ⁇ s the origin setting marker coordinates set on the setting marker 132.
  • the operator overlaps the setting tool 130 and the object 120 in order to set the relative positional relationship between the object marker 122 and the detection point 124.
  • the detection point 124 overlaps the reference position 134 that is separated from the setting marker 132 and has a positional relationship with the setting marker 132. Then, the reference position 134 is detected as the position of the detection point 124.
  • the processor 142 (position detection unit 152) analyzes an image including the object marker 122 and the setting marker 132 from the camera image when the detection point 124 and the reference position 134 are overlapped. Then, the position and orientation of each marker in the camera coordinate system ⁇ c are calculated. At this time, as shown in FIG. 1B, the position of the detection point 124 in the camera coordinate system ⁇ c is equal to the reference position 134 in the camera coordinate system ⁇ c.
  • the processor 142 calculates the reference position 134 in the camera coordinate system ⁇ c based on the position and orientation of the setting marker 132 and the distance L between the setting marker 132 and the reference position 134.
  • the processor 142 obtains a rotation matrix from the posture of the object marker 122 in the camera coordinate system ⁇ c, and determines the reference position 134 (that is, the coordinates of the detection point 124) in the camera coordinate system ⁇ c as the object marker 122.
  • the coordinate transformation is performed to the marker coordinate system ⁇ m based on the object marker coordinates 126.
  • a known three-dimensional coordinate conversion method is used, and detailed description thereof is omitted here.
  • the offset coordinates of the detection point 124 in the marker coordinate system ⁇ m are acquired by coordinate conversion.
  • the offset coordinates of the detection point 124 are the coordinates of the detection point 124 with respect to the origin 126 of the marker coordinate system ⁇ m.
  • the acquired offset coordinates of the detection point 124 are stored in the memory 144 as detection point data 162 for each identification number of the object marker 122.
  • the positional relationship between the object marker 122 and the detection point 124 is fixed, and the positional relationship does not change even when the object 120 moves.
  • FIG. 2 shows a configuration example of the detection point data 162 stored in the memory 144.
  • the detection point data 162 includes a marker identification number 202, a detection point offset coordinate 204, and a detection point type 206.
  • the marker identification number 202 is an identification number assigned to each object marker 122 obtained by decoding the image of the object marker 122.
  • the detection point type 206 indicates the type of object to which the object marker 122 is attached.
  • the object marker 122 can be attached to various objects 120 in various ways.
  • FIGS. 3A to 3D illustrate the relationship among various types of objects 120, object markers 122 attached to the objects 120, detection points 124, and object marker coordinates 126.
  • FIG. 3A shows a case where the object 120 is a glove or a hand.
  • FIG. 3B shows a case where the object 120 is a part.
  • FIG. 3C shows a case where the object 120 is a label.
  • FIG. 3D illustrates the object 120 with a plurality of object markers 122 attached thereto.
  • a plurality of object markers 122 a 1 and 122 a 2 (hereinafter collectively referred to as object markers 122 a) are arranged on the same side of the object 120. Even if the camera 110 cannot capture one object marker 122a1 due to a part of the worker's body or a shield, if the other object marker 122a2 can be captured by the camera 110, the position of the detection point 124 is determined. Can be set.
  • a plurality of object markers 122a, 122b, 122c are arranged on different sides of the object 120.
  • the plurality of cameras 110 are arranged so that images can be taken from different angles of the object 120, respectively.
  • the camera 110a images the side of the object 120 where the object marker 122a is attached, and the side where the object markers 122b and 122c are attached becomes the blind spot of the camera 110a and cannot be imaged.
  • the camera 110b takes an image of the side where the object marker 122a is attached, which is the blind spot of the cameras 110a and 110c, and the camera 110c is attached of the object marker 122c which is the blind spot of the cameras 110a and 110b.
  • Each side is imaged.
  • the offset coordinates of the detection point 124 with the object marker 122 as a reference are set.
  • the configuration of the control system 400 configured to support the worker's work using the set offset coordinates of the detection point 124 will be described.
  • FIG. 4A is a schematic configuration diagram of a control system 400 according to the first embodiment of the present disclosure.
  • the control system 400 compares the three-dimensional position of the detection point 124 in a predetermined coordinate system with the preset three-dimensional position of the detection point 124 to determine whether the operator is working correctly or not. To do. Further, if the worker's work is correct or incorrect, the control system 400 sends a control signal to the control object 460 based on a preset sequence, and a response signal from the control object 460. Receive.
  • FIG. 4A shows a case where the object 120 and the control object 460 (tool 464) are the same (tool 470).
  • FIG. 4B is a schematic configuration diagram including functional blocks of the control system 400 according to the first embodiment of the present disclosure shown in FIG. 4A.
  • the control system 400 includes a camera 110, one or more tools 470 each having an object marker 122 attached thereto, and a computer 140.
  • the computer 140 includes a processor 142 and a memory 144 as main components.
  • the processor 142 includes a position / orientation detection unit 410, a position / orientation comparison unit 420, a sequence control unit 430, and a control signal generation unit 440.
  • the components included in the processor 142 are just one example of expressing functions performed by the processor 142 as specific modules.
  • the memory 144 includes detection point data 162, target coordinate posture data 452, and a sequence table 454.
  • the computer 140 is connected to one or more control objects 460.
  • the computer 140 outputs a control signal to the control object 460 and receives a response signal from the control object 460.
  • the computer 140 and the control object 460 may be directly connected, or a wired communication interface such as LAN (Local Area Network), or a wireless communication interface such as WiFi (Wireless Fidelity) or Bluetooth (registered trademark). It may be connected via.
  • LAN Local Area Network
  • WiFi Wireless Fidelity
  • Bluetooth registered trademark
  • Each control object 460 includes a tool controller, 462, and a tool 470.
  • the tool controller 462 receives a control signal from the computer 140 or sends a response signal to the computer 140.
  • the tool controller 462 validates the tool 470 (allows various operations of the tool 470) or sets various parameters of the tool 470.
  • the tool controller 462 receives various signals such as a tightening torque from the tool 470. Based on various signals from the tool 470, the tool controller 462 determines whether or not the work using the tool 470 has been successful, and the like, and sends a response signal to the computer 140, for example, a signal indicating that the work has been completed. Is output.
  • FIG. 5 is a flowchart of a method 500 performed by the control system 400 according to the first embodiment.
  • the object 120 and the control object 460 are the same (the tool 470 in FIG. 4A).
  • a flowchart of the method 500 will be described with reference to FIG.
  • step 504 the processor 142 (position / orientation detection unit 410) analyzes the image of the object marker 122 attached to the tool 470 (here, the tool 470 as an object) from the image captured by the camera 110, and performs predetermined processing.
  • the three-dimensional position and orientation of the object marker 122 in the coordinate system are calculated.
  • the predetermined coordinate system may be a camera coordinate system ⁇ c set in the camera 110, or a coordinate system ⁇ b (shown in FIG. 8) based on a separately provided reference marker to be described later.
  • the processor 142 calculates the posture of the tool 470 as an object from the posture of the object marker 122 (step 504).
  • the processor 142 may set the posture of the object marker 122 itself as the posture of the tool 470, or may set the posture of the inspection point 124 obtained by multiplying the posture of the object marker 122 by the rotation matrix as the posture of the tool 470. Further, the processor 142 calculates the position of the detection point 124 in a predetermined coordinate system based on the offset coordinates 204 of the detection point 124 of the detection point data 162 stored in the memory 144 and the three-dimensional position of the object marker 122. (Step 504).
  • the processor 142 (position / orientation comparison unit 420) reads target coordinate / orientation data 452 from the memory 144.
  • the target coordinate posture data 452 stores target coordinates 612 of the detection point 124 in a predetermined coordinate system and a target posture 614 of the tool 470 in the predetermined coordinate system.
  • the target coordinate 612 is the correct coordinate of the detection point 124 in a predetermined coordinate system
  • the target posture 614 is the correct posture of the tool 470 in the predetermined coordinate system.
  • the processor 142 compares the coordinates of the detection point 124 detected in step 504 with the target coordinates 612, and compares the posture of the tool 470 detected in step 504 with the target posture 614. A method for setting the target coordinates 612 and the target posture 614 will be described later.
  • the processor 142 (position / orientation comparison unit 420) matches the coordinates of the current detection point 124 in the predetermined coordinate system with the target coordinates 612 and the current attitude of the tool 470 in the predetermined coordinate system, and the target attitude. Whether or not 614 matches is determined.
  • the processor 142 determines that the difference between the coordinates of the current detection point 124 in the predetermined coordinate system and the target coordinates 612 is within the predetermined position range 604 and Whether the difference between the current posture of the tool 470 and the target posture 614 in the coordinate system is within a predetermined posture range 606 is determined.
  • the position range 604 is a range in which the detected 124 points are determined to be in the correct position
  • the posture range 606 is a range in which the detected point 124 is determined to be in the correct posture. Note that the state of the tool 470 may be compared with either one of the target coordinates 612 and the target posture 614, and the match may be determined.
  • the posture of the tool 470 is compared with the target posture 614 (posture range 606). Therefore, for example, in the sequence of tightening the bolt using the tool 470, the bolt can be tightened at the correct position and the correct posture, and a problem caused by the bolt entering obliquely can be avoided.
  • FIG. 6A illustrates the relationship among the object marker 122, the detection point 124, the target position 602, the position range 604, the marker coordinate system ⁇ m, and a predetermined coordinate system (here, the camera coordinate system ⁇ c).
  • the position range 604 is indicated by a sphere centered on the target position 602 in FIG. 6A.
  • the shape of the position range 604 is not limited to a sphere, and may be various shapes such as a hemisphere or a cube.
  • a reference marker coordinate system ⁇ b shown in FIG. 8) based on the reference marker may be used as a predetermined coordinate system.
  • FIG. 6B shows a configuration example of the target coordinate posture data 452 stored in the memory 144.
  • the target coordinate posture data 452 includes at least a sequence number 610, a target coordinate 612 of the target position 602, a target posture 614, and a marker identification number 202. These values are preset.
  • the sequence number 610 is a number assigned to each process defined in the sequence.
  • the target coordinate 612 is indicated by coordinates (xt, yt, zt) in a predetermined coordinate system
  • the target posture 614 is indicated by a vector of three orthogonal axes (P1, Q1, R1) in the predetermined coordinate system.
  • a set including a target coordinate 612, a target posture 614, and a marker identification number 202 corresponds to one sequence number.
  • a plurality of sets may be associated with one sequence number.
  • a plurality of different objects 120 here, tool 470
  • the target coordinate posture data 452 may further include a preset position range 604 and a posture range 606 corresponding to each sequence number.
  • the position range 604 is indicated by the radius of a sphere centered on the target coordinate 612
  • the posture range 606 is an angle centered on each axis of the three orthogonal vectors indicating the target posture 614. Indicated.
  • a method for setting the target coordinates 612 and the target posture 614 will be described.
  • the operator operates the tool 470 to acquire a camera image when the detection point 124 of the tool 470 is set to the target position 602 with the correct position and posture. From the acquired camera image, a three-dimensional position and orientation of the object marker 122 in a predetermined coordinate system are calculated.
  • a target posture 614 is obtained from the calculated posture of the object marker 122.
  • the target posture 614 may be the posture of the object marker 122 itself, or may be the posture of the detection point 124 obtained by multiplying the posture of the object marker 122 by a rotation matrix.
  • the coordinates of the detection point 124 in a predetermined coordinate system are calculated.
  • the calculated coordinates of the detection point 124 are target coordinates 612.
  • step 508 If “Yes” in step 508, the processor 142 generates a coincidence signal and the process proceeds to step 510. On the other hand, if “NO” in the step 508, the process returns to the step 504.
  • step 510 the processor 142 (sequence control unit 430) refers to the sequence table 454.
  • the processor 142 determines whether or not the control object 460 (here, the tool 470 as the tool 464) needs to be controlled according to the contents set in the sequence table 454, and determines that the control is necessary ( In step 512, “Yes”), the process proceeds to step 514. On the other hand, if it is determined that the tool 470 does not require control, the process proceeds to step 518.
  • step 514 the processor 142 (control signal generation unit 440) generates a control signal for the control object 460 and outputs the control signal to the control object 460.
  • the control signal is a digital binary signal as an example, and indicates that the sequence has started or that the sequence has ended.
  • Control object 460 executes a predetermined process according to the control signal.
  • the control object 460 (tool controller 462) enables the operation of the tool 470 as the tool 464.
  • the worker performs work using the enabled tool 464.
  • the tool controller 462 determines whether the work using the tool 470 is completed or successful based on a signal output from the tool 470 as a result of the work. When it is determined that the work using the tool 470 has been completed or succeeded, the tool controller 462 (control object 460) outputs a response signal to the control signal generation unit 440.
  • step 516 when the processor 142 (control signal generation unit 440) receives the response signal from the control object 460, the processor 142 (control signal generation unit 440) outputs the response signal to the sequence control unit 430.
  • the response signal is a digital binary signal as an example, and indicates that the input terminal of the control object 460 is active or the input terminal is inactive.
  • the computer 140 can set the state of the input terminal of the control object 460 as a process start condition or an end condition in the sequence table 454.
  • step 518 When the processor 142 (sequence control unit 430) refers to the sequence table 454 and determines that there is no next process, the process ends (step 520). On the other hand, if it is determined in step 518 that there is a next process, the process returns to step 504.
  • the control signal generation unit 440 generates data including a program number n corresponding to the process to be executed by the tool 470 as a control signal and outputs the data to the control object 460.
  • the tool controller 462 reads a program corresponding to the acquired program number n, and parameters specified in the program, for example, when the tool 470 is a tightening tool, a tightening torque, a tightening angle, a rotational speed, whether reverse rotation is possible, Setting parameters such as the number of retries at the time of a tightening error are set in the tool 470, and the tool 470 is validated.
  • the tool 470 When an operation is performed using the enabled tool 470, the tool 470 outputs various signals, for example, a tightening torque when the tool is actually operated, to the tool controller 462.
  • the tool controller 462 determines whether the signal acquired from the tool 470 matches the set parameter. If the tool controller 462 determines that the signal matches, the tool controller 462 generates a processing completion message and outputs it to the computer 140.
  • step 516 when the computer 140 receives the processing completion message from the tool 470, the computer 140 refers to the sequence table 454 and ends the processing when determining that there is no next processing (step 520). On the other hand, if it is determined in step 518 that there is a next process, the process returns to step 504.
  • the computer 140 of the present disclosure does not hold parameters unique to the tool 464 such as the tool 470, for example, tightening torque, and the tool controller 462 holds control parameters unique to the tool 464.
  • the computer 140 inputs and outputs general-purpose signals (control signals and response signals) that do not depend on the type of the tool 464 with the control object 460. Therefore, the computer 140 can be connected to various tools 464 (tools, signal lights, switches, etc.) or tools 464 manufactured by various manufacturers without changing the setting according to the type of the tool 464.
  • the computer 140 can control various types of tools 464 by using the tool controller 462, and can flexibly cope with the process design of production sites using various tools. Replacement of 464 can be facilitated.
  • the control object 460 may be the same as or different from the object 120.
  • the object 120 when the control object 460 and the object 120 are different are a glove (FIG. 3A), a part (FIG. 3B), a label (FIG. 3C), and a helmet (FIG. 11).
  • Specific examples of the tool 464 when 120 is different are an indicator light, a switch, and the like.
  • the object 120 is a glove and the control object 460 is an indicator light
  • the tool 470 as an object is replaced with a glove in steps S502 to S510 shown in FIG. 5, and the tool as a tool is replaced in steps S512 to S519.
  • Tool 470 is replaced with an indicator light.
  • FIG. 7 shows an example of the sequence table 454.
  • the sequence table 454 includes a sequence number 610, target coordinate posture data 452, a start condition 702, an end condition 704, and a control target output 706.
  • the sequence table 454 is preset by the operator via the display operation unit 146 (FIG. 1A).
  • the start condition 702 is a condition for starting the process for the control object 460
  • the end condition 704 is a condition for ending the process for the control object 460
  • the control object output 706 is a signal generated by the computer 140. Indicates the port to output.
  • Each of the start condition 702 and the end condition 704 includes one of “automatic”, “position and orientation”, “control input X”, and “time T”.
  • the condition “automatic” indicates that the sequence process is unconditionally started or ended when the sequence process is selected.
  • the condition “position / orientation” indicates that, when the sequence process is selected, the sequence process is started or ended by a coincidence signal from the position / orientation comparison unit 420.
  • the condition “control input X” indicates that, when the sequence process is selected, the sequence process is started or ended according to the response signal received from the control object 460.
  • the condition “time T” indicates that, when the sequence process is selected, the sequence process is started or ended after the set time T has elapsed.
  • sequence number 001 As an example of a combination of the start condition 702, the end condition 704, and the control target output 706, the start condition 702 “automatic”, the end condition 704 “position and orientation”, and the control target output 706 “indicator” will be described (sequence number 001).
  • the object 120 is a hand
  • the tool 464 is an indicator light
  • the object 120 and the tool 464 are different.
  • the process of sequence number 001 is entered unconditionally (start condition).
  • start condition When the hand (object 120) is placed within the predetermined position range, the process proceeds to the next sequence process (end condition).
  • the computer 140 sends a control signal to the control object 460.
  • a specific indicator lamp (tool 464) is turned on based on the control signal. Then, the process proceeds to the next sequence process.
  • sequence number 002 As another example of the combination of the start condition 702, the end condition 704, and the control target output 706, the start condition 702 “position and orientation”, the end condition 704 “external input 1”, and the control target object 460 “none” will be described ( Sequence number 002).
  • both the object 120 and the tool 464 are tools.
  • a screw at a specific location is tightened at a correct position and orientation using a specific tool (object 120), and the process of sequence number 002 is entered (start condition).
  • start condition an external input 1 (response signal) indicating completion of tightening from a specific tool is obtained, the process ends (end condition) and proceeds to the next sequence process.
  • control system 400 The configuration and the like of the control system 400 according to the first embodiment of the present disclosure have been described above. Next, the configuration of the control system according to the second to fifth embodiments of the present disclosure will be described.
  • a reference marker coordinate system ⁇ b is used as a predetermined coordinate system.
  • a target coordinate 612 and a target posture 614 with respect to the reference marker coordinate system ⁇ b are fixed.
  • the reference marker coordinate system ⁇ b is a coordinate system having the reference marker coordinate 804 as the origin.
  • the reference marker 802 is fixed in position and orientation relative to the target position 602 even when the camera 110 moves or the target position 602 changes in the global coordinate system ⁇ g. Further, the reference marker 802 is disposed at an arbitrary position that can be imaged by the camera 110.
  • FIG. 9A shows a schematic configuration of the control system 900A when the parts 912 arranged on the conveyor 910 move
  • FIGS. 9B and 9C show schematic configurations of the control systems 900B and 900C when the camera 110 moves, respectively.
  • both the object 120 and the control object 460 are tools 470.
  • FIG. 9A shows a first reference marker 802a attached and fixed to a part 912 on the conveyor 910, and a second reference marker 802b attached and fixed on the conveyor 910.
  • both the first reference marker 802a and the second reference marker 802b move.
  • at least one of the first reference marker 802a or the second reference marker 802b is used.
  • the first reference marker 802a attached to the component 912 is used.
  • the computer 140 compares the coordinates of the detection point 124 in the reference marker coordinate system ⁇ b (referenced to the first reference marker 802a in FIG.
  • the tool controller 462 outputs a control signal to the tool 470.
  • the camera 110 is attached to the object 120, and the camera 110 moves together with the tool 470 as an object.
  • the camera 110 images the first reference marker 802a attached to and fixed to the component 912.
  • the computer 140 compares the coordinates of the detection point 124 in the reference marker coordinate system ⁇ b with the target coordinates 612, and compares the posture of the tool 470 in the reference marker coordinate system ⁇ b with the target posture 614. According to the present embodiment, even if the target position 602 is set at a site that cannot be seen by the operator, the camera 110 can capture the vicinity of the target position 602, so whether the detection point 124 has approached the target position 602. Can be determined.
  • the camera 110 is attached to the helmet 932 worn by the worker 930, and the camera 110 moves as the head of the worker 930 moves.
  • the camera 110 captures an image of at least one of the first reference marker 802a and the third reference marker 802c attached to and fixed to the component 912.
  • the first reference marker 802a is attached and fixed to the component 912 on the work table 940
  • the third reference marker 802c is attached to the work table 940 fixed in the global coordinate system ⁇ g.
  • the computer 140 uses at least one of the first reference marker 802a and the third reference marker 802c to determine the coordinates of the detection point 124 in the reference marker coordinate system ⁇ b (in FIG. 9C, the third reference marker 802c is the reference).
  • the target coordinates 612 of the target position 602 are compared, and the posture of the tool 470 as an object in the reference marker coordinate system ⁇ b is compared with the target posture 614.
  • the target coordinates 612 and the target posture 614 change with respect to the camera coordinate system ⁇ c (Xc, Yc, Zc).
  • the target coordinate 612 is corrected to the target coordinate in the camera coordinate system ⁇ c using the position detection device 920 without using the reference marker 802.
  • both the object 120 and the control object 460 are tools 470.
  • FIG. 9D schematically illustrates a configuration of a control system 800E according to the third embodiment of the present disclosure.
  • the component 912 is arranged on the conveyor 910 to which the position sensor 914 is attached.
  • a control system 900E according to the third embodiment includes a camera 110, an object 120, a computer 140, a position sensor 914, and a position sensor PLC (Programmable Logic Controller) 820.
  • the position sensor 914 detects the movement of the conveyor 910 and detects the current position Cp of the conveyor.
  • the position sensor PLC 820 converts the current conveyor position Cp from the starting point of the conveyor into a physical length (m) and outputs it to the computer 140.
  • the computer 140 calculates the current position of the conveyor in one process when the length on the conveyor in one process is Lp (m).
  • the current position of the conveyor within one process is a remainder obtained by dividing the current position Cp (m) from the starting point of the conveyor by Lp (m).
  • the computer 140 holds a three-dimensional unit vector V (Ux, Uy, Uz) indicating the traveling direction of the conveyor in the camera coordinate system ⁇ c.
  • the computer 140 uses the current conveyor position in one process and the three-dimensional unit vector V indicating the conveyor traveling direction in the camera coordinate system ⁇ c to calculate the target coordinates 612 stored in the memory 144.
  • the target coordinate in the camera coordinate system ⁇ c is corrected.
  • P is the target coordinate 612 of the target position 602 stored in the memory 144 in the camera coordinate system ⁇ c.
  • the target coordinates thus used are used for comparison with the coordinates of the detection point 124 in the camera coordinate system ⁇ c, where mod is a remainder, and the unit vector V is temporarily provided with a reference marker 802 on the conveyor 910, for example, It is acquired by photographing the reference marker 802 at regular time intervals with an installed camera.
  • FIG. 10 is a diagram schematically illustrating a configuration of a control system 1000 according to the fourth embodiment of the present disclosure.
  • it is determined whether or not a plurality of different types of objects 120 (workers' hands and tools) are in the correct positions or postures using images captured by the fixed camera 110.
  • the control object 460 (tool 464) is used to assist the operator's work.
  • the camera 110 includes an operator's hand 120a with an object marker 122a, a tool 464c (object 120b) with an object marker 122b, a bolt bucket 1002 with target positions 602ta and 602tb set, and a target position.
  • the component 912 in which 602tc1 to tc4 are set is imaged.
  • the object marker 122a and the object marker 122b have different identification numbers, and the detection point 124a corresponds to the object marker 122a, and the detection point 124b corresponds to the object marker 122b.
  • the operator is identified by the identification number of the object marker 122a, and the tool 464 is identified by the identification number of the object marker 122b.
  • control system 1000 takes a bolt (not shown) from a specific bolt bucket 1002 (sequence process 1) by a specific worker, and the specific worker sets the bolt to the part 912.
  • the operator uses a specific tool to assist in tightening the arranged bolt (sequence process 3).
  • sequence process 3 the contents of the sequence processes 1 to 3 will be described in detail.
  • Sequence Process 1 Bolt Extraction
  • the object 120 is an operator's hand
  • the control object 460 is the indicator lamps 464a and 464b.
  • the computer 140 activates (lights) the tool (indicator light) 464a, and the specific operator identified by the object marker 122a attached to the glove has a specific position in which the target position 602ta is set. Prompt to collect bolts from the bolt bucket 1002.
  • the computer 140 deactivates the tool 464a (turns off the indicator light) and proceeds to the next process.
  • FIG. 1 Bolt Extraction
  • the tools 464a and 464b are connected to the computer 140, but the tools 464a and 464b may be connected to the tool controller 462. In this case, the tools 464a and 464b are turned on or off in response to a control signal from the tool controller 462.
  • Sequence process 2 Bolt arrangement
  • the object 120 is an operator's hand
  • the control object 460 is an operator's hand.
  • the computer 140 determines that the collected bolt is arranged at the correct target position 602tc1, and proceeds to the next process.
  • Sequence process 3 bolt tightening
  • both the object 120 and the controlled object 460 are tools 470.
  • the computer 140 outputs a control signal to the tool controller 462.
  • the tool controller 462 sets the tightening program 1 based on the control signal.
  • the tool 470 sends a tightening completion signal to the tool controller 462.
  • the tool controller 462 detects the tightening completion signal, the tool controller 462 outputs a response signal to the computer 140 and proceeds to the next step.
  • Sequence processing 4 Bolt sampling (second) The computer 140 performs the same process as the sequence process 1 on the tool 464b instead of the tool 464a and on the target position 602tb instead of the target position 602ta.
  • Sequence process 5 Bolt arrangement (second) The computer 140 performs the same process as the sequence process 2 on the target position 602tc2 instead of the target position 602tc1.
  • Sequence processing 6 bolt tightening (second)
  • the computer 140 performs the same process as the sequence process 3 on the target position 602tc2 instead of the target position 602tc1.
  • control system 1000 can support the worker's work so that the worker performs various processes defined in the sequence process using the specific tool 464 or the specific object 120.
  • the contents of the sequence processes 1 to 6 are examples and are not limited thereto, and various changes can be made according to the type of the object, the type of the control target, and the like.
  • the control system 1000 when the object 120 is a component (FIG. 3B) and the control object 460 is an indicator lamp, the control system 1000 causes the operator to set a specific indicator 120 identified by the object marker 122 as a preset indicator lamp. It is possible to assist the user in arranging at a specific part position where is lit.
  • the object 120 is a label (FIG. 3C) and the control object 460 is an indicator lamp
  • the control system 1000 turns on an indicator lamp in which a specific label identified by the object marker 122 is set in advance. The worker can be assisted to paste it at the specified position.
  • FIG. 11 is a diagram schematically illustrating a configuration of a control system 1100 according to the fifth embodiment of the present disclosure.
  • the control system 1100 includes a first computer 140a connected to the first camera 110a, a second computer 140b connected to the second camera 110b, a tool controller connected to the first and second computers 140b, One object 120a (helmet 932) and a second object 120b (tool 470) are provided.
  • the first computer 140a and the second computer 140b are connected to the tool controller 462 via an arithmetic unit 950 that outputs a logical product of outputs from these computers.
  • the control system 1100 uses a combination of the first camera 110a whose position is fixed in the global coordinate system ⁇ g and the second camera 110b whose position is not fixed.
  • the control system 1100 according to the fifth embodiment is different from the embodiment shown in FIG. 9C in that no reference marker is attached to the component 912 or the workbench 940.
  • two-step coordinate conversion is performed using the first camera 110a that is in a fixed positional relationship with the part 912.
  • the positional relationship between the first camera 110a and the component 912 is fixed, and the first object marker 122a attached to the helmet 932 and the target position 602ta in the three-dimensional space are imaged.
  • the first computer 140a calculates the coordinates of the first detection point 124a set on the helmet 932 as an object from the position and orientation of the first object marker 122a. Then, the first detection point 124a is compared with a predetermined position range of the target position 602ta, and the posture of the helmet 932 is compared with a predetermined posture range to determine coincidence.
  • the second camera 110b is fixed to the helmet 932 worn by the worker 930, and images the object marker 122b and the part 912 in which the target position 602 is set.
  • the second computer 140b calculates the coordinates of the second detection point 124b set on the tool 470 from the position and orientation of the second object marker 122b attached to the tool 470 as an object. Then, the second computer 140b compares the second detection point 124b with the position range of the target position 602tb, compares the posture of the tool 470 with the position range of the target posture, and determines coincidence.
  • the arithmetic device 850 outputs the logical product of the coincidence signals from the first computer 140 a and the second computer 140 b to the tool controller 462.
  • the coincidence determination result by the first camera 110a and the first computer 140a and the coincidence determination result by the second camera 110b and the second computer 140b are combined, and directly by the fixed first camera 110a.
  • the position and orientation of the target position 602tb in the coordinate system fixed to the part 912 can be detected and the coincidence can be determined without photographing the part 912.
  • the processing by the first computer 140a and the processing by the second computer 140b are performed, and further the logical product of the processing results by the respective computers is taken, resulting in two-step coordinate transformation. Become.
  • the first camera 110a can photograph the first object marker 122a
  • the first camera 110a cannot photograph the component 912 directly
  • the second camera 110b can photograph the component 912.
  • the target position can be detected even in such a complicated work environment.
  • the fixed first camera 110a can image the object marker 122a that moves together with the non-fixed second camera 110b, and calculate the position and orientation of the first object marker 122a.
  • the worker 930 may perform work by inserting a hand and a tool into the vehicle while being outside the vehicle. In such a case, the head of the worker 930 outside the vehicle can be photographed by the fixed camera 110a, but the hand and tool 120 of the worker inserted in the vehicle cannot be photographed by the fixed first camera 110a. Sometimes. The hand and tool of the worker 930 can be photographed by an unfixed second camera 110b attached to the helmet 932 of the worker.
  • the fifth embodiment is useful when a range that cannot be captured by a fixed camera can be captured by a non-fixed camera.
  • Sequence control unit 440 Control signal generation unit 452 ... Target coordinate orientation data 454 ... Sequence table 460 ... Control object 462 ... Tool Controller 464 ... Tool 470 ... Tool 602 ... Target position 604 ... Position range 612 ... Target coordinate 614 ... Target posture 616 ... Marker identification number 802 ... Reference marker 804 ... Reference marker coordinate 910 ... Conveyor 912 ... Parts 914 ... Position sensor 920 ... Position Detection device

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Automatic Assembly (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In the present invention, a marker attached to an object is used to assist in an operation performed by an operator. Provided is a method comprising: a step in which the position and orientation, in a prescribed coordinate system, of an object marker attached to an object are acquired from an image in which the object marker is captured; a step in which the position, in the prescribed coordinate system, of a detection point of the object from the object marker in a marker coordinate system, and the orientation of the object, are calculated on the basis of the coordinates of the detection point and the position and orientation of the object marker; and a step in which an assessment is made as to whether the position of the detection point is within a prescribed position range in the prescribed coordinate system and whether the orientation of the object is within a prescribed orientation range in the prescribed coordinate system.

Description

方法及び装置Method and apparatus
 本開示は、工具等のオブジェクトに付されたマーカを利用したオブジェクトによる、あるいはオブジェクトに対する作業が正しくおこなわれたかを検出する方法及び装置に関する。 The present disclosure relates to a method and an apparatus for detecting whether an object using a marker attached to an object such as a tool or a work on the object is correctly performed.
 生産現場において部品を組み付ける工具の位置を検出するために、工具に付されたマーカを撮像し、得られた画像を解析して工具の位置を検出するシステムが知られている。例えば、特許文献1では、工具の先端付近に付設されたマーカを撮像し、工具の位置を検出する。 In order to detect the position of a tool for assembling a part at a production site, a system is known in which a marker attached to the tool is imaged and the obtained image is analyzed to detect the position of the tool. For example, in Patent Document 1, a marker attached near the tip of a tool is imaged and the position of the tool is detected.
特許第5930708号Patent No. 5930708
 しかしながら、上記のような従来のシステムでは、工具を用いてねじの締め付けを行う際に、最も知りたい位置であるねじ等に作用する位置にマーカを付設可能な、特殊な形状を有する工具を用いる必要があった。例えば、特許文献1ではねじに近い位置にマーカを付設可能なトルクレンチを用いる必要があった。しかしながら、工具の形状や大きさによっては、工具がねじ等に作用する位置にマーカを付設することは困難である。例えば、工具がドライバの場合、ドライバがねじと嵌合する位置は狭く、当該位置にマーカを付設することは困難である。ドライバがねじと嵌合する位置にマーカを付設できたとしても、マーカが手やドライバによって隠れてしまいマーカを識別することができないことがある。また、ねじで部品を組み立てる際に、最も知りたい位置である部品のねじ穴にはマーカを付設することができない。本開示は上述の点に鑑みてなされたものであり、様々な形状や大きさを有する工具等の最も知りたい位置(即ち、工具の場合、工具がねじ等に接触する位置であり、部品の場合、部品がねじ等により作用される位置である)とは異なる位置にオブジェクトマーカを付設する。そして、当該オブジェクトマーカを用いて最も知りたい位置である検出点の位置(上記工具でいえば、工具がねじ等に接触する位置)を求めることを目的の一つとする。 However, in the conventional system as described above, when a screw is tightened using a tool, a tool having a special shape in which a marker can be attached to a position acting on a screw or the like that is the most desired position is used. There was a need. For example, in Patent Document 1, it is necessary to use a torque wrench capable of attaching a marker at a position close to a screw. However, depending on the shape and size of the tool, it is difficult to attach a marker at a position where the tool acts on a screw or the like. For example, when the tool is a driver, the position where the driver is fitted with the screw is narrow, and it is difficult to attach a marker at the position. Even if the marker can be attached at a position where the driver is fitted with the screw, the marker may be hidden by the hand or the driver, and the marker may not be identified. In addition, when assembling a part with screws, a marker cannot be attached to the screw hole of the part that is the most desired position. The present disclosure has been made in view of the above points, and is the most desired position of a tool or the like having various shapes and sizes (that is, in the case of a tool, the position where the tool contacts a screw or the like). In this case, the object marker is attached at a position different from the position where the component is acted on by a screw or the like. Then, one of the purposes is to obtain the position of the detection point that is the most desired position by using the object marker (in the above-described tool, the position where the tool contacts a screw or the like).
 また上記のような従来のシステムでは、工具の姿勢によっては、工具等が作用する部品の姿勢が誤っており、正しく作業を行えないことがある。例えば工具がドライバの場合、工具の姿勢が誤っていると、工具に取り付けられたねじが部品等に対して斜めに入り、ねじを正しく入れることができない。本開示は上述の点に鑑みてなされたものであり、工具等を正しい位置および姿勢で用いることができるよう、作業者の作業を支援することを他の目的の一つとする。 Also, in the conventional system as described above, depending on the posture of the tool, the posture of the part on which the tool or the like acts may be incorrect, and the work may not be performed correctly. For example, when the tool is a driver, if the posture of the tool is wrong, the screw attached to the tool enters obliquely with respect to the part or the like, and the screw cannot be inserted correctly. This indication is made in view of the above-mentioned point, and makes it one of other objects to support a worker's work so that a tool etc. can be used in a right position and posture.
 上述した課題を解決するために、本開示の一態様は、工具等のオブジェクトに付されたオブジェクトマーカを撮像した画像から、所定の座標系における前記オブジェクトマーカの位置と姿勢を取得するステップと、マーカ座標系における前記オブジェクトマーカからの前記オブジェクトの検出点の座標(上記ドライバでいうと、ねじと嵌合する場所の座標)と、前記オブジェクトマーカの位置と姿勢とに基づいて、前記所定の座標系における前記検出点の位置と前記オブジェクトの姿勢を算出するステップと、前記検出点の位置が、前記所定の座標系における所定の位置範囲内にあり、かつ前記オブジェクトの姿勢が、前記所定の座標系における所定の姿勢範囲内にあるかを判定するステップと、を備える方法である。所定の位置範囲は、正しく部品等を組み立てるために工具等を移動し得る位置範囲である。所定の姿勢範囲は、正しく部品等を組み立てるために工具等を移動し得る姿勢範囲である。 In order to solve the above-described problem, one aspect of the present disclosure includes a step of acquiring the position and orientation of the object marker in a predetermined coordinate system from an image obtained by imaging an object marker attached to an object such as a tool; The predetermined coordinates based on the coordinates of the detection point of the object from the object marker in the marker coordinate system (in the above driver, the coordinates of the place where the screw is fitted), and the position and orientation of the object marker Calculating the position of the detection point and the posture of the object in the system, the position of the detection point is within a predetermined position range in the predetermined coordinate system, and the posture of the object is the predetermined coordinate Determining whether the position is within a predetermined posture range in the system. The predetermined position range is a position range in which a tool or the like can be moved in order to correctly assemble parts or the like. The predetermined posture range is a posture range in which a tool or the like can be moved in order to correctly assemble parts or the like.
 また、本発明の他の一態様は、工具等のオブジェクトに付されたオブジェクトマーカを撮像した画像から、所定の座標系における前記オブジェクトマーカの位置と姿勢を検出し、マーカ座標系における前記オブジェクトマーカからの前記オブジェクトの検出点の座標と、前記オブジェクトマーカの位置と姿勢とに基づいて、前記所定の座標系における前記検出点の位置と前記オブジェクトの姿勢を検出する位置姿勢検出部と、前記検出点の位置が、前記所定の座標系における所定の位置範囲内にあり、かつ前記オブジェクトの姿勢が、前記所定の座標系における所定の姿勢範囲内にあるかを判定する位置姿勢比較部と、を備える装置である。 In another aspect of the present invention, the position and orientation of the object marker in a predetermined coordinate system are detected from an image obtained by imaging an object marker attached to an object such as a tool, and the object marker in the marker coordinate system is detected. A position / orientation detection unit that detects the position of the detection point and the attitude of the object in the predetermined coordinate system based on the coordinates of the detection point of the object from the position of the object marker and the position and orientation of the object marker; A position and orientation comparison unit that determines whether the position of a point is within a predetermined position range in the predetermined coordinate system and the posture of the object is within a predetermined posture range in the predetermined coordinate system; It is a device provided.
本開示の一実施形態に係るオフセット設定システムの概略構成図である。1 is a schematic configuration diagram of an offset setting system according to an embodiment of the present disclosure. 本開示の一実施形態に係る各座標系と、オブジェクトマーカ等との関係を示す図である。It is a figure which shows the relationship between each coordinate system which concerns on one Embodiment of this indication, an object marker, etc. FIG. 本開示の一実施形態に係るマーカの例を示す。2 illustrates an example of a marker according to an embodiment of the present disclosure. 本開示の一実施形態に係る検出点データの構成例を示す。4 illustrates a configuration example of detection point data according to an embodiment of the present disclosure. 本開示の一実施形態に係るオブジェクトの例を示す。2 illustrates an example of an object according to an embodiment of the present disclosure. 本開示の一実施形態に係るオブジェクトの例を示す。2 illustrates an example of an object according to an embodiment of the present disclosure. 本開示の一実施形態に係るオブジェクトの例を示す。2 illustrates an example of an object according to an embodiment of the present disclosure. 本開示の一実施形態に係るオブジェクトの例を示す。2 illustrates an example of an object according to an embodiment of the present disclosure. 本開示の第1実施形態に係る制御システムの概略構成図である。It is a schematic structure figure of a control system concerning a 1st embodiment of this indication. 本開示の第1実施形態に係る制御システムの機能ブロックを含む概略構成図である。It is a schematic structure figure containing a functional block of a control system concerning a 1st embodiment of this indication. 本開示の一実施形態による制御システムの動作を示すフローチャートである。5 is a flowchart illustrating an operation of a control system according to an embodiment of the present disclosure. 本開示の一実施形態に係る各座標系と、オブジェクトマーカ等との関係を示す図である。It is a figure which shows the relationship between each coordinate system which concerns on one Embodiment of this indication, an object marker, etc. FIG. 本開示の一実施形態に係る目標座標姿勢データの構成例を示す。3 shows a configuration example of target coordinate posture data according to an embodiment of the present disclosure. 本開示の一実施形態に係るシーケンステーブルの構成例を示す。3 shows a configuration example of a sequence table according to an embodiment of the present disclosure. 本開示の一実施形態に係る各座標系と、オブジェクトマーカ等との関係を示す図である。It is a figure which shows the relationship between each coordinate system which concerns on one Embodiment of this indication, an object marker, etc. FIG. 本開示の第2実施形態に係る制御システムの概略的な構成を示す。3 shows a schematic configuration of a control system according to a second embodiment of the present disclosure. 本開示の第2実施形態に係る制御システムの概略的な構成を示す。3 shows a schematic configuration of a control system according to a second embodiment of the present disclosure. 本開示の第2実施形態に係る制御システムの概略的な構成を示す。3 shows a schematic configuration of a control system according to a second embodiment of the present disclosure. 本開示の第3実施形態に係る制御システムの概略的な構成を示す。4 shows a schematic configuration of a control system according to a third embodiment of the present disclosure. 本開示の第4実施形態による制御システムの概略的な構成を示す。6 shows a schematic configuration of a control system according to a fourth embodiment of the present disclosure. 本開示の第5実施形態による制御システムの構成を概略的に示す。6 schematically shows a configuration of a control system according to a fifth embodiment of the present disclosure. 本開示の一実施形態に係る設定具の例を示す図である。It is a figure which shows the example of the setting tool which concerns on one Embodiment of this indication.
[本発明の実施形態の説明]
 最初に、本発明の実施形態の内容を列記して説明する。本発明の一実施形態は、以下のような構成を備える。
(項目1)オブジェクトに付されたオブジェクトマーカを撮像した画像から、所定の座標系における前記オブジェクトマーカの位置と姿勢を取得するステップと、マーカ座標系における前記オブジェクトマーカからの前記オブジェクトの検出点の座標と、前記オブジェクトマーカの位置と姿勢とに基づいて、前記所定の座標系における前記検出点の位置と前記オブジェクトの姿勢を算出するステップと、前記検出点の位置が、前記所定の座標系における所定の位置範囲内にあり、かつ前記オブジェクトの姿勢が、前記所定の座標系における所定の姿勢範囲内にあるかを判定するステップと、を備える方法。
[Description of Embodiment of the Present Invention]
First, the contents of the embodiment of the present invention will be listed and described. One embodiment of the present invention has the following configuration.
(Item 1) A step of acquiring the position and orientation of the object marker in a predetermined coordinate system from an image obtained by imaging the object marker attached to the object, and the detection point of the object from the object marker in the marker coordinate system Calculating the position of the detection point and the posture of the object in the predetermined coordinate system based on the coordinates and the position and posture of the object marker; and the position of the detection point in the predetermined coordinate system. Determining whether the object is within a predetermined position range and the posture of the object is within a predetermined posture range in the predetermined coordinate system.
(項目2)前記検出点の位置が、前記所定の位置範囲内にあり、かつ前記オブジェクトの姿勢が、前記所定の姿勢範囲内にある場合に、制御対象物に対する制御信号を出力するステップをさらに備え、前記制御信号は、前記制御対象物の種類に依存しない信号である、項目1に記載の方法。 (Item 2) When the position of the detection point is within the predetermined position range and the posture of the object is within the predetermined posture range, a step of outputting a control signal for the control object is further included. The method according to Item 1, wherein the control signal is a signal that does not depend on a type of the control object.
(項目3)前記制御信号に応じて、前記制御対象物からの応答信号を取得するステップをさらに備え、前記応答信号は、前記制御対象物の種類に依存しない信号である、項目2に記載の方法。 (Item 3) The method according to item 2, further comprising a step of obtaining a response signal from the control object in response to the control signal, wherein the response signal is a signal independent of a type of the control object. Method.
(項目4)前記オブジェクトマーカを撮像するカメラのカメラ座標系に対し、前記所定の位置範囲及び前記所定の姿勢範囲が変化しない場合に、前記所定の座標系は、カメラ座標系である、項目1に記載の方法。 (Item 4) When the predetermined position range and the predetermined posture range do not change with respect to the camera coordinate system of the camera that images the object marker, the predetermined coordinate system is a camera coordinate system. The method described in 1.
(項目5)前記オブジェクトマーカを撮像するカメラのカメラ座標系に対し、前記所定の位置範囲及び前記所定の姿勢範囲が変化する場合に、前記所定の座標系は、基準マーカ座標系である、項目1に記載の方法。 (Item 5) When the predetermined position range and the predetermined posture range change with respect to the camera coordinate system of the camera that captures the object marker, the predetermined coordinate system is a reference marker coordinate system. The method according to 1.
(項目6)前記オブジェクトマーカを撮像するカメラのカメラ座標系対し、前記所定の位置範囲及び前記所定の姿勢範囲が変化する場合に、前記所定の位置範囲の目標座標及び、前記所定の姿勢範囲の目標姿勢を、カメラ座標系における目標座標、及び目標姿勢にそれぞれ補正する、項目1に記載の方法。 (Item 6) When the predetermined position range and the predetermined posture range change with respect to the camera coordinate system of the camera that captures the object marker, the target coordinates of the predetermined position range and the predetermined posture range Item 2. The method according to Item 1, wherein the target posture is corrected to the target coordinate and the target posture in the camera coordinate system.
(項目7)オブジェクトに付されたオブジェクトマーカを撮像した画像から、所定の座標系における前記オブジェクトマーカの位置と姿勢を検出し、マーカ座標系における前記オブジェクトマーカからの前記オブジェクトの検出点の座標と、前記オブジェクトマーカの位置と姿勢とに基づいて、前記所定の座標系における前記検出点の位置と前記オブジェクトの姿勢を検出する位置姿勢検出部と、
 前記検出点の位置が、前記所定の座標系における所定の位置範囲内にあり、かつ前記オブジェクトの姿勢が、前記所定の座標系における所定の目標姿勢にあるかを判定する位置姿勢比較部と、
を備える装置。
(Item 7) The position and orientation of the object marker in a predetermined coordinate system is detected from an image obtained by imaging the object marker attached to the object, and the coordinates of the detection point of the object from the object marker in the marker coordinate system A position and orientation detection unit that detects the position of the detection point and the orientation of the object in the predetermined coordinate system based on the position and orientation of the object marker;
A position and orientation comparison unit that determines whether the position of the detection point is within a predetermined position range in the predetermined coordinate system and the posture of the object is in a predetermined target posture in the predetermined coordinate system;
A device comprising:
(項目8)前記検出点の位置が、前記所定の位置範囲内にあり、かつ前記オブジェクトの姿勢が、前記所定の姿勢範囲内にある場合に、制御対象物に対する制御信号を出力する制御信号生成部をさらに備え、前記制御信号は、前記制御対象物の種類に依存しない信号である、項目7に記載の装置。 (Item 8) Generation of a control signal for outputting a control signal for a control object when the position of the detection point is within the predetermined position range and the posture of the object is within the predetermined posture range The apparatus according to item 7, further comprising: a control unit, wherein the control signal is a signal that does not depend on a type of the control object.
(項目9)前記制御信号生成部は、さらに、前記制御信号に応じて、前記制御対象物からの応答信号を取得し、前記応答信号は、前記制御対象物の種類に依存しない信号である、項目8に記載の装置。 (Item 9) The control signal generation unit further acquires a response signal from the control object in accordance with the control signal, and the response signal is a signal that does not depend on the type of the control object. Item 9. The apparatus according to item 8.
(項目10)オブジェクトマーカが付されたオブジェクトの検出点が設定具の基準位置に重なったときの、所定の座標系における前記オブジェクトマーカの位置と姿勢とを算出するステップと、
 前記基準位置と、前記オブジェクトマーカの位置と姿勢とに基づいて、マーカ座標系における前記オブジェクトマーカを基準とする前記オブジェクトの検出点の座標を設定するステップと、
を備える方法。
(Item 10) calculating the position and orientation of the object marker in a predetermined coordinate system when the detection point of the object to which the object marker is attached overlaps the reference position of the setting tool;
Setting the coordinates of the detection point of the object based on the object marker in a marker coordinate system based on the reference position and the position and orientation of the object marker;
A method comprising:
(項目11)前記基準位置は、前記設定具に付された設定用マーカとは離れた位置に配置される、項目10に記載の方法。 (Item 11) The method according to item 10, wherein the reference position is arranged at a position distant from a setting marker attached to the setting tool.
[本発明の実施形態の詳細]
 以下、本開示の実施形態について図面を参照して説明する。図面において、同一または類似の要素には同一または類似の参照符号が付され、各実施形態の説明において同一または類似の要素に関する重複する説明は省略することがある。また、各実施形態で示される特徴は、互いに矛盾しない限り他の実施形態にも適用可能である。しかし、本開示の実施形態は、必ずしもこのような態様に限定されない。本開示の実施形態が、特許請求の範囲において規定される範囲に含まれる様々な態様を取り得ることは、当業者にとって明らかであろう。
[Details of the embodiment of the present invention]
Hereinafter, embodiments of the present disclosure will be described with reference to the drawings. In the drawings, the same or similar elements are denoted by the same or similar reference numerals, and redundant description of the same or similar elements may be omitted in the description of each embodiment. Further, the features shown in each embodiment can be applied to other embodiments as long as they do not contradict each other. However, embodiments of the present disclosure are not necessarily limited to such an aspect. It will be apparent to those skilled in the art that the embodiments of the present disclosure can take various forms that fall within the scope of the claims.
 本開示によると、オブジェクトに付設されたオブジェクトマーカを撮像して得られた画像を元に、オブジェクトマーカの三次元位置を算出する。オブジェクトは、オブジェクトマーカにより位置や姿勢が検出されるものであり、工具、部品、ラベル、作業者の手等を含む。工具は、作業者に把持され、操作される。部品は、作業者や工具からの作用を受けるものであり、例えばねじ穴を有する部品である。ラベルは、作業者により移動や貼り付け等されるものである。本開示によると、オブジェクトマーカの三次元位置から、オブジェクトマーカとは離れた位置にある検出点の三次元位置を算出し、当該検出点の三次元位置を予め設定された所定の位置範囲と比較して作業者の作業の正誤を判定する。作業者の作業が正しい場合、あるいは正しくない場合に、予め設定されたシーケンスに基づいて、制御対象物に信号を送出する。制御対象物は、作業者により操作される工具や、コンピュータによって制御される表示灯、スイッチ等を含む。工具は、オブジェクトにも制御対象物にもなり得る。 According to the present disclosure, the three-dimensional position of the object marker is calculated based on the image obtained by imaging the object marker attached to the object. The object is a position and posture detected by an object marker, and includes tools, parts, labels, workers' hands, and the like. The tool is gripped and operated by an operator. The part receives an action from an operator or a tool, for example, a part having a screw hole. The label is moved or pasted by an operator. According to the present disclosure, from the three-dimensional position of the object marker, the three-dimensional position of the detection point located away from the object marker is calculated, and the three-dimensional position of the detection point is compared with a predetermined position range set in advance. The correctness of the worker's work is determined. When the worker's work is correct or incorrect, a signal is sent to the control object based on a preset sequence. The control object includes a tool operated by an operator, an indicator light controlled by a computer, a switch, and the like. A tool can be either an object or a controlled object.
 オブジェクトマーカは、カメラから撮像可能な位置に配置されていればよく、検出点とは異なる位置、通例ではオブジェクト上に配置される。オブジェクトマーカと、オブジェクトの検出点との位置関係は任意に設定可能である。 The object marker only needs to be arranged at a position where it can be imaged from the camera, and is arranged at a position different from the detection point, usually on the object. The positional relationship between the object marker and the detection point of the object can be arbitrarily set.
 まず、図1A及び図1Bを用いて、任意に配置されたオブジェクトマーカと、検出点との相対的な位置関係を設定するための構成を説明する。 First, a configuration for setting the relative positional relationship between an arbitrarily placed object marker and a detection point will be described with reference to FIGS. 1A and 1B.
 図1Aは本開示の一実施形態に係るオフセット設定システム100の概略構成図である。オフセット設定システム100は、カメラ110と、オブジェクトマーカ122が付されたオブジェクト120と、設定用マーカ132が付された設定具130と、コンピュータ140とを備える。カメラ110と、コンピュータ140は、同じ場所に設置され無線、あるいは有線で相互に接続されている。あるいは、カメラ110と、コンピュータ140は、離れた場所に設置され、LAN(ローカルエリアネットワーク)、WAN(ワイドエリアネットワーク)、又はインターネット等のネットワーク(不図示)を介して相互に通信可能に接続されてもよい。なお、図1Aではオブジェクト120は工具で示されているが、工具以外の部品やラベル等のオブジェクト120にも、同様に適用される。 FIG. 1A is a schematic configuration diagram of an offset setting system 100 according to an embodiment of the present disclosure. The offset setting system 100 includes a camera 110, an object 120 to which an object marker 122 is attached, a setting tool 130 to which a setting marker 132 is attached, and a computer 140. The camera 110 and the computer 140 are installed in the same place and connected to each other wirelessly or by wire. Alternatively, the camera 110 and the computer 140 are installed at remote locations and are connected to each other via a network (not shown) such as a LAN (Local Area Network), a WAN (Wide Area Network), or the Internet. May be. Although the object 120 is shown as a tool in FIG. 1A, the same applies to the object 120 such as a part other than the tool or a label.
 カメラ110は、連続的、あるいは一定間隔毎に画像を撮像可能なカメラ、例えばWebカメラである。カメラ110は、オブジェクトマーカ122と設定用マーカ132とを撮像可能な位置に配置される。オブジェクトマーカ122、設定用マーカ132は、いわゆるAR(Augumented Reality)マーカである。カメラ110により撮影されたマーカ画像を解析して、マーカの大きさからカメラ座標系Σcにおけるマーカの位置、マーカの歪み方からカメラ座標系Σcにおけるマーカの姿勢を、それぞれ求めることができる。カメラ座標系Σcは、カメラのレンズの中心を原点とする座標系である。図1Cはオブジェクトマーカ122及び、設定用マーカ132の例を示す。各マーカは、黒色背景に白色のパターンを配した四角形の二次元の画像である。各マーカには、それぞれマーカ識別番号と、誤り訂正符号が符号化された異なるビットパターンが割り当てられている。図1Cに示されるARマーカは例示であって、マーカの形や色はこれに限定されない。 The camera 110 is a camera that can capture images continuously or at regular intervals, for example, a web camera. The camera 110 is arranged at a position where the object marker 122 and the setting marker 132 can be imaged. The object marker 122 and the setting marker 132 are so-called AR (Augmented Reality) markers. By analyzing the marker image photographed by the camera 110, the marker position in the camera coordinate system Σc can be obtained from the marker size, and the marker posture in the camera coordinate system Σc can be obtained from the marker distortion method. The camera coordinate system Σc is a coordinate system with the center of the camera lens as the origin. FIG. 1C shows an example of the object marker 122 and the setting marker 132. Each marker is a quadrangular two-dimensional image in which a white pattern is arranged on a black background. Each marker is assigned a marker identification number and a different bit pattern in which an error correction code is encoded. The AR marker shown in FIG. 1C is an example, and the shape and color of the marker are not limited to this.
 オブジェクト120には、検出点124と離れた、カメラ110から撮像可能な任意の位置にオブジェクトマーカ122が付設されている。オブジェクト120は、一例として、図1Aに示されるように工具であってもよいし、図3Aから図3Cを用いて後述するように、様々な種類の他のオブジェクトであってもよい。検出点124は、位置を検出したいオブジェクト120上の位置に設定され、例えば、図1Aに示すようにオブジェクト120が工具の場合、工具の先端に設定される。なお、オブジェクトマーカ122は、カメラ110から撮像可能であり、オブジェクト120との位置関係が固定されていれば、オブジェクト120上に付設されなくともよい。 The object marker 122 is attached to the object 120 at an arbitrary position away from the detection point 124 and capable of being imaged from the camera 110. As an example, the object 120 may be a tool as shown in FIG. 1A, or may be other objects of various types as described later with reference to FIGS. 3A to 3C. The detection point 124 is set to a position on the object 120 whose position is to be detected. For example, when the object 120 is a tool as shown in FIG. 1A, the detection point 124 is set to the tip of the tool. Note that the object marker 122 may be imaged from the camera 110 and may not be provided on the object 120 as long as the positional relationship with the object 120 is fixed.
 設定具130は、オブジェクトマーカ122と、検出点124との相対的な位置関係を設定するために用いられる。設定具130には、設定用マーカ132と、基準位置134とを示す図形とが付されている。基準位置134は、設定用マーカ132から所定の距離、例えば設定用マーカ132の中心から、ある方向に距離Lだけ離れた位置に配置されている。設定具130は、図1Aに示すような板であってもよいし、オブジェクト120が工具の場合は、図12に示すように、工具の先端にはめ込み可能な部分を有する治具であってもよい。 The setting tool 130 is used to set a relative positional relationship between the object marker 122 and the detection point 124. The setting tool 130 is provided with a setting marker 132 and a graphic indicating the reference position 134. The reference position 134 is arranged at a predetermined distance from the setting marker 132, for example, at a position away from the center of the setting marker 132 by a distance L in a certain direction. The setting tool 130 may be a plate as shown in FIG. 1A, or when the object 120 is a tool, as shown in FIG. 12, it may be a jig having a portion that can be fitted to the tip of the tool. Good.
 コンピュータ140は、主たる構成要素として、プロセッサ142と、メモリ144と、表示操作部146とを備える。 The computer 140 includes a processor 142, a memory 144, and a display operation unit 146 as main components.
 プロセッサ142は、コンピュータ140に与えられる信号に基づいて、あるいは、予め定められた条件が成立したことに基づいて、メモリ144に格納されているプログラムに含まれる一連の命令を実行する。ある態様において、プロセッサ142は、CPU(Central Processing Unit)、MPU(Micro Processor Unit)、FPGA(Field-Programmable Gate Array)等のデバイスとして実現される。プロセッサ142内に含まれるコンポーネントは、プロセッサ142が実行する機能を具体的なモジュールとして表現する1つの例にすぎない。複数のコンポーネントの機能が単一のコンポーネントによって実現されてもよい。プロセッサ142がすべてのコンポーネントの機能を実行するように構成されてもよい。一例では、図1Aに示すように、プロセッサ142はマーカ座標設定部150を含む。マーカ座標設定部150は、位置検出部152と、座標変換部154とを含む。 The processor 142 executes a series of instructions included in the program stored in the memory 144 based on a signal given to the computer 140 or when a predetermined condition is established. In one embodiment, the processor 142 is realized as a device such as a CPU (Central Processing Unit), an MPU (Micro Processor Unit), or an FPGA (Field-Programmable Gate Array). The components included in the processor 142 are just one example of expressing functions performed by the processor 142 as specific modules. The functions of multiple components may be realized by a single component. The processor 142 may be configured to perform the functions of all components. In one example, as shown in FIG. 1A, the processor 142 includes a marker coordinate setting unit 150. The marker coordinate setting unit 150 includes a position detection unit 152 and a coordinate conversion unit 154.
 メモリ144は、プログラム及びデータを保存する。プログラムは、例えば、メモリ144からロードされる。データは、コンピュータ140に入力されたデータと、プロセッサ142によって生成されたデータとを含む。ある態様において、メモリ144は、RAM(Random Access Memory)等の揮発性メモリ、あるいはROM(Read Only Memory)等の不揮発性メモリとして実現される。一例では、図1Aに示すように、メモリ144は検出点データ162を含む。 The memory 144 stores programs and data. The program is loaded from the memory 144, for example. The data includes data input to the computer 140 and data generated by the processor 142. In one embodiment, the memory 144 is realized as a volatile memory such as a RAM (Random Access Memory) or a non-volatile memory such as a ROM (Read Only Memory). In one example, the memory 144 includes detection point data 162, as shown in FIG. 1A.
 表示操作部146は、各部の状態を表示し、また、作業者による操作入力を受け付けることができる。 The display operation unit 146 displays the state of each unit and can accept operation input by an operator.
 図1Bは、オブジェクトマーカ122と、検出点124と、設定具130に設定された基準位置134と、設定用マーカ132と、各3次元直交座標系(マーカ座標系Σm(Xm,Ym,Zm)、カメラ座標系Σc(Xc,Yc,Zc)、設定用マーカ座標系Σs(Xs,Ys,Zs)との関係を示す。オフセット設定システム100は、オブジェクトマーカ122を基準とするマーカ座標系Σmにおける検出点124の位置を求める。マーカ座標系Σmは、オブジェクトマーカ122上に設定されたオブジェクトマーカ座標126を原点とする。マーカ座標系Σmの姿勢は、オブジェクトマーカ122の姿勢に基づいて設定され、例えば、オブジェクトマーカ122の配置面にXY軸を、当該配置面に直交する方向にZ軸を設定する。設定用マーカ座標系Σsは、設定用マーカ132上に設定された設定用マーカ座標を原点とする。設定用マーカ座標系Σsの姿勢もマーカ座標系Σmと同様に設定され得る。 FIG. 1B shows an object marker 122, a detection point 124, a reference position 134 set on the setting tool 130, a setting marker 132, and each three-dimensional orthogonal coordinate system (marker coordinate system Σm (Xm, Ym, Zm). The relationship between the camera coordinate system Σc (Xc, Yc, Zc) and the setting marker coordinate system Σs (Xs, Ys, Zs) is shown in the offset coordinate system 100 in the marker coordinate system Σm with the object marker 122 as a reference. The position of the detection point 124 is obtained, and the marker coordinate system Σm has the origin at the object marker coordinate 126 set on the object marker 122. The posture of the marker coordinate system Σm is set based on the posture of the object marker 122, For example, the XY axis is set on the arrangement plane of the object marker 122, and the Z axis is set in a direction orthogonal to the arrangement plane. Over Ca coordinate system Σs may be set similarly to the posture marker coordinate system Σm of that. Set marker coordinate system Σs the origin setting marker coordinates set on the setting marker 132.
 以下、図1A及び図1Bを参照しながら、オブジェクトマーカ122を基準とする検出点124の位置を求める方法について説明する。オブジェクトマーカ122と、検出点124の位置関係は不明であるので、位置関係が定められている設定具上に設けられた設定用マーカ132と、設定用マーカ132とは離れた位置に設けられた基準位置134とを用いて、オブジェクトマーカ122を基準とする検出点124の位置を求める。 Hereinafter, a method for obtaining the position of the detection point 124 based on the object marker 122 will be described with reference to FIGS. 1A and 1B. Since the positional relationship between the object marker 122 and the detection point 124 is unknown, the setting marker 132 provided on the setting tool for which the positional relationship is determined and the setting marker 132 are provided at positions separated from each other. Using the reference position 134, the position of the detection point 124 with respect to the object marker 122 is obtained.
 まず、作業者は、オブジェクトマーカ122と、検出点124との相対的な位置関係を設定するために、設定具130と、オブジェクト120とを重ね合わせる。このとき、仮に、設定用マーカ132上に検出点124を重ねようとすると、オブジェクト120の形状や大きさによっては、オブジェクト120により設定用マーカ132が隠れてしまい、カメラ110は設定用マーカ132を撮像できなくなる。従って、本開示においては、設定用マーカ132とは離れた、設定用マーカ132との位置関係が定められた基準位置134に、検出点124が重ね合わせする。そして、基準位置134を検出点124の位置として検出する。 First, the operator overlaps the setting tool 130 and the object 120 in order to set the relative positional relationship between the object marker 122 and the detection point 124. At this time, if the detection point 124 is to be superimposed on the setting marker 132, the setting marker 132 is hidden by the object 120 depending on the shape and size of the object 120, and the camera 110 moves the setting marker 132. It becomes impossible to take an image. Therefore, in the present disclosure, the detection point 124 overlaps the reference position 134 that is separated from the setting marker 132 and has a positional relationship with the setting marker 132. Then, the reference position 134 is detected as the position of the detection point 124.
 プロセッサ142(位置検出部152)は、検出点124と、基準位置134とを重ね合わせたときのカメラ画像から、オブジェクトマーカ122と設定用マーカ132を含む画像を解析する。そして、カメラ座標系Σcにおける各マーカの位置と姿勢を算出する。このとき、図1Bに示すように、カメラ座標系Σcにおける検出点124の位置は、カメラ座標系Σcにおける基準位置134に等しくなる。プロセッサ142は、設定用マーカ132の位置と姿勢と、設定用マーカ132と基準位置134との間の距離Lとに基づいて、カメラ座標系Σcにおける基準位置134を算出する。 The processor 142 (position detection unit 152) analyzes an image including the object marker 122 and the setting marker 132 from the camera image when the detection point 124 and the reference position 134 are overlapped. Then, the position and orientation of each marker in the camera coordinate system Σc are calculated. At this time, as shown in FIG. 1B, the position of the detection point 124 in the camera coordinate system Σc is equal to the reference position 134 in the camera coordinate system Σc. The processor 142 calculates the reference position 134 in the camera coordinate system Σc based on the position and orientation of the setting marker 132 and the distance L between the setting marker 132 and the reference position 134.
 そして、プロセッサ142(座標変換部154)は、カメラ座標系Σcでのオブジェクトマーカ122の姿勢から回転行列を求め、カメラ座標系Σcにおける基準位置134(即ち検出点124の座標)を、オブジェクトマーカ122のオブジェクトマーカ座標126を基準とするマーカ座標系Σmへと座標変換する。検出点124の座標を、マーカ座標系Σmの座標へと変換する方法には、公知の三次元座標変換の方法が用いられ、ここでは詳細な説明は割愛する。座標変換によりマーカ座標系Σmにおける検出点124のオフセット座標が取得される。検出点124のオフセット座標は、マーカ座標系Σmの原点126を基準とした検出点124の座標である。取得された検出点124のオフセット座標は、オブジェクトマーカ122の識別番号毎に検出点データ162としてメモリ144に格納される。オブジェクトマーカ122と検出点124との位置関係は固定されており、当該位置関係は、オブジェクト120が移動しても変わらない。 Then, the processor 142 (coordinate conversion unit 154) obtains a rotation matrix from the posture of the object marker 122 in the camera coordinate system Σc, and determines the reference position 134 (that is, the coordinates of the detection point 124) in the camera coordinate system Σc as the object marker 122. The coordinate transformation is performed to the marker coordinate system Σm based on the object marker coordinates 126. As a method of converting the coordinates of the detection point 124 into the coordinates of the marker coordinate system Σm, a known three-dimensional coordinate conversion method is used, and detailed description thereof is omitted here. The offset coordinates of the detection point 124 in the marker coordinate system Σm are acquired by coordinate conversion. The offset coordinates of the detection point 124 are the coordinates of the detection point 124 with respect to the origin 126 of the marker coordinate system Σm. The acquired offset coordinates of the detection point 124 are stored in the memory 144 as detection point data 162 for each identification number of the object marker 122. The positional relationship between the object marker 122 and the detection point 124 is fixed, and the positional relationship does not change even when the object 120 moves.
 図2は、メモリ144に格納される検出点データ162の構成例を示す。検出点データ162は、マーカ識別番号202と、検出点のオフセット座標204と、検出点種別206とを備える。マーカ識別番号202は、オブジェクトマーカ122の画像を復号して得られた、各オブジェクトマーカ122に割り当てられた識別番号である。検出点種別206は、オブジェクトマーカ122が付されるオブジェクトの種類を示す。 FIG. 2 shows a configuration example of the detection point data 162 stored in the memory 144. The detection point data 162 includes a marker identification number 202, a detection point offset coordinate 204, and a detection point type 206. The marker identification number 202 is an identification number assigned to each object marker 122 obtained by decoding the image of the object marker 122. The detection point type 206 indicates the type of object to which the object marker 122 is attached.
 オブジェクトマーカ122は、様々なオブジェクト120に様々な態様で付設され得る。図3Aから図3Dは、様々な種類のオブジェクト120と、該オブジェクト120に付されたオブジェクトマーカ122と、検出点124と、オブジェクトマーカ座標126との関係を例示する。図3Aはオブジェクト120が手袋、あるいは手の場合を示す。図3Bはオブジェクト120が部品の場合を示す。図3Cはオブジェクト120がラベルの場合を示す。 The object marker 122 can be attached to various objects 120 in various ways. FIGS. 3A to 3D illustrate the relationship among various types of objects 120, object markers 122 attached to the objects 120, detection points 124, and object marker coordinates 126. FIG. FIG. 3A shows a case where the object 120 is a glove or a hand. FIG. 3B shows a case where the object 120 is a part. FIG. 3C shows a case where the object 120 is a label.
 図3Dは、複数のオブジェクトマーカ122が付されたオブジェクト120を例示する。一例として、複数のオブジェクトマーカ122a1、122a2(以下、まとめてオブジェクトマーカ122aとも呼ぶ)をオブジェクト120の同じ側に配置する。作業者の身体の一部や遮蔽物により、カメラ110が1つのオブジェクトマーカ122a1を撮像できなくなった場合にも、他のオブジェクトマーカ122a2がカメラ110により撮像可能であれば、検出点124の位置を設定することができる。また、他の例として、複数のオブジェクトマーカ122a、122b、122cをオブジェクト120の異なる側に配置する。複数のカメラ110は、それぞれオブジェクト120の異なる角度から撮像可能なように配置される。カメラ110aは、オブジェクト120のオブジェクトマーカ122aの付されている側を撮像し、オブジェクトマーカ122b、122cの付されている側はカメラ110aの死角となり撮像できない。カメラ110bは、カメラ110a及び110cの死角となっている、オブジェクトマーカ122aの付されている側を撮像し、カメラ110cは、カメラ110a及び110bの死角となっている、オブジェクトマーカ122cの付されている側を、それぞれ撮像する。複数のカメラ110をオブジェクト120の異なる角度に配置することで、オブジェクトマーカ122を撮像できない死角を減らすことができる。 FIG. 3D illustrates the object 120 with a plurality of object markers 122 attached thereto. As an example, a plurality of object markers 122 a 1 and 122 a 2 (hereinafter collectively referred to as object markers 122 a) are arranged on the same side of the object 120. Even if the camera 110 cannot capture one object marker 122a1 due to a part of the worker's body or a shield, if the other object marker 122a2 can be captured by the camera 110, the position of the detection point 124 is determined. Can be set. As another example, a plurality of object markers 122a, 122b, 122c are arranged on different sides of the object 120. The plurality of cameras 110 are arranged so that images can be taken from different angles of the object 120, respectively. The camera 110a images the side of the object 120 where the object marker 122a is attached, and the side where the object markers 122b and 122c are attached becomes the blind spot of the camera 110a and cannot be imaged. The camera 110b takes an image of the side where the object marker 122a is attached, which is the blind spot of the cameras 110a and 110c, and the camera 110c is attached of the object marker 122c which is the blind spot of the cameras 110a and 110b. Each side is imaged. By arranging a plurality of cameras 110 at different angles of the object 120, it is possible to reduce blind spots where the object marker 122 cannot be imaged.
 以上の実施形態においては、オブジェクトマーカ122を基準とする検出点124のオフセット座標を設定した。次に、設定された検出点124のオフセット座標を用いて、作業者の作業を支援するよう構成された制御システム400の構成等について説明する。 In the above embodiment, the offset coordinates of the detection point 124 with the object marker 122 as a reference are set. Next, the configuration of the control system 400 configured to support the worker's work using the set offset coordinates of the detection point 124 will be described.
<第1実施形態>
 図4Aは、本開示の第1実施形態に係る制御システム400の概略構成図である。本実施形態では、制御システム400は、検出点124の所定の座標系における三次元位置と、予め設定された検出点124の三次元位置とを比較して、作業者の作業の正誤を判定等する。また、制御システム400は、作業者の作業が正しい場合、あるいは正しくない場合、予め設定されたシーケンスに基づいて、制御対象物460に対し制御信号を送出し、また制御対象物460からの応答信号を受け取る。図4Aは、オブジェクト120も制御対象物460(ツール464)も同一(工具470)の場合を示す。
<First Embodiment>
FIG. 4A is a schematic configuration diagram of a control system 400 according to the first embodiment of the present disclosure. In the present embodiment, the control system 400 compares the three-dimensional position of the detection point 124 in a predetermined coordinate system with the preset three-dimensional position of the detection point 124 to determine whether the operator is working correctly or not. To do. Further, if the worker's work is correct or incorrect, the control system 400 sends a control signal to the control object 460 based on a preset sequence, and a response signal from the control object 460. Receive. FIG. 4A shows a case where the object 120 and the control object 460 (tool 464) are the same (tool 470).
 図4Bは、図4Aに示す本開示の第1実施形態に係る制御システム400の機能ブロックを含む概略構成図である。制御システム400は、カメラ110と、オブジェクトマーカ122がそれぞれ付された1以上の工具470と、コンピュータ140とを備える。 FIG. 4B is a schematic configuration diagram including functional blocks of the control system 400 according to the first embodiment of the present disclosure shown in FIG. 4A. The control system 400 includes a camera 110, one or more tools 470 each having an object marker 122 attached thereto, and a computer 140.
 コンピュータ140は、主たる構成要素として、プロセッサ142と、メモリ144とを備える。プロセッサ142は、位置姿勢検出部410と、位置姿勢比較部420と、シーケンス制御部430と、制御信号生成部440とを備える。プロセッサ142内に含まれるコンポーネントは、プロセッサ142が実行する機能を具体的なモジュールとして表現する1つの例にすぎない。メモリ144は、検出点データ162と、目標座標姿勢データ452と、シーケンステーブル454とを備える。 The computer 140 includes a processor 142 and a memory 144 as main components. The processor 142 includes a position / orientation detection unit 410, a position / orientation comparison unit 420, a sequence control unit 430, and a control signal generation unit 440. The components included in the processor 142 are just one example of expressing functions performed by the processor 142 as specific modules. The memory 144 includes detection point data 162, target coordinate posture data 452, and a sequence table 454.
 コンピュータ140は、1以上の制御対象物460と接続される。コンピュータ140は、制御対象物460に対し制御信号を出力し、また、制御対象物460からの応答信号を受信する。コンピュータ140と、制御対象物460とは、直接接続されてもよいし、LAN(Local Area Network)等の有線通信インターフェース、あるいは、WiFi(Wireless Fidelity)、Bluetooth(登録商標)等の無線通信インターフェースを介して接続されてもよい。 The computer 140 is connected to one or more control objects 460. The computer 140 outputs a control signal to the control object 460 and receives a response signal from the control object 460. The computer 140 and the control object 460 may be directly connected, or a wired communication interface such as LAN (Local Area Network), or a wireless communication interface such as WiFi (Wireless Fidelity) or Bluetooth (registered trademark). It may be connected via.
 各制御対象物460は、ツールコントローラと462と、工具470とを備える。ツールコントローラ462は、コンピュータ140からの制御信号を受け取る、あるいはコンピュータ140へ応答信号を送出する。ツールコントローラ462は、コンピュータ140からの制御信号を受け取ると、一例として、工具470の有効化を行う(工具470の各種操作ができるようになる)、あるいは工具470の各種パラメータを設定する。そして、工具470を用いた作業が行われると、ツールコントローラ462は、工具470から様々な信号、例えば締め付けトルクを受け取る。ツールコントローラ462は、工具470からの様々な信号に基づき、工具470を用いた作業が成功したか否かの判断等を行い、コンピュータ140に対し応答信号、例えば、作業が完了した旨を示す信号を出力する。 Each control object 460 includes a tool controller, 462, and a tool 470. The tool controller 462 receives a control signal from the computer 140 or sends a response signal to the computer 140. When receiving a control signal from the computer 140, the tool controller 462, for example, validates the tool 470 (allows various operations of the tool 470) or sets various parameters of the tool 470. When an operation using the tool 470 is performed, the tool controller 462 receives various signals such as a tightening torque from the tool 470. Based on various signals from the tool 470, the tool controller 462 determines whether or not the work using the tool 470 has been successful, and the like, and sends a response signal to the computer 140, for example, a signal indicating that the work has been completed. Is output.
 図5は、第1実施形態による、制御システム400にて実施される方法500のフローチャートである。本実施形態においては、オブジェクト120も制御対象物460も同一(図4Aの工具470)である場合について説明する。以下、図5を参照して、方法500のフローチャートを説明する。 FIG. 5 is a flowchart of a method 500 performed by the control system 400 according to the first embodiment. In the present embodiment, a case will be described in which the object 120 and the control object 460 are the same (the tool 470 in FIG. 4A). Hereinafter, a flowchart of the method 500 will be described with reference to FIG.
 処理はステップ502において開始する。ステップ504において、プロセッサ142(位置姿勢検出部410)は、カメラ110により撮像された画像から、工具470(ここではオブジェクトとしての工具470)に付されたオブジェクトマーカ122の画像を解析して、所定の座標系におけるオブジェクトマーカ122の三次元位置と姿勢とを算出する。所定の座標系は、カメラ110に設定されたカメラ座標系Σcであってもよいし、後述する別途付設された基準マーカに基づく座標系Σb(図8に示す)であってもよい。そして、プロセッサ142は、オブジェクトマーカ122の姿勢から、オブジェクトとしての工具470の姿勢を算出する(ステップ504)。プロセッサ142は、オブジェクトマーカ122の姿勢そのものを工具470の姿勢としてもよいし、オブジェクトマーカ122の姿勢に回転行列を掛けて求められる検査点124の姿勢を工具470の姿勢としてもよい。さらに、プロセッサ142は、メモリ144に格納された検出点データ162の検出点124のオフセット座標204と、オブジェクトマーカ122の三次元位置とに基づいて、所定の座標系における検出点124の位置を算出する(ステップ504)。 Processing starts at step 502. In step 504, the processor 142 (position / orientation detection unit 410) analyzes the image of the object marker 122 attached to the tool 470 (here, the tool 470 as an object) from the image captured by the camera 110, and performs predetermined processing. The three-dimensional position and orientation of the object marker 122 in the coordinate system are calculated. The predetermined coordinate system may be a camera coordinate system Σc set in the camera 110, or a coordinate system Σb (shown in FIG. 8) based on a separately provided reference marker to be described later. Then, the processor 142 calculates the posture of the tool 470 as an object from the posture of the object marker 122 (step 504). The processor 142 may set the posture of the object marker 122 itself as the posture of the tool 470, or may set the posture of the inspection point 124 obtained by multiplying the posture of the object marker 122 by the rotation matrix as the posture of the tool 470. Further, the processor 142 calculates the position of the detection point 124 in a predetermined coordinate system based on the offset coordinates 204 of the detection point 124 of the detection point data 162 stored in the memory 144 and the three-dimensional position of the object marker 122. (Step 504).
 処理はステップ506に進む。プロセッサ142(位置姿勢比較部420)は、メモリ144から、目標座標姿勢データ452を読み出す。目標座標姿勢データ452には、所定の座標系における検出点124の目標座標612と、所定の座標系における工具470の目標姿勢614とが格納されている。目標座標612は、所定の座標系における検出点124の正しい座標であり、目標姿勢614は、所定の座標系における工具470の正しい姿勢である。プロセッサ142は、ステップ504にて検出された検出点124の座標を、目標座標612と比較し、また、ステップ504にて検出された工具470の姿勢を、目標姿勢614と比較する。目標座標612と、目標姿勢614の設定方法については後述する。 Processing proceeds to step 506. The processor 142 (position / orientation comparison unit 420) reads target coordinate / orientation data 452 from the memory 144. The target coordinate posture data 452 stores target coordinates 612 of the detection point 124 in a predetermined coordinate system and a target posture 614 of the tool 470 in the predetermined coordinate system. The target coordinate 612 is the correct coordinate of the detection point 124 in a predetermined coordinate system, and the target posture 614 is the correct posture of the tool 470 in the predetermined coordinate system. The processor 142 compares the coordinates of the detection point 124 detected in step 504 with the target coordinates 612, and compares the posture of the tool 470 detected in step 504 with the target posture 614. A method for setting the target coordinates 612 and the target posture 614 will be described later.
 処理はステップ508に進む。プロセッサ142(位置姿勢比較部420)は、所定の座標系における現在の検出点124の座標と、目標座標612とが一致し、かつ所定の座標系での現在の工具470の姿勢と、目標姿勢614とが一致しているかを判定する。あるいは、他の例として、プロセッサ142(位置姿勢比較部420)は、所定の座標系における現在の検出点124の座標と、目標座標612との差分が所定の位置範囲604内にあり、かつ所定の座標系での現在の工具470の姿勢と、目標姿勢614との差分が所定の姿勢範囲606内にあるかを判定する。位置範囲604は、検出124点が正しい位置にあると判断される範囲であり、姿勢範囲606は、検出点124が正しい姿勢にあると判断される範囲である。なお、工具470の状態を、目標座標612と、目標姿勢614の両方と比較せず、いずれか一方と比較して、その一致を判定してもよい。 Processing proceeds to step 508. The processor 142 (position / orientation comparison unit 420) matches the coordinates of the current detection point 124 in the predetermined coordinate system with the target coordinates 612 and the current attitude of the tool 470 in the predetermined coordinate system, and the target attitude. Whether or not 614 matches is determined. Alternatively, as another example, the processor 142 (position / orientation comparison unit 420) determines that the difference between the coordinates of the current detection point 124 in the predetermined coordinate system and the target coordinates 612 is within the predetermined position range 604 and Whether the difference between the current posture of the tool 470 and the target posture 614 in the coordinate system is within a predetermined posture range 606 is determined. The position range 604 is a range in which the detected 124 points are determined to be in the correct position, and the posture range 606 is a range in which the detected point 124 is determined to be in the correct posture. Note that the state of the tool 470 may be compared with either one of the target coordinates 612 and the target posture 614, and the match may be determined.
 本開示によると、検出点124の座標を目標座標612(位置範囲604)と比較することに加えて、工具470の姿勢を、目標姿勢614(姿勢範囲606)と比較する。したがって、例えば、工具470を用いてボルトを締めるシーケンスにおいて、正しい位置及び正しい姿勢でボルトを締めることができ、ボルトが斜めに入ることで生じる不具合を回避することができる。 According to the present disclosure, in addition to comparing the coordinates of the detection point 124 with the target coordinates 612 (position range 604), the posture of the tool 470 is compared with the target posture 614 (posture range 606). Therefore, for example, in the sequence of tightening the bolt using the tool 470, the bolt can be tightened at the correct position and the correct posture, and a problem caused by the bolt entering obliquely can be avoided.
 図6Aは、オブジェクトマーカ122と、検出点124と、目標位置602と、位置範囲604と、マーカ座標系Σmと、所定の座標系(ここではカメラ座標系Σc)との関係を例示する。位置範囲604は、図6Aでは目標位置602を中心とした球で示される。位置範囲604の形状は球に限られず、様々な形状、例えば半球や、立方体であってよい。カメラ座標系Σcに代えて、基準マーカに基づく基準マーカ座標系Σb(図8に示す)を所定の座標系としてもよい。 6A illustrates the relationship among the object marker 122, the detection point 124, the target position 602, the position range 604, the marker coordinate system Σm, and a predetermined coordinate system (here, the camera coordinate system Σc). The position range 604 is indicated by a sphere centered on the target position 602 in FIG. 6A. The shape of the position range 604 is not limited to a sphere, and may be various shapes such as a hemisphere or a cube. Instead of the camera coordinate system Σc, a reference marker coordinate system Σb (shown in FIG. 8) based on the reference marker may be used as a predetermined coordinate system.
 図6Bは、メモリ144に格納された、目標座標姿勢データ452の構成例を示す。目標座標姿勢データ452は、少なくとも、シーケンス番号610と、目標位置602の目標座標612と、目標姿勢614と、マーカ識別番号202とを備える。これらの値は予め設定されている。シーケンス番号610は、シーケンス内に定められた各処理に割り当てられた番号である。目標座標612は、所定の座標系における座標(xt、yt、zt)で示され、目標姿勢614は、所定の座標系における直交3軸のベクトル(P1、Q1、R1)で示される。一つのシーケンス番号に対し、目標座標612と、目標姿勢614と、マーカ識別番号202とを備えるセットが対応する。一つのシーケンス番号に対し、複数の前記セットを対応させてもよい。この場合、複数の異なるオブジェクト120(ここでは工具470)を、それぞれ対応する目標位置、目標姿勢と同時に比較することができる。目標座標姿勢データ452は、さらに、各シーケンス番号に対応する予め設定された位置範囲604と、姿勢範囲606とを備えてもよい。図6Bでは、一例として、位置範囲604は、目標座標612を中心とした球の半径で示され、姿勢範囲606は、目標姿勢614を示す直交3軸のベクトルの各軸を中心とした角度で示される。 FIG. 6B shows a configuration example of the target coordinate posture data 452 stored in the memory 144. The target coordinate posture data 452 includes at least a sequence number 610, a target coordinate 612 of the target position 602, a target posture 614, and a marker identification number 202. These values are preset. The sequence number 610 is a number assigned to each process defined in the sequence. The target coordinate 612 is indicated by coordinates (xt, yt, zt) in a predetermined coordinate system, and the target posture 614 is indicated by a vector of three orthogonal axes (P1, Q1, R1) in the predetermined coordinate system. A set including a target coordinate 612, a target posture 614, and a marker identification number 202 corresponds to one sequence number. A plurality of sets may be associated with one sequence number. In this case, a plurality of different objects 120 (here, tool 470) can be compared simultaneously with the corresponding target position and target posture. The target coordinate posture data 452 may further include a preset position range 604 and a posture range 606 corresponding to each sequence number. In FIG. 6B, as an example, the position range 604 is indicated by the radius of a sphere centered on the target coordinate 612, and the posture range 606 is an angle centered on each axis of the three orthogonal vectors indicating the target posture 614. Indicated.
 目標座標612と、目標姿勢614とを設定する方法を説明する。作業者が工具470を操作して、工具470の検出点124を、正しい位置及び姿勢で、目標位置602に合わせたときのカメラ画像を取得する。取得されたカメラ画像から、オブジェクトマーカ122の所定の座標系における三次元位置と姿勢を算出する。算出されたオブジェクトマーカ122の姿勢から目標姿勢614を求める。目標姿勢614は、オブジェクトマーカ122の姿勢そのものとしてもよいし、オブジェクトマーカ122の姿勢に回転行列を掛けて求められる検出点124の姿勢を目標姿勢614としてもよい。そして、オブジェクトマーカ122からの検出点のオフセット座標204と、オブジェクトマーカ122の三次元位置と姿勢とに基づいて、所定の座標系における検出点124の座標を算出する。算出された検出点124の座標が目標座標612である。 A method for setting the target coordinates 612 and the target posture 614 will be described. The operator operates the tool 470 to acquire a camera image when the detection point 124 of the tool 470 is set to the target position 602 with the correct position and posture. From the acquired camera image, a three-dimensional position and orientation of the object marker 122 in a predetermined coordinate system are calculated. A target posture 614 is obtained from the calculated posture of the object marker 122. The target posture 614 may be the posture of the object marker 122 itself, or may be the posture of the detection point 124 obtained by multiplying the posture of the object marker 122 by a rotation matrix. Based on the offset coordinates 204 of the detection point from the object marker 122 and the three-dimensional position and orientation of the object marker 122, the coordinates of the detection point 124 in a predetermined coordinate system are calculated. The calculated coordinates of the detection point 124 are target coordinates 612.
 再び図5に戻る。ステップ508にて「はい」の場合、プロセッサ142は一致信号を生成し、処理はステップ510に進む。一方、ステップ508にて「いいえ」の場合、処理はステップ504に戻る。 Return to Fig. 5 again. If “Yes” in step 508, the processor 142 generates a coincidence signal and the process proceeds to step 510. On the other hand, if “NO” in the step 508, the process returns to the step 504.
 一致信号が生成されると、ステップ510において、プロセッサ142(シーケンス制御部430)は、シーケンステーブル454を参照する。プロセッサ142は、シーケンステーブル454に設定された内容に従って、制御対象物460(ここではツール464としての工具470)に対し制御が必要か否かを判定し、制御が必要であると判定した場合(ステップ512にて「はい」)、ステップ514に進む。一方、工具470に対し制御が不要であると判定した場合、ステップ518に進む。 When the coincidence signal is generated, in step 510, the processor 142 (sequence control unit 430) refers to the sequence table 454. The processor 142 determines whether or not the control object 460 (here, the tool 470 as the tool 464) needs to be controlled according to the contents set in the sequence table 454, and determines that the control is necessary ( In step 512, “Yes”), the process proceeds to step 514. On the other hand, if it is determined that the tool 470 does not require control, the process proceeds to step 518.
 ステップ514において、プロセッサ142(制御信号生成部440)は、制御対象物460に対する制御信号を生成し、制御対象物460に出力する。制御信号は、一例としてデジタル二値信号であり、シーケンスが開始したこと、あるいはシーケンスが終了したことを示す。 In step 514, the processor 142 (control signal generation unit 440) generates a control signal for the control object 460 and outputs the control signal to the control object 460. The control signal is a digital binary signal as an example, and indicates that the sequence has started or that the sequence has ended.
 制御対象物460は制御信号に応じて、所定の処理を実行する。一例として、制御対象物460(ツールコントローラ462)は、ツール464としての工具470の動作を有効にする。作業者は、有効化されたツール464を用いて作業を行う。ツールコントローラ462は、作業の結果、工具470から出力された信号に基づいて、工具470を用いた作業が終了したかあるいは成功したかを判定する。工具470を用いた作業が終了あるいは成功したと判定すると、ツールコントローラ462(制御対象物460)は制御信号生成部440へ応答信号を出力する。 Control object 460 executes a predetermined process according to the control signal. As an example, the control object 460 (tool controller 462) enables the operation of the tool 470 as the tool 464. The worker performs work using the enabled tool 464. The tool controller 462 determines whether the work using the tool 470 is completed or successful based on a signal output from the tool 470 as a result of the work. When it is determined that the work using the tool 470 has been completed or succeeded, the tool controller 462 (control object 460) outputs a response signal to the control signal generation unit 440.
 ステップ516において、プロセッサ142(制御信号生成部440)は、制御対象物460から応答信号を受け取ると、シーケンス制御部430へ出力する。応答信号は、一例としてデジタル二値信号であり、制御対象物460の入力端子がアクティブであること、あるいは入力端子が非アクティブであることを示す。コンピュータ140は、制御対象物460の入力端子の状態を、シーケンステーブル454における処理の開始条件又は終了条件として設定することができる。 In step 516, when the processor 142 (control signal generation unit 440) receives the response signal from the control object 460, the processor 142 (control signal generation unit 440) outputs the response signal to the sequence control unit 430. The response signal is a digital binary signal as an example, and indicates that the input terminal of the control object 460 is active or the input terminal is inactive. The computer 140 can set the state of the input terminal of the control object 460 as a process start condition or an end condition in the sequence table 454.
 処理は、ステップ518に進む。プロセッサ142(シーケンス制御部430)はシーケンステーブル454を参照し、次の処理が無いと判定した場合に、処理を終了する(ステップ520)。一方、ステップ518にて、次の処理があると判定された場合に、ステップ504に戻る。 Processing proceeds to step 518. When the processor 142 (sequence control unit 430) refers to the sequence table 454 and determines that there is no next process, the process ends (step 520). On the other hand, if it is determined in step 518 that there is a next process, the process returns to step 504.
 また、他の例として、ステップ514において、制御信号生成部440は、制御信号として、工具470が実行すべき処理に対応するプログラム番号nを含むデータを生成し、制御対象物460に出力する。ツールコントローラ462は、取得したプログラム番号nに対応するプログラムを読み出し、当該プログラムに規定されたパラメータ、例えば工具470が締め付け工具の場合、締付トルク、締付角度、回転速度、逆回転の可否、締付エラー時のリトライ回数などの設定パラメータを、工具470に設定し、工具470を有効化する。有効化された工具470を用いて、作業が行われると、工具470は各種信号、例えばツールが実際に作動したときの締付トルクをツールコントローラ462へ出力する。ツールコントローラ462は、工具470から取得された信号が、設定パラメータと整合するか判定し、整合していると判定すると、処理完了メッセージを生成し、コンピュータ140に出力する。そして、ステップ516において、コンピュータ140は、工具470から処理完了メッセージを受信すると、シーケンステーブル454を参照し、次の処理が無いと判断した場合に、処理を終了する(ステップ520)。一方、ステップ518にて、次の処理があると判断した場合に、ステップ504に戻る。 As another example, in step 514, the control signal generation unit 440 generates data including a program number n corresponding to the process to be executed by the tool 470 as a control signal and outputs the data to the control object 460. The tool controller 462 reads a program corresponding to the acquired program number n, and parameters specified in the program, for example, when the tool 470 is a tightening tool, a tightening torque, a tightening angle, a rotational speed, whether reverse rotation is possible, Setting parameters such as the number of retries at the time of a tightening error are set in the tool 470, and the tool 470 is validated. When an operation is performed using the enabled tool 470, the tool 470 outputs various signals, for example, a tightening torque when the tool is actually operated, to the tool controller 462. The tool controller 462 determines whether the signal acquired from the tool 470 matches the set parameter. If the tool controller 462 determines that the signal matches, the tool controller 462 generates a processing completion message and outputs it to the computer 140. In step 516, when the computer 140 receives the processing completion message from the tool 470, the computer 140 refers to the sequence table 454 and ends the processing when determining that there is no next processing (step 520). On the other hand, if it is determined in step 518 that there is a next process, the process returns to step 504.
 本開示のコンピュータ140は、工具470等のツール464に固有のパラメータ、例えば締付トルクは保持せず、ツールコントローラ462が、ツール464に固有の制御パラメータを保持する。コンピュータ140は、制御対象物460との間で、ツール464の種類に依存しない汎用的な信号(制御信号、応答信号)の入出力を行う。したがって、コンピュータ140は、ツール464の種類に応じて設定を変更することなく、様々なツール464(工具、信号灯、スイッチ等)、あるいは多様なメーカが製造したツール464に接続されうる。コンピュータ140は、ツールコントローラ462を用いることで、様々な種類のツール464の制御をおこなうことができ、多種多様なツールを用いる生産現場の工程設計に柔軟に対応することができる、あるいはまた、ツール464の交換を容易にすることができる。 The computer 140 of the present disclosure does not hold parameters unique to the tool 464 such as the tool 470, for example, tightening torque, and the tool controller 462 holds control parameters unique to the tool 464. The computer 140 inputs and outputs general-purpose signals (control signals and response signals) that do not depend on the type of the tool 464 with the control object 460. Therefore, the computer 140 can be connected to various tools 464 (tools, signal lights, switches, etc.) or tools 464 manufactured by various manufacturers without changing the setting according to the type of the tool 464. The computer 140 can control various types of tools 464 by using the tool controller 462, and can flexibly cope with the process design of production sites using various tools. Replacement of 464 can be facilitated.
 なお、図4Bに示すように、制御対象物460(ツール464)は、オブジェクト120と同一であってもよいし、異なってもよい。制御対象物460とオブジェクト120が異なる場合のオブジェクト120の具体例は、手袋(図3A)、部品(図3B)、ラベル(図3C)、ヘルメット(図11)であり、制御対象物460とオブジェクト120が異なる場合のツール464の具体例は、表示灯、スイッチ等である。例えば、オブジェクト120が手袋であり、制御対象物460が表示灯である場合、図5に示すステップS502からS510において、オブジェクトとしての工具470は手袋に置き換えられ、ステップS512からS519において、ツールとしての工具470は表示灯に置き換えられる。 As shown in FIG. 4B, the control object 460 (tool 464) may be the same as or different from the object 120. Specific examples of the object 120 when the control object 460 and the object 120 are different are a glove (FIG. 3A), a part (FIG. 3B), a label (FIG. 3C), and a helmet (FIG. 11). Specific examples of the tool 464 when 120 is different are an indicator light, a switch, and the like. For example, when the object 120 is a glove and the control object 460 is an indicator light, the tool 470 as an object is replaced with a glove in steps S502 to S510 shown in FIG. 5, and the tool as a tool is replaced in steps S512 to S519. Tool 470 is replaced with an indicator light.
 図7はシーケンステーブル454の一例を示す。シーケンステーブル454は、シーケンス番号610と、目標座標姿勢データ452と、開始条件702と、終了条件704と、制御対象出力706とを備える。シーケンステーブル454は、作業者により、表示操作部146(図1A)を介して予め設定されている。開始条件702は、制御対象物460に対する処理を開始する条件であり、終了条件704は、制御対象物460に対する処理を終了する条件であり、制御対象出力706は、コンピュータ140により生成された信号を出力するポートを示す。開始条件702、終了条件704は、それぞれ、「自動」「位置姿勢」「制御入力X」「時間T」のいずれかを含む。条件「自動」は、シーケンス処理が選択されると、無条件に当該シーケンス処理を開始あるいは終了することを示す。条件「位置姿勢」は、シーケンス処理が選択されると、位置姿勢比較部420からの一致信号によって、当該シーケンス処理を開始あるいは終了することを示す。条件「制御入力X」は、シーケンス処理が選択されると、制御対象物460から受け取った応答信号によって、当該シーケンス処理を開始あるいは終了することを示す。条件「時間T」は、シーケンス処理が選択されると、設定された時間T経過後に当該シーケンス処理を開始あるいは終了することを示す。 FIG. 7 shows an example of the sequence table 454. The sequence table 454 includes a sequence number 610, target coordinate posture data 452, a start condition 702, an end condition 704, and a control target output 706. The sequence table 454 is preset by the operator via the display operation unit 146 (FIG. 1A). The start condition 702 is a condition for starting the process for the control object 460, the end condition 704 is a condition for ending the process for the control object 460, and the control object output 706 is a signal generated by the computer 140. Indicates the port to output. Each of the start condition 702 and the end condition 704 includes one of “automatic”, “position and orientation”, “control input X”, and “time T”. The condition “automatic” indicates that the sequence process is unconditionally started or ended when the sequence process is selected. The condition “position / orientation” indicates that, when the sequence process is selected, the sequence process is started or ended by a coincidence signal from the position / orientation comparison unit 420. The condition “control input X” indicates that, when the sequence process is selected, the sequence process is started or ended according to the response signal received from the control object 460. The condition “time T” indicates that, when the sequence process is selected, the sequence process is started or ended after the set time T has elapsed.
 開始条件702、終了条件704、制御対象出力706の組み合わせの一例として、開始条件702「自動」、終了条件704「位置姿勢」、制御対象出力706「表示灯」について説明する(シーケンス番号001)。シーケンス番号001の処理では、オブジェクト120は手、ツール464は表示灯であり、オブジェクト120とツール464とは異なる。まず、無条件でシーケンス番号001の処理にはいる(開始条件)。そして、手(オブジェクト120)が所定の位置範囲内におかれると、次のシーケンス処理に進む(終了条件)。コンピュータ140は、制御対象物460に制御信号を送出する。当該制御信号に基づいて特定の表示灯(ツール464)が点灯する。そして、次のシーケンス処理に進む。 As an example of a combination of the start condition 702, the end condition 704, and the control target output 706, the start condition 702 “automatic”, the end condition 704 “position and orientation”, and the control target output 706 “indicator” will be described (sequence number 001). In the process of sequence number 001, the object 120 is a hand, the tool 464 is an indicator light, and the object 120 and the tool 464 are different. First, the process of sequence number 001 is entered unconditionally (start condition). When the hand (object 120) is placed within the predetermined position range, the process proceeds to the next sequence process (end condition). The computer 140 sends a control signal to the control object 460. A specific indicator lamp (tool 464) is turned on based on the control signal. Then, the process proceeds to the next sequence process.
 また、開始条件702、終了条件704、制御対象出力706の組み合わせの他の例として、開始条件702「位置姿勢」、終了条件704「外部入力1」、制御対象物460「なし」について説明する(シーケンス番号002)。シーケンス番号002の処理では、オブジェクト120、ツール464ともに工具である。まず、特定の場所のねじを特定の工具(オブジェクト120)を用いて、正しい位置姿勢で締付け、シーケンス番号002の処理に入る(開始条件)。特定の工具からの締め付け完了を示す外部入力1(応答信号)が得られると、当該処理は終了し(終了条件)、次のシーケンス処理に進む。 As another example of the combination of the start condition 702, the end condition 704, and the control target output 706, the start condition 702 “position and orientation”, the end condition 704 “external input 1”, and the control target object 460 “none” will be described ( Sequence number 002). In the process of sequence number 002, both the object 120 and the tool 464 are tools. First, a screw at a specific location is tightened at a correct position and orientation using a specific tool (object 120), and the process of sequence number 002 is entered (start condition). When an external input 1 (response signal) indicating completion of tightening from a specific tool is obtained, the process ends (end condition) and proceeds to the next sequence process.
 以上、本開示の第1実施形態に係る制御システム400の構成等について説明した。次に本開示の第2から第5の実施形態に係る制御システムの構成等について説明する。 The configuration and the like of the control system 400 according to the first embodiment of the present disclosure have been described above. Next, the configuration of the control system according to the second to fifth embodiments of the present disclosure will be described.
<第2実施形態>
 グローバル座標系Σgでカメラ110が移動する場合、あるいは目標位置602が変化する場合は、カメラ座標系Σcに対する目標座標612や目標姿勢614は変化する。第2実施形態においては、図8に示すように所定の座標系として、基準マーカ座標系Σbを用いる。基準マーカ座標系Σbでは、基準マーカ座標系Σbに対する目標座標612、及び目標姿勢614が固定されている。
Second Embodiment
When the camera 110 moves in the global coordinate system Σg or the target position 602 changes, the target coordinates 612 and the target posture 614 with respect to the camera coordinate system Σc change. In the second embodiment, as shown in FIG. 8, a reference marker coordinate system Σb is used as a predetermined coordinate system. In the reference marker coordinate system Σb, a target coordinate 612 and a target posture 614 with respect to the reference marker coordinate system Σb are fixed.
 図8は、オブジェクトマーカ122と、検出点124と、目標位置602と、基準マーカ802と、基準マーカ座標系Σb(Xb,Yb.Zb)と、マーカ座標系Σm(Xm,Ym,Zm)との位置関係を例示する。基準マーカ座標系Σbは、基準マーカ座標804を原点とする座標系である。基準マーカ802は、グローバル座標系Σgにおいてカメラ110が移動あるいは、目標位置602が変化しても、目標位置602との相対的な位置姿勢が固定される。また、基準マーカ802は、カメラ110により撮像可能な任意の位置に配置される。 8 includes an object marker 122, a detection point 124, a target position 602, a reference marker 802, a reference marker coordinate system Σb (Xb, Yb.Zb), and a marker coordinate system Σm (Xm, Ym, Zm). The positional relationship of is illustrated. The reference marker coordinate system Σb is a coordinate system having the reference marker coordinate 804 as the origin. The reference marker 802 is fixed in position and orientation relative to the target position 602 even when the camera 110 moves or the target position 602 changes in the global coordinate system Σg. Further, the reference marker 802 is disposed at an arbitrary position that can be imaged by the camera 110.
 以下、図9Aから図9Cを参照して、本開示の第2実施形態の様々な変形例を説明する。図9Aはコンベア910上に配置された部品912が動く場合の制御システム900A、図9Bおよび図9Cはカメラ110が動く場合の、制御システム900B及び900Cの概略的な構成をそれぞれ示す。第2実施形態では、オブジェクト120と制御対象物460(ツール464)は共に工具470である。 Hereinafter, various modifications of the second embodiment of the present disclosure will be described with reference to FIGS. 9A to 9C. FIG. 9A shows a schematic configuration of the control system 900A when the parts 912 arranged on the conveyor 910 move, and FIGS. 9B and 9C show schematic configurations of the control systems 900B and 900C when the camera 110 moves, respectively. In the second embodiment, both the object 120 and the control object 460 (tool 464) are tools 470.
 図9Aは、コンベア910上の部品912に付設され固定された第1基準マーカ802aと、コンベア910上に付設され固定された第2基準マーカ802bとを示す。コンベア910が動くと、第1基準マーカ802a、第2基準マーカ802bが共に動く。コンベア910と部品912の位置関係が固定の場合は、第1基準マーカ802aあるいは、第2基準マーカ802bの少なくとも一方を用いる。コンベア910と部品912の位置関係が変化する場合、部品912に付設された第1基準マーカ802aを用いる。コンピュータ140は、基準マーカ座標系Σb(図9Aでは第1基準マーカ802aを基準とする)における検出点124の座標と、目標位置602の目標座標612とを比較し、かつ基準マーカ座標系Σbにおけるオブジェクトとしての工具470の姿勢と、目標姿勢614とを比較する。コンピュータ140により工具470が正しい位置、姿勢であると判定されると、ツールコントローラ462は、工具470に対し制御信号を出力する。 FIG. 9A shows a first reference marker 802a attached and fixed to a part 912 on the conveyor 910, and a second reference marker 802b attached and fixed on the conveyor 910. When the conveyor 910 moves, both the first reference marker 802a and the second reference marker 802b move. When the positional relationship between the conveyor 910 and the component 912 is fixed, at least one of the first reference marker 802a or the second reference marker 802b is used. When the positional relationship between the conveyor 910 and the component 912 changes, the first reference marker 802a attached to the component 912 is used. The computer 140 compares the coordinates of the detection point 124 in the reference marker coordinate system Σb (referenced to the first reference marker 802a in FIG. 9A) with the target coordinates 612 of the target position 602, and in the reference marker coordinate system Σb. The posture of the tool 470 as an object is compared with the target posture 614. When the computer 140 determines that the tool 470 is in the correct position and posture, the tool controller 462 outputs a control signal to the tool 470.
 図9Bでは、図9Aに示すカメラが固定である例と異なり、カメラ110はオブジェクト120に取り付けられており、カメラ110は、オブジェクトとしての工具470と共に動く。カメラ110は、部品912に付設され固定された第1基準マーカ802aを撮像する。コンピュータ140は、基準マーカ座標系Σbにおける検出点124の座標と、目標座標612とを比較し、かつ基準マーカ座標系Σbにおける工具470の姿勢と、目標姿勢614とを比較する。本実施形態によると、目標位置602が作業者から見えない部位に設定されていても、カメラ110が目標位置602付近を撮像することが可能であるので、検出点124が目標位置602に近づいたかを判定することができる。 9B, unlike the example in which the camera shown in FIG. 9A is fixed, the camera 110 is attached to the object 120, and the camera 110 moves together with the tool 470 as an object. The camera 110 images the first reference marker 802a attached to and fixed to the component 912. The computer 140 compares the coordinates of the detection point 124 in the reference marker coordinate system Σb with the target coordinates 612, and compares the posture of the tool 470 in the reference marker coordinate system Σb with the target posture 614. According to the present embodiment, even if the target position 602 is set at a site that cannot be seen by the operator, the camera 110 can capture the vicinity of the target position 602, so whether the detection point 124 has approached the target position 602. Can be determined.
 図9Cでは、図9Aに示すカメラが固定である例と異なり、カメラ110は作業者930の被るヘルメット932に取り付けられており、カメラ110は、作業者930の頭部の動きに伴い動く。カメラ110は、部品912に付設され固定された第1基準マーカ802aあるいは第3基準マーカ802cの少なくとも一方を撮像する。第1基準マーカ802aは、作業台940の上の部品912に付設され固定されており、第3基準マーカ802cは、グローバル座標系Σgにおいて固定されている作業台940の上に付設されている。コンピュータ140は、第1基準マーカ802aあるいは、第3基準マーカ802cの少なくとも一方を用いて、基準マーカ座標系Σb(図9Cでは、第3基準マーカ802cを基準とする)における検出点124の座標と、目標位置602の目標座標612とを比較し、かつ基準マーカ座標系Σbにおけるオブジェクトとしての工具470の姿勢と、目標姿勢614とを比較する。 9C, unlike the example in which the camera shown in FIG. 9A is fixed, the camera 110 is attached to the helmet 932 worn by the worker 930, and the camera 110 moves as the head of the worker 930 moves. The camera 110 captures an image of at least one of the first reference marker 802a and the third reference marker 802c attached to and fixed to the component 912. The first reference marker 802a is attached and fixed to the component 912 on the work table 940, and the third reference marker 802c is attached to the work table 940 fixed in the global coordinate system Σg. The computer 140 uses at least one of the first reference marker 802a and the third reference marker 802c to determine the coordinates of the detection point 124 in the reference marker coordinate system Σb (in FIG. 9C, the third reference marker 802c is the reference). The target coordinates 612 of the target position 602 are compared, and the posture of the tool 470 as an object in the reference marker coordinate system Σb is compared with the target posture 614.
<第3実施形態>
 第3実施形態においては、第2の実施形態と同様に、カメラ座標系Σc(Xc,Yc,Zc)に対し目標座標612及び目標姿勢614が変化する。しかし、第3実施形態では、第2実施形態とは異なり基準マーカ802を用いずに、位置検出装置920を用いて、目標座標612をカメラ座標系Σcにおける目標座標に補正する。第2実施形態では、オブジェクト120と制御対象物460(ツール464)は共に工具470である。
<Third Embodiment>
In the third embodiment, similarly to the second embodiment, the target coordinates 612 and the target posture 614 change with respect to the camera coordinate system Σc (Xc, Yc, Zc). However, in the third embodiment, unlike the second embodiment, the target coordinate 612 is corrected to the target coordinate in the camera coordinate system Σc using the position detection device 920 without using the reference marker 802. In the second embodiment, both the object 120 and the control object 460 (tool 464) are tools 470.
 図9Dは、本開示の第3実施形態による制御システム800Eの構成を概略的に示す。図9Dでは、位置センサ914を取り付けたコンベア910上に部品912を配置している。第3実施形態による制御システム900Eは、カメラ110と、オブジェクト120と、コンピュータ140と、位置センサ914と、位置センサ用PLC(Programmable logic controller)820とを備える。位置センサ914はコンベア910の動きを検知して、コンベアの現在の位置Cpを検出する。位置センサ用PLC820は、コンベアの起点からの現在のコンベアの位置Cpを、物理的な長さ(m)に変換し、コンピュータ140に出力する。そして、コンピュータ140は、一工程のコンベア上の長さをLp(m)としたときの、一工程内での現在のコンベアの位置を算出する。一工程内での現在のコンベアの位置は、コンベアの起点からの現在の位置Cp(m)を、Lp(m)で割った剰余である。コンピュータ140は、カメラ座標系Σcでのコンベアの進行方向を示す三次元の単位ベクトルV(Ux,Uy,Uz)を保持している。コンピュータ140は、一工程内での現在のコンベアの位置と、カメラ座標系Σcでのコンベア進行方向を示す三次元の単位ベクトルVとを用いて、メモリ144に格納されている目標座標612を、カメラ座標系Σcにおける目標座標に補正する。補正された目標座標はQ=((Cp mod Lp)×V+Pで求められる。ここで、Pはカメラ座標系Σcにおけるメモリ144に格納された目標位置602の目標座標612である。そして、この補正された目標座標を、カメラ座標系Σcにおける検出点124の座標との比較判定に用いる。但しmodは剰余である。単位ベクトルVは、例えば、一時的にコンベア910に基準マーカ802を付設し、設置されたカメラで基準マーカ802を一定時間間隔で撮影することで取得される。 FIG. 9D schematically illustrates a configuration of a control system 800E according to the third embodiment of the present disclosure. In FIG. 9D, the component 912 is arranged on the conveyor 910 to which the position sensor 914 is attached. A control system 900E according to the third embodiment includes a camera 110, an object 120, a computer 140, a position sensor 914, and a position sensor PLC (Programmable Logic Controller) 820. The position sensor 914 detects the movement of the conveyor 910 and detects the current position Cp of the conveyor. The position sensor PLC 820 converts the current conveyor position Cp from the starting point of the conveyor into a physical length (m) and outputs it to the computer 140. Then, the computer 140 calculates the current position of the conveyor in one process when the length on the conveyor in one process is Lp (m). The current position of the conveyor within one process is a remainder obtained by dividing the current position Cp (m) from the starting point of the conveyor by Lp (m). The computer 140 holds a three-dimensional unit vector V (Ux, Uy, Uz) indicating the traveling direction of the conveyor in the camera coordinate system Σc. The computer 140 uses the current conveyor position in one process and the three-dimensional unit vector V indicating the conveyor traveling direction in the camera coordinate system Σc to calculate the target coordinates 612 stored in the memory 144. The target coordinate in the camera coordinate system Σc is corrected. The corrected target coordinate is obtained by Q = ((Cp mod Lp) × V + P. Here, P is the target coordinate 612 of the target position 602 stored in the memory 144 in the camera coordinate system Σc. The target coordinates thus used are used for comparison with the coordinates of the detection point 124 in the camera coordinate system Σc, where mod is a remainder, and the unit vector V is temporarily provided with a reference marker 802 on the conveyor 910, for example, It is acquired by photographing the reference marker 802 at regular time intervals with an installed camera.
<第4実施形態>
 図10は、本開示の第4実施形態による制御システム1000の構成を概略的に示す図である。第4実施形態においては、固定されたカメラ110が撮像した画像を用いて、複数の異なる種類のオブジェクト120(作業者の手、工具)がそれぞれ正しい位置、あるいは姿勢であるかを判定し、正しい位置および姿勢である場合に制御対象物460(ツール464)を用いて作業者の作業を支援する。
<Fourth embodiment>
FIG. 10 is a diagram schematically illustrating a configuration of a control system 1000 according to the fourth embodiment of the present disclosure. In the fourth embodiment, it is determined whether or not a plurality of different types of objects 120 (workers' hands and tools) are in the correct positions or postures using images captured by the fixed camera 110. In the case of the position and posture, the control object 460 (tool 464) is used to assist the operator's work.
 カメラ110は、オブジェクトマーカ122aが付された作業者の手120aと、オブジェクトマーカ122bが付されたツール464c(オブジェクト120b)と、目標位置602ta及び602tbがそれぞれ設定されたボルトバケツ1002と、目標位置602tc1からtc4が設定された部品912とを撮像する。オブジェクトマーカ122aとオブジェクトマーカ122bは異なる識別番号を持ち、オブジェクトマーカ122aに検出点124aが、オブジェクトマーカ122bに検出点124bがそれぞれ対応している。オブジェクトマーカ122aの識別番号により作業者が識別され、またオブジェクトマーカ122bの識別番号によりツール464が識別される。 The camera 110 includes an operator's hand 120a with an object marker 122a, a tool 464c (object 120b) with an object marker 122b, a bolt bucket 1002 with target positions 602ta and 602tb set, and a target position. The component 912 in which 602tc1 to tc4 are set is imaged. The object marker 122a and the object marker 122b have different identification numbers, and the detection point 124a corresponds to the object marker 122a, and the detection point 124b corresponds to the object marker 122b. The operator is identified by the identification number of the object marker 122a, and the tool 464 is identified by the identification number of the object marker 122b.
 制御システム1000は、一例として、特定の作業者が、特定のボルトバケツ1002からボルト(不図示)を採取し(シーケンス処理1)、特定の作業者が当該ボルトを部品912に設定された特定の位置に配置し(シーケンス処理2)、さらに、作業者は特定の工具を用いて、配置されたボルトを締める(シーケンス処理3)よう支援する。以下、シーケンス処理1から3の内容について詳細に説明する。 As an example, the control system 1000 takes a bolt (not shown) from a specific bolt bucket 1002 (sequence process 1) by a specific worker, and the specific worker sets the bolt to the part 912. In addition, the operator uses a specific tool to assist in tightening the arranged bolt (sequence process 3). Hereinafter, the contents of the sequence processes 1 to 3 will be described in detail.
1.シーケンス処理1:ボルト採取
 シーケンス処理1においては、オブジェクト120は作業者の手であり、制御対象物460は、表示灯464a及び464bである。シーケンス処理1では、コンピュータ140は、ツール(表示灯)464aをアクティブに(点灯)させ、手袋に付されたオブジェクトマーカ122aにより識別された特定の作業者に、目標位置602taが設定された特定のボルトバケツ1002からボルトを採取するよう促す。そして、手袋に設定された検出点124aが、ボルトバケツ1002の目標位置602taに近づくと、コンピュータ140はツール464aを非アクティブに(表示灯を消灯)させ、次の処理に進む。なお、図10においてツール464a及び464b(表示灯)は、コンピュータ140に接続されているが、ツール464a及び464bはツールコントローラ462に接続されていてもよい。この場合、ツール464a及び464bは、ツールコントローラ462からの制御信号に応答して点灯あるいは消灯する。
1. Sequence Process 1: Bolt Extraction In sequence process 1, the object 120 is an operator's hand, and the control object 460 is the indicator lamps 464a and 464b. In the sequence process 1, the computer 140 activates (lights) the tool (indicator light) 464a, and the specific operator identified by the object marker 122a attached to the glove has a specific position in which the target position 602ta is set. Prompt to collect bolts from the bolt bucket 1002. When the detection point 124a set on the glove approaches the target position 602ta of the bolt bucket 1002, the computer 140 deactivates the tool 464a (turns off the indicator light) and proceeds to the next process. In FIG. 10, the tools 464a and 464b (indicator lights) are connected to the computer 140, but the tools 464a and 464b may be connected to the tool controller 462. In this case, the tools 464a and 464b are turned on or off in response to a control signal from the tool controller 462.
2.シーケンス処理2:ボルト配置
 シーケンス処理2においては、オブジェクト120は作業者の手であり、制御対象物460は、作業者の手である。シーケンス処理2では、手袋に設定された検出点124aが、部品912の目標位置602tc1に近づくと、コンピュータ140は、採取したボルトが正しい目標位置602tc1に配置されたと判断し、次の処理へ進む。
2. Sequence process 2: Bolt arrangement In the sequence process 2, the object 120 is an operator's hand, and the control object 460 is an operator's hand. In the sequence process 2, when the detection point 124a set for the glove approaches the target position 602tc1 of the part 912, the computer 140 determines that the collected bolt is arranged at the correct target position 602tc1, and proceeds to the next process.
3.シーケンス処理3:ボルト締め付け
 シーケンス処理3においては、オブジェクト120、制御対象物460は、共に工具470である。シーケンス処理3では、作業者により把持される特定の工具470の検出点124bが目標位置602tc1に、目標姿勢で近づくことを検出すると、コンピュータ140は、ツールコントローラ462へ制御信号を出力する。ツールコントローラ462は制御信号に基づいて、締付プログラム1を設定する。締め付けプログラム1で設定された締め付けトルクに等しいトルクでボルトが締められると、工具470は、ツールコントローラ462に対し、締め付け完了信号を送出する。ツールコントローラ462は、締め付け完了信号を検知すると、コンピュータ140へ応答信号を出力し、次のステップへ進む。
3. Sequence process 3: bolt tightening In sequence process 3, both the object 120 and the controlled object 460 are tools 470. In the sequence process 3, when it is detected that the detection point 124b of the specific tool 470 gripped by the operator approaches the target position 602tc1 in a target posture, the computer 140 outputs a control signal to the tool controller 462. The tool controller 462 sets the tightening program 1 based on the control signal. When the bolt is tightened with a torque equal to the tightening torque set in the tightening program 1, the tool 470 sends a tightening completion signal to the tool controller 462. When the tool controller 462 detects the tightening completion signal, the tool controller 462 outputs a response signal to the computer 140 and proceeds to the next step.
4.シーケンス処理4:ボルト採取(2本目)
 コンピュータ140は、ツール464aに代えてツール464bに対して、また、目標位置602taに代えて目標位置602tbに対して、上記シーケンス処理1と同様の処理を行う。
4). Sequence processing 4: Bolt sampling (second)
The computer 140 performs the same process as the sequence process 1 on the tool 464b instead of the tool 464a and on the target position 602tb instead of the target position 602ta.
5.シーケンス処理5:ボルト配置(2本目)
 コンピュータ140は、目標位置602tc1に代えて目標位置602tc2に対して、上記シーケンス処理2と同様の処理を行う。
5). Sequence process 5: Bolt arrangement (second)
The computer 140 performs the same process as the sequence process 2 on the target position 602tc2 instead of the target position 602tc1.
6.シーケンス処理6:ボルト締め付け(2本目)
 コンピュータ140は、目標位置602tc1に代えて目標位置602tc2に対して、上記シーケンス処理3と同様の処理を行う。
6). Sequence processing 6: bolt tightening (second)
The computer 140 performs the same process as the sequence process 3 on the target position 602tc2 instead of the target position 602tc1.
 以上、制御システム1000は、作業者が特定のツール464を用いて、あるいは特定のオブジェクト120を用いて、シーケンス処理に規定された各種処理を行うよう作業者の作業を支援することができる。 As described above, the control system 1000 can support the worker's work so that the worker performs various processes defined in the sequence process using the specific tool 464 or the specific object 120.
 なお、上記シーケンス処理1から6の内容は例示であってこれに限定されず、オブジェクトの種類や、制御対象物の種類等に応じて様々に変更が可能である。一例として、オブジェクト120が部品(図3B)、制御対象物460が表示灯の場合、制御システム1000は、作業者が、オブジェクトマーカ122により識別された特定の部品120を、予め設定された表示灯が点灯された特定の部品位置に配置するよう支援することができる。また、他の例として、オブジェクト120がラベル(図3C)、制御対象物460が表示灯の場合、制御システム1000は、オブジェクトマーカ122により識別された特定のラベルを予め設定された表示灯が点灯された特定の位置に貼り付けるよう作業者を支援することができる。 Note that the contents of the sequence processes 1 to 6 are examples and are not limited thereto, and various changes can be made according to the type of the object, the type of the control target, and the like. As an example, when the object 120 is a component (FIG. 3B) and the control object 460 is an indicator lamp, the control system 1000 causes the operator to set a specific indicator 120 identified by the object marker 122 as a preset indicator lamp. It is possible to assist the user in arranging at a specific part position where is lit. As another example, when the object 120 is a label (FIG. 3C) and the control object 460 is an indicator lamp, the control system 1000 turns on an indicator lamp in which a specific label identified by the object marker 122 is set in advance. The worker can be assisted to paste it at the specified position.
<第5実施形態>
 図11は、本開示の第5実施形態による制御システム1100の構成を概略的に示す図である。制御システム1100は、第1カメラ110aと接続された第1コンピュータ140aと、第2カメラ110bと接続された第2コンピュータ140bと、第1及び第2のコンピュータ140bに接続されたツールコントローラと、第1オブジェクト120a(ヘルメット932)と、第2オブジェクト120b(工具470)とを備える。第1コンピュータ140aと第2コンピュータ140bとはこれらのコンピュータからの出力の論理積を出力する演算装置950を介して、ツールコントローラ462と接続される。第5実施形態においては、制御システム1100は、グローバル座標系Σgにおいて位置が固定された第1カメラ110aと、位置が固定されていない第2カメラ110bとを複合利用する。
<Fifth Embodiment>
FIG. 11 is a diagram schematically illustrating a configuration of a control system 1100 according to the fifth embodiment of the present disclosure. The control system 1100 includes a first computer 140a connected to the first camera 110a, a second computer 140b connected to the second camera 110b, a tool controller connected to the first and second computers 140b, One object 120a (helmet 932) and a second object 120b (tool 470) are provided. The first computer 140a and the second computer 140b are connected to the tool controller 462 via an arithmetic unit 950 that outputs a logical product of outputs from these computers. In the fifth embodiment, the control system 1100 uses a combination of the first camera 110a whose position is fixed in the global coordinate system Σg and the second camera 110b whose position is not fixed.
 第5実施形態による制御システム1100は、図9Cに示す実施形態と異なり、部品912や作業台940には基準マーカは付設されていない。第5実施形態においては、部品912や作業台940上に基準マーカを付設する代わりに、部品912と固定位置関係にある第1カメラ110aを用いて、2段階の座標変換を行う。 The control system 1100 according to the fifth embodiment is different from the embodiment shown in FIG. 9C in that no reference marker is attached to the component 912 or the workbench 940. In the fifth embodiment, instead of providing a reference marker on the part 912 or the work table 940, two-step coordinate conversion is performed using the first camera 110a that is in a fixed positional relationship with the part 912.
 第1カメラ110aは部品912との位置関係が固定されており、ヘルメット932に付された第1オブジェクトマーカ122aと、三次元空間内の目標位置602taとを撮像する。第1コンピュータ140aは第1オブジェクトマーカ122aの位置と姿勢から、オブジェクトとしてのヘルメット932に設定された第1検出点124aの座標を算出する。そして、第1検出点124aを目標位置602taの所定の位置範囲と比較し、ヘルメット932の姿勢を、所定の姿勢範囲と比較し一致判定する。 The positional relationship between the first camera 110a and the component 912 is fixed, and the first object marker 122a attached to the helmet 932 and the target position 602ta in the three-dimensional space are imaged. The first computer 140a calculates the coordinates of the first detection point 124a set on the helmet 932 as an object from the position and orientation of the first object marker 122a. Then, the first detection point 124a is compared with a predetermined position range of the target position 602ta, and the posture of the helmet 932 is compared with a predetermined posture range to determine coincidence.
 第2カメラ110bは、作業者930の被るヘルメット932に固定されており、オブジェクトマーカ122bと目標位置602が設定された部品912とを撮像する。第2コンピュータ140bはオブジェクトとしての工具470に付された第2オブジェクトマーカ122bの位置と姿勢から、工具470上に設定された第2検出点124bの座標を算出する。そして、第2コンピュータ140bは、第2検出点124bを目標位置602tbの位置範囲と比較し、工具470の姿勢を、目標姿勢の位置範囲と比較し、一致判定する。演算装置850は、第1コンピュータ140a及び第2コンピュータ140bからの一致信号の論理積を、ツールコントローラ462に出力する。これら一連の処理により、第1カメラ110aおよび第1コンピュータ140aによる一致判定結果と、第2カメラ110bおよび第2コンピュータ140bによる一致判定結果と、を結合させ、固定された第1カメラ110aによって直接的に部品912を撮影すること無しに、部品912に固定された座標系における目標位置602tbの位置と姿勢を検出し、一致判定することが可能となる。第5実施形態では第1コンピュータ140aによる処理と、第2コンピュータ140bによる処理とを行い、さらに各コンピュータによる処理結果の論理積をとることで、結果として2段階の座標変換を行っていることとなる。本実施形態によると、第1カメラ110aは第1オブジェクトマーカ122aを撮影できるものの、直接部品912を撮影することができず、第2カメラ110bが部品912を撮影することができる。本実施形態ではこのような複雑に入り組んだ作業環境においても目標位置が検出可能である。 The second camera 110b is fixed to the helmet 932 worn by the worker 930, and images the object marker 122b and the part 912 in which the target position 602 is set. The second computer 140b calculates the coordinates of the second detection point 124b set on the tool 470 from the position and orientation of the second object marker 122b attached to the tool 470 as an object. Then, the second computer 140b compares the second detection point 124b with the position range of the target position 602tb, compares the posture of the tool 470 with the position range of the target posture, and determines coincidence. The arithmetic device 850 outputs the logical product of the coincidence signals from the first computer 140 a and the second computer 140 b to the tool controller 462. Through a series of these processes, the coincidence determination result by the first camera 110a and the first computer 140a and the coincidence determination result by the second camera 110b and the second computer 140b are combined, and directly by the fixed first camera 110a. In addition, the position and orientation of the target position 602tb in the coordinate system fixed to the part 912 can be detected and the coincidence can be determined without photographing the part 912. In the fifth embodiment, the processing by the first computer 140a and the processing by the second computer 140b are performed, and further the logical product of the processing results by the respective computers is taken, resulting in two-step coordinate transformation. Become. According to the present embodiment, although the first camera 110a can photograph the first object marker 122a, the first camera 110a cannot photograph the component 912 directly, and the second camera 110b can photograph the component 912. In the present embodiment, the target position can be detected even in such a complicated work environment.
 第5実施形態によると、固定された第1カメラ110aが、非固定の第2カメラ110bと共に動くオブジェクトマーカ122aを撮像し、第1オブジェクトマーカ122aの位置、姿勢を算出することができる。一例として、自動車などの組立を行う場合、作業者930は車外に居ながら、手と工具を車内に差し入れて作業を行うことがある。このような場合、車外にある作業者930の頭部は固定されたカメラ110aにより撮影可能であるが、車内に差し入れられた作業者の手や工具120は固定された第1カメラ110aにより撮影できないことがある。作業者930の手や工具は、作業者のヘルメット932に付設された非固定の第2カメラ110bにより撮影可能である。第5実施形態は、固定カメラにより撮影できない範囲を非固定のカメラにより撮影できる場合に有用である。 According to the fifth embodiment, the fixed first camera 110a can image the object marker 122a that moves together with the non-fixed second camera 110b, and calculate the position and orientation of the first object marker 122a. As an example, when assembling an automobile or the like, the worker 930 may perform work by inserting a hand and a tool into the vehicle while being outside the vehicle. In such a case, the head of the worker 930 outside the vehicle can be photographed by the fixed camera 110a, but the hand and tool 120 of the worker inserted in the vehicle cannot be photographed by the fixed first camera 110a. Sometimes. The hand and tool of the worker 930 can be photographed by an unfixed second camera 110b attached to the helmet 932 of the worker. The fifth embodiment is useful when a range that cannot be captured by a fixed camera can be captured by a non-fixed camera.
 以上、本開示の実施の形態について説明してきたが、上記した発明の実施の形態は、本開示の理解を容易にするためのものであり、本開示を限定するものではない。本開示は、その趣旨を逸脱することなく、変更、改良され得るとともに、本開示にはその均等物が含まれることはもちろんである。また、上述した課題の少なくとも一部を解決できる範囲、または、効果の少なくとも一部を奏する範囲において、実施形態および変形例の任意の組み合わせが可能であり、特許請求の範囲および明細書に記載された各構成要素の任意の組み合わせ、または、省略が可能である。 Although the embodiments of the present disclosure have been described above, the above-described embodiments of the present invention are for facilitating the understanding of the present disclosure, and do not limit the present disclosure. The present disclosure can be changed and improved without departing from the gist thereof, and the present disclosure naturally includes equivalents thereof. In addition, any combination of the embodiment and the modified example is possible within a range where at least a part of the above-described problems can be solved or a range where at least a part of the effect can be achieved, and is described in the claims and the specification Any combination or omission of each component is possible.
  100…オフセット設定システム
  110…カメラ
  112…オブジェクトマーカ
  120…オブジェクト
  122…オブジェクトマーカ
  124…検出点
  126…オブジェクトマーカ座標
  130…設定具
  132…設定用マーカ
  134…基準位置
  140…コンピュータ
  142…プロセッサ
  144…メモリ
  150…マーカ座標設定部
  152…位置検出部
  154…座標変換部
  162…検出点データ
  202…マーカ識別番号
  204…オフセット座標
  400、900A、900B、900C、1000、1100…制御システム
  410…位置姿勢検出部
  420…位置姿勢比較部
  430…シーケンス制御部
  440…制御信号生成部
  452…目標座標姿勢データ
  454…シーケンステーブル
  460…制御対象物
  462…ツールコントローラ
  464…ツール
  470…工具
  602…目標位置
  604…位置範囲
  612…目標座標
  614…目標姿勢
  616…マーカ識別番号
  802…基準マーカ
  804…基準マーカ座標
  910…コンベア
  912…部品
  914…位置センサ
  920…位置検出装置
DESCRIPTION OF SYMBOLS 100 ... Offset setting system 110 ... Camera 112 ... Object marker 120 ... Object 122 ... Object marker 124 ... Detection point 126 ... Object marker coordinate 130 ... Setting tool 132 ... Setting marker 134 ... Reference position 140 ... Computer 142 ... Processor 144 ... Memory DESCRIPTION OF SYMBOLS 150 ... Marker coordinate setting part 152 ... Position detection part 154 ... Coordinate conversion part 162 ... Detection point data 202 ... Marker identification number 204 ... Offset coordinate 400, 900A, 900B, 900C, 1000, 1100 ... Control system 410 ... Position and orientation detection part 420 ... Position and orientation comparison unit 430 ... Sequence control unit 440 ... Control signal generation unit 452 ... Target coordinate orientation data 454 ... Sequence table 460 ... Control object 462 ... Tool Controller 464 ... Tool 470 ... Tool 602 ... Target position 604 ... Position range 612 ... Target coordinate 614 ... Target posture 616 ... Marker identification number 802 ... Reference marker 804 ... Reference marker coordinate 910 ... Conveyor 912 ... Parts 914 ... Position sensor 920 ... Position Detection device

Claims (11)

  1.  オブジェクトに付されたオブジェクトマーカを撮像した画像から、所定の座標系における前記オブジェクトマーカの位置と姿勢を取得するステップと、
     マーカ座標系における前記オブジェクトマーカからの前記オブジェクトの検出点の座標と、前記オブジェクトマーカの位置と姿勢とに基づいて、前記所定の座標系における前記検出点の位置と前記オブジェクトの姿勢とを算出するステップと、
     前記検出点の位置が、前記所定の座標系における所定の位置範囲内にあり、かつ前記オブジェクトの姿勢が、前記所定の座標系における所定の姿勢範囲内にあるかを判定するステップと、
    を備える方法。
    Obtaining a position and orientation of the object marker in a predetermined coordinate system from an image obtained by imaging the object marker attached to the object;
    Based on the coordinates of the detection point of the object from the object marker in the marker coordinate system and the position and posture of the object marker, the position of the detection point and the posture of the object in the predetermined coordinate system are calculated. Steps,
    Determining whether the position of the detection point is within a predetermined position range in the predetermined coordinate system and the posture of the object is within a predetermined posture range in the predetermined coordinate system;
    A method comprising:
  2.  前記検出点の位置が、前記所定の位置範囲内にあり、かつ前記オブジェクトの姿勢が、前記所定の姿勢範囲内にある場合に、制御対象物に対する制御信号を出力するステップをさらに備え、前記制御信号は、前記制御対象物の種類に依存しない信号である、請求項1に記載の方法。 A step of outputting a control signal for a control object when the position of the detection point is within the predetermined position range and the posture of the object is within the predetermined posture range; The method according to claim 1, wherein the signal is a signal that does not depend on a type of the control object.
  3.  前記制御信号に応じて、前記制御対象物からの応答信号を取得するステップをさらに備え、前記応答信号は、前記制御対象物の種類に依存しない信号である、請求項2に記載の方法。 The method according to claim 2, further comprising a step of obtaining a response signal from the control object in response to the control signal, wherein the response signal is a signal independent of a type of the control object.
  4.  前記オブジェクトマーカを撮像するカメラのカメラ座標系に対し、前記所定の位置範囲及び前記所定の姿勢範囲が変化しない場合に、前記所定の座標系は、カメラ座標系である、請求項1に記載の方法。 The predetermined coordinate system is a camera coordinate system when the predetermined position range and the predetermined posture range do not change with respect to a camera coordinate system of a camera that images the object marker. Method.
  5.  前記オブジェクトマーカを撮像するカメラのカメラ座標系に対し、前記所定の位置範囲及び前記所定の姿勢範囲が変化する場合に、前記所定の座標系は、基準マーカ座標系である、請求項1に記載の方法。 The predetermined coordinate system is a reference marker coordinate system when the predetermined position range and the predetermined posture range change with respect to a camera coordinate system of a camera that images the object marker. the method of.
  6.  前記オブジェクトマーカを撮像するカメラのカメラ座標系対し、前記所定の位置範囲及び前記所定の姿勢範囲が変化する場合に、前記所定の位置範囲の目標座標及び、前記所定の姿勢範囲の目標姿勢を、カメラ座標系における目標座標、及び目標姿勢にそれぞれ補正する、請求項1に記載の方法。 When the predetermined position range and the predetermined posture range change with respect to the camera coordinate system of the camera that images the object marker, the target coordinates of the predetermined position range and the target posture of the predetermined posture range are: The method according to claim 1, wherein each of the target coordinates and the target posture in the camera coordinate system is corrected.
  7.  オブジェクトに付されたオブジェクトマーカを撮像した画像から、所定の座標系における前記オブジェクトマーカの位置と姿勢を検出し、マーカ座標系における前記オブジェクトマーカからの前記オブジェクトの検出点の座標と、前記オブジェクトマーカの位置と姿勢とに基づいて、前記所定の座標系における前記検出点の位置と前記オブジェクトの姿勢を検出する位置姿勢検出部と、
     前記検出点の位置が、前記所定の座標系における所定の位置範囲内にあり、かつ前記オブジェクトの姿勢が、前記所定の座標系における所定の目標姿勢にあるかを判定する位置姿勢比較部と、
    を備える装置。
    The position and orientation of the object marker in a predetermined coordinate system is detected from an image obtained by imaging the object marker attached to the object, the coordinates of the detection point of the object from the object marker in the marker coordinate system, and the object marker A position and orientation detection unit for detecting the position of the detection point and the orientation of the object in the predetermined coordinate system based on the position and orientation of
    A position and orientation comparison unit that determines whether the position of the detection point is within a predetermined position range in the predetermined coordinate system and the posture of the object is in a predetermined target posture in the predetermined coordinate system;
    A device comprising:
  8.  前記検出点の位置が、前記所定の位置範囲内にあり、かつ前記オブジェクトの姿勢が、前記所定の姿勢範囲内にある場合に、制御対象物に対する制御信号を出力する制御信号生成部をさらに備え、前記制御信号は、前記制御対象物の種類に依存しない信号である、請求項7に記載の装置。 And a control signal generator that outputs a control signal for a control target when the position of the detection point is within the predetermined position range and the posture of the object is within the predetermined posture range. The apparatus according to claim 7, wherein the control signal is a signal that does not depend on a type of the control object.
  9.  前記制御信号生成部は、さらに、前記制御信号に応じて、前記制御対象物からの応答信号を取得し、前記応答信号は、前記制御対象物の種類に依存しない信号である、請求項8に記載の装置。 The control signal generation unit further obtains a response signal from the control object according to the control signal, and the response signal is a signal independent of the type of the control object. The device described.
  10.  オブジェクトマーカが付されたオブジェクトの検出点が設定具の基準位置に重なったときの、所定の座標系における前記オブジェクトマーカの位置と姿勢とを算出するステップと、
     前記基準位置と、前記オブジェクトマーカの位置と姿勢とに基づいて、マーカ座標系における前記オブジェクトマーカを基準とする前記オブジェクトの検出点の座標を設定するステップと、
    を備える方法。
    Calculating the position and orientation of the object marker in a predetermined coordinate system when the detection point of the object to which the object marker is attached overlaps the reference position of the setting tool;
    Setting the coordinates of the detection point of the object based on the object marker in a marker coordinate system based on the reference position and the position and orientation of the object marker;
    A method comprising:
  11.  前記基準位置は、前記設定具に付された設定用マーカとは離れた位置に配置される、請求項10に記載の方法。 The method according to claim 10, wherein the reference position is arranged at a position away from a setting marker attached to the setting tool.
PCT/JP2019/011980 2018-03-23 2019-03-22 Method and device WO2019182084A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-055637 2018-03-23
JP2018055637A JP6596530B2 (en) 2018-03-23 2018-03-23 Method and apparatus

Publications (1)

Publication Number Publication Date
WO2019182084A1 true WO2019182084A1 (en) 2019-09-26

Family

ID=67986505

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/011980 WO2019182084A1 (en) 2018-03-23 2019-03-22 Method and device

Country Status (2)

Country Link
JP (1) JP6596530B2 (en)
WO (1) WO2019182084A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113576459A (en) * 2020-04-30 2021-11-02 本田技研工业株式会社 Analysis device, analysis method, storage medium storing program, and calibration method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002013910A (en) * 2000-06-30 2002-01-18 Shibuya Kogyo Co Ltd Method and device for monitoring laser beam
JP2013132736A (en) * 2011-12-27 2013-07-08 Mitsubishi Electric Engineering Co Ltd Work management device and work management system
JP2014052393A (en) * 2012-09-04 2014-03-20 Ricoh Co Ltd Image projection system, operation method of image projection system, image projector, remote control device of image projection system, and television conference system using image projection system
JP2017049658A (en) * 2015-08-31 2017-03-09 Kddi株式会社 AR information display device
JP2017198563A (en) * 2016-04-27 2017-11-02 株式会社キーエンス Three-dimensional coordinate measuring instrument

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002013910A (en) * 2000-06-30 2002-01-18 Shibuya Kogyo Co Ltd Method and device for monitoring laser beam
JP2013132736A (en) * 2011-12-27 2013-07-08 Mitsubishi Electric Engineering Co Ltd Work management device and work management system
JP2014052393A (en) * 2012-09-04 2014-03-20 Ricoh Co Ltd Image projection system, operation method of image projection system, image projector, remote control device of image projection system, and television conference system using image projection system
JP2017049658A (en) * 2015-08-31 2017-03-09 Kddi株式会社 AR information display device
JP2017198563A (en) * 2016-04-27 2017-11-02 株式会社キーエンス Three-dimensional coordinate measuring instrument

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113576459A (en) * 2020-04-30 2021-11-02 本田技研工业株式会社 Analysis device, analysis method, storage medium storing program, and calibration method

Also Published As

Publication number Publication date
JP6596530B2 (en) 2019-10-23
JP2019168304A (en) 2019-10-03

Similar Documents

Publication Publication Date Title
CN108297096B (en) Calibration device, calibration method, and computer-readable medium
EP1555508B1 (en) Measuring system
JP4191080B2 (en) Measuring device
US11173609B2 (en) Hand-eye calibration method and system
EP1607194B1 (en) Robot system comprising a plurality of robots provided with means for calibrating their relative position
WO2013099373A1 (en) Work management apparatus and work management system
CN111225143B (en) Image processing apparatus, control method thereof, and program storage medium
JP2016078195A (en) Robot system, robot, control device and control method of robot
KR102400416B1 (en) Detection of the robot axial angles and selection of a robot by means of a camera
JP2019025572A (en) Control device of robot, the robot, robot system, and method of checking abnormality of the robot
JP6596530B2 (en) Method and apparatus
JP2016182648A (en) Robot, robot control device and robot system
JP7439410B2 (en) Image processing device, image processing method and program
CN113858214B (en) Positioning method and control system for robot operation
CN116867619A (en) Teaching device
US11267129B2 (en) Automatic positioning method and automatic control device
JP2020197806A (en) Detection system
JP3541980B2 (en) Calibration method for robot with visual sensor
EP4279227A1 (en) Robot system and robot control method
WO2021200743A1 (en) Device for correcting robot teaching position, teaching device, robot system, teaching position correction method, and computer program
JP2012022600A (en) Mask image creation system
WO2021210540A1 (en) Coordinate system setting system and position/orientation measurement system
CN117620812A (en) Weld joint polishing head adjusting method and device, electronic equipment and storage medium
US20190061152A1 (en) Measuring method, program, measuring apparatus and method of manufacturing article
JP2020059064A (en) Deviation detection device and detection method of robot tool center position

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19771289

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19771289

Country of ref document: EP

Kind code of ref document: A1