CN115516274A - Marker, device, system, and measurement method for measuring position and orientation of object - Google Patents

Marker, device, system, and measurement method for measuring position and orientation of object Download PDF

Info

Publication number
CN115516274A
CN115516274A CN202180033490.3A CN202180033490A CN115516274A CN 115516274 A CN115516274 A CN 115516274A CN 202180033490 A CN202180033490 A CN 202180033490A CN 115516274 A CN115516274 A CN 115516274A
Authority
CN
China
Prior art keywords
feature point
mark
unit
orientation
marker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180033490.3A
Other languages
Chinese (zh)
Inventor
栗田恒雄
笠岛永吉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Institute of Advanced Industrial Science and Technology AIST
Original Assignee
National Institute of Advanced Industrial Science and Technology AIST
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Institute of Advanced Industrial Science and Technology AIST filed Critical National Institute of Advanced Industrial Science and Technology AIST
Publication of CN115516274A publication Critical patent/CN115516274A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/401Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by control arrangements for measuring, e.g. calibration and initialisation, measuring workpiece for machining purposes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/402Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by control arrangements for positioning, e.g. centring a tool relative to a hole in the workpiece, additional detection means to correct position
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37067Calibrate work surface, reference markings on object, work surface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37555Camera detects orientation, position workpiece, points of workpiece

Abstract

In the present disclosure, there is provided a marker (16) provided on a surface of an object, the marker being capable of measuring a position and a posture of the object, the marker including: a first surface (15 a) having a first characteristic point (Ht) 1 ) (ii) a And a second surface (16 b) formed to have a predetermined height difference (D) from the first feature point and a predetermined relative posture to the first surface below the first surface, wherein a second feature point (Kb) corresponding to the first feature point is obtained based on a shadow of the first surface projected onto the second surface in an image obtained by imaging the mark by an imaging unit (19) 1 ) Radical (II) of a fluorine-containing compoundThe position and orientation of the object can be measured at the first feature point and the second feature point.

Description

Marker, device, system, and measurement method for measuring position and orientation of object
Technical Field
The present invention relates to a technique for measuring the position and orientation of an object with high accuracy by an imaging device using a marker.
Background
In a manufacturing site, a workpiece is machined by combining various machine tools and industrial robots. The following techniques are known: in order to coordinate the machine tool with the robot, a calibration mark is formed on the machine tool, and a camera of the robot images the calibration mark, thereby performing coordinate conversion between a coordinate system of the robot and a coordinate system of the machine tool (for example, see patent document 1).
In the method using the calibration mark formed on the robot, when the object is moved and delivered to the machine tool or the robot which is arranged separately, the movable range is narrowed.
A technique of forming a posture mark on a workpiece holder to specify a 3-dimensional position of the workpiece is known (for example, see patent document 2).
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2016-101640
Patent document 2: japanese patent laid-open publication No. 2017-144534
Disclosure of Invention
Problems to be solved by the invention
The invention aims to provide a mark, a device and a system capable of measuring the position and the posture of an object with high precision.
Means for solving the problems
According to one aspect of the present invention, there is provided a marker that is provided on a surface of an object and that is capable of measuring a position and a posture of the object, the marker including: a first face having a first feature point; and a second surface which is formed to have a predetermined height difference from the first feature point and a predetermined relative posture to the first surface below the first surface, and which is formed to find a second feature point corresponding to the first feature point based on a shadow of the first surface projected to the second surface in an image obtained by imaging the mark by an imaging means, and which is capable of measuring the position and posture of the object based on the first feature point and the second feature point.
According to the above aspect, it is possible to provide a marker including a first surface and a second surface formed below the first surface with a predetermined step, and capable of accurately measuring the position and orientation of an object provided with the marker based on a first feature point of the first surface in an image captured by an imaging unit and a second feature point based on a shadow projected onto the first surface of the second surface.
According to another aspect of the present invention, there is provided an apparatus comprising: a light irradiation unit that is provided on a surface of an object and that can irradiate the mark of the above-described aspect with light to form a shadow of the first surface on the second surface; an imaging unit that images the mark; and a measurement unit that obtains a second feature point corresponding to the first feature point based on a shadow of the first surface projected onto the second surface in the image of the marker captured by the imaging unit, and obtains a position and a posture of the object based on the first feature point and the second feature point.
According to another aspect of the present invention, there is provided an apparatus comprising: the position and orientation of the object provided with the mark can be measured with high accuracy by imaging the mark having the first surface having the first feature point and the second surface formed below with a predetermined step from the first surface by the imaging means, and measuring and calculating the first feature point of the first surface and the second feature point based on the shadow projected onto the first surface of the second surface by the light irradiation means in the image by the measuring means.
According to another aspect of the present invention, there is provided a method of measuring a position and an orientation of an object, including the steps of: a first measurement step of measuring, by imaging the mark in the above-described manner by an imaging unit, a coordinate component of the first feature point and a third feature point in the second surface having the predetermined height difference from the first feature point, based on first image data acquired by a measurement unit; a second measurement step of irradiating light by a light irradiation unit to image the mark by the imaging unit, and measuring a coordinate component of the second feature point corresponding to the first feature point based on a shadow of the first surface projected onto the second surface from second image data acquired by the measurement unit; and determining information on the position and orientation of the marker by calculation based on the coordinate components of the first to third feature points acquired by the measurement means in the first and second measurement steps.
According to the above-described another aspect, the position and orientation of the object provided with the marker can be measured with high accuracy by imaging the marker including the first surface having the first feature point and the second surface formed below with the predetermined step from the first surface, measuring the coordinate components of the first feature point and the third feature point in the second surface having the predetermined step from the first feature point based on the acquired first image data, measuring the coordinate component of the second feature point based on the shadow projected onto the first surface of the second surface by the light irradiation of the light irradiation means, and determining the information of the position and orientation of the marker by the calculation based on the coordinate components of the first to third feature points.
Drawings
Fig. 1 is a schematic configuration diagram of a machine tool including a position and orientation measurement system according to an embodiment of the present invention.
FIG. 2 is a top view and a cross-sectional view of a marker according to an embodiment of the present invention.
Fig. 3 is a schematic configuration diagram of a position and orientation measurement system according to an embodiment of the present invention.
Fig. 4 is an explanatory diagram of the principle of the position and orientation measurement system according to the embodiment of the present invention 1.
Fig. 5 is an explanatory diagram of the principle of the position and orientation measurement system according to the embodiment of the present invention, fig. 2.
Fig. 6 is a flowchart showing a position and orientation measurement method according to an embodiment of the present invention.
Fig. 7 shows a modification of the mark according to the embodiment of the present invention.
Detailed Description
Hereinafter, embodiments of the present invention will be described with reference to the drawings. Note that elements common to a plurality of drawings are given the same reference numerals, and overlapping detailed description of the elements is omitted.
Fig. 1 is a schematic configuration diagram of a machine tool including a position and orientation measurement system according to an embodiment of the present invention. Referring to fig. 1, a machine tool 10 including a position and orientation measurement system according to an embodiment of the present invention includes: the machine tool includes a machine tool frame 11, a spindle 12, a tool 13, a movable table 14, a workpiece 15, a mark 16 formed on a surface of the workpiece 15, a light irradiation unit 18 that irradiates the mark 16 with light, a camera 19 that captures an image of the mark 16, and a measurement unit 20 that receives image data from the camera 19 and measures a position and an orientation of the workpiece 15. In the machine tool 10, the camera 19 captures an image of the mark 16 formed on the workpiece 15, the measurement unit 20 measures the position and orientation of the mark 16 based on the image data, determines the position and orientation of the workpiece 15 based on information on the position and orientation of the mark 16 and information on the relative position between the mark 16 and the workpiece 15, and controls the main spindle 12 and the movable table 14 to machine the workpiece 15 with the tool 13.
Fig. 2 is a schematic diagram of a mark according to an embodiment of the present invention, wherein (a) is a plan view of an approach mark, and (b) is a cross-sectional view of an AA line shown in (a). Referring to (a) and (b) of fig. 2, the marker 16 has a cavity (recess) provided on the surface 15a of the work 15. The mark 16 has an opening 16a having a polygonal shape, for example, a square shape, in plan view, on the front surface 15a of the workpiece 15. The length of one side of the opening 16a is appropriately selected according to the size of the workpiece 15, and is, for example, 10mm, and is preferably set to 0.01mm to 1000mm from the viewpoint of imaging the opening 16a and the entire interior thereof with the camera 19.
The opening 16a has 4 vertexes Ht 1 ~Ht 4 . From 4 vertices Ht 1 ~Ht 4 To select a feature point (hereinafter also referred to as "surface feature point"), for example, ht 1
The bottom surface 16b is formed in a predetermined relative posture with respect to the surface 15a of the workpiece 15, and is formed parallel to the surface 15a, for example. The side wall surface 16c extends to the bottom surface 16b by a predetermined length (height difference) D perpendicularly from each side of the quadrangle of the opening 16a. The height difference D is, for example, 8mm, and is preferably set to 0.01mm to 1000mm from the viewpoint of capturing an image of the shadow projected onto the bottom surface 16b by the camera 19. Vertex Hb of bottom surface 16b 1 To Hb 4 Corresponding to the vertices Ht of the quadrangle of the opening 16a 1 ~Ht 4 . In the apex of the bottom surface 16b, the surface feature point Ht is selected 1 Corresponding feature point (hereinafter, also referred to as "bottom surface feature point") Hb 1
Fig. 3 is a schematic configuration diagram of a position and orientation measurement system according to an embodiment of the present invention, and referring to fig. 1, 2, and 3, a position and orientation measurement system 17 includes a mark 16 and a position and orientation measurement device 21. The position and orientation measuring device 21 includes: a light irradiation unit 18 that irradiates light to the marker 16; a camera 19 that photographs the marker 16; and a measuring unit 20 that processes image data of the marker 16 captured by the camera 19. The measurement unit 20 includes a measurement unit 22, a calculation unit 23, and a control unit 24.
The light irradiation section 18 has at least one light source, for example, an LED light source, and is disposed around the lens of the camera 19. The light irradiation unit 18 is configured with 2 light sources on both sides of the lens, for example, and 4 light sources at 4 corners around the lens, respectively. The light irradiation unit 18 is disposed so that the shadow of the feature point disposed on the surface 15a of the mark 16 can be formed on the bottom surface 16b. The light irradiation unit 18 is controlled to be turned on and off by the control unit 24, and can be turned on and off for each light source. The light irradiation unit 18 is preferably a point light source in view of making the outline of the shadow (boundary between light and dark) clear. The light source is arranged obliquely above the marker 16. From the viewpoint of facilitating the formation of a shadow on the bottom surface 16b, the light source is preferably disposed at an angle of 1 to 89 degrees, more preferably 10 to 80 degrees, and particularly preferably 30 to 45 degrees with respect to the virtual surface formed by the opening 16a. In addition, when the height difference D is small (shallow), it is preferable to reduce the angle from the viewpoint of easily forming a shadow on the bottom surface 16b and the viewpoint of improving the measurement resolution in the depth direction.
The camera 19 is composed of a lens, an imaging element, a camera control unit, and the like, and can use, for example, a digital video camera, a digital still camera, and the like. The lens 19a may be a single focus lens or a zoom lens. In the case of the zoom lens, the zoom lens may be set to the short focus side to perform imaging in a wide angle, the position of the mark 16 formed on the work 15 may be determined, and then the zoom lens may be set to the long focus side to obtain an enlarged image of the mark 16. This makes it possible to easily specify the position of the mark 16 on the short focal point side, and to more accurately obtain the position of the outline of the shadow formed on the mark 16 on the long focal point side.
The camera 19 is preferably a pinhole camera in order to capture an image in which the opening 16a and the bottom surface 16b of the mark 16 are focused. The pinhole camera may be formed by mounting a pinhole lens on a camera body or mounting a pinhole filter in front of the lens. The camera 19 may be configured such that a monochromatic transmission filter that transmits only green, blue, red, or the like is attached to the lens 19 a. This makes it possible to measure the shadow contour with high accuracy by emphasizing the contrast between light and dark.
The measuring unit 22 is provided in the measuring unit 20, and measures the surface feature point Ht selected from the surface 15a based on the image data of the mark 16 captured by the camera 19 1 Based on the surface feature point Ht formed on the bottom surface 16b 1 And a bottom surface feature point Hb on the bottom surface 16b that is vertically lowered by a predetermined height difference D from the surface feature point 1 The coordinate component of (a). The coordinate components measured here are coordinate components forming a plane parallel to the imaging element of the camera 19, and include, for example, x-coordinates and y-coordinates as described later.
The calculation unit 23 calculates the surface feature point Ht based on the surface feature point Ht measured by the measurement unit 22 1 Projection feature point and bottom surface feature point Hb 1 The respective coordinate components (for example, x-coordinate component and y-coordinate component) are calculated to obtain the surface feature point Ht 1 Projection feature point and bottom surface feature point Hb 1 The position and/or slope of the work piece 15 is obtained from the respective other coordinate component (e.g., z-coordinate) and the slope of the marker 16.
The control unit 24 can control the light irradiation unit 18, the camera 19, the measurement unit 22, the calculation unit 23, and the moving table 14. Specifically, the control unit 24 is a timing of turning on and off the light irradiation unit 18, a timing of capturing an image by the camera 19, a timing of receiving image data by the measurement unit 22, and the like.
The measurement unit 20 may be constituted by a personal computer and software cooperating with the personal computer. Alternatively, the measurement unit 20 may be configured by a dedicated circuit to constitute the measurement unit 22, the calculation unit 23, and the control unit 24, or may be configured by combining a personal computer and software. The measurement unit 20 may include a user interface such as a display and a keyboard, which are not shown.
Fig. 4 is a diagram illustrating the principle of the position and orientation measurement system according to the embodiment of the present invention, in which fig. 1 (a) is a diagram showing the positional relationship between a camera, a light irradiation unit, and a mark, and (b) is a plan view of the mark on which a shadow is projected.
Referring to fig. 4 (a), for convenience of explanation, the vertical direction is assumed to be the z-axis, the horizontal direction is assumed to be the x-axis, and one light source 18 of the light irradiation section 18 is disposed on the xz plane 1 The apex (surface feature point) Ht of the marked surface 15a 1 And the center of the lens of the camera 19, in this example the center 25a of the pinhole lens 25, the surface 15a of the mark being inclined at an angle θ with respect to the x-axis x . An image of the mark 16 passing through the pinhole lens 25 is formed on the camera element 26, converted into image data, and sent to the measuring unit 20. One light source 18 of the upper left light irradiating section 18 of the mark 16 1 And is turned on in response to a control signal from the control unit 24. Thereby, the shadow of the marked surface 15a is formed on the bottom surface 16b.
Referring to fig. 4 (b), feature points (surface features) of the surface 15a of the mark 16 are markedCharacteristic point) is Ht 1 . Surface feature point Ht 1 Is the side Ht 1 ﹣Ht 2 And edge Ht 4 ﹣Ht 1 The intersection point of (a). The bottom surface 16b and the side wall surface 16c of the cavity of the marker 16 are hatched, and the hatched portion is referred to as a dark portion DS (hatched portion), and the portion directly irradiated with light is referred to as a bright portion BS. In the boundary between the dark part DS and the light part BS on the bottom surface 16b, the side Kb 1 ﹣Kb 2 Is the side Ht 1 ﹣Ht 2 Projected edge, edge Kb 4 ﹣Kb 1 Is the side Ht 4 ﹣Ht 1 The projected edge. Will edge Kb 1 ﹣Kb 2 And edge Kb 4 ﹣Kb 1 The intersection of (A) is regarded as a feature point (projected feature point) Kb 1 . Projection feature point Kb 1 Is the surface feature point Ht 1 The projected points. Namely, the light source 18 1 And surface feature point Ht 1 Is connected by a straight line (indicated by a 1-dot chain line in fig. 4 (a)) and exceeds the surface feature point Ht 1 And the intersection of the extended bottom surface 16b is the projected feature point Kb 1 . Will follow the surface feature point Ht 1 A vertex Hb starting from a bottom surface 16b in a vertical direction with respect to the surface 15a 1 Set as the bottom surface feature point.
Fig. 5 is a diagram illustrating the principle of the position and orientation measurement system according to the embodiment of the present invention, fig. 2. Fig. 5 shows the same configuration as fig. 4. Since the constituent elements and the feature points are located on the xz plane, the y coordinate is omitted from the components of these coordinates and displayed as (x coordinate, z coordinate).
Referring to fig. 5, the measurement unit 22 obtains the surface feature point Ht from the image data 1 Bottom surface characteristic point Hb 1 And projection feature point Kb 1 X coordinate of (a). Surface feature point Ht of mark 16 1 Is expressed as (Ht) 1x ,Ht 1z ) A bottom surface feature point Hb 1 Is expressed as (Hb) 1x ,Hb 1z ) Project the feature point Kb 1 Is expressed as (Kb) 1x ,Kb 1z ). At the passing surface feature point Ht 1 Z = Ht of the plane parallel to the x-axis 1z In the case of being observed from the lens center 25a of the pinhole lens 25, the bottom surface characteristic point Hb 1 Becomes z = Ht with the plane 1z Point of intersection P Hb So that the coordinates can be expressed as (Hb) in the image data 1x 、Ht 1z ). Similarly, the characteristic point Kb is projected 1 Becomes z = Ht with the plane 1z Point of intersection P Kb So that in the image data, the coordinates can be expressed as (Kb) 1x ,Ht 1z ). Thus, the measurement unit 22 obtains the surface feature point Ht from the image data 1 X coordinate of (Ht) 1x Bottom surface characteristic point Hb 1 X coordinate Hb of 1x And projection feature point Kb 1 X coordinate Kb of 1x
The calculation unit 23 uses the surface feature point Ht obtained by the measurement unit 22 1 Bottom surface feature point Hb 1 And projection feature point Kb 1 Respective x coordinate Ht 1x 、Hb 1x And Kb 1x A predetermined light source 18 1 Coordinate (S) of 1x ,S 1z ) And a mark height difference D, and the surface feature point Ht is obtained from the image data by simultaneous equations (the following equations (1) to (3)) obtained from a geometrical relationship 1 Z coordinate of (Ht) 1z Slope of mark theta x And length of shadow on xz plane<Kb 1x >The position and slope of the marker 16 are obtained.
(Ht 1z -S 1z )(<Kb 1x >cosθ x -V 1x -Dcosθ x )=Dcosθ x (S 1x -Ht 1x )…(1)
Ht 1z (<Kb 1x >cosθ x +V’ 1x )=(Hb 1x -K b1x )(Ht 1z +Dcosθ x )…(2)
(Ht 1x -Kb 1x )<Kb 1x >sinθ x =(V 1x +V’ 1x )(<Kb 1x >sinθ x +D cosθ x )…(3)
Formula (1) is derived from a light source 18 1 Surface feature point Ht 1 And bottom surface feature point Hb 1 According to the relation of similar triangles. The expression (2) is from the lens center 25a and the bottom surface characteristic point of the pinhole lens 25Hb 1 According to the relation of similar triangles. Equation (3) is derived from the projected feature point Kb 1 And surface feature point Ht 1 According to the expression obtained by the relation of similar triangles.
V contained in formulas (1) to (3) 1x And V' 1x As shown below. Surface feature point Ht 1 By the light source 18 1 Projected to pass through bottom surface feature point Hb 1 Z = Hb in a plane parallel to the x-axis 1z Point P on A Relative to projected feature point Kb 1 X coordinate K of b1x Offset V 1x . Further, the characteristic point Kb is projected when viewed from the lens center 25a of the pinhole lens 25 1 In plane z = Hb 1z Is located at a point P B Relative to the projected feature point Kb 1 X coordinate K of b1x Offset V' 1x . These offsets V 1x 、V’ 1x The geometrical relationship can be expressed by the following formula (4).
[ mathematical formula 1]
Figure BDA0003928833380000081
Figure BDA0003928833380000082
The above equations (1) to (3) can be solved analytically, and the computing unit 23 obtains the surface feature point Ht by numerical calculation, for example, newton's method 1 Z coordinate of (Ht) 1z Slope of mark θ x And length of shadow on xz plane < Kb 1x >。
In addition, when the mark 16 is also inclined with respect to the y-axis, the measurement unit 22 measures the surface feature point Ht from the image data 1 Bottom surface feature point Hb 1 And projected feature point Kb 1 The calculation unit 23 obtains the position and the slope of the marker 16 by using expressions (1) to (3) in consideration of the y coordinate.
From the above, the position and orientation of the mark 16 can be measured. Further, by acquiring information on the positional relationship between the marker 16 and the workpiece 15 by a 3-dimensional measuring machine or the like in advance and setting the information in the measuring unit 20, the position and orientation of the workpiece 15 can be determined with high accuracy.
According to the marker 16 of the present embodiment, the marker 16 has: an opening 16a formed in a surface 15a of the workpiece 15; and a bottom surface 16b having a predetermined height difference D from the surface 15a. Can be determined from the surface feature point Ht on the counter surface 15a 1 And the shadow and surface feature point Ht of the surface 15a projected on the bottom surface 16b 1 Corresponding projected feature point Kb 1 The position and orientation of the workpiece 15 can be determined with high accuracy by measuring and calculating the position and orientation of the mark 16 with high accuracy from the image obtained by imaging.
According to the position and orientation measuring device 21 of the present embodiment, the camera 19 is used to measure the surface feature point Ht of the mark 16 formed on the surface 15a of the workpiece 15 1 And a surface feature point Ht of a shadow with the surface 15a projected on the bottom surface 16b 1 Corresponding projected feature point Kb 1 The image is taken, and the surface feature point Ht in the image obtained by the measuring unit 20 is detected 1 And a projection feature point Kb based on a shadow of the surface 15a projected to the bottom surface 16b by the light irradiated by the light irradiation unit 1 By measuring and calculating the coordinate component of (a), the position and orientation of the workpiece 15 provided with the mark 16 can be measured with high accuracy.
Fig. 6 is a flowchart showing a position and orientation measurement method according to an embodiment of the present invention. A position and orientation measurement method according to an embodiment of the present invention will be described with reference to fig. 6, 1, 2, 4, and 5.
First, in S100, coordinate data of the camera 19 and the light source of the light irradiation unit 18 are set.
Specifically, the measurement unit 20 sets the coordinates of the lens center 25a of the pinhole lens 25 of the camera 19 and the light source 18 of the light irradiation unit 18 1 ~18 4 And stored in, for example, a memory (not shown). In this example, the light source 18 is set with the lens center 25a as the origin 1 X and z coordinates ofAnd (4) components. The set data is stored in the memory of the measuring unit 20, for example.
Next, in S110, a height difference between the surface and the bottom surface of the mark is set. Specifically, the measurement unit 20 sets a height difference D from the surface 15a to the bottom surface 16b of the mark 16, and stores the height difference D in a memory (not shown), for example.
In S120, the marker is photographed by a camera, and the coordinate components of the surface feature point and the bottom surface feature point are measured from the image data. Specifically, the mark 16 is imaged by the camera 19 by the control unit 24. The control unit 24 may turn on the light source S 1 ~S 4 A part or all of. The measurement unit 22 acquires the surface feature point Ht of the mark 16 from the captured image data 1 X coordinate of (Ht) 1x And bottom surface feature point Hb 1x X coordinate Hb of 1x As measured data, among them, the bottom surface characteristic point Hb 1 The height difference D is a point where the height difference D decreases from the surface feature point toward the bottom surface 16b.
In S130, the light source is turned on, the marker is photographed by the camera, and the coordinate component of the projected feature point is measured based on the shadow of the shape of the surface of the marker projected on the bottom surface from the image data. Specifically, the control unit 24 turns on the light source S 1 The shadow of the surface 15a is projected onto the bottom surface of the mark 16, and the mark 16 is photographed by the camera 19. Obtaining projection feature point Kb from captured image data 1 X coordinate Kb of 1x As measurement data.
In S140, information on the position and orientation of the marker is obtained by calculation based on the setting data and the measurement data. Specifically, the light source 18 set in S100 and S110 is used by the arithmetic unit 23 1 Coordinate (S) of 1x ,S 1z ) And the height difference D of the mark, the surface characteristic point Ht measured in S120 and S130 1 Bottom surface characteristic point Hb 1 And projected feature point Kb 1 Respective x coordinate Ht 1x 、Hb 1x And Kb 1x The surface feature point Ht is obtained by numerically calculating simultaneous equations (the above equations (1) to (3)) obtained from the geometric relationship from the image data, for example, by the newton method 1 Z coordinate of (Ht) 1z Sign, signNote slope theta x And the length of the shadow on the xz plane < Kb 1x >. Determining the surface feature point Ht of the mark 16 1 And the posture θ x.
In S150, the position and orientation of the workpiece are determined based on the information on the position and orientation of the mark. Specifically, the calculation unit 23 calculates the surface feature point Ht of the mark 16 determined in S140 1 The coordinates and the posture θ x of (a) are determined by information of the positional relationship between the marker 16 and the workpiece 15, the position and the posture of the workpiece 15. Information on the positional relationship between the marker and the workpiece is set in advance in the measuring unit 20.
From S100 to S150, the position and orientation of the workpiece are measured using the mark formed on the workpiece. S100 and S110 may be performed simultaneously, or S110 may be performed first. S120 and S130 may be performed simultaneously, or S130 may be performed first.
According to the position and orientation measuring method of the present embodiment, the mark 16 is photographed, and the surface feature point Ht is measured from the acquired image data 1 And from the surface feature point Ht 1 A bottom surface characteristic point Hb of the bottom surface 16b having a predetermined height difference D 1 Based on the coordinate component of light source S 1 Projection feature point Kb of shadow projected on surface 15a of bottom surface 16b by irradiating light 1 Based on the surface feature point Ht 1 Bottom surface characteristic point Hb 1 And projected feature point Kb 1 The coordinate component of (2) is calculated to determine the information of the position and orientation of the mark 16, and thereby the position and orientation of the workpiece 15 on which the mark 16 is provided can be measured with high accuracy.
Fig. 7 shows a modification of the marker according to the embodiment of the present invention, in which (a) is a plan view and (b) is a view in which a light source is lit and a shadow is projected on a bottom surface. For convenience of explanation, points different from the reference numeral 16 shown in fig. 2 will be explained. Referring to fig. 7 (a), the mark 116 has a substantially polygonal shape, for example, a square shape, and has a corner Rt formed by a circular arc shape when viewed from the front surface 15a of the workpiece 15 in a plan view 1 ~Rt 4 And an edge St 1 ~St 4 Opening 116a. Of reference numeral 116The bottom surface 116b has the same shape as the opening 116a and has a corner Rb with a circular arc shape 1 ~Rb 4 And edges Sbt 1 ~Sb 4 The shape of the structure.
At the corner of the opening 116a, for example, rt 1 In (2), a virtual vertex H't can be formed 1 So as to sandwich a corner part Rt 1 Side St of 1 And St 4 An extended intersection point. Similarly, a virtual vertex H' b can be formed on the bottom surface 116b 1 Is arranged to sandwich the corner Rb 1 Side Sb 1 And Sb 4 An extended intersection point. By finding the virtual apex H't in this manner 1 And H' b 1 May be set as the surface feature points H' t 1 And bottom surface characteristic point H' b 1
Referring to (b) and (a) of fig. 7, when the light source 18 is lit 1 And when the mark 116 is irradiated with light, a shadow (dark portion DS (shadow portion)) of the edge of the surface is projected onto the bottom surface 116b. The boundary between the dark part DS and the light part BS, i.e., the outline of the shadow has a side Ks 1 And Ks 4 And a circular arc portion. The arc being a corner Rt 1 The projected portion. Will lengthen the length Ks 1 And edges Ks 4 Is set as K' b 1 . Intersection point K' b 1 Is a virtual earth light source 18 1 Projection surface feature point H' t 1 The latter point can be set as a virtual projected feature point K' b 1
Therefore, the mark 116 has the same function as the mark 16, and can function as a mark of the position and orientation measurement system according to the embodiment of the present invention. In addition, the position and orientation measurement method according to an embodiment of the present invention can be used as a marker.
In the above description, the bottom surfaces 16b, 116b of the marks 16, 116 have a quadrangular shape similar to the shape of the openings 16a, 116a of the front surface 15a. The bottom surfaces 16b, 116b may have a different shape from the openings 16a, 116a. In this case, since the height differences D from the surface 15a to the bottom surfaces 16b and 116b are the same, the characteristic point Kb is projected 1 The above expressions (1) to (5) can be applied in the same manner as in the case where the bottom surfaces 16b and 116b are square.
In the above description, the relative postures of the bottom surfaces 16b, 116b of the marks 16, 116 with respect to the surface 15a are parallel. The bottom surfaces 16b and 116b may be in a relative posture other than parallel to the surface 15a. Projection feature point Kb 1 Only the amount of deviation of the relative posture of the bottom surface is different from that in the case of being parallel to the bottom surface, but the above equations (1) to (5) are only required to be used<Kb 1x >The offset can be compensated. The amount of deviation may be set in the measuring unit 20 by measuring the mark with a 3-dimensional measuring instrument in advance.
In the above description, the marks 16 and 116 are described as cavities (recesses) provided in the surface 15a of the workpiece 15. The mark may be a protrusion (convex portion) provided on the surface 15a of the workpiece 15. The top surface of the protrusion has the same shape as the opening 16a, 116a of the mark 16, 116. In this case, the surface feature point Ht is selected from the apex of the top surface of the protrusion or the virtual apex 1 . The shadow of the surface feature point is projected on the surface 15a of the processed object 15, and the projected feature point Kb 1 Formed on surface 15a. The surface 15a of the workpiece 15 is expanded to form the projected feature point Kb 1 To any extent. Accordingly, the mark of the protrusion has the same function as the marks 16 and 116 by applying the above equations (1) to (5), and can function as the mark of the position and orientation measurement system according to the embodiment of the present invention.
The marks 16 and 116 may display identification information, attribute information, and the like of the workpiece 15 on which the marks 16 and 116 are formed, for example, a specification number, a manufacturing lot number, and the like, on the surface 15a, the bottom surface 16b, and the top surface. The image is captured by the camera 19 and recognized by the measuring unit 20, and the work 15 is managed. This makes it possible to simultaneously acquire information on the position and orientation of the workpiece 15, identification information, attribute information, and the like, thereby facilitating production management.
While the preferred embodiments of the present invention have been described in detail, the present invention is not limited to the specific embodiments, and various modifications and changes can be made within the scope of the present invention described in the claims. For example, the marks 16 and 116 may be formed in a jig for fixing the workpiece 15. Thus, by setting the information of the relative position and the relative orientation between the workpiece 15 and the jig in advance, the position and the orientation of the workpiece can be determined from the information of the position and the orientation of the marker 16 by the position and the orientation measuring device 21 and the position and the orientation measuring method according to the present embodiment via the information.
In addition, since the workpiece 15 or the jig having the marks 16 and 116 according to the present embodiment can specify the position and the posture of the workpiece 15 or the jig by the marks 16 and 116, the workpiece 15 can be transferred between the machine tool and the robot.
The markers 16 and 116, the position and orientation measuring device 21, and the position and orientation measuring method according to the present embodiment can be applied also to a case where the workpiece 15 is placed on the moving table 14 and an Automatic Guided Vehicle (AGV) as shown in fig. 1 and is processed by a plurality of machine tools in a factory.
Description of the reference numerals
10. Machine tool
14. Movable workbench
15. Processed product
16. 116 mark
17. Position and orientation measuring system
18. Light irradiation section
19. Camera with a camera module
20. Measuring part
21. Position and orientation measuring device
22. Measuring part
23. Arithmetic unit
24. A control unit.

Claims (13)

1. A marker that is provided on a surface of an object and that is capable of measuring a position and a posture of the object, the marker comprising:
a first face having a first feature point; and
a second surface formed to have a predetermined height difference from the first characteristic point and to have a predetermined relative posture with the first surface below the first surface,
the position and orientation of the object can be measured based on the first feature point and the second feature point by obtaining the second feature point corresponding to the first feature point based on the shadow of the first surface projected onto the second surface in the image obtained by imaging the mark by the imaging means.
2. The tag of claim 1,
the mark is a recess having an opening on a surface of the object, the first surface is the surface, and the second surface is a bottom surface of the recess.
3. The tag of claim 2,
the opening is a polygon, and the first feature point is a vertex of the polygon.
4. The tag of claim 2,
the opening portion is a polygon having an arc shape, and the first characteristic point is a virtual vertex of the polygon.
5. The tag of claim 1,
the mark is a protrusion formed on a surface of the object, the first surface is a top surface of the protrusion, and the second surface is the surface.
6. The tag of claim 5,
the shape of the top surface is a polygon, and the first characteristic point is a vertex of the polygon.
7. The tag of claim 5,
the shape of the top surface is a polygon with an arc shape, and the first characteristic point is a virtual vertex of the polygon.
8. An apparatus, comprising:
a light irradiation unit which is provided on a surface of an object and which can irradiate the mark according to any one of claims 1 to 7 with light to form a shadow of the first surface on the second surface;
a photographing unit that photographs the mark; and
and a measurement unit that obtains a second feature point corresponding to the first feature point based on a shadow of the first surface projected onto the second surface in the image of the marker captured by the imaging unit, and obtains a position and a posture of the object based on the first feature point and the second feature point.
9. The apparatus of claim 8,
the measurement unit has:
a measurement unit that measures, from the captured image of the marker, a coordinate component on the image of the first feature point, a second feature point corresponding to the first feature point based on a shadow of the first surface projected onto the second surface, and a third feature point in the second surface having the predetermined height difference from the first feature point; and
and a calculation unit that obtains the position and orientation of the object by calculating other coordinate components of the first to third feature points based on the coordinate components of the first to third feature points measured by the measurement unit.
10. The apparatus of claim 8 or 9,
the imaging unit may detect a position of the mark by performing imaging at a wide angle separately from the object, and perform imaging by enlarging the mark.
11. A system is characterized by comprising:
the mark according to any one of claims 1 to 7 formed on the surface of an object; and
the device of any one of claims 8 to 10.
12. A method for measuring a position and an orientation of an object by the system according to claim 11, the method comprising the steps of:
a first measurement step of measuring, by imaging the mark according to any one of claims 1 to 7 by an imaging unit, a coordinate component of the first feature point and a third feature point in the second surface having the predetermined height difference from the first feature point, based on first image data acquired by a measurement unit;
a second measurement step of irradiating light by a light irradiation unit to image the mark by the imaging unit, and measuring a coordinate component of the second feature point corresponding to the first feature point based on a shadow of the first surface projected onto the second surface from second image data acquired by the measurement unit; and
and determining information on the position and orientation of the marker by calculation based on the coordinate components of the first to third feature points acquired by the measurement means in the first and second measurement steps.
13. The method of claim 12,
in the step determined by the measurement unit through the calculation, the calculation is performed using the predetermined height difference and a coordinate component of the light irradiation unit.
CN202180033490.3A 2020-05-29 2021-03-01 Marker, device, system, and measurement method for measuring position and orientation of object Pending CN115516274A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2020094365A JP7386531B2 (en) 2020-05-29 2020-05-29 Markers, devices and systems for measuring the position and orientation of objects
JP2020-094365 2020-05-29
PCT/JP2021/007710 WO2021240934A1 (en) 2020-05-29 2021-03-01 Marker for measuring position and orientation of subject, device, system, and measurement method

Publications (1)

Publication Number Publication Date
CN115516274A true CN115516274A (en) 2022-12-23

Family

ID=78723304

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180033490.3A Pending CN115516274A (en) 2020-05-29 2021-03-01 Marker, device, system, and measurement method for measuring position and orientation of object

Country Status (4)

Country Link
JP (1) JP7386531B2 (en)
CN (1) CN115516274A (en)
DE (1) DE112021003059T5 (en)
WO (1) WO2021240934A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03131710A (en) * 1989-10-18 1991-06-05 Nec Corp Positioning mark
JPH05312521A (en) * 1992-05-13 1993-11-22 Nec Corp Target mark
US20070073133A1 (en) 2005-09-15 2007-03-29 Schoenefeld Ryan J Virtual mouse for use in surgical navigation
JP6126067B2 (en) 2014-11-28 2017-05-10 ファナック株式会社 Collaborative system with machine tool and robot
JP6694643B2 (en) 2016-02-19 2020-05-20 国立研究開発法人産業技術総合研究所 Workpiece processing method
JP6630881B2 (en) 2016-09-15 2020-01-15 株式会社五合 Information processing apparatus, camera, moving object, moving object system, information processing method and program

Also Published As

Publication number Publication date
JP2021189033A (en) 2021-12-13
WO2021240934A1 (en) 2021-12-02
DE112021003059T5 (en) 2023-06-22
JP7386531B2 (en) 2023-11-27

Similar Documents

Publication Publication Date Title
US11544874B2 (en) System and method for calibration of machine vision cameras along at least three discrete planes
US8564655B2 (en) Three-dimensional measurement method and three-dimensional measurement apparatus
KR100948161B1 (en) Camera corrector
US20090128648A1 (en) Image processing device and image processing method for performing three dimensional measurements
JP5438475B2 (en) Gap step measurement device, gap step measurement method, and program thereof
KR101379787B1 (en) An apparatus and a method for calibration of camera and laser range finder using a structure with a triangular hole
JPH11166818A (en) Calibrating method and device for three-dimensional shape measuring device
JP5001330B2 (en) Curved member measurement system and method
KR20180090316A (en) Deformation processing support system and deformation processing support method
JP7353757B2 (en) Methods for measuring artifacts
JP2006308500A (en) Three dimensional workpiece measuring method
JP6064871B2 (en) Thickness measurement method
KR102235999B1 (en) Deformation processing support system and deformation processing support method
JP3696336B2 (en) How to calibrate the camera
CN108050934B (en) Visual vertical positioning method for workpiece with chamfer
CN115516274A (en) Marker, device, system, and measurement method for measuring position and orientation of object
JP2017007026A (en) Position correcting system
JP2624557B2 (en) Angle measuring device for bending machine
JPH09329440A (en) Coordinating method for measuring points on plural images
CN108195319B (en) Visual oblique positioning method for workpiece with chamfer
JPH03259705A (en) Angle measuring instrument for bending machine
CN105425724A (en) High-precision motion positioning method and apparatus based on machine vision scanning imaging
JP2523420B2 (en) Image processing method in optical measuring device
JP3589512B2 (en) Inspection point marking method for microfabricated products, automatic dimension inspection method and automatic dimension inspection device
TW202239546A (en) Image processing system and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination