WO2020008936A1 - Image processing device, image processing method, system, and method for manufacturing article - Google Patents
Image processing device, image processing method, system, and method for manufacturing article Download PDFInfo
- Publication number
- WO2020008936A1 WO2020008936A1 PCT/JP2019/024987 JP2019024987W WO2020008936A1 WO 2020008936 A1 WO2020008936 A1 WO 2020008936A1 JP 2019024987 W JP2019024987 W JP 2019024987W WO 2020008936 A1 WO2020008936 A1 WO 2020008936A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- model
- orientation
- image processing
- measurement
- measurement target
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/26—Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
Definitions
- the present invention relates to measurement technology.
- ⁇ Before capturing an image of an object using the imaging device, it is necessary to set a positional relationship between the object and the imaging device in advance. This is because the field of view of the imaging device is defined by various factors including the focal length of the lens of the imaging device, the size of the imaging device such as a CCD, and the like, and the target object needs to be in that region. That's why. There is a technique for making this setting on a simulation device.
- Patent Literature 1 discloses a technique that aims to know in advance by simulation what kind of area can be photographed before installing a camera in a real space.
- an area image of an area to be a camera installation area is displayed, and a camera indicator is arranged and displayed on the area image.
- an imaging range and a blind spot are presented based on information such as the type of camera and the angle of view.
- Patent Literature 2 when viewing a video at a distance from the shooting site of a video camera, it is unclear where the video camera is looking from, and it is difficult to grasp the shooting site from the video and to give an instruction to the video camera. The task is difficult.
- Patent Literature 2 a real world including an existing position of a video camera and a viewable area is prepared in advance in a virtual space, and the position, the line of sight, and the angle of view of the real world video camera are prepared in the virtual space. This problem is solved by displaying the area three-dimensionally in real time.
- Patent Literatures 1 and 2 The purpose of the technology disclosed in Patent Literatures 1 and 2 is to know where in a real space an imaging device is to be installed to place a desired target or a desired area in a visual field.
- a predetermined area suitable for measurement is defined separately from a visual field area, and the area is naturally invisible in a real space. There may be a part with even higher accuracy in this predetermined area, and it is difficult to adjust the measurement target so as to come to that area or a specific part in the area.
- the user moves the measuring instrument to an approximate position where the measurement target enters an area suitable for measurement. Then, in order to know the actual relative positional relationship between the measuring instrument and the measurement target, the distance and the like are measured with a scale or the like. Then, the measuring instrument is moved based on the result, and the confirmation is performed again on the scale. Thereafter, the image photographed at the position of the measuring instrument is confirmed, and if it is not within the angle of view, for example, the adjustment and confirmation are performed again.
- the above-described three-dimensional measuring device may be used, for example, fixed to a robot or the like.
- the user does not control the position and orientation of the measuring device itself, but controls the position and orientation of the measuring device by operating the position and orientation of the robot equipped with the measuring device with a controller or the like. That is, even if the position and orientation of the measuring instrument are known, the position and orientation of the robot must be controlled so that the measuring instrument itself becomes the position and orientation. Further, the operation of the robot is often performed outside the safety fence, and when checking on a scale after the operation of the robot, it is necessary to enter and exit the safety fence, so that the work efficiency may be reduced.
- the technique described in the background art cannot provide a sufficient effect.
- the present invention provides a technique for quickly and easily setting the position and orientation of a measuring instrument for measuring a measurement target.
- One embodiment of the present invention is a display unit that displays, on a display screen, a first model representing an object to be measured by a measuring instrument, a second model representing a measuring range of the measuring instrument, and A change unit for changing the position and orientation of the second model, and a position and orientation of a part whose position and orientation relative to the measuring device are constant when the position and orientation of the second model are changed.
- Output means for changing the position and orientation of the second model, and a position and orientation of a part whose position and orientation relative to the measuring device are constant when the position and orientation of the second model are changed.
- FIG. 1 is a diagram illustrating a configuration example of a system.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- FIG. 9 is a view showing a display example of a GUI.
- 4 is a flowchart of a process performed by the image processing apparatus 1.
- FIG. 2 is a diagram illustrating a control system including a measurement device and a robot arm.
- the measuring device 2 is a three-dimensional shape measuring device for measuring three-dimensional information relating to the measurement target 3 such as the position, posture, shape, and the like of the measurement target (target object) 3 mounted on the mounting table 10.
- the measuring device 2 includes, for example, an active type that includes a projection unit that projects pattern light onto the measurement target 3 and an imaging unit that captures the measurement target 3 onto which the pattern light is projected.
- Some of the measuring instruments 2 are of a passive type having a plurality of imaging devices for imaging the measurement target 3 at different positions and orientations.
- the measurement target 3 can be applied as the measurement target 3. If the measurement target 3 is relatively small, it may be placed in a bulk state, and if it is relatively large, it may be placed with only one individual. When the measurement targets 3 are placed in bulk, there are many cases where the measurement targets 3 are stored in containers such as pallets. In the present embodiment, the measurement target 3 includes not only the measurement target by the measuring device 2 but also a container in which the measurement target by the measuring device 2 is stored. In the present embodiment, it is assumed that the measurement target 3 is a small work that is piled on a pallet. As shown in FIG. 1, it is assumed that the measurement target 3 is mounted on an installation table 10 and is installed obliquely with respect to the direction of gravity.
- the measuring instrument 2 has a measuring range 4 defined by its performance and specifications.
- a desired measurement cannot be performed (a desired measurement result cannot be obtained) unless at least a part of the measurement target 3 is included in the visual field of the imaging device.
- the relative position and orientation between the measurement device 2 and the measurement target 3 is good (that is, the measurement accuracy of the measurement target 3 by the measurement device 2 is sufficient). Not always. In other words, even if the measurement target 3 is in the field of view of the imaging device, there is a position or orientation of the imaging device suitable for measurement from the viewpoint of defocus and aberration.
- the measurement target 3 there is a good position and orientation of the measurement target 3 which can be considered from the performance and specifications of an optical system for projecting a pattern.
- an area having good recognition performance and good accuracy is often defined.
- a truncated quadrangular pyramid having an area of 500 ⁇ 500 mm at a position 500 mm far from the measuring device 2 as a bottom surface and an area of 300 ⁇ 300 mm at a position of 300 mm far from the measuring device 2 as a top surface Is defined as a measurement range 4.
- the measurement range 4 is an area where the measuring instrument 2 can achieve a certain measurement performance. That is, the relative position and orientation (position and orientation relationship) between the measuring device 2 and the measurement range 4 is constant unless the parameters (focal length and the like) of the measuring device 2 are changed.
- the user performs adjustment so that the measurement target 3 matches a specific portion of the measurement range 4.
- Various states are assumed as the state of the workpieces 3 to be measured in bulk, and there are various patterns in the height position of the surface workpieces stacked in bulk.
- the user tries to make the bottom surface of the pallet containing the measurement target 3 coincide with the far end of the measurement range 4, that is, the above-described 500 ⁇ 500 mm area.
- the workpieces stacked thereon enter the truncated quadrangular pyramid-shaped area, and even if the state of the bulk loading is high, the workpieces existing on the bulk loading surface are farther than 300 mm farther than the measuring instrument 2. This allows the work to be contained in the measurement range 4.
- the measuring device 2 is attached (fixed) to the robot arm 12 via the mounter 11 and the like.
- the robot arm 12 is fixed to a robot base 13, and the position and posture of a specific portion of the robot arm 12 such as the movable part 5 are controlled by a robot controller 14.
- a command related to the control is transmitted from, for example, the programming pendant 15.
- the programming pendant 15 When the user operates the programming pendant 15 to input an instruction to change the position or posture of the movable unit 5, the change instruction is transmitted to the robot controller 14, and the robot controller 14 changes the position or posture of the movable unit 5 according to the change instruction. change. Unless otherwise specified, the position and orientation handled below.
- the position and orientation are in a coordinate system (reference coordinate system) based on the position and orientation of the robot base 13.
- the reference coordinate system is, for example, a coordinate system in which the position of the robot base 13 is set as an origin, and three axes orthogonal to the origin are set as an X axis, a Y axis, and a Z axis, respectively.
- An example of the movable portion 5 is a flange portion at the tip of the robot arm 12.
- the tip of the tool may be the movable portion 5.
- the tip of the tool is the movable part 5.
- the relative position and orientation between the measuring instrument 2 and the movable section 5 are independent of the position and orientation of the robot arm 12. Constant.
- the “relative position and orientation between the measuring device 2 and the movable unit 5” can be acquired based on design information of the measuring device 2, the mounter 11, the tool, and the like.
- a method is also known in which a marker or the like is photographed from a plurality of viewpoints with the measuring device 2 and is obtained by performing arithmetic processing.
- the image processing apparatus 1 further includes a calculation unit 9, a display unit 7, and an input unit 8.
- the operation unit 9 includes one or more CPUs, processors, operation processing circuits, and the like.
- the arithmetic unit 9 controls the overall operation of the image processing apparatus 1 by executing processing using a computer program and data stored in the storage unit 6, and performs various operations described below as being performed by the image processing apparatus 1. Execute or control the process.
- the calculation unit 9 executes or controls display control of a GUI (graphical user interface) described below and processes in accordance with an operation input to the GUI.
- GUI graphical user interface
- the storage unit 6 stores, in addition to the “relative position and orientation between the measuring device 2 and the movable unit 5”, information that the image processing apparatus 1 handles as known information in the following description. .
- the storage unit 6 stores computer programs and data for causing the arithmetic unit 9 to execute or control each process described below, which is performed by the image processing apparatus 1.
- the input unit 8 is configured by a user interface such as a keyboard, a mouse, and a touch panel screen, and can input various instructions to the arithmetic unit 9 by operating the user.
- a user interface such as a keyboard, a mouse, and a touch panel screen
- the display unit 7 is configured by a liquid crystal screen, a touch panel screen, or the like, and can display a processing result of the calculation unit 9 as an image, characters, or the like.
- the display unit 7 can display a GUI described below.
- FIG. 16 is a flowchart showing the same processing performed by the image processing apparatus 1 to obtain the position and orientation of the movable unit 5 such that the measurement target 3 falls within a desired area within the measurement range 4 measured by the measuring device 2. It will be described according to.
- the display area 17 is an area for displaying an image of a “virtual space defined by a coordinate system (virtual coordinate system) that matches the above-described reference coordinate system” viewed from a viewpoint having a specified position and orientation.
- an icon 18 representing each axis (X axis, Y axis, Z axis) of the virtual coordinate system is displayed.
- step S1601 the calculation unit 9 determines whether or not the “measurement target display” button 19 has been instructed by the user operating the display screen of the input unit 8 or the display unit 7 (already instructed). As a result of this determination, if the “measurement object display” button 19 has been instructed, the process proceeds to step S1602. If the “measurement object display” button 19 has not been instructed, the process proceeds to step S1613.
- step S1602 the calculation unit 9 generates a measurement target model that is a three-dimensional model imitating the geometric shape of the measurement target 3, and sets the generated measurement target model as the position and orientation of the measurement target model.
- the measurement target position and orientation are arranged in the virtual space.
- a specified position and orientation is set as the initial value of the position and orientation of the measurement target, and the position and orientation of the measurement target are updated in accordance with a user operation in step S1605 described below.
- the data of the measurement target model is created in advance and registered in the storage unit 6, and the calculation unit 9 generates the measurement target model based on the data of the measurement target model.
- step S1603 the calculation unit 9 generates an image of the “measurement target model arranged in step S1602” viewed from a viewpoint having a specified position and orientation, and displays the image in the display area 17.
- FIG. 3 shows a display example of the GUI in step S1603.
- the display area 17 displays an image of the measurement target model 23 viewed from a viewpoint having a specified position and orientation.
- the measurement target position / posture is displayed in the measurement target position / posture column 21.
- an X coordinate position “0”, a Y coordinate position “300.0”, and a Z coordinate position “100.0” are displayed as measurement target positions (the unit is “mm”).
- information on each measurement target 3 may be displayed in a list on the GUI, and selection of the measurement target 3 for displaying the measurement target model may be accepted. That is, the measurement target model of the measurement target 3 and the measurement target position and orientation of the measurement target model corresponding to the information selected by the user via the display screen of the input unit 8 or the display unit 7 from the information displayed in the list. It may be displayed on a GUI as illustrated in FIG.
- step S1604 the calculation unit 9 determines whether a change instruction for changing the measurement target position or the measurement target posture has been input.
- the user operates the input unit 8 to press a key for changing the position or orientation, or to perform a touch operation or a drag operation on the display area 17 to issue an instruction to change the measurement target position or the measurement target posture. Can be entered.
- the user may operate the input unit 8 or the display screen of the display unit 7 to directly change the numerical value in the column 21 of the measurement target position and orientation to change the position and orientation of the measurement target model.
- the “rough position and orientation of the measurement target 3 in the reference coordinate system in the real space” is assumed to be obtained by measuring using a measure or the like in the real space, or estimated from the position and orientation of the installation table 10. Is done.
- the user operates the display screen of the input unit 8 or the display unit 7 to set “approximate position and orientation of the measurement target 3 in the reference coordinate system in the real space” as the measurement target position and orientation.
- step S1605 If the result of this determination is that an instruction to change the measurement target position or measurement target orientation has been input, the process proceeds to step S1605; otherwise, the process proceeds to step S1606.
- step S1605 the calculation unit 9 updates the measurement target position and the measurement target posture in accordance with the instruction to change the measurement target position and the measurement target posture.
- the position and orientation of the measurement target model 23 are changed in the GUI of FIG. 3, an image of the measurement target model 23 arranged with the changed position and orientation is generated and displayed in the display area 17 as shown in FIG. .
- a prescribed position and orientation is set as the initial value of the position and orientation of the “viewpoint”, and the user operates the input unit 8 or performs a touch operation, a drag operation, or the like on the display area 17.
- the position and orientation of the viewpoint can be changed.
- the relative position and orientation of the viewpoint with respect to the virtual coordinate system are changed. Therefore, the icon 18 is updated in the change. That is, the icon 18 in which the direction of each axis of the virtual coordinate system is updated is generated and displayed so as to indicate the relative posture between the viewpoint and the virtual coordinate system.
- step S1606 the arithmetic unit 9 determines whether the “measurement device display” button 20 has been instructed by the user operating the display screen of the input unit 8 or the display unit 7 (already instructed). As a result of this determination, when the “measurement device display” button 20 is instructed, the process proceeds to step S1607, and when the “measurement device display” button 20 is not instructed, the process proceeds to step S1613.
- step S1607 the calculation unit 9 generates a measuring instrument model that is a three-dimensional model imitating the geometric shape of the measuring instrument 2, and sets the generated measuring instrument model as the position and orientation of the measuring instrument model.
- the measurement device is placed in the virtual space with the position and orientation.
- a prescribed position and orientation is set as an initial value of the position and orientation of the measuring instrument, and the position and orientation of the measuring instrument are updated in accordance with a user operation in step S1610 described below.
- the data of the measuring instrument model is created in advance and registered in the storage unit 6, and the calculating unit 9 generates the measuring instrument model based on the data of the measuring instrument model.
- the calculation unit 9 generates a measurement range model that is a three-dimensional model representing the measurement range 4 of the measuring device 2.
- the arithmetic unit 9 arranges the generated measurement range model in the virtual space with the measurement range position and orientation obtained by adding “the relative position and orientation of the measurement range 4 with respect to the measurement device 2” to the measurement device position and orientation. I do. That is, no matter what the position and orientation of the measuring instrument model is, the relative position and orientation between the measuring instrument model and the measurement range model does not change. It is assumed that this “relative position and orientation of the measurement range 4 with respect to the measuring device 2” is registered in the storage unit 6 in advance. The data of the measurement range model is created in advance and registered in the storage unit 6, and the calculation unit 9 generates a measurement range model based on the data of the measurement range model. In addition, the arithmetic unit 9 obtains the position and orientation of the movable unit 5 by adding the “relative position and orientation between the measuring unit 2 and the movable unit 5” to the position and orientation of the measuring unit 2.
- step S1608 the arithmetic unit 9 generates an image of the measuring instrument model arranged in step S1607 as viewed from the above viewpoint and an image of the measuring range model arranged in step S1607 as viewed from the above viewpoint, and generates each of the generated images. Is displayed in the display area 17.
- FIG. 5 shows a display example of the GUI in step S1608. As shown in FIG. 5, in the display area 17, an image of the measuring instrument model 24 and an image of the measurement range model 25 are displayed in addition to the image of the measurement target model 23 viewed from the viewpoint. That is, an image (virtual space image) of the virtual space in which the measurement target model, the measuring instrument model, and the measurement range model are arranged is generated from the above viewpoint and displayed in the display area 17.
- the position / posture of the movable unit 5 obtained in step S1607 is displayed in the column 22 of the movable unit position / posture.
- the X-coordinate position “87.2”, the Y-coordinate position “250.2”, and the Z-coordinate position “230.0” are displayed as the positions of the movable part 5 (the unit is “mm”).
- the posture (unit is “degrees”) of the movable portion 5 Rx (angle around the X axis in the virtual coordinate system) “178.0”, Ry (angle around the Y axis in the virtual coordinate system) “1.2 , Rz (the angle around the Z axis in the virtual coordinate system) “0.4”.
- information on each measuring instrument 2 may be displayed in a list on the GUI, and selection of the measuring instrument 2 for displaying the measuring instrument model may be accepted.
- the measurement device model of the measurement device 2 corresponding to the information selected by the user via the display screen of the input unit 8 or the display unit 7 from the information displayed in the list, and the measurement range of the measurement range 4 of the measurement device 2
- the model may be displayed on a GUI as illustrated in FIG.
- step S1609 the calculation unit 9 determines whether or not a change instruction for changing the position or orientation of the measuring instrument model or the measurement range model has been input by the user operating the display screen of the input unit 8 or the display unit 7. to decide.
- the user moves the measurement device model, measurement range model, and hand model so that part or all of the measurement target model fits inside the measurement range model. Operation (adjustment). For example, suppose that the pallet bottom of the measurement target 3 is to be aligned with the far end of the measurement range 4. In this case, first, the measuring instrument model and the measuring range model are moved such that the bottom surface of the truncated pyramid of the measuring range model coincides with the pallet bottom surface. Then, the truncated quadrangular pyramid of the measurement range model is moved in a direction parallel to the pallet bottom surface, and adjustment is made, for example, so that the center of the bottom surface of the quadrangular pyramid and the center of the pallet bottom surface approximately match.
- the operation for changing the position and orientation of the measuring instrument model and the measurement range model is the same as the above-described operation for changing the position and orientation of the measurement target model.
- the measurement range model is a translucent model, a wire frame model, or a model in which the measurement target model is not hidden by the measurement range model even if it overlaps with the measurement target model.
- the display form of each model is not limited to a specific display form as long as the same purpose can be achieved.
- step S1610 If the result of this determination is that an instruction to change the position or orientation of the measuring instrument model or the measurement range model has been input, the process proceeds to step S1610; otherwise, the process proceeds to step S1613.
- step S1610 if there is an instruction to change the measuring instrument position or the measuring instrument attitude, the arithmetic unit 9 updates the measuring instrument position or the measuring instrument attitude in accordance with the instruction to change the measuring instrument position or the measuring instrument attitude. For example, the measurement range position and the measurement range posture are updated according to the change instruction.
- the arithmetic unit 9 measures the measuring range position / posture by adding “the relative position / posture of the measuring range 4 with respect to the measuring instrument 2” to the updated measuring instrument position / posture. Update to range position and orientation.
- the arithmetic unit 9 calculates the measurement device position / posture by subtracting “the relative position / posture of the measurement range 4 with respect to the measurement device 2” from the updated measurement range position / posture. Update to container position and orientation.
- the relative position and orientation between the measuring instrument model and the measurement range model are maintained even if the position and orientation of the measuring instrument model and the measurement range model are changed.
- the calculation unit 9 updates the other position and orientation so that the relative position and orientation are maintained.
- the measuring instrument model is arranged in the virtual space using the updated measuring instrument position and orientation.
- the measurement range model is arranged in the virtual space with the updated measurement range position and orientation.
- the image of the measuring instrument model 24 and the measurement range model arranged in the changed position and orientation are changed. Twenty-five images are generated and displayed in the display area 17.
- the measurement target model 23 is included in the measurement range model 25.
- the user operates the input unit 8 or the display screen of the display unit 7 to move the measuring instrument model or the measurement range model, and the inclusion state of the measurement target model with respect to the measurement range model becomes a desired inclusion state (for example, the measurement range).
- the user uses the programming pendant 15 to input the position and orientation displayed in the column 22 of the movable unit position and orientation at this time, and issues an instruction to move the movable unit 5 to the position and orientation using the programming pendant 15.
- the robot controller 14 operates the robot arm 12 to move the position and orientation of the movable unit 5 to the position and orientation input using the programming pendant 15.
- the measuring device 2 can be moved so that the measurement target 3 has a desired position and orientation with respect to the measurement range 4.
- the error between the position and orientation of the measurement target model set in the virtual space and the position and orientation of the measurement target 3 in the real space, and the error between the measurement unit 2 and the movable unit 5 stored in the storage unit 6 There may be a case where a slight shift occurs due to various factors such as an error of “relative position and orientation”.
- photographing or measurement is performed using the measuring device 2 to check whether the measuring device 2 and the measurement target 3 have a desired relative position and orientation relationship, and move the movable section 5 to move the measuring device 2 It is also possible to set an optimum position by finely adjusting the position and orientation.
- step S1613 the calculation unit 9 determines whether the user operates the input unit 8 or the display screen of the display unit 7 to input an end instruction. As a result of this determination, when the user operates the input unit 8 or the display screen of the display unit 7 to input an end instruction, the process according to the flowchart of FIG. 16 ends. The process returns to step S1601.
- the relative position and orientation of the measuring instrument 2 and the measurement target 3 can be set to a desired position and orientation by a simpler operation than in the past with less labor than in the past.
- the movable part 5 is merely an example of a specific part of the robot arm 12, and the position and orientation of a part other than the movable part 5 (a part having a constant relative position and orientation with respect to the measuring device 2) in the robot arm 12 is determined by the movable part Instead of the position and orientation of No. 5, it may be obtained and displayed.
- an axis corresponding to the direction of the measuring instrument 2 in the measuring instrument model (if the measuring instrument 2 has an imaging section, Axis) or an axis corresponding to the direction of the measurement range model may be displayed on the GUI.
- the user performed an operation of moving the measuring instrument model 24 and the measuring range model 25, but the operation is not limited to this, and the relative position and orientation with respect to the measuring instrument 2 and the measuring range 4 may be different. It may be something that is fixed or predetermined. For example, the position and orientation of the movable unit 5 with respect to the measuring instrument model 24 and the measurement range model 25 are fixed. For this reason, as shown in FIG. 6, a coordinate system 22c representing the position and orientation of the movable unit 5 in which the relative position and orientation with respect to the measurement range model 25 is fixed is displayed on the display unit 7, and the user changes the position and orientation of the coordinate system 22c. You may operate to change it. Further, for example, the display unit 7 displays a robot model, which is a three-dimensional model imitating the geometric shape of the mounter 11 fixing the measuring device 2 and the movable unit 5, and the user views the robot model while viewing the robot model. May be changed.
- a robot model which is a three-dimensional model imitating the geometric shape of the mounter 11 fixing
- the GUI of FIG. 7 is displayed on the display screen of the display unit 7 instead of the GUI of FIG. In the display area 17, an image captured by the measuring instrument 2 is displayed.
- the user Before performing various inputs and operations, the user sets the measurement target 3 at a desired position in the real space with a desired posture.
- the measurement target 3 is a relatively large rectangular solid.
- the user operates the programming pendant 15 to move the movable unit 5 so that the measurement target 3 is approximately in the measurable area of the measuring device 2 (so that the measurement target 3 is within the minimum measurable range). Instruct movement. Since the image captured by the measuring device 2 is displayed in the display area 17, the user moves the movable unit 5 while observing the display area 17, and confirms that the measurement target 3 has been captured in the display area 17 in a measurable state. When confirmed, the user operates the programming pendant 15 to stop the movement.
- the user When the user confirms that the measurement target 3 is approximately in the measurable area of the measuring device 2, the user inputs the position and orientation of the movable unit 5 at this time in the movable unit position and orientation column 22.
- the user may input the position and orientation of the movable unit 5 displayed on the programming pendant 15 to the field 22 by operating the input unit 8 and the display screen of the display unit 7. Further, the user may operate the programming pendant 15 to transfer the position and orientation displayed on the programming pendant 15 to the image processing apparatus 1 and display the position and orientation on the column 22.
- the arithmetic section 9 performs the imaging based on the captured image displayed in the display area 17.
- the position and orientation of the measurement target 3 in the image are obtained (measured).
- the calculation unit 9 matches the result of performing image processing on the captured image with the “3D model for matching with respect to the measurement target 3” registered in the storage unit 6 in advance, and Find the position and orientation.
- the position and orientation of the measurement target 3 obtained from the captured image by the calculation unit 9 is a position and orientation in a coordinate system based on the measuring device 2.
- the calculation unit 9 After calculating the position and orientation of the measurement target 3 in the captured image, the calculation unit 9 displays a GUI illustrated in FIG. In the GUI of FIG. 8, the measurement target model 23 arranged with the obtained position and orientation is displayed in the display area 17 in a state of being superimposed on the captured image.
- the calculation unit 9 displays the position and orientation of the measurement target 3 obtained from the captured image, the “relative position and orientation between the measuring device 2 and the movable unit 5” registered in the storage unit 6 in advance, in a column 22.
- the position and orientation of the measurement target 3 in the reference coordinate system are obtained based on the position and orientation being measured. For example, by adding “the relative position and orientation between the measuring device 2 and the movable unit 5” and “the position and orientation of the measurement target 3 obtained from the captured image” to the position and orientation displayed in the column 22, the reference is obtained.
- the position and orientation of the measurement target 3 in the coordinate system (virtual coordinate system) are obtained.
- the calculation unit 9 displays the position and orientation of the measurement target 3 in the reference coordinate system (virtual coordinate system) in the measurement target position and orientation column 21.
- the calculation unit 9 displays the GUI illustrated in FIG. To display.
- the display area 17 displays an image of the measurement target model 23, the measurement device model 24, and the measurement range model 25 as viewed from the above viewpoint, and the icon 18.
- the position and orientation of the measurement target model 23 is “the position and orientation of the measurement target 3 in the reference coordinate system (virtual coordinate system)” obtained by the process performed by pressing the “work measurement” 26.
- the position and orientation of the measuring instrument model 24 is obtained by adding “the relative position and orientation between the measuring instrument 2 and the movable unit 5” to the position and orientation displayed in the column 22.
- the position and orientation of the measurement range model 25 can be obtained by adding “the relative position and orientation of the measurement range 4 with respect to the measurement device 2” to the position and orientation of the measurement device model in the reference coordinate system.
- FIG. 10 shows a display example of the GUI in a state where the operation is completed.
- the surface portion of the measurement target model 23 is substantially aligned with the center of the measurement range model 25.
- the position and orientation of the movable unit 5 at that time are calculated by the calculation unit 9 in the same manner as in the first embodiment, and are displayed in the column 22. This allows the user to know the position and orientation of the movable part for performing measurement under desired conditions.
- the condition that the approximate position and orientation of the measurement target 3 in the reference coordinate system presupposed in the first embodiment is not required in advance. That is, the act of acquiring the position and orientation using a measure or the like in the real space and the act of acquiring the position and orientation from the position and orientation of the installation table 10 are not required, and the work is further simplified.
- the arrangement relationship of the measurement target model 23, the measurement device model 24, and the measurement range model 25 in the real space at that time can be displayed in the virtual space. The set load decreases.
- an image (virtual space image) that is visible when a virtual space within the measurement range (view volume) indicated by the measurement range model is observed from the viewpoint having the position and orientation of the measuring instrument model, and is displayed on the GUI.
- a technique for generating an image of a “space within a prescribed viewing volume” viewed from a viewpoint having a prescribed position and orientation is well known, detailed description thereof will be omitted.
- the GUI of FIG. 11 is displayed instead of the GUI of FIG.
- an image 29 that is visible when the space in the measurement range model 25 is observed from the position and orientation of the measurement device model 24 is displayed.
- the display mode of the image 29 is not limited to the above example, and various display modes can be considered.
- the image 29 is displayed in the display area 17, but may be displayed in a different place.
- the GUI of FIG. 12 is displayed instead of the GUI of FIG.
- the relative position column 30 displays a relative position between a specified position in the measurement range model 25 and a specified position in the measurement target model 23.
- the X coordinate position "3.2" and the Y coordinate position "2.1” ”And the Z coordinate position“ 1.2 ” are displayed.
- both the “specified position in the measurement range model 25” and the “specified position in the measurement target model 23” are specified in advance, and the specified positions are registered in the storage unit 6.
- the position of the measurement range model corresponding to the center position of the bottom surface of the truncated pyramid represented by the measurement range 4 and the position of the measurement target model corresponding to the center position of the bottom surface of the pallet of the measurement target 3 are specified in advance.
- the calculation unit 9 obtains a relative position between the specified position in the measurement range model 25 and the specified position in the measurement target model 23, and displays the obtained position in the column 30.
- the user moves the measuring instrument model and the measurement range model while watching the numerical value displayed in the column 30 so that the numerical value approaches zero.
- the measuring instrument model and the measuring instrument model are adjusted so that the far end of the measuring range 4 and the bottom of the pallet match.
- the measurement range model can be moved.
- information indicating the relative positional relationship between the measurement target model and the measurement range model other information may be displayed in addition to or in place of the relative position.
- a relative posture may be displayed in addition to the relative position.
- the position of the measurement range model corresponding to the center coordinates of the far end plane of the measurement range 4 and the position of the measurement target model corresponding to the center coordinates of the bottom surface of the pallet of the measurement target 3 are specified.
- the position of the measurement range model corresponding to the barycenter coordinates of the measurement range 4 and the position of the measurement target model corresponding to the barycenter coordinates of the measurement target 3 may be specified.
- the position of the measuring instrument model corresponding to the center coordinates of the bottom surface of the measuring instrument 2 and the position of the measuring object model corresponding to the center coordinates of the bottom surface of the pallet of the measuring object 3 are specified. May be calculated, and the distance may be displayed instead of the column 30.
- the relative relationship between the measurement target model 23 and the measurement range model 25 is indicated by numerical values.
- the position of the measurement target in the measurement range is indicated by color information of the model. May be.
- the GUI of FIG. 13 is displayed instead of the GUI of FIG.
- a measurement target model 31 indicating, by color, which region in the measurement range model 25 each part of the measurement target model 23 is displayed.
- the calculation unit 9 determines at which position in the measurement range model 25 each element obtained by dividing the measurement target model 31 in a prescribed unit (for example, a certain volume) corresponds to the element corresponding to the determined position.
- elements that are far from the measurement range model 25 (far as viewed from the measuring instrument model 24) among the elements constituting the measurement target model 31 are drawn in dark color, and are intermediate between the near end and the far end.
- the element at the position is drawn in light color.
- the purpose of adjustment is to match the far end of the measurement range 4 with the pallet bottom surface, so that the color of the bottom surface of the measurement target model 31 becomes dark as shown in FIG. 13. The object is achieved by performing such adjustment.
- a measurement target model indicating how much each part of the measurement target model 23 is apart from the specified position in the measuring instrument model 24 by color may be displayed.
- coloring there are various modes of coloring, and they may be represented by gradations of light and shade as in the above example, or may be represented by gradations of different colors. Further, for example, only the elements of the measurement target model at a specific position such as a far end plane of a predetermined area may be colored. Furthermore, coloring may be considered so as to emphasize elements of the measurement target model existing outside the measurement range model 25.
- the relative relationship between the measurement target model and the measuring instrument model or the measurement range model can be intuitively grasped by numerical values or coloring.
- the adjustment can be performed efficiently and accurately.
- the GUI of FIG. 14 is displayed instead of the GUI of FIG.
- the movable section 5 has a movable range and a posture restriction in the real space.
- condition information that defines the range of positions and postures that the movable unit 5 can take is created in advance and registered in the storage unit 6.
- the calculation unit 9 determines whether or not the position and the posture of the movable unit 5 displayed in the column 22 are within the range of the position and the posture indicated by the condition information. Displays a warning on the display screen of the display unit 7. In the GUI of FIG.
- a message 32 “out of robot posture restriction range” is displayed as a warning, and the user is notified that the posture of the movable unit 5 displayed in the column 22 exceeds the posture restriction range. Notify Thus, the user can know whether or not the arrangement relationship in the virtual space can be realized in the real space without actually trying to move the movable unit 5 in the real space.
- the measuring instrument model and the measuring range model are displayed by semi-transparently displaying the measuring instrument model and the range of the measuring range model (prohibited area) corresponding to the range of the position and orientation that cannot be taken by the movable section 5.
- the user can be notified of an area that should not be moved.
- a message may be displayed on the display screen of the display unit 7 when the measuring instrument model or the measurement range model moves to the prohibited area.
- the calculation unit 9 determines interference between the measuring instrument model and the measurement target model, and displays a warning (for example, a message indicating that there is interference) on the display screen of the display unit 7 if there is interference.
- a warning for example, a message indicating that there is interference
- the calculation unit 9 performs interference determination between the model and the measurement target model, and warns if there is interference (for example, , A message indicating that there is interference) is displayed on the display screen of the display unit 7.
- the display form of the warning is not limited to a specific display form.
- the position displayed in the column 22 may be highlighted, such as displaying in red, or when the position is not within the range of the posture indicated by the condition information, The posture displayed in the column 22 may be highlighted, for example, displayed in red.
- the user can determine whether the movement can be realized in the real space or does not cause interference due to the movement without moving the movable unit 5 in the real space. It is possible and contributes to setting efficiency and prevention of physical damage to the device.
- the position and orientation of the movable unit 5 with respect to the plurality of measurement targets 3 are obtained and displayed.
- a GUI shown in FIG. 15 is displayed instead of the GUI shown in FIG.
- the measurement target models 23a, 23b, and 23c corresponding to the three measurement targets 3 target 1, target 2, target 3) are displayed.
- the user sets the position and orientation of the measuring instrument model 24 and the measurement range model 25 with respect to one of the measurement target models 23a, 23b, and 23c (here, as an example, the measurement target model 23a) as in the first embodiment.
- the user operates the input unit 8 and the display screen of the display unit 7 to input the number and arrangement information of the measurement targets 3 in the measurement target arrangement column 33.
- the number of the measurement targets 3 (3 in FIG. 15), the distance between the measurement targets 3 in the X-axis direction (30 mm in FIG. 15), the distance in the Y-axis direction ( ⁇ 200 mm in FIG. 15), and the Z-axis direction (0 mm in FIG. 15).
- the measurement target model 23b is located at a position shifted by 30 mm in the X axis direction, ⁇ 200 mm in the Y axis direction, and 0 mm in the Z axis direction from the position of the measurement target model 23a.
- the measurement target model 23c is located at a position shifted by 30 mm in the X-axis direction, ⁇ 200 mm in the Y-axis direction, and 0 mm in the Z-axis direction from the position of the measurement target model 23b.
- the posture is assumed to be the same for all measurement target models.
- the calculation unit 9 determines the position and orientation of the movable unit 5 obtained for the measurement target model 23a and the measurement target model 23a.
- the relative position / posture between and is referred to as a relative position / posture T.
- the calculation unit 9 obtains a position and orientation such that the relative position and orientation with respect to the position and orientation of the measurement target model 23b becomes the relative position and orientation T, as “the position and orientation of the movable unit 5 corresponding to the measurement target model 23b”.
- the calculation unit 9 obtains a position and orientation such that the relative position and orientation relative to the position and orientation of the measurement target model 23c is the relative position and orientation T, as "the position and orientation of the movable unit 5 corresponding to the measurement target model 23c". Since the position and orientation of the movable unit 5 corresponding to each of the target 1, the target 2, and the target 3 are obtained in this manner, the calculation unit 9 displays the obtained position and orientation of the movable unit 5 in the column 22. This makes it possible to obtain the position and orientation of the movable part 5 when measuring each individual model without adjusting the relative relationship between the measuring instrument model 24 and the measurement range model 25 with respect to the individual measurement target models.
- the position and orientation of the movable unit 5 corresponding to the measurement target models 23a, 23b, and 23c in this order are obtained, but the order is not limited to this order.
- the procedure for obtaining the position and orientation of the movable unit 5 corresponding to the plurality of measurement targets 3 is not limited to a specific procedure.
- the number and arrangement intervals of the measurement target models are input in the column 33 to determine the position and orientation of another measurement target model. After that, the position and orientation of the measuring instrument model 24 and the measurement range model 25 with respect to any one measurement target model are adjusted.
- the information that can be entered in the column 33 is only the number of measurement target models and the arrangement intervals, but other information may be input.
- information specifying the posture component of each measurement target model may be input.
- the column 21 may display the position and orientation of each measurement target model.
- the respective measurement targets 3 are described as being the same type of object, they may be different types of objects.
- the measuring device 2 is a three-dimensional shape measuring device, but may be another measuring device.
- the measuring device 2 may be a measuring device that calculates the distance from the measuring device 2 to a target, or may be a device that extracts a contour from a captured image and inspects the measuring target 3. . That is, it is possible to assume various forms of the measuring device 2 having a measuring range defined by its performance and specifications.
- the measurement target 3 is a small work stacked in bulk or a pallet for storing the small work, but is not limited thereto.
- the measurement target 3 when there is no pallet, it is also possible to treat the approximate entire space where small piled works exist as the measurement target 3.
- the specific range is set as the measurement target 3. good. That is, it does not depend on the size, material, or the like of the measurement target 3.
- the measurement range 4 is a truncated quadrangular pyramid-shaped region in which the measurement device 2 can achieve a constant measurement performance, but is not limited thereto.
- a working distance region that is generally desirable as an imaging range of a measuring instrument may be indicated without indicating measurement performance.
- Various shapes can be considered for the shape of the measurement range 4.
- a truncated cone shape, a pyramid shape such as a quadrangular pyramid shape or a conical shape, a rectangular parallelepiped shape, or a cubic shape may be considered.
- the size of the measurement range 4 can be variously changed depending on various conditions such as the performance of the measurement device 2.
- the movable portion 5 is the tip of the tool fixedly connected to the flange portion of the robot arm 12 for gripping the measurement target 3, but is not limited thereto.
- a part of the robot arm for example, a flange portion of the robot arm 12 may be the movable portion 5.
- the movable part 5 may be any part having a specific relative relationship with the measuring instrument 2 irrespective of a change in position and orientation due to its movement, and various parts can be used within the definition.
- the configuration of the image processing apparatus 1 is not limited to the configuration shown in FIG.
- the storage unit 6 is provided inside the image processing apparatus 1, but may be provided outside the image processing apparatus 1.
- the storage unit 6 may be incorporated in the robot controller 14. , May be incorporated in an external device that can communicate with the image processing apparatus 1. This is the same for the arithmetic unit 9.
- GUI configurations have been described.
- these GUI configurations described above are merely examples, and various configurations are possible.
- the movable part 5 calculated by the arithmetic unit 9 is displayed.
- the position and orientation may be visually represented.
- a three-dimensional model of the mounter to which the measuring device 2 and the movable part 5 are fixed and the robot arm 12 are also displayed in the display area 17, and the positions and orientations of the models are operated with the operation of the measuring device model 24 and the measurement range model 25. May change.
- the interference between the three-dimensional model of the mounter or the robot arm 12 and the measurement target model 23 or the measuring instrument model 24 is determined, and if there is interference, a warning (for example, a message indicating that there is interference) is displayed. You may make it display on the display screen of the part 7.
- the display of the measuring instrument model 24 can be omitted.
- the display area 17 shows the measurement target model 23 and the measurement range model 25, and the relative relationship between them is adjusted.
- the relative position and orientation between the measurement device 2 and the measurement range 4 are constant. Therefore, it is possible to calculate the position and orientation of the movable unit 5 as in the first embodiment.
- an imaging button or a display area for confirmation may be added to the GUI.
- the user operates the programming pendant 15 to move the movable unit 5 in the real space, and then operates the input unit 8 and the display screen of the display unit 7 to perform imaging. Press the button.
- the measuring device 2 performs imaging, sends the captured image obtained by the imaging to the image processing apparatus 1, and the arithmetic unit 9 displays the captured image in the display area for confirmation. Thereby, without changing to another screen, it is confirmed whether or not adjustment has been performed so that the image captured by the measuring device 2 becomes a desired captured image (the measurement target 3 is captured in a desired state). Can be done.
- the position and orientation of the movable unit 5 are displayed in the column 22.
- the arithmetic unit 9 does not display the position and orientation of the movable unit 5 and the robot controller 14 or the programming pendant. 15 may be output.
- the position and orientation of the measuring instrument model 24 and the measurement range model 25 may be displayed on a GUI. In that case, the position and orientation of the measuring instrument model 24 and the measurement range model 25 may be changed by the user operating the display screen of the input unit 8 and the display unit 7 to change the displayed position and orientation.
- the method of expressing the position and orientation displayed on the GUI is not limited to a specific expression method, but can be expressed by various expression methods such as an Euler angle expression and a rotation angle expression.
- an example in which the positions and orientations of six degrees of freedom are displayed in individual boxes has been presented.
- the position and orientation of six degrees of freedom may be displayed in one box by separating them with commas.
- the method of selecting a model to be operated on the display screen of the display unit 7 is not limited to a specific method.
- the model may be selected by directly specifying the display area of the model to be operated, or a radio button corresponding to each model may be provided outside the display area 17 to operate the model corresponding to the radio button selected by the user. You may make it select as a target model.
- the image of the virtual space displayed in the display area 17 may be enlarged or reduced by rotating a mouse wheel when a mouse is used as the input unit 8.
- the image may be simply enlarged or reduced, or may be included in the image in the virtual space by moving the viewpoint in the virtual space in the direction of the line of sight / in the direction opposite to the line of sight. May be changed.
- the viewpoint may be moved in the virtual space by moving the mouse while right-clicking.
- the display unit 7 may be a display device included in another device.
- the display unit 7 may be a display device included in the programming pendant 15.
- the display unit 7 may be a device separate from the image processing device 1 or a device integrated with the image processing device 1.
- the information input form is not limited to the above input form.
- information input using a mouse or a keyboard may be notified by, for example, command communication from the robot controller 14, and the input form is arbitrary.
- the order of operations performed by the user is specifically shown, but these orders are merely examples, and the present invention is not limited to this.
- the display of the measurement target model 23 and the display of the measuring instrument model 24 may be in the reverse order instead of the order described in the above example.
- the calculation unit 9 calculates the position and orientation of the movable unit 5 and displays the calculated position and orientation in the column 22.
- the calculation and display of the position and orientation of the movable unit 5 may be performed with a trigger triggered by the user's input of an instruction to complete the adjustment by operating the display screen of the input unit 8 or the display unit 7.
- the position and orientation are determined in advance such as when the measurement target 3 is installed on a jig in a real space, the position and orientation are stored in the storage unit 6 or the like in advance, and It is not necessary to set the position and orientation of. In this case, the measurement target model 23 is displayed in the display area 17 based on the stored position and orientation.
- the model imitates the geometric shape of the original real object.
- the present invention is not limited to this, as long as the original real object can be identified.
- the model is not necessarily limited to a model imitating the geometric shape of the original real object. If the model has been created in advance by CAD or the like, the previously created model may be obtained.
- the configurations and operation methods of the GUIs described in each of the above embodiments and modifications are merely examples, and are not intended to limit the configurations and operation methods.
- the measuring device that is the above-described image processing device 1 can be used while being supported by a certain support member.
- a control system provided and used in a robot arm 5300 (gripping device) as shown in FIG. 17 will be described.
- the measuring device 5100 projects and captures an image of the pattern light on the test object 5210 placed on the support base 5350 to obtain an image.
- the control unit of the measuring device 5100 or the control unit 5310 that has acquired the image data from the control unit of the measuring device 5100 obtains the position and orientation of the test object 5210 and controls the information of the obtained position and orientation.
- the unit 5310 acquires the information.
- the control unit 5310 controls the robot arm 5300 by sending a drive command to the robot arm 5300 based on the information on the position and the posture.
- the robot arm 5300 holds the test object 5210 with a robot hand or the like (grasping portion) at the tip, and moves such as translation or rotation. Further, by attaching the test object 5210 to another component by the robot arm 5300, an article composed of a plurality of components, for example, an electronic circuit board or a machine can be manufactured. In addition, by processing the moved test object 5210, an article can be manufactured.
- the control unit 5310 includes an arithmetic device such as a CPU and a storage device such as a memory. Note that a control unit for controlling the robot may be provided outside the control unit 5310.
- the measurement data measured by the measurement device 5100 and the obtained image may be displayed on a display unit 5320 such as a display.
- the present invention supplies a program for realizing one or more functions of the above-described embodiments to a system or an apparatus via a network or a storage medium, and one or more processors in a computer of the system or the apparatus read and execute the program. It can also be realized by the following processing. Further, it can be realized by a circuit (for example, an ASIC) that realizes one or more functions.
- a circuit for example, an ASIC
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
In the present invention, a first model indicating a target of measurement by a measurement instrument and a second model indicating a measurement range of the measurement instrument are displayed on a display screen. The position and orientation of the second model are changed in accordance with a user operation. The position and orientation of a region having a fixed position and orientation relative to the measurement instrument in the state where the position and orientation of the second model have been changed are obtained and output.
Description
本発明は、計測技術に関するものである。
The present invention relates to measurement technology.
撮像装置を用いて対象物を撮像する前に、予め対象物と撮像装置との間の位置関係を設定する必要がある。これは、撮像装置のレンズの焦点距離、CCDなどの撮像素子のサイズなどをはじめとした様々なファクターによって撮像装置の視野領域が規定されており、対象物がその領域に入っている必要があるためである。この設定を、シミュレーション装置上で行う技術が存在する。
位置 Before capturing an image of an object using the imaging device, it is necessary to set a positional relationship between the object and the imaging device in advance. This is because the field of view of the imaging device is defined by various factors including the focal length of the lens of the imaging device, the size of the imaging device such as a CCD, and the like, and the target object needs to be in that region. That's why. There is a technique for making this setting on a simulation device.
特許文献1には、実空間でカメラを設置する前に、どのような範囲を撮影することができるかをシミュレーションによって予め知ることを目的とした技術が開示されている。この技術は、カメラの設置領域となるエリアのエリア画像を表示し、そのエリア画像上にカメラ表示子を重ねて配置、表示する。そしてカメラの種類や画角などの情報を基に、撮影範囲や死角を提示する、というものである。
Patent Literature 1 discloses a technique that aims to know in advance by simulation what kind of area can be photographed before installing a camera in a real space. In this technique, an area image of an area to be a camera installation area is displayed, and a camera indicator is arranged and displayed on the area image. Then, an imaging range and a blind spot are presented based on information such as the type of camera and the angle of view.
特許文献2では、ビデオカメラの撮影現場と離れて映像を見る場合、ビデオカメラがどこからどこを見ているかが不明確であり、映像から撮影現場を把握することやビデオカメラに指示を与えることが難しいことを課題としている。これに対し、特許文献2では、ビデオカメラの存在位置、視野可能領域を含んだ現実世界を、予め仮想空間上で準備し、その仮想空間上に現実世界のビデオカメラの位置や視線、画角領域をリアルタイムで3次元的に表示することで、この課題を解消している。
According to Patent Literature 2, when viewing a video at a distance from the shooting site of a video camera, it is unclear where the video camera is looking from, and it is difficult to grasp the shooting site from the video and to give an instruction to the video camera. The task is difficult. On the other hand, in Patent Literature 2, a real world including an existing position of a video camera and a viewable area is prepared in advance in a virtual space, and the position, the line of sight, and the angle of view of the real world video camera are prepared in the virtual space. This problem is solved by displaying the area three-dimensionally in real time.
特許文献1,2に開示されている技術の目的は、実空間上において、おおよそどこに撮像装置を設置すれば所望の対象物もしくは所望の領域が視野領域に入るかを知ることにある。
The purpose of the technology disclosed in Patent Literatures 1 and 2 is to know where in a real space an imaging device is to be installed to place a desired target or a desired area in a visual field.
一方で、例えば、撮像装置を有する3次元計測器などにおいては、背景技術に記載の技術では十分な効果が得られない状況が存在する。一般に、計測器においては、視野領域とは別に計測に適した所定領域が規定されており、当然その領域は実空間上で不可視である。この所定領域の中でもさらに精度が良好な箇所が存在することもあり、その領域あるいは領域内の特定箇所に計測対象が来るように調整することは困難を伴う。
On the other hand, for example, in a three-dimensional measuring device having an imaging device, there is a situation in which a sufficient effect cannot be obtained by the technology described in the background art. In general, in a measuring instrument, a predetermined area suitable for measurement is defined separately from a visual field area, and the area is naturally invisible in a real space. There may be a part with even higher accuracy in this predetermined area, and it is difficult to adjust the measurement target so as to come to that area or a specific part in the area.
実際の調整方法の一例を以下に記す。まず、使用者は計測対象が計測に適した領域に入るようなおおよその位置に計測器を移動させる。そして計測器と計測対象の実際の相対的な位置関係を知るためにその距離などをスケールなどで測る。そしてその結果を基に計測器を移動させて、再度スケールで確認を行う。その後、その計測器の位置で撮影された画像を確認し、例えばそれが画角内に収まっていない場合は再度調整及び確認を行うこととなる。
例 An example of the actual adjustment method is described below. First, the user moves the measuring instrument to an approximate position where the measurement target enters an area suitable for measurement. Then, in order to know the actual relative positional relationship between the measuring instrument and the measurement target, the distance and the like are measured with a scale or the like. Then, the measuring instrument is moved based on the result, and the confirmation is performed again on the scale. Thereafter, the image photographed at the position of the measuring instrument is confirmed, and if it is not within the angle of view, for example, the adjustment and confirmation are performed again.
前述の3次元計測器は、例えば、ロボットなどに固定して用いられることもあり、この場合には更なる難点、手間が存在する。まず、使用者は計測器そのものの位置姿勢を制御するわけではなく、計測器が搭載されているロボットの位置姿勢をコントローラなどで操作して計測器の位置姿勢制御を行う。即ち、仮に計測器の位置姿勢がわかっても、計測器自体がその位置姿勢となるようにロボットの位置姿勢を制御しなければならない。また、ロボットの操作は安全柵の外で行われることが多く、ロボット操作を行った後にスケールで確認を行う際に安全柵に対する出入りを要するので、作業効率も低下することもある。このように、上記の計測器と計測対象の位置関係の設定において、背景技術に記載の技術では十分な効果が得られない。本発明では、計測対象の計測のための計測器の位置姿勢の設定を迅速且つ容易に行うための技術を提供する。
3The above-described three-dimensional measuring device may be used, for example, fixed to a robot or the like. In this case, there are further difficulties and troubles. First, the user does not control the position and orientation of the measuring device itself, but controls the position and orientation of the measuring device by operating the position and orientation of the robot equipped with the measuring device with a controller or the like. That is, even if the position and orientation of the measuring instrument are known, the position and orientation of the robot must be controlled so that the measuring instrument itself becomes the position and orientation. Further, the operation of the robot is often performed outside the safety fence, and when checking on a scale after the operation of the robot, it is necessary to enter and exit the safety fence, so that the work efficiency may be reduced. As described above, in setting the positional relationship between the measuring instrument and the measurement target, the technique described in the background art cannot provide a sufficient effect. The present invention provides a technique for quickly and easily setting the position and orientation of a measuring instrument for measuring a measurement target.
本発明の一様態は、計測器による計測対象を表す第1のモデルと、該計測器の計測範囲を表す第2のモデルと、を表示画面に表示する表示手段と、ユーザ操作に応じて前記第2のモデルの位置姿勢を変更する変更手段と、前記第2のモデルの位置姿勢が変更された状態における、前記計測器に対する相対的な位置姿勢が一定である部位の位置姿勢、を出力する出力手段とを備えることを特徴とする。
One embodiment of the present invention is a display unit that displays, on a display screen, a first model representing an object to be measured by a measuring instrument, a second model representing a measuring range of the measuring instrument, and A change unit for changing the position and orientation of the second model, and a position and orientation of a part whose position and orientation relative to the measuring device are constant when the position and orientation of the second model are changed. Output means.
本発明の構成によれば、計測対象の計測のための計測器の位置姿勢の設定を迅速且つ容易に行うことができる。
According to the configuration of the present invention, it is possible to quickly and easily set the position and orientation of the measuring instrument for measuring the measurement target.
本発明のその他の特徴及び利点は、添付図面を参照とした以下の説明により明らかになるであろう。なお、添付図面においては、同じ若しくは同様の構成には、同じ参照番号を付す。
Other features and advantages of the present invention will become apparent from the following description with reference to the accompanying drawings. In the accompanying drawings, the same or similar components are denoted by the same reference numerals.
添付図面は明細書に含まれ、その一部を構成し、本発明の実施の形態を示し、その記述と共に本発明の原理を説明するために用いられる。
システムの構成例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
GUIの表示例を示す図。
画像処理装置1が行う処理のフローチャート。
計測装置とロボットアームを含む制御システムを示す図である。
BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings are included in and constitute a part of the specification and illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
FIG. 1 is a diagram illustrating a configuration example of a system. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. FIG. 9 is a view showing a display example of a GUI. 4 is a flowchart of a process performed by the image processing apparatus 1. FIG. 2 is a diagram illustrating a control system including a measurement device and a robot arm.
以下、添付図面を参照し、本発明の実施形態について説明する。なお、以下説明する実施形態は、本発明を具体的に実施した場合の一例を示すもので、特許請求の範囲に記載した構成の具体的な実施形態の1つである。
Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. The embodiment described below shows an example when the present invention is specifically implemented, and is one of the specific embodiments having the configuration described in the claims.
[第1の実施形態]
先ず、本実施形態に係るシステムの構成例について、図1を用いて説明する。計測器2は、設置台10に載置されている計測対象(対象物)3の位置、姿勢、形状等、計測対象3に係る3次元情報を計測するための3次元形状計測器である。計測器2には、例えば、計測対象3にパターン光を投影する投影部と、該パターン光が投影された該計測対象3を撮像する撮像部と、を有するアクティブ方式のものがある。また、計測器2には、それぞれ異なる位置姿勢で計測対象3を撮像する複数の撮像装置を有するパッシブ方式のものもある。 [First Embodiment]
First, a configuration example of a system according to the present embodiment will be described with reference to FIG. Themeasuring device 2 is a three-dimensional shape measuring device for measuring three-dimensional information relating to the measurement target 3 such as the position, posture, shape, and the like of the measurement target (target object) 3 mounted on the mounting table 10. The measuring device 2 includes, for example, an active type that includes a projection unit that projects pattern light onto the measurement target 3 and an imaging unit that captures the measurement target 3 onto which the pattern light is projected. Some of the measuring instruments 2 are of a passive type having a plurality of imaging devices for imaging the measurement target 3 at different positions and orientations.
先ず、本実施形態に係るシステムの構成例について、図1を用いて説明する。計測器2は、設置台10に載置されている計測対象(対象物)3の位置、姿勢、形状等、計測対象3に係る3次元情報を計測するための3次元形状計測器である。計測器2には、例えば、計測対象3にパターン光を投影する投影部と、該パターン光が投影された該計測対象3を撮像する撮像部と、を有するアクティブ方式のものがある。また、計測器2には、それぞれ異なる位置姿勢で計測対象3を撮像する複数の撮像装置を有するパッシブ方式のものもある。 [First Embodiment]
First, a configuration example of a system according to the present embodiment will be described with reference to FIG. The
計測対象3としては様々な物体(様々なサイズや様々な材質の物体)が適用可能である。計測対象3が比較的小型のものである場合は、ばら積み状態で載置されることもあり、比較的大型のものであれば、一個体のみで載置される場合もある。計測対象3をばら積みで載置する場合には、パレットなどの容器に該計測対象3が納められているケースも多く存在する。本実施形態では、計測対象3は計測器2による計測対象だけでなく、該計測器2による計測対象が収められた容器も含むものとする。本実施形態では、計測対象3として、パレットにばら積みされた状態の小型ワークを想定する。また、図1に示すように、計測対象3は設置台10に搭載され、重力方向に対して斜めに設置されているものとする。
様 々 Various objects (objects of various sizes and various materials) can be applied as the measurement target 3. If the measurement target 3 is relatively small, it may be placed in a bulk state, and if it is relatively large, it may be placed with only one individual. When the measurement targets 3 are placed in bulk, there are many cases where the measurement targets 3 are stored in containers such as pallets. In the present embodiment, the measurement target 3 includes not only the measurement target by the measuring device 2 but also a container in which the measurement target by the measuring device 2 is stored. In the present embodiment, it is assumed that the measurement target 3 is a small work that is piled on a pallet. As shown in FIG. 1, it is assumed that the measurement target 3 is mounted on an installation table 10 and is installed obliquely with respect to the direction of gravity.
計測器2に関しては、その性能や仕様により規定される計測範囲4が存在する。例えば、計測器2が撮像装置を備える場合、該撮像装置の視野内に計測対象3の少なくとも一部が含まれていないと、所望の計測が出来ない(所望の計測結果が得られない)。また、計測対象3が撮像装置の視野に入っていれば計測器2と計測対象3との間の相対的な位置姿勢が良好(つまり、計測器2による計測対象3の計測精度が充分)であるとは限らない。つまり、計測対象3が撮像装置の視野に入っていたとしても、デフォーカスや収差の観点から計測に適した撮像装置の位置や姿勢というものが存在する。また、例えば、前述のアクティブ方式の3次元形状計測器においては、パターンを投影する光学系の性能、仕様から考えられる良好な計測対象3の位置や姿勢というものも存在する。このような3次元形状計測器においては、認識性能や精度上良好な領域というものが規定されていることが往々にしてある。本実施形態では計測器2より遠方500mmの位置で500×500mmの大きさの領域を底面、計測器2より遠方300mmの位置で300×300mmの大きさの領域を上面、とする四角錐台状の領域を計測範囲4として規定する。本実施形態において、計測範囲4は計測器2が一定の計測性能を達成することが可能な領域であるとする。つまり、計測器2と計測範囲4との間の相対的な位置姿勢(位置姿勢関係)は、計測器2のパラメータ(焦点距離など)を変更しない限りは一定である。
The measuring instrument 2 has a measuring range 4 defined by its performance and specifications. For example, when the measuring device 2 includes an imaging device, a desired measurement cannot be performed (a desired measurement result cannot be obtained) unless at least a part of the measurement target 3 is included in the visual field of the imaging device. In addition, if the measurement target 3 is within the field of view of the imaging device, the relative position and orientation between the measurement device 2 and the measurement target 3 is good (that is, the measurement accuracy of the measurement target 3 by the measurement device 2 is sufficient). Not always. In other words, even if the measurement target 3 is in the field of view of the imaging device, there is a position or orientation of the imaging device suitable for measurement from the viewpoint of defocus and aberration. Further, for example, in the above-mentioned active type three-dimensional shape measuring device, there is a good position and orientation of the measurement target 3 which can be considered from the performance and specifications of an optical system for projecting a pattern. In such a three-dimensional shape measuring instrument, an area having good recognition performance and good accuracy is often defined. In the present embodiment, a truncated quadrangular pyramid having an area of 500 × 500 mm at a position 500 mm far from the measuring device 2 as a bottom surface and an area of 300 × 300 mm at a position of 300 mm far from the measuring device 2 as a top surface Is defined as a measurement range 4. In the present embodiment, it is assumed that the measurement range 4 is an area where the measuring instrument 2 can achieve a certain measurement performance. That is, the relative position and orientation (position and orientation relationship) between the measuring device 2 and the measurement range 4 is constant unless the parameters (focal length and the like) of the measuring device 2 are changed.
計測対象3を精度よく計測するために、ユーザは計測対象3を計測範囲4の特定箇所に合わせるように調整を行う。計測対象3のワークのばら積み状態は様々な状態が想定され、ばら積みされた表層ワークの高さ位置も多様なパターンが存在する。その場合、ユーザは、例えば計測対象3の入ったパレットの底面を計測範囲4の遠方端、すなわち前述の500×500mmの領域と一致させようとする。そのように設定することで、その上にばら積みされたワークは四角錐台状の領域に入り、ばら積み状態が高い場合であってもばら積み表面に存在するワークが計測器2より遠方300mmより遠い位置にあれば計測範囲4にワークを収めることが出来るためである。
(4) In order to measure the measurement target 3 with high accuracy, the user performs adjustment so that the measurement target 3 matches a specific portion of the measurement range 4. Various states are assumed as the state of the workpieces 3 to be measured in bulk, and there are various patterns in the height position of the surface workpieces stacked in bulk. In this case, the user tries to make the bottom surface of the pallet containing the measurement target 3 coincide with the far end of the measurement range 4, that is, the above-described 500 × 500 mm area. With such a setting, the workpieces stacked thereon enter the truncated quadrangular pyramid-shaped area, and even if the state of the bulk loading is high, the workpieces existing on the bulk loading surface are farther than 300 mm farther than the measuring instrument 2. This allows the work to be contained in the measurement range 4.
本実施形態では、図1に示す如く、計測器2はマウンタ11などを介してロボットアーム12に取り付けられている(固定されている)。ロボットアーム12はロボットベース13に固定されており、可動部5などのロボットアーム12における特定部位の位置や姿勢はロボットコントローラ14により制御される。該制御に関する指令は、例えばプログラミングペンダント15から送信される。ユーザがプログラミングペンダント15を操作して可動部5の位置や姿勢の変更指示を入力すると、該変更指示はロボットコントローラ14に送信され、ロボットコントローラ14は該変更指示に従って可動部5の位置や姿勢を変更する。以下で取り扱う位置姿勢は、特に断らない限りは。ロボットベース13の位置姿勢を基準とする座標系(基準座標系)における位置姿勢であるものとする。基準座標系は、例えば、ロボットベース13の位置を原点とし、該原点で直交する3軸をそれぞれX軸、Y軸、Z軸とする座標系である。可動部5としてはロボットアーム12の先端のフランジ部が例として挙げられる。あるいは図1のように計測対象3を把持するためのツールとして把持機構をフランジ部に固定接続した場合、そのツール先端部を可動部5とすることもある。本実施形態においてはツール先端を可動部5とする。
In the present embodiment, as shown in FIG. 1, the measuring device 2 is attached (fixed) to the robot arm 12 via the mounter 11 and the like. The robot arm 12 is fixed to a robot base 13, and the position and posture of a specific portion of the robot arm 12 such as the movable part 5 are controlled by a robot controller 14. A command related to the control is transmitted from, for example, the programming pendant 15. When the user operates the programming pendant 15 to input an instruction to change the position or posture of the movable unit 5, the change instruction is transmitted to the robot controller 14, and the robot controller 14 changes the position or posture of the movable unit 5 according to the change instruction. change. Unless otherwise specified, the position and orientation handled below. It is assumed that the position and orientation are in a coordinate system (reference coordinate system) based on the position and orientation of the robot base 13. The reference coordinate system is, for example, a coordinate system in which the position of the robot base 13 is set as an origin, and three axes orthogonal to the origin are set as an X axis, a Y axis, and a Z axis, respectively. An example of the movable portion 5 is a flange portion at the tip of the robot arm 12. Alternatively, when a gripping mechanism is fixedly connected to the flange portion as a tool for gripping the measurement target 3 as shown in FIG. 1, the tip of the tool may be the movable portion 5. In this embodiment, the tip of the tool is the movable part 5.
ここで、計測器2及び可動部5はロボットアーム12に固定して取り付けられているため、計測器2と可動部5との間の相対的な位置姿勢はロボットアーム12の位置や姿勢にかかわらず一定である。「計測器2と可動部5との間の相対的な位置姿勢」は計測器2やマウンタ11やツールの設計情報などを基に取得することが可能である。また、マーカなどを計測器2で複数の視点から撮影し、演算処理を行うことで取得する方法も知られている。
Here, since the measuring instrument 2 and the movable section 5 are fixedly attached to the robot arm 12, the relative position and orientation between the measuring instrument 2 and the movable section 5 are independent of the position and orientation of the robot arm 12. Constant. The “relative position and orientation between the measuring device 2 and the movable unit 5” can be acquired based on design information of the measuring device 2, the mounter 11, the tool, and the like. A method is also known in which a marker or the like is photographed from a plurality of viewpoints with the measuring device 2 and is obtained by performing arithmetic processing.
この「計測器2と可動部5との間の相対的な位置姿勢」は、画像処理装置1が有する記憶部6に格納されているものとする。画像処理装置1は更に演算部9、表示部7、入力部8、を有する。
It is assumed that the “relative position and orientation between the measuring device 2 and the movable unit 5” is stored in the storage unit 6 of the image processing apparatus 1. The image processing apparatus 1 further includes a calculation unit 9, a display unit 7, and an input unit 8.
演算部9は、1以上のCPUやプロセッサ、演算処理回路などにより構成されている。演算部9は、記憶部6に格納されているコンピュータプログラムやデータを用いて処理を実行することで、画像処理装置1全体の動作制御を行うと共に、画像処理装置1が行うものとして後述する各処理を実行若しくは制御する。例えば、演算部9は、以下に説明するGUI(グラフィカルユーザインターフェース)の表示制御や該GUIに対する操作入力に応じた各処理を実行若しくは制御する。
The operation unit 9 includes one or more CPUs, processors, operation processing circuits, and the like. The arithmetic unit 9 controls the overall operation of the image processing apparatus 1 by executing processing using a computer program and data stored in the storage unit 6, and performs various operations described below as being performed by the image processing apparatus 1. Execute or control the process. For example, the calculation unit 9 executes or controls display control of a GUI (graphical user interface) described below and processes in accordance with an operation input to the GUI.
記憶部6には、上記の「計測器2と可動部5との間の相対的な位置姿勢」に加えて、以下の説明において画像処理装置1が既知の情報として取り扱う情報が保存されている。また、記憶部6には、画像処理装置1が行うものとして後述する各処理を演算部9に実行若しくは制御させるためのコンピュータプログラムやデータが保存されている。
The storage unit 6 stores, in addition to the “relative position and orientation between the measuring device 2 and the movable unit 5”, information that the image processing apparatus 1 handles as known information in the following description. . In addition, the storage unit 6 stores computer programs and data for causing the arithmetic unit 9 to execute or control each process described below, which is performed by the image processing apparatus 1.
入力部8は、キーボード、マウス、タッチパネル画面などのユーザインターフェースにより構成されており、ユーザが操作することで各種の指示を演算部9に対して入力することができる。
The input unit 8 is configured by a user interface such as a keyboard, a mouse, and a touch panel screen, and can input various instructions to the arithmetic unit 9 by operating the user.
表示部7は、液晶画面やタッチパネル画面などにより構成されており、演算部9による処理結果を画像や文字などでもって表示することができる。例えば、表示部7には、以下に説明するGUIを表示することができる。
The display unit 7 is configured by a liquid crystal screen, a touch panel screen, or the like, and can display a processing result of the calculation unit 9 as an image, characters, or the like. For example, the display unit 7 can display a GUI described below.
次に、計測器2による計測範囲4内の所望領域内に計測対象3が収まるような可動部5の位置姿勢を得るために画像処理装置1が行う処理について、同処理のフローチャートを示す図16に従って説明する。
Next, FIG. 16 is a flowchart showing the same processing performed by the image processing apparatus 1 to obtain the position and orientation of the movable unit 5 such that the measurement target 3 falls within a desired area within the measurement range 4 measured by the measuring device 2. It will be described according to.
ここで、図16のフローチャートに従った処理を開始する時点で、表示部7の表示画面には、図2に例示するGUIが表示されているものとする。表示領域17は、規定の位置姿勢を有する視点から見た「上記の基準座標系と一致する座標系(仮想座標系)で規定される仮想空間」の画像を表示するための領域である。また、表示領域17には、仮想座標系の各軸(X軸、Y軸、Z軸)を表すアイコン18が表示される。
Here, it is assumed that the GUI illustrated in FIG. 2 is displayed on the display screen of the display unit 7 when the processing according to the flowchart in FIG. 16 is started. The display area 17 is an area for displaying an image of a “virtual space defined by a coordinate system (virtual coordinate system) that matches the above-described reference coordinate system” viewed from a viewpoint having a specified position and orientation. In the display area 17, an icon 18 representing each axis (X axis, Y axis, Z axis) of the virtual coordinate system is displayed.
ステップS1601では、演算部9は、ユーザが入力部8や表示部7の表示画面を操作することで「計測対象表示」ボタン19が指示された(指示済み)か否かを判断する。この判断の結果、「計測対象表示」ボタン19が指示された場合には、処理はステップS1602に進み、「計測対象表示」ボタン19が指示されていない場合には、処理はステップS1613に進む。
In step S1601, the calculation unit 9 determines whether or not the “measurement target display” button 19 has been instructed by the user operating the display screen of the input unit 8 or the display unit 7 (already instructed). As a result of this determination, if the “measurement object display” button 19 has been instructed, the process proceeds to step S1602. If the “measurement object display” button 19 has not been instructed, the process proceeds to step S1613.
ステップS1602では、演算部9は、計測対象3の幾何形状を模した3次元モデルである計測対象モデルを生成し、該生成した計測対象モデルを、該計測対象モデルの位置姿勢として設定されている計測対象位置姿勢でもって上記の仮想空間中に配置する。計測対象位置姿勢の初期値には規定の位置姿勢が設定されており、該計測対象位置姿勢は後述するステップS1605においてユーザ操作に応じて更新される。また、計測対象モデルのデータは予め作成して記憶部6に登録されており、演算部9は、この計測対象モデルのデータに基づいて計測対象モデルを生成する。
In step S1602, the calculation unit 9 generates a measurement target model that is a three-dimensional model imitating the geometric shape of the measurement target 3, and sets the generated measurement target model as the position and orientation of the measurement target model. The measurement target position and orientation are arranged in the virtual space. A specified position and orientation is set as the initial value of the position and orientation of the measurement target, and the position and orientation of the measurement target are updated in accordance with a user operation in step S1605 described below. In addition, the data of the measurement target model is created in advance and registered in the storage unit 6, and the calculation unit 9 generates the measurement target model based on the data of the measurement target model.
そしてステップS1603では、演算部9は、規定の位置姿勢を有する視点から見た「ステップS1602で配置した計測対象モデル」の画像を生成し、該画像を表示領域17に表示する。ステップS1603におけるGUIの表示例を図3に示す。図3に示す如く、表示領域17には、規定の位置姿勢を有する視点から見た計測対象モデル23の画像が表示される。また、計測対象位置姿勢の欄21には、計測対象位置姿勢が表示される。図3では、計測対象位置(単位は「mm」)として、X座標位置「0」、Y座標位置「300.0」、Z座標位置「100.0」が表示されている。また、計測対象姿勢(単位は「度」)として、Rx(仮想座標系におけるX軸周りの角度)「0」、Ry(仮想座標系におけるY軸周りの角度)「0」、Rz(仮想座標系におけるZ軸周りの角度)「0」が表示されている。
In step S1603, the calculation unit 9 generates an image of the “measurement target model arranged in step S1602” viewed from a viewpoint having a specified position and orientation, and displays the image in the display area 17. FIG. 3 shows a display example of the GUI in step S1603. As shown in FIG. 3, the display area 17 displays an image of the measurement target model 23 viewed from a viewpoint having a specified position and orientation. The measurement target position / posture is displayed in the measurement target position / posture column 21. In FIG. 3, an X coordinate position “0”, a Y coordinate position “300.0”, and a Z coordinate position “100.0” are displayed as measurement target positions (the unit is “mm”). Further, as measurement target postures (units are “degrees”), Rx (angle around the X axis in the virtual coordinate system) “0”, Ry (angle around the Y axis in the virtual coordinate system) “0”, Rz (virtual coordinates) (Angle around the Z axis in the system) “0” is displayed.
なお、計測対象3が複数存在する場合には、それぞれの計測対象3に係る情報をGUIに一覧表示して、計測対象モデルを表示する計測対象3の選択を受け付けても良い。つまり、一覧表示された情報の中からユーザが入力部8や表示部7の表示画面を介して選択した情報に対応する計測対象3の計測対象モデルと該計測対象モデルの計測対象位置姿勢とを図3に例示する如くGUIに表示するようにしても良い。
When there are a plurality of measurement targets 3, information on each measurement target 3 may be displayed in a list on the GUI, and selection of the measurement target 3 for displaying the measurement target model may be accepted. That is, the measurement target model of the measurement target 3 and the measurement target position and orientation of the measurement target model corresponding to the information selected by the user via the display screen of the input unit 8 or the display unit 7 from the information displayed in the list. It may be displayed on a GUI as illustrated in FIG.
ステップS1604では、演算部9は、計測対象位置や計測対象姿勢を変更する変更指示が入力されたか否かを判断する。ユーザは入力部8を操作して位置や姿勢を変更するためのキーを押下したり、表示領域17上のタッチ操作やドラッグ操作等を行うことで、計測対象位置や計測対象姿勢の変更指示を入力することができる。またユーザは入力部8や表示部7の表示画面を操作して、計測対象位置姿勢の欄21の数値を直接変更することで計測対象モデルの位置や姿勢を変更するようにしても良い。
In step S1604, the calculation unit 9 determines whether a change instruction for changing the measurement target position or the measurement target posture has been input. The user operates the input unit 8 to press a key for changing the position or orientation, or to perform a touch operation or a drag operation on the display area 17 to issue an instruction to change the measurement target position or the measurement target posture. Can be entered. Alternatively, the user may operate the input unit 8 or the display screen of the display unit 7 to directly change the numerical value in the column 21 of the measurement target position and orientation to change the position and orientation of the measurement target model.
ユーザは現実空間中の基準座標系における計測対象3の大よその位置姿勢を予め取得しているものとする。「現実空間中の基準座標系における計測対象3の大よその位置姿勢」は、現実空間においてメジャーなどを用いて計測することで取得することや、設置台10の位置姿勢から推定することが想定される。然るにユーザは、例えば、入力部8や表示部7の表示画面を操作して、計測対象位置姿勢として「現実空間中の基準座標系における計測対象3の大よその位置姿勢」を設定する。
It is assumed that the user has previously acquired the approximate position and orientation of the measurement target 3 in the reference coordinate system in the real space. The “rough position and orientation of the measurement target 3 in the reference coordinate system in the real space” is assumed to be obtained by measuring using a measure or the like in the real space, or estimated from the position and orientation of the installation table 10. Is done. However, the user operates the display screen of the input unit 8 or the display unit 7 to set “approximate position and orientation of the measurement target 3 in the reference coordinate system in the real space” as the measurement target position and orientation.
この判断の結果、計測対象位置や計測対象姿勢の変更指示が入力された場合には、処理はステップS1605に進み、入力されていない場合には、処理はステップS1606に進む。
If the result of this determination is that an instruction to change the measurement target position or measurement target orientation has been input, the process proceeds to step S1605; otherwise, the process proceeds to step S1606.
ステップS1605では、演算部9は、計測対象位置や計測対象姿勢の変更指示に応じて、計測対象位置や計測対象姿勢を更新する。図3のGUIにおいて計測対象モデル23の位置姿勢の変更を行うと、図4に示す如く、変更後の位置姿勢で配置された計測対象モデル23の画像が生成されて表示領域17に表示される。
In step S1605, the calculation unit 9 updates the measurement target position and the measurement target posture in accordance with the instruction to change the measurement target position and the measurement target posture. When the position and orientation of the measurement target model 23 are changed in the GUI of FIG. 3, an image of the measurement target model 23 arranged with the changed position and orientation is generated and displayed in the display area 17 as shown in FIG. .
なお、上記の「視点」の位置姿勢の初期値には規定の位置姿勢が設定されており、ユーザは入力部8を操作したり、表示領域17上のタッチ操作やドラッグ操作等を行うことで、該視点の位置姿勢を変更することができる。また、視点の位置姿勢の変更を行うと、仮想座標系に対する視点の相対的な位置姿勢が変更されるため、アイコン18を該変更において更新する。つまり、視点と仮想座標系との間の相対的な姿勢を表すように、仮想座標系の各軸の方向を更新したアイコン18を生成して表示する。
Note that a prescribed position and orientation is set as the initial value of the position and orientation of the “viewpoint”, and the user operates the input unit 8 or performs a touch operation, a drag operation, or the like on the display area 17. , The position and orientation of the viewpoint can be changed. When the position and orientation of the viewpoint are changed, the relative position and orientation of the viewpoint with respect to the virtual coordinate system are changed. Therefore, the icon 18 is updated in the change. That is, the icon 18 in which the direction of each axis of the virtual coordinate system is updated is generated and displayed so as to indicate the relative posture between the viewpoint and the virtual coordinate system.
ステップS1606では、演算部9は、ユーザが入力部8や表示部7の表示画面を操作することで「計測器表示」ボタン20が指示された(指示済み)か否かを判断する。この判断の結果、「計測器表示」ボタン20が指示された場合には、処理はステップS1607に進み、「計測器表示」ボタン20が指示されていない場合には、処理はステップS1613に進む。
In step S1606, the arithmetic unit 9 determines whether the “measurement device display” button 20 has been instructed by the user operating the display screen of the input unit 8 or the display unit 7 (already instructed). As a result of this determination, when the “measurement device display” button 20 is instructed, the process proceeds to step S1607, and when the “measurement device display” button 20 is not instructed, the process proceeds to step S1613.
ステップS1607では、演算部9は、計測器2の幾何形状を模した3次元モデルである計測器モデルを生成し、該生成した計測器モデルを、該計測器モデルの位置姿勢として設定されている計測器位置姿勢でもって上記の仮想空間中に配置する。計測器位置姿勢の初期値には規定の位置姿勢が設定されており、該計測器位置姿勢は後述するステップS1610においてユーザ操作に応じて更新される。また、計測器モデルのデータは予め作成して記憶部6に登録されており、演算部9は、この計測器モデルのデータに基づいて計測器モデルを生成する。また演算部9は、計測器2の計測範囲4を表す3次元モデルである計測範囲モデルを生成する。そして演算部9は、該生成した計測範囲モデルを、計測器位置姿勢に「計測器2に対する計測範囲4の相対的な位置姿勢」を加えた計測範囲位置姿勢でもって上記の仮想空間中に配置する。つまり、計測器モデルの位置姿勢がどのような位置姿勢であったとしても、計測器モデルと計測範囲モデルとの間の相対位置姿勢は変わらない。この「計測器2に対する計測範囲4の相対的な位置姿勢」は予め記憶部6に登録されているものとする。また、計測範囲モデルのデータは予め作成して記憶部6に登録されており、演算部9は、この計測範囲モデルのデータに基づいて計測範囲モデルを生成する。また演算部9は、計測器2の位置姿勢に、上記の「計測器2と可動部5との間の相対的な位置姿勢」を加えることで、可動部5の位置姿勢を求める。
In step S1607, the calculation unit 9 generates a measuring instrument model that is a three-dimensional model imitating the geometric shape of the measuring instrument 2, and sets the generated measuring instrument model as the position and orientation of the measuring instrument model. The measurement device is placed in the virtual space with the position and orientation. A prescribed position and orientation is set as an initial value of the position and orientation of the measuring instrument, and the position and orientation of the measuring instrument are updated in accordance with a user operation in step S1610 described below. The data of the measuring instrument model is created in advance and registered in the storage unit 6, and the calculating unit 9 generates the measuring instrument model based on the data of the measuring instrument model. The calculation unit 9 generates a measurement range model that is a three-dimensional model representing the measurement range 4 of the measuring device 2. Then, the arithmetic unit 9 arranges the generated measurement range model in the virtual space with the measurement range position and orientation obtained by adding “the relative position and orientation of the measurement range 4 with respect to the measurement device 2” to the measurement device position and orientation. I do. That is, no matter what the position and orientation of the measuring instrument model is, the relative position and orientation between the measuring instrument model and the measurement range model does not change. It is assumed that this “relative position and orientation of the measurement range 4 with respect to the measuring device 2” is registered in the storage unit 6 in advance. The data of the measurement range model is created in advance and registered in the storage unit 6, and the calculation unit 9 generates a measurement range model based on the data of the measurement range model. In addition, the arithmetic unit 9 obtains the position and orientation of the movable unit 5 by adding the “relative position and orientation between the measuring unit 2 and the movable unit 5” to the position and orientation of the measuring unit 2.
ステップS1608では、演算部9は、ステップS1607で配置した計測器モデルを上記視点から見た画像及びステップS1607で配置した計測範囲モデルを上記視点から見た画像を生成し、該生成したそれぞれの画像を表示領域17に表示する。ステップS1608におけるGUIの表示例を図5に示す。図5に示す如く、表示領域17には、上記視点から見た計測対象モデル23の画像に加えて、計測器モデル24の画像、計測範囲モデル25の画像が表示される。つまり、計測対象モデル、計測器モデル、計測範囲モデル、が配置された仮想空間を上記の視点から見た画像(仮想空間の画像)が生成されて、表示領域17に表示される。
In step S1608, the arithmetic unit 9 generates an image of the measuring instrument model arranged in step S1607 as viewed from the above viewpoint and an image of the measuring range model arranged in step S1607 as viewed from the above viewpoint, and generates each of the generated images. Is displayed in the display area 17. FIG. 5 shows a display example of the GUI in step S1608. As shown in FIG. 5, in the display area 17, an image of the measuring instrument model 24 and an image of the measurement range model 25 are displayed in addition to the image of the measurement target model 23 viewed from the viewpoint. That is, an image (virtual space image) of the virtual space in which the measurement target model, the measuring instrument model, and the measurement range model are arranged is generated from the above viewpoint and displayed in the display area 17.
また、可動部位置姿勢の欄22には、ステップS1607にて求めた可動部5の位置姿勢が表示される。図5では、可動部5の位置(単位は「mm」)として、X座標位置「87.2」、Y座標位置「250.2」、Z座標位置「230.0」が表示されている。また、可動部5の姿勢(単位は「度」)として、Rx(仮想座標系におけるX軸周りの角度)「178.0」、Ry(仮想座標系におけるY軸周りの角度)「1.2」、Rz(仮想座標系におけるZ軸周りの角度)「0.4」が表示されている。
{Circle around (2)} The position / posture of the movable unit 5 obtained in step S1607 is displayed in the column 22 of the movable unit position / posture. In FIG. 5, the X-coordinate position “87.2”, the Y-coordinate position “250.2”, and the Z-coordinate position “230.0” are displayed as the positions of the movable part 5 (the unit is “mm”). Further, as the posture (unit is “degrees”) of the movable portion 5, Rx (angle around the X axis in the virtual coordinate system) “178.0”, Ry (angle around the Y axis in the virtual coordinate system) “1.2 , Rz (the angle around the Z axis in the virtual coordinate system) “0.4”.
なお、計測器2が複数存在する場合には、それぞれの計測器2に係る情報をGUIに一覧表示して、計測器モデルを表示する計測器2の選択を受け付けても良い。つまり、一覧表示された情報の中からユーザが入力部8や表示部7の表示画面を介して選択した情報に対応する計測器2の計測器モデル及び該計測器2の計測範囲4の計測範囲モデルを図5に例示する如くGUIに表示するようにしても良い。
In the case where there are a plurality of measuring instruments 2, information on each measuring instrument 2 may be displayed in a list on the GUI, and selection of the measuring instrument 2 for displaying the measuring instrument model may be accepted. In other words, the measurement device model of the measurement device 2 corresponding to the information selected by the user via the display screen of the input unit 8 or the display unit 7 from the information displayed in the list, and the measurement range of the measurement range 4 of the measurement device 2 The model may be displayed on a GUI as illustrated in FIG.
ステップS1609では、演算部9は、ユーザが入力部8や表示部7の表示画面を操作することで、計測器モデルや計測範囲モデルの位置や姿勢を変更する変更指示が入力されたか否かを判断する。
In step S1609, the calculation unit 9 determines whether or not a change instruction for changing the position or orientation of the measuring instrument model or the measurement range model has been input by the user operating the display screen of the input unit 8 or the display unit 7. to decide.
計測対象モデル、計測器モデル、計測範囲モデルが表示されると、ユーザは、計測範囲モデルの内部に計測対象モデルの一部若しくは全部が収まるように計測器モデルや計測範囲モデルやハンドモデルを移動させる操作(調整)を行う。例えば、計測対象3のパレット底面を計測範囲4の遠方端に合わせたいとする。この場合、先ず計測範囲モデルの四角錐台の底面がパレット底面と一致するように計測器モデルや計測範囲モデルを移動させる。そして次に、パレット底面と並行方向に計測範囲モデルの四角錐台を動かし、例えば四角錐台の底面中央とパレット底面中央がおおよそ一致するように調整する。計測器モデルや計測範囲モデルの位置や姿勢を変更させるための操作は、上記の計測対象モデルの位置や姿勢を変更するための操作と同様である。上記の操作のために、計測範囲モデルは半透明のモデルであったり、ワイヤーフレームのモデルであったりと、計測対象モデルと重なっても該計測対象モデルが計測範囲モデルによって隠蔽されないようなモデルであることが望ましい。また、同様の目的を達成できるのであれば、それぞれのモデルの表示形態は特定の表示形態に限らない。
When the measurement target model, measurement device model, and measurement range model are displayed, the user moves the measurement device model, measurement range model, and hand model so that part or all of the measurement target model fits inside the measurement range model. Operation (adjustment). For example, suppose that the pallet bottom of the measurement target 3 is to be aligned with the far end of the measurement range 4. In this case, first, the measuring instrument model and the measuring range model are moved such that the bottom surface of the truncated pyramid of the measuring range model coincides with the pallet bottom surface. Then, the truncated quadrangular pyramid of the measurement range model is moved in a direction parallel to the pallet bottom surface, and adjustment is made, for example, so that the center of the bottom surface of the quadrangular pyramid and the center of the pallet bottom surface approximately match. The operation for changing the position and orientation of the measuring instrument model and the measurement range model is the same as the above-described operation for changing the position and orientation of the measurement target model. For the above operation, the measurement range model is a translucent model, a wire frame model, or a model in which the measurement target model is not hidden by the measurement range model even if it overlaps with the measurement target model. Desirably. Further, the display form of each model is not limited to a specific display form as long as the same purpose can be achieved.
この判断の結果、計測器モデルや計測範囲モデルの位置や姿勢の変更指示が入力された場合には、処理はステップS1610に進み、入力されていない場合には、処理はステップS1613に進む。
If the result of this determination is that an instruction to change the position or orientation of the measuring instrument model or the measurement range model has been input, the process proceeds to step S1610; otherwise, the process proceeds to step S1613.
ステップS1610では、演算部9は、計測器位置や計測器姿勢の変更指示があれば、該変更指示に従って計測器位置や計測器姿勢を更新し、計測範囲位置や計測範囲姿勢の変更指示があれば、該変更指示に従って計測範囲位置や計測範囲姿勢を更新する。
In step S1610, if there is an instruction to change the measuring instrument position or the measuring instrument attitude, the arithmetic unit 9 updates the measuring instrument position or the measuring instrument attitude in accordance with the instruction to change the measuring instrument position or the measuring instrument attitude. For example, the measurement range position and the measurement range posture are updated according to the change instruction.
なお、計測器位置姿勢を更新した場合、演算部9は、計測範囲位置姿勢を、該更新後の計測器位置姿勢に「計測器2に対する計測範囲4の相対的な位置姿勢」を加えた計測範囲位置姿勢に更新する。
When the measuring instrument position / posture is updated, the arithmetic unit 9 measures the measuring range position / posture by adding “the relative position / posture of the measuring range 4 with respect to the measuring instrument 2” to the updated measuring instrument position / posture. Update to range position and orientation.
一方、計測範囲位置姿勢を更新した場合、演算部9は、計測器位置姿勢を、該更新後の計測範囲位置姿勢から「計測器2に対する計測範囲4の相対的な位置姿勢」を減じた計測器位置姿勢に更新する。
On the other hand, when the measurement range position / posture is updated, the arithmetic unit 9 calculates the measurement device position / posture by subtracting “the relative position / posture of the measurement range 4 with respect to the measurement device 2” from the updated measurement range position / posture. Update to container position and orientation.
つまり、上記の通り、計測器モデルと計測範囲モデルとの間の相対的な位置姿勢は、計測器モデル及び計測範囲モデルの何れの位置姿勢が変更されても維持されたままである。然るに、演算部9は、一方の位置姿勢が変更された場合には、該相対的な位置姿勢が維持されるように、他方の位置姿勢も更新する。
That is, as described above, the relative position and orientation between the measuring instrument model and the measurement range model are maintained even if the position and orientation of the measuring instrument model and the measurement range model are changed. However, when one position and orientation is changed, the calculation unit 9 updates the other position and orientation so that the relative position and orientation are maintained.
計測器位置姿勢が更新された場合、次回のステップS1607では、該更新後の計測器位置姿勢でもって計測器モデルが仮想空間中に配置される。同様に、計測範囲位置姿勢が更新された場合、次回のステップS1607では、該更新後の計測範囲位置姿勢でもって計測範囲モデルが仮想空間中に配置される。
If the measuring instrument position and orientation have been updated, in the next step S1607, the measuring instrument model is arranged in the virtual space using the updated measuring instrument position and orientation. Similarly, when the measurement range position and orientation are updated, in the next step S1607, the measurement range model is arranged in the virtual space with the updated measurement range position and orientation.
例えば、図5のGUIにおいて計測器モデル24や計測範囲モデル25の位置姿勢の変更を行うと、図6に示す如く、変更後の位置姿勢で配置された計測器モデル24の画像及び計測範囲モデル25の画像が生成されて表示領域17に表示される。図6では、計測対象モデル23が計測範囲モデル25に収まっている。
For example, when the position and orientation of the measuring instrument model 24 and the measurement range model 25 are changed in the GUI of FIG. 5, as shown in FIG. 6, the image of the measuring instrument model 24 and the measurement range model arranged in the changed position and orientation are changed. Twenty-five images are generated and displayed in the display area 17. In FIG. 6, the measurement target model 23 is included in the measurement range model 25.
そしてユーザが入力部8や表示部7の表示画面を操作して計測器モデルや計測範囲モデルを移動させ、計測範囲モデルに対する計測対象モデルの包含状態が所望の包含状態となった(例えば計測範囲モデルの内部に計測対象モデルの全てが収まった場合)とする。この場合、ユーザは、この時点で可動部位置姿勢の欄22に表示されている位置姿勢をプログラミングペンダント15を用いて入力し、該位置姿勢に可動部5を移動させる指示をプログラミングペンダント15を用いて入力する。これによりロボットコントローラ14は、ロボットアーム12を稼働させて、可動部5の位置姿勢を、プログラミングペンダント15を用いて入力された位置姿勢に移動させる。これにより、計測対象3が計測範囲4に対して所望の位置姿勢となるように計測器2を移動させることができる。
Then, the user operates the input unit 8 or the display screen of the display unit 7 to move the measuring instrument model or the measurement range model, and the inclusion state of the measurement target model with respect to the measurement range model becomes a desired inclusion state (for example, the measurement range). (When all of the measurement target models fit inside the model). In this case, the user uses the programming pendant 15 to input the position and orientation displayed in the column 22 of the movable unit position and orientation at this time, and issues an instruction to move the movable unit 5 to the position and orientation using the programming pendant 15. Enter Thereby, the robot controller 14 operates the robot arm 12 to move the position and orientation of the movable unit 5 to the position and orientation input using the programming pendant 15. Thereby, the measuring device 2 can be moved so that the measurement target 3 has a desired position and orientation with respect to the measurement range 4.
実際には、仮想空間上で設定した計測対象モデルの位置姿勢と実空間上における計測対象3の位置姿勢との誤差、記憶部6に記憶されている「計測器2と可動部5との間の相対的な位置姿勢」の誤差等の種々の要因により、若干のずれが生じる場合も考えられる。その場合、上記移動後に計測器2を用いて撮影や計測を行い、計測器2と計測対象3が所望の相対位置姿勢関係になっているかを確認し、可動部5を動かして計測器2の位置姿勢の微調整を行うことにより最適な位置を設定することも可能である。
Actually, the error between the position and orientation of the measurement target model set in the virtual space and the position and orientation of the measurement target 3 in the real space, and the error between the measurement unit 2 and the movable unit 5 stored in the storage unit 6 There may be a case where a slight shift occurs due to various factors such as an error of “relative position and orientation”. In this case, after the movement, photographing or measurement is performed using the measuring device 2 to check whether the measuring device 2 and the measurement target 3 have a desired relative position and orientation relationship, and move the movable section 5 to move the measuring device 2 It is also possible to set an optimum position by finely adjusting the position and orientation.
そして、ステップS1613では、演算部9は、ユーザが入力部8や表示部7の表示画面を操作して終了指示を入力したか否かを判断する。この判断の結果、ユーザが入力部8や表示部7の表示画面を操作して終了指示を入力した場合には、図16のフローチャートに従った処理は終了し、入力していない場合には、処理はステップS1601に戻る。
In step S1613, the calculation unit 9 determines whether the user operates the input unit 8 or the display screen of the display unit 7 to input an end instruction. As a result of this determination, when the user operates the input unit 8 or the display screen of the display unit 7 to input an end instruction, the process according to the flowchart of FIG. 16 ends. The process returns to step S1601.
このように本実施形態では、実空間上での可動部5及び計測器2の移動と計測器2と計測対象3との間の相対的な位置姿勢の確認の繰り返しを必要としない。そして本実施形態によれば、従来より少ない手間で従来よりも簡易な操作により計測器2と計測対象3との相対的な位置姿勢を所望の位置姿勢に設置することができる。
As described above, in the present embodiment, it is not necessary to repeat the movement of the movable unit 5 and the measuring device 2 in the real space and the confirmation of the relative position and orientation between the measuring device 2 and the measurement target 3. According to the present embodiment, the relative position and orientation of the measuring instrument 2 and the measurement target 3 can be set to a desired position and orientation by a simpler operation than in the past with less labor than in the past.
なお、可動部5は、ロボットアーム12における特定部位の一例に過ぎず、ロボットアーム12において可動部5以外の部位(計測器2に対する相対的な位置姿勢が一定の部位)の位置姿勢を可動部5の位置姿勢の代わりに求めて表示するようにしても良い。
The movable part 5 is merely an example of a specific part of the robot arm 12, and the position and orientation of a part other than the movable part 5 (a part having a constant relative position and orientation with respect to the measuring device 2) in the robot arm 12 is determined by the movable part Instead of the position and orientation of No. 5, it may be obtained and displayed.
<変形例>
計測器モデルや計測範囲モデルの移動方向をガイドするためにも例えば計測器モデルにおいて計測器2の向きに相当する軸(計測器2が撮像部を有しているのであれば該撮像部の視軸)や計測範囲モデルの向きに相当する軸をGUIに表示するようにしても良い。 <Modification>
In order to guide the moving direction of the measuring instrument model or the measurement range model, for example, an axis corresponding to the direction of the measuringinstrument 2 in the measuring instrument model (if the measuring instrument 2 has an imaging section, Axis) or an axis corresponding to the direction of the measurement range model may be displayed on the GUI.
計測器モデルや計測範囲モデルの移動方向をガイドするためにも例えば計測器モデルにおいて計測器2の向きに相当する軸(計測器2が撮像部を有しているのであれば該撮像部の視軸)や計測範囲モデルの向きに相当する軸をGUIに表示するようにしても良い。 <Modification>
In order to guide the moving direction of the measuring instrument model or the measurement range model, for example, an axis corresponding to the direction of the measuring
また、上記実施例では、ユーザは、計測器モデル24や計測範囲モデル25を移動させる操作を行ったが、操作を行う物はこれに限らず、計測器2や計測範囲4に対する相対位置姿勢が固定されていたり、予め決められていたりする物を表すものでもよい。例えば、可動部5は、計測器モデル24や計測範囲モデル25に対する位置姿勢が固定されている。そのため、図6に示すように、計測範囲モデル25に対する相対位置姿勢が固定された、可動部5の位置姿勢を表す座標系22cを表示部7に表示させ、ユーザが座標系22cの位置姿勢を変更するように操作してもよい。また、例えば、計測器2を固定しているマウンタ11や可動部5の幾何形状を模した3次元モデルであるロボットモデルを表示部7に表示させ、ユーザがロボットモデルを見ながら、その位置姿勢を変更するように操作してもよい。
Further, in the above embodiment, the user performed an operation of moving the measuring instrument model 24 and the measuring range model 25, but the operation is not limited to this, and the relative position and orientation with respect to the measuring instrument 2 and the measuring range 4 may be different. It may be something that is fixed or predetermined. For example, the position and orientation of the movable unit 5 with respect to the measuring instrument model 24 and the measurement range model 25 are fixed. For this reason, as shown in FIG. 6, a coordinate system 22c representing the position and orientation of the movable unit 5 in which the relative position and orientation with respect to the measurement range model 25 is fixed is displayed on the display unit 7, and the user changes the position and orientation of the coordinate system 22c. You may operate to change it. Further, for example, the display unit 7 displays a robot model, which is a three-dimensional model imitating the geometric shape of the mounter 11 fixing the measuring device 2 and the movable unit 5, and the user views the robot model while viewing the robot model. May be changed.
[第2の実施形態]
本実施形態を含む以下の各実施形態や各変形例では、第1の実施形態との差分について説明し、以下で特に触れない限りは第1の実施形態と同様であるものとする。本実施形態では、第1の実施形態に対し、ユーザによる入力及び設定をより簡単にするGUIを提供する。 [Second embodiment]
In each of the following embodiments and modifications including the present embodiment, differences from the first embodiment will be described, and unless otherwise specified below, the same as the first embodiment will be described. In the present embodiment, a GUI is provided that makes input and setting by a user easier than in the first embodiment.
本実施形態を含む以下の各実施形態や各変形例では、第1の実施形態との差分について説明し、以下で特に触れない限りは第1の実施形態と同様であるものとする。本実施形態では、第1の実施形態に対し、ユーザによる入力及び設定をより簡単にするGUIを提供する。 [Second embodiment]
In each of the following embodiments and modifications including the present embodiment, differences from the first embodiment will be described, and unless otherwise specified below, the same as the first embodiment will be described. In the present embodiment, a GUI is provided that makes input and setting by a user easier than in the first embodiment.
本実施形態では、図2のGUIの代わりに、図7のGUIが表示部7の表示画面に表示される。表示領域17には、計測器2による撮像画像が表示される。
In the present embodiment, the GUI of FIG. 7 is displayed on the display screen of the display unit 7 instead of the GUI of FIG. In the display area 17, an image captured by the measuring instrument 2 is displayed.
ユーザは各種の入力や操作を行う前に、計測対象3を現実空間中の所望の位置に所望の姿勢でもって設置する。本実施形態では、計測対象3は比較的大型の直方体形状の一個体とする。次に、ユーザは、計測対象3がおおよそ計測器2の計測可能な領域に入るように(計測対象3を最低限計測できる範囲に収めるように)、プログラミングペンダント15を操作して可動部5の移動を指示する。表示領域17には計測器2による撮像画像が表示されるので、ユーザは表示領域17を観察しながら可動部5を移動させ、表示領域17に計測対象3が計測可能な状態で写ったことを確認すると、プログラミングペンダント15を操作して該移動を停止させる。
(4) Before performing various inputs and operations, the user sets the measurement target 3 at a desired position in the real space with a desired posture. In the present embodiment, the measurement target 3 is a relatively large rectangular solid. Next, the user operates the programming pendant 15 to move the movable unit 5 so that the measurement target 3 is approximately in the measurable area of the measuring device 2 (so that the measurement target 3 is within the minimum measurable range). Instruct movement. Since the image captured by the measuring device 2 is displayed in the display area 17, the user moves the movable unit 5 while observing the display area 17, and confirms that the measurement target 3 has been captured in the display area 17 in a measurable state. When confirmed, the user operates the programming pendant 15 to stop the movement.
そしてユーザは、計測対象3がおおよそ計測器2の計測可能な領域に入ったことを確認すると、このときの可動部5の位置姿勢を可動部位置姿勢の欄22に入力する。このときの可動部5の位置姿勢を欄22に入力するための方法には様々な方法があり、特定の方法に限らない。例えば、ユーザは、プログラミングペンダント15に表示されている可動部5の位置姿勢を、入力部8や表示部7の表示画面を操作して欄22に入力しても良い。またユーザは、プログラミングペンダント15を操作することで、該プログラミングペンダント15に表示されている位置姿勢を画像処理装置1に転送して欄22に表示させても良い。
{Circle around (4)} When the user confirms that the measurement target 3 is approximately in the measurable area of the measuring device 2, the user inputs the position and orientation of the movable unit 5 at this time in the movable unit position and orientation column 22. There are various methods for inputting the position and orientation of the movable unit 5 into the column 22 at this time, and there is no particular method. For example, the user may input the position and orientation of the movable unit 5 displayed on the programming pendant 15 to the field 22 by operating the input unit 8 and the display screen of the display unit 7. Further, the user may operate the programming pendant 15 to transfer the position and orientation displayed on the programming pendant 15 to the image processing apparatus 1 and display the position and orientation on the column 22.
次に、ユーザが入力部8や表示部7の表示画面を操作して「ワーク計測」ボタン26を指示すると、演算部9は、表示領域17に表示されている撮像画像に基づいて、該撮像画像中の計測対象3の位置姿勢を求める(計測する)。撮像画像中の物体の位置姿勢を求める(計測する)方法には様々な周知の方法があり、如何なる方法を採用しても良い。例えば演算部9は、撮像画像に対して画像処理を施した結果と、記憶部6に予め登録している「計測対象3に関するマッチング用の3次元モデル」と、をマッチングして計測対象3の位置姿勢を求める。ここで、演算部9が撮像画像から求めた計測対象3の位置姿勢は、計測器2を基準とする座標系における位置姿勢である。
Next, when the user operates the input section 8 or the display screen of the display section 7 to designate the “work measurement” button 26, the arithmetic section 9 performs the imaging based on the captured image displayed in the display area 17. The position and orientation of the measurement target 3 in the image are obtained (measured). There are various known methods for obtaining (measuring) the position and orientation of an object in a captured image, and any method may be employed. For example, the calculation unit 9 matches the result of performing image processing on the captured image with the “3D model for matching with respect to the measurement target 3” registered in the storage unit 6 in advance, and Find the position and orientation. Here, the position and orientation of the measurement target 3 obtained from the captured image by the calculation unit 9 is a position and orientation in a coordinate system based on the measuring device 2.
そして演算部9は、撮像画像中の計測対象3の位置姿勢を求めると、図8に例示するGUIを表示部7の表示画面に表示する。図8のGUIでは、求めた位置姿勢でもって配置した計測対象モデル23が、撮像画像に重畳した状態で表示領域17に表示されている。
{Circle around (4)} After calculating the position and orientation of the measurement target 3 in the captured image, the calculation unit 9 displays a GUI illustrated in FIG. In the GUI of FIG. 8, the measurement target model 23 arranged with the obtained position and orientation is displayed in the display area 17 in a state of being superimposed on the captured image.
そして演算部9は、撮像画像から求めた計測対象3の位置姿勢、記憶部6に予め登録している「計測器2と可動部5との間の相対的な位置姿勢」、欄22に表示している位置姿勢、に基づいて、基準座標系(仮想座標系)における計測対象3の位置姿勢を求める。例えば、欄22に表示している位置姿勢に「計測器2と可動部5との間の相対的な位置姿勢」及び「撮像画像から求めた計測対象3の位置姿勢」を加えることで、基準座標系(仮想座標系)における計測対象3の位置姿勢を求める。そして演算部9は、基準座標系(仮想座標系)における計測対象3の位置姿勢を計測対象位置姿勢の欄21に表示する。
The calculation unit 9 displays the position and orientation of the measurement target 3 obtained from the captured image, the “relative position and orientation between the measuring device 2 and the movable unit 5” registered in the storage unit 6 in advance, in a column 22. The position and orientation of the measurement target 3 in the reference coordinate system (virtual coordinate system) are obtained based on the position and orientation being measured. For example, by adding “the relative position and orientation between the measuring device 2 and the movable unit 5” and “the position and orientation of the measurement target 3 obtained from the captured image” to the position and orientation displayed in the column 22, the reference is obtained. The position and orientation of the measurement target 3 in the coordinate system (virtual coordinate system) are obtained. Then, the calculation unit 9 displays the position and orientation of the measurement target 3 in the reference coordinate system (virtual coordinate system) in the measurement target position and orientation column 21.
ここで、ユーザが入力部8や表示部7の表示画面を操作して「計測対象位置姿勢登録」ボタン27を指示すると、演算部9は、図9に例示するGUIを表示部7の表示画面に表示する。表示領域17には、上記の視点から見た計測対象モデル23、計測器モデル24、計測範囲モデル25の画像、アイコン18が表示されている。計測対象モデル23の位置姿勢は、「ワーク計測」26を押下したことで行われた処理によって求めた「基準座標系(仮想座標系)における計測対象3の位置姿勢」である。また、計測器モデル24の位置姿勢は、欄22に表示している位置姿勢に「計測器2と可動部5との間の相対的な位置姿勢」を加えることで求められる。また、計測範囲モデル25の位置姿勢は、計測器モデルの基準座標系における位置姿勢に「計測器2に対する計測範囲4の相対的な位置姿勢」を加えることで求めることができる。
Here, when the user operates the input unit 8 or the display screen of the display unit 7 to designate the “measurement target position / posture registration” button 27, the calculation unit 9 displays the GUI illustrated in FIG. To display. The display area 17 displays an image of the measurement target model 23, the measurement device model 24, and the measurement range model 25 as viewed from the above viewpoint, and the icon 18. The position and orientation of the measurement target model 23 is “the position and orientation of the measurement target 3 in the reference coordinate system (virtual coordinate system)” obtained by the process performed by pressing the “work measurement” 26. The position and orientation of the measuring instrument model 24 is obtained by adding “the relative position and orientation between the measuring instrument 2 and the movable unit 5” to the position and orientation displayed in the column 22. The position and orientation of the measurement range model 25 can be obtained by adding “the relative position and orientation of the measurement range 4 with respect to the measurement device 2” to the position and orientation of the measurement device model in the reference coordinate system.
各モデルが表示された後は、第1の実施形態と同様、ユーザは計測範囲モデルの内部に計測対象モデルの一部若しくは全部が収まるように計測器モデルや計測範囲モデルを移動させる操作を行う。該操作が完了した状態におけるGUIの表示例を図10に示す。
After each model is displayed, as in the first embodiment, the user performs an operation of moving the measuring instrument model or the measurement range model so that a part or the whole of the measurement target model fits inside the measurement range model. . FIG. 10 shows a display example of the GUI in a state where the operation is completed.
図10の例では、計測対象モデル23の表層部が計測範囲モデル25のほぼ中央に合わせられている。その際の可動部5の位置姿勢は演算部9で第1の実施形態と同様に演算され、欄22に表示される。これにより使用者は所望の条件で計測を行うための可動部位置姿勢を知ることが可能である。
In the example of FIG. 10, the surface portion of the measurement target model 23 is substantially aligned with the center of the measurement range model 25. The position and orientation of the movable unit 5 at that time are calculated by the calculation unit 9 in the same manner as in the first embodiment, and are displayed in the column 22. This allows the user to know the position and orientation of the movable part for performing measurement under desired conditions.
このように、本実施形態によれば、第1の実施形態で前提としていた基準座標系における計測対象3の大よその位置姿勢を予め取得しているという条件は不要である。すなわち、実空間上でメジャーなどを用いて位置姿勢を取得する行為や、設置台10の位置姿勢などから位置姿勢を取得する行為が不要であり、作業がより簡単になる。また、計測さえ出来れば、その際の計測対象モデル23及び計測器モデル24及び計測範囲モデル25の実空間上での配置関係が仮想空間に表示できるため、第1の実施形態よりユーザの入力、設定負荷が減少する。
As described above, according to the present embodiment, the condition that the approximate position and orientation of the measurement target 3 in the reference coordinate system presupposed in the first embodiment is not required in advance. That is, the act of acquiring the position and orientation using a measure or the like in the real space and the act of acquiring the position and orientation from the position and orientation of the installation table 10 are not required, and the work is further simplified. In addition, if the measurement can be performed, the arrangement relationship of the measurement target model 23, the measurement device model 24, and the measurement range model 25 in the real space at that time can be displayed in the virtual space. The set load decreases.
[第3の実施形態]
本実施形態では、計測器モデルの位置姿勢を有する視点から計測範囲モデルが示す計測範囲(視体積)内の仮想空間を観察した場合に見える画像(仮想空間画像)を生成してGUIに表示する。規定の位置姿勢を有する視点から見える「規定の視体積内の空間」の画像を生成するための技術については周知であるため、これに係る詳細な説明は省略する。例えば、図6のGUIの代わりに、図11のGUIを表示する。図11のGUIでは、計測器モデル24の位置姿勢から、計測範囲モデル25内の空間を観察した場合に見える画像29が表示されている。なお、画像29の表示形態については、上記の例に限らず、様々な表示形態が考えられる。例えば、図11において、画像29は表示領域17内に表示したが異なる箇所に表示してもよい。 [Third Embodiment]
In the present embodiment, an image (virtual space image) that is visible when a virtual space within the measurement range (view volume) indicated by the measurement range model is observed from the viewpoint having the position and orientation of the measuring instrument model, and is displayed on the GUI. . Since a technique for generating an image of a “space within a prescribed viewing volume” viewed from a viewpoint having a prescribed position and orientation is well known, detailed description thereof will be omitted. For example, the GUI of FIG. 11 is displayed instead of the GUI of FIG. In the GUI of FIG. 11, animage 29 that is visible when the space in the measurement range model 25 is observed from the position and orientation of the measurement device model 24 is displayed. The display mode of the image 29 is not limited to the above example, and various display modes can be considered. For example, in FIG. 11, the image 29 is displayed in the display area 17, but may be displayed in a different place.
本実施形態では、計測器モデルの位置姿勢を有する視点から計測範囲モデルが示す計測範囲(視体積)内の仮想空間を観察した場合に見える画像(仮想空間画像)を生成してGUIに表示する。規定の位置姿勢を有する視点から見える「規定の視体積内の空間」の画像を生成するための技術については周知であるため、これに係る詳細な説明は省略する。例えば、図6のGUIの代わりに、図11のGUIを表示する。図11のGUIでは、計測器モデル24の位置姿勢から、計測範囲モデル25内の空間を観察した場合に見える画像29が表示されている。なお、画像29の表示形態については、上記の例に限らず、様々な表示形態が考えられる。例えば、図11において、画像29は表示領域17内に表示したが異なる箇所に表示してもよい。 [Third Embodiment]
In the present embodiment, an image (virtual space image) that is visible when a virtual space within the measurement range (view volume) indicated by the measurement range model is observed from the viewpoint having the position and orientation of the measuring instrument model, and is displayed on the GUI. . Since a technique for generating an image of a “space within a prescribed viewing volume” viewed from a viewpoint having a prescribed position and orientation is well known, detailed description thereof will be omitted. For example, the GUI of FIG. 11 is displayed instead of the GUI of FIG. In the GUI of FIG. 11, an
このように、本実施形態によれば、ユーザは計測器2による撮像を行う前に計測対象3がおおよそどのように映るのかを把握することが可能であり、仮想的な撮像画像(画像29)を確認しながら効率的かつ精度よく調整を行うことが可能である。
As described above, according to the present embodiment, it is possible for the user to understand roughly how the measurement target 3 appears before performing imaging by the measuring device 2, and to capture a virtual captured image (image 29). It is possible to perform the adjustment efficiently and accurately while checking.
[第4の実施形態]
本実施形態では、GUI上で、計測対象モデルと計測範囲モデルとの相対的な位置関係を表す情報を表示する。本実施形態では、図6のGUIの代わりに、図12のGUIを表示する。相対位置の欄30には、計測範囲モデル25における規定位置と、計測対象モデル23における規定位置と、の間の相対的な位置が表示されている。図12では、計測範囲モデル25における規定位置と計測対象モデル23における規定位置との間の相対的な位置(単位はmm)として、X座標位置「3.2」、Y座標位置「2.1」、Z座標位置「1.2」が表示されている。なお、「計測範囲モデル25における規定位置」、「計測対象モデル23における規定位置」は何れも、予め指定されており、指定された位置は記憶部6に登録されているものとする。例えば、計測範囲4が表わす四錐台の底面の中心位置に対応する計測範囲モデルの位置と、計測対象3のパレットの底面の中心位置に対応する計測対象モデルの位置と、が予め指定されて、記憶部6に登録されている。然るに演算部9は、計測範囲モデル25における規定位置と、計測対象モデル23における規定位置と、の間の相対的な位置を求めて、欄30に表示する。 [Fourth embodiment]
In the present embodiment, information indicating the relative positional relationship between the measurement target model and the measurement range model is displayed on the GUI. In the present embodiment, the GUI of FIG. 12 is displayed instead of the GUI of FIG. Therelative position column 30 displays a relative position between a specified position in the measurement range model 25 and a specified position in the measurement target model 23. In FIG. 12, as a relative position (unit: mm) between the specified position in the measurement range model 25 and the specified position in the measurement target model 23, the X coordinate position "3.2" and the Y coordinate position "2.1" ”And the Z coordinate position“ 1.2 ”are displayed. It is assumed that both the “specified position in the measurement range model 25” and the “specified position in the measurement target model 23” are specified in advance, and the specified positions are registered in the storage unit 6. For example, the position of the measurement range model corresponding to the center position of the bottom surface of the truncated pyramid represented by the measurement range 4 and the position of the measurement target model corresponding to the center position of the bottom surface of the pallet of the measurement target 3 are specified in advance. , Are registered in the storage unit 6. Accordingly, the calculation unit 9 obtains a relative position between the specified position in the measurement range model 25 and the specified position in the measurement target model 23, and displays the obtained position in the column 30.
本実施形態では、GUI上で、計測対象モデルと計測範囲モデルとの相対的な位置関係を表す情報を表示する。本実施形態では、図6のGUIの代わりに、図12のGUIを表示する。相対位置の欄30には、計測範囲モデル25における規定位置と、計測対象モデル23における規定位置と、の間の相対的な位置が表示されている。図12では、計測範囲モデル25における規定位置と計測対象モデル23における規定位置との間の相対的な位置(単位はmm)として、X座標位置「3.2」、Y座標位置「2.1」、Z座標位置「1.2」が表示されている。なお、「計測範囲モデル25における規定位置」、「計測対象モデル23における規定位置」は何れも、予め指定されており、指定された位置は記憶部6に登録されているものとする。例えば、計測範囲4が表わす四錐台の底面の中心位置に対応する計測範囲モデルの位置と、計測対象3のパレットの底面の中心位置に対応する計測対象モデルの位置と、が予め指定されて、記憶部6に登録されている。然るに演算部9は、計測範囲モデル25における規定位置と、計測対象モデル23における規定位置と、の間の相対的な位置を求めて、欄30に表示する。 [Fourth embodiment]
In the present embodiment, information indicating the relative positional relationship between the measurement target model and the measurement range model is displayed on the GUI. In the present embodiment, the GUI of FIG. 12 is displayed instead of the GUI of FIG. The
欄30に表示されている数値(X座標位置、Y座標位置、Z座標位置)が0に近いほど、計測範囲モデル25における規定位置と、計測対象モデル23における規定位置と、が近いことを示している。然るにユーザは欄30に表示されている数値を見ながら、該数値が0に近づくように、計測器モデルや計測範囲モデルを移動させる。上記の例では、欄30に表示されている数値が0に近づくように計測器モデルや計測範囲モデルを移動させることで、計測範囲4の遠方端とパレット底面とを合わせるように計測器モデルや計測範囲モデルを移動させることができる。
The closer the numerical value (X coordinate position, Y coordinate position, Z coordinate position) displayed in the column 30 is to 0, the closer the specified position in the measurement range model 25 and the specified position in the measurement target model 23 are. ing. However, the user moves the measuring instrument model and the measurement range model while watching the numerical value displayed in the column 30 so that the numerical value approaches zero. In the above example, by moving the measuring instrument model and the measuring range model so that the numerical value displayed in the column 30 approaches 0, the measuring instrument model and the measuring instrument model are adjusted so that the far end of the measuring range 4 and the bottom of the pallet match. The measurement range model can be moved.
なお、計測対象モデルと計測範囲モデルとの相対的な位置関係を表す情報として、相対位置に加えて若しくは変えて他の情報を表示するようにしても良い。例えば、相対位置に加えて相対姿勢を表示するようにしても良い。また、上記の例では計測範囲4の遠方端平面の中心座標に対応する計測範囲モデルの位置と、計測対象3のパレットの底面の中心座標に対応する計測対象モデルの位置と、を指定した。しかし、計測範囲4の重心座標に対応する計測範囲モデルの位置と、計測対象3の重心座標に対応する計測対象モデルの位置と、を指定してもよい。また、計測器2の底面の中心座標に対応する計測器モデルの位置と、計測対象3のパレットの底面の中心座標に対応する計測対象モデルの位置と、を指定し、指定したそれぞれの位置間の距離を算出し、該距離を欄30の代わりに表示しても良い。
In addition, as information indicating the relative positional relationship between the measurement target model and the measurement range model, other information may be displayed in addition to or in place of the relative position. For example, a relative posture may be displayed in addition to the relative position. In the above example, the position of the measurement range model corresponding to the center coordinates of the far end plane of the measurement range 4 and the position of the measurement target model corresponding to the center coordinates of the bottom surface of the pallet of the measurement target 3 are specified. However, the position of the measurement range model corresponding to the barycenter coordinates of the measurement range 4 and the position of the measurement target model corresponding to the barycenter coordinates of the measurement target 3 may be specified. In addition, the position of the measuring instrument model corresponding to the center coordinates of the bottom surface of the measuring instrument 2 and the position of the measuring object model corresponding to the center coordinates of the bottom surface of the pallet of the measuring object 3 are specified. May be calculated, and the distance may be displayed instead of the column 30.
また、図12のGUIでは、計測対象モデル23と計測範囲モデル25との相対関係を数値により示したが、モデルの色情報などにより計測対象が計測範囲の中のどこに位置するのかを示すようにしてもよい。この場合、図6のGUIの代わりに、図13のGUIを表示する。
Further, in the GUI of FIG. 12, the relative relationship between the measurement target model 23 and the measurement range model 25 is indicated by numerical values. However, the position of the measurement target in the measurement range is indicated by color information of the model. May be. In this case, the GUI of FIG. 13 is displayed instead of the GUI of FIG.
図13のGUIでは、計測対象モデル23の代わりに、該計測対象モデル23の各部が計測範囲モデル25におけるどの領域に属しているのかを色別に示した計測対象モデル31が表示されている。例えば演算部9は、計測対象モデル31を規定の単位(例えばあるボリューム)で分割した各要素が、計測範囲モデル25内のどの位置に存在するかを求め、該要素を該求めた位置に対応する色で描画する。図13では計測対象モデル31を構成する各要素のうち計測範囲モデル25の遠方(計測器モデル24から見て遠方)にある要素は濃色で描画されており、近方端と遠方端の中間位置にある要素は淡色で描画されている。本実施形態においても図6と同様に、計測範囲4の遠方端とパレット底面を合わせることが調整の目的であるので、図13に示す如く、計測対象モデル31の底面の色が濃色になるように調整を行うことで該目的が達成される。
13. In the GUI of FIG. 13, instead of the measurement target model 23, a measurement target model 31 indicating, by color, which region in the measurement range model 25 each part of the measurement target model 23 is displayed. For example, the calculation unit 9 determines at which position in the measurement range model 25 each element obtained by dividing the measurement target model 31 in a prescribed unit (for example, a certain volume) corresponds to the element corresponding to the determined position. Draw in the color you want. In FIG. 13, elements that are far from the measurement range model 25 (far as viewed from the measuring instrument model 24) among the elements constituting the measurement target model 31 are drawn in dark color, and are intermediate between the near end and the far end. The element at the position is drawn in light color. In the present embodiment as well, as in FIG. 6, the purpose of adjustment is to match the far end of the measurement range 4 with the pallet bottom surface, so that the color of the bottom surface of the measurement target model 31 becomes dark as shown in FIG. 13. The object is achieved by performing such adjustment.
また例えば、計測対象モデル23の代わりに、該計測対象モデル23の各部が計測器モデル24における規定位置からどれだけ離間しているのかを色別に示した計測対象モデルを表示しても良い。
Also, for example, instead of the measurement target model 23, a measurement target model indicating how much each part of the measurement target model 23 is apart from the specified position in the measuring instrument model 24 by color may be displayed.
また、着色の様態も様々であり、上記の例の通り濃淡のグラデーションにより示してもよいし、異なる色のグラデーションで示してもよい。また、例えば所定領域の遠方端平面など特定の位置にある計測対象モデルの要素のみを着色してもよい。更に、計測範囲モデル25の外に存在する計測対象モデルの要素を強調するように着色することも考えられる。
着色 Also, there are various modes of coloring, and they may be represented by gradations of light and shade as in the above example, or may be represented by gradations of different colors. Further, for example, only the elements of the measurement target model at a specific position such as a far end plane of a predetermined area may be colored. Furthermore, coloring may be considered so as to emphasize elements of the measurement target model existing outside the measurement range model 25.
このように、本実施形態によれば、計測対象モデルと計測器モデルもしくは計測範囲モデルとの相対関係が数値もしくは着色により直感的に把握することが出来るので、それらの相対関係を仮想空間上で効率的かつ精度よく調整を行うことが可能である。
As described above, according to the present embodiment, the relative relationship between the measurement target model and the measuring instrument model or the measurement range model can be intuitively grasped by numerical values or coloring. The adjustment can be performed efficiently and accurately.
[第5の実施形態]
本実施形態では、図6のGUIの代わりに、図14のGUIを表示する。一般に、実空間において可動部5には可動範囲や姿勢制約が存在する。然るに、可動部5が取り得る位置や姿勢の範囲を規定する条件情報を予め作成して記憶部6に登録しておく。そして演算部9は、GUIの表示後、欄22に表示されている可動部5の位置や姿勢が条件情報が示す位置や姿勢の範囲に入っているか否かを判断し、入っていない場合には、警告を表示部7の表示画面に表示する。図14のGUIでは、警告としてのメッセージ32「ロボットの姿勢制約範囲外です」が表示されており、欄22に表示されている可動部5の姿勢が姿勢制約の範囲を超えていることをユーザに通知する。これによりユーザは、実空間上で可動部5を実際に動かすことを試みなくとも、仮想空間における配置関係が実空間上で実現可能か否かを知ることが出来る。 [Fifth Embodiment]
In the present embodiment, the GUI of FIG. 14 is displayed instead of the GUI of FIG. In general, the movable section 5 has a movable range and a posture restriction in the real space. However, condition information that defines the range of positions and postures that the movable unit 5 can take is created in advance and registered in thestorage unit 6. After displaying the GUI, the calculation unit 9 determines whether or not the position and the posture of the movable unit 5 displayed in the column 22 are within the range of the position and the posture indicated by the condition information. Displays a warning on the display screen of the display unit 7. In the GUI of FIG. 14, a message 32 “out of robot posture restriction range” is displayed as a warning, and the user is notified that the posture of the movable unit 5 displayed in the column 22 exceeds the posture restriction range. Notify Thus, the user can know whether or not the arrangement relationship in the virtual space can be realized in the real space without actually trying to move the movable unit 5 in the real space.
本実施形態では、図6のGUIの代わりに、図14のGUIを表示する。一般に、実空間において可動部5には可動範囲や姿勢制約が存在する。然るに、可動部5が取り得る位置や姿勢の範囲を規定する条件情報を予め作成して記憶部6に登録しておく。そして演算部9は、GUIの表示後、欄22に表示されている可動部5の位置や姿勢が条件情報が示す位置や姿勢の範囲に入っているか否かを判断し、入っていない場合には、警告を表示部7の表示画面に表示する。図14のGUIでは、警告としてのメッセージ32「ロボットの姿勢制約範囲外です」が表示されており、欄22に表示されている可動部5の姿勢が姿勢制約の範囲を超えていることをユーザに通知する。これによりユーザは、実空間上で可動部5を実際に動かすことを試みなくとも、仮想空間における配置関係が実空間上で実現可能か否かを知ることが出来る。 [Fifth Embodiment]
In the present embodiment, the GUI of FIG. 14 is displayed instead of the GUI of FIG. In general, the movable section 5 has a movable range and a posture restriction in the real space. However, condition information that defines the range of positions and postures that the movable unit 5 can take is created in advance and registered in the
また、上記のGUIにおいて、可動部5が取り得ない位置や姿勢の範囲に対応する計測器モデル及び計測範囲モデルの範囲(禁止領域)を半透表示することで、計測器モデルや計測範囲モデルを移動させてはいけない領域をユーザに通知することができる。また、同様の目的のために、禁止領域に計測器モデルや計測範囲モデルが移動した場合に何らかのメッセージを表示部7の表示画面に表示させるようにしても良い。
In the above GUI, the measuring instrument model and the measuring range model are displayed by semi-transparently displaying the measuring instrument model and the range of the measuring range model (prohibited area) corresponding to the range of the position and orientation that cannot be taken by the movable section 5. The user can be notified of an area that should not be moved. For the same purpose, a message may be displayed on the display screen of the display unit 7 when the measuring instrument model or the measurement range model moves to the prohibited area.
また、計測器2や可動部5と計測対象3とが干渉すると、装置や計測対象3の破損などが生じ得る。そこで演算部9は、計測器モデルと計測対象モデルとの干渉判定を行い、干渉があれば警告(例えば、干渉があることを示すメッセージ)を表示部7の表示画面に表示する。また演算部9は、可動部5の位置姿勢に該可動部5の幾何形状を模したモデルを配置した場合に、該モデルと計測対象モデルとの干渉判定を行い、干渉があれば警告(例えば、干渉があることを示すメッセージ)を表示部7の表示画面に表示する。
If the measuring instrument 2 or the movable section 5 interferes with the measurement target 3, the device or the measurement target 3 may be damaged. Therefore, the calculation unit 9 determines interference between the measuring instrument model and the measurement target model, and displays a warning (for example, a message indicating that there is interference) on the display screen of the display unit 7 if there is interference. In addition, when a model imitating the geometric shape of the movable unit 5 is placed at the position and orientation of the movable unit 5, the calculation unit 9 performs interference determination between the model and the measurement target model, and warns if there is interference (for example, , A message indicating that there is interference) is displayed on the display screen of the display unit 7.
また、警告の表示形態は特定の表示形態に限らない。例えば、条件情報が示す位置の範囲に入っていない場合、欄22に表示されている位置を赤字表示する等、強調表示しても良いし、条件情報が示す姿勢の範囲に入っていない場合、欄22に表示されている姿勢を赤字表示する等、強調表示しても良い。
警告 Also, the display form of the warning is not limited to a specific display form. For example, when the position is not within the range of the position indicated by the condition information, the position displayed in the column 22 may be highlighted, such as displaying in red, or when the position is not within the range of the posture indicated by the condition information, The posture displayed in the column 22 may be highlighted, for example, displayed in red.
このように、本実施形態によれば、ユーザは実空間上で可動部5を動かすことなく、移動が実空間上で実現可能なものか、あるいは移動により干渉が起きないかを判定することが可能であり、設定の効率化や装置の物理的損傷の防止に寄与する。
As described above, according to the present embodiment, the user can determine whether the movement can be realized in the real space or does not cause interference due to the movement without moving the movable unit 5 in the real space. It is possible and contributes to setting efficiency and prevention of physical damage to the device.
[第6の実施形態]
本実施形態では、複数の計測対象3に対する可動部5の位置姿勢を求めて表示する。本実施形態では、図6のGUIの代わりに図15に示すGUIを表示する。表示領域17には、3つの計測対象3(対象1,対象2,対象3)のそれぞれに対応する計測対象モデル23a、23b、23cが表示されている。ユーザは計測対象モデル23a、23b、23cのうち1つの計測対象モデル(ここでは一例として計測対象モデル23aとする)に対する計測器モデル24や計測範囲モデル25の位置姿勢を第1の実施形態と同様に調整しているものとする。ここで、ユーザは入力部8や表示部7の表示画面を操作して計測対象配置の欄33に、計測対象3の個数や配置情報を入力する。図15では、計測対象3の個数(図15の場合は3)、計測対象3のX軸方向の間隔(図15では30mm)、Y軸方向の間隔(図15では-200mm)、Z軸方向の間隔(図15では0mm)、を入力している。これはすなわち、計測対象モデル23aの位置からX軸方向に30mm、Y軸方向に-200mm、Z軸方向に0mm移動した位置に計測対象モデル23bが位置していることを示している。またこれは、計測対象モデル23bの位置からX軸方向に30mm、Y軸方向に-200mm、Z軸方向に0mm移動した位置に計測対象モデル23cが位置していることを示している。ここでは説明を簡単にするために、姿勢については全ての計測対象モデルで同じとしている。 [Sixth Embodiment]
In the present embodiment, the position and orientation of the movable unit 5 with respect to the plurality of measurement targets 3 are obtained and displayed. In the present embodiment, a GUI shown in FIG. 15 is displayed instead of the GUI shown in FIG. In thedisplay area 17, the measurement target models 23a, 23b, and 23c corresponding to the three measurement targets 3 (target 1, target 2, target 3) are displayed. The user sets the position and orientation of the measuring instrument model 24 and the measurement range model 25 with respect to one of the measurement target models 23a, 23b, and 23c (here, as an example, the measurement target model 23a) as in the first embodiment. It is assumed that it is adjusted to Here, the user operates the input unit 8 and the display screen of the display unit 7 to input the number and arrangement information of the measurement targets 3 in the measurement target arrangement column 33. In FIG. 15, the number of the measurement targets 3 (3 in FIG. 15), the distance between the measurement targets 3 in the X-axis direction (30 mm in FIG. 15), the distance in the Y-axis direction (−200 mm in FIG. 15), and the Z-axis direction (0 mm in FIG. 15). This means that the measurement target model 23b is located at a position shifted by 30 mm in the X axis direction, −200 mm in the Y axis direction, and 0 mm in the Z axis direction from the position of the measurement target model 23a. This also indicates that the measurement target model 23c is located at a position shifted by 30 mm in the X-axis direction, −200 mm in the Y-axis direction, and 0 mm in the Z-axis direction from the position of the measurement target model 23b. Here, for the sake of simplicity, the posture is assumed to be the same for all measurement target models.
本実施形態では、複数の計測対象3に対する可動部5の位置姿勢を求めて表示する。本実施形態では、図6のGUIの代わりに図15に示すGUIを表示する。表示領域17には、3つの計測対象3(対象1,対象2,対象3)のそれぞれに対応する計測対象モデル23a、23b、23cが表示されている。ユーザは計測対象モデル23a、23b、23cのうち1つの計測対象モデル(ここでは一例として計測対象モデル23aとする)に対する計測器モデル24や計測範囲モデル25の位置姿勢を第1の実施形態と同様に調整しているものとする。ここで、ユーザは入力部8や表示部7の表示画面を操作して計測対象配置の欄33に、計測対象3の個数や配置情報を入力する。図15では、計測対象3の個数(図15の場合は3)、計測対象3のX軸方向の間隔(図15では30mm)、Y軸方向の間隔(図15では-200mm)、Z軸方向の間隔(図15では0mm)、を入力している。これはすなわち、計測対象モデル23aの位置からX軸方向に30mm、Y軸方向に-200mm、Z軸方向に0mm移動した位置に計測対象モデル23bが位置していることを示している。またこれは、計測対象モデル23bの位置からX軸方向に30mm、Y軸方向に-200mm、Z軸方向に0mm移動した位置に計測対象モデル23cが位置していることを示している。ここでは説明を簡単にするために、姿勢については全ての計測対象モデルで同じとしている。 [Sixth Embodiment]
In the present embodiment, the position and orientation of the movable unit 5 with respect to the plurality of measurement targets 3 are obtained and displayed. In the present embodiment, a GUI shown in FIG. 15 is displayed instead of the GUI shown in FIG. In the
そしてユーザが入力部8や表示部7の表示画面を操作して「算出」ボタン1501を指示すると、演算部9は、計測対象モデル23aについて求めた可動部5の位置姿勢と、計測対象モデル23aの位置姿勢と、の間の相対位置姿勢を相対位置姿勢Tとする。そして演算部9は、計測対象モデル23bの位置姿勢との相対位置姿勢が相対位置姿勢Tとなるような位置姿勢を、「計測対象モデル23bに対応する可動部5の位置姿勢」として求める。同様に演算部9は、計測対象モデル23cの位置姿勢との相対位置姿勢が相対位置姿勢Tとなるような位置姿勢を、「計測対象モデル23cに対応する可動部5の位置姿勢」として求める。このようにして対象1,対象2,対象3のそれぞれに対応する可動部5の位置姿勢が求まるので、演算部9は、これら求めた可動部5の位置姿勢を欄22に表示する。これにより、個別の計測対象モデルに対して計測器モデル24や計測範囲モデル25の相対関係を調整しなくても、それぞれを計測する際の可動部5の位置姿勢を得ることが出来る。なお、ここでは計測対象モデル23a、23b、23cの順に対応する可動部5の位置姿勢を求めたが、この順に限らない。
When the user operates the input unit 8 or the display screen of the display unit 7 to indicate the “calculation” button 1501, the calculation unit 9 determines the position and orientation of the movable unit 5 obtained for the measurement target model 23a and the measurement target model 23a. The relative position / posture between and is referred to as a relative position / posture T. Then, the calculation unit 9 obtains a position and orientation such that the relative position and orientation with respect to the position and orientation of the measurement target model 23b becomes the relative position and orientation T, as “the position and orientation of the movable unit 5 corresponding to the measurement target model 23b”. Similarly, the calculation unit 9 obtains a position and orientation such that the relative position and orientation relative to the position and orientation of the measurement target model 23c is the relative position and orientation T, as "the position and orientation of the movable unit 5 corresponding to the measurement target model 23c". Since the position and orientation of the movable unit 5 corresponding to each of the target 1, the target 2, and the target 3 are obtained in this manner, the calculation unit 9 displays the obtained position and orientation of the movable unit 5 in the column 22. This makes it possible to obtain the position and orientation of the movable part 5 when measuring each individual model without adjusting the relative relationship between the measuring instrument model 24 and the measurement range model 25 with respect to the individual measurement target models. Here, the position and orientation of the movable unit 5 corresponding to the measurement target models 23a, 23b, and 23c in this order are obtained, but the order is not limited to this order.
なお、複数の計測対象3に対応する可動部5の位置姿勢を求める手順は特定の手順に限らない。例えば、1つの計測対象モデルの位置姿勢を設定した後で欄33に計測対象モデルの数や配置間隔を入力することで他の計測対象モデルの位置姿勢を確定させる。そしてその後、何れか1つの計測対象モデルに対する計測器モデル24や計測範囲モデル25の位置姿勢を調整する。
The procedure for obtaining the position and orientation of the movable unit 5 corresponding to the plurality of measurement targets 3 is not limited to a specific procedure. For example, after setting the position and orientation of one measurement target model, the number and arrangement intervals of the measurement target models are input in the column 33 to determine the position and orientation of another measurement target model. After that, the position and orientation of the measuring instrument model 24 and the measurement range model 25 with respect to any one measurement target model are adjusted.
また、図15では、欄33に入力可能な情報は計測対象モデルの数と配置間隔のみであったが、他の情報を入力するようにしても良い。例えば、それぞれの計測対象モデルの姿勢成分を規定する情報を入力するようにしても良い。
In FIG. 15, the information that can be entered in the column 33 is only the number of measurement target models and the arrangement intervals, but other information may be input. For example, information specifying the posture component of each measurement target model may be input.
また、欄21には、それぞれの計測対象モデルの位置姿勢を表示するようにしても良い。また、それぞれの計測対象3は同種の物体であるものとして記したが、これらが異なる種別の物体でもよい。
In addition, the column 21 may display the position and orientation of each measurement target model. In addition, although the respective measurement targets 3 are described as being the same type of object, they may be different types of objects.
<変形例>
上記の実施形態や変形例では、計測器2は3次元形状計測器としたが、その他の計測器であっても良い。例えば、計測器2は、該計測器2から対象までの距離の演算を行う計測器であっても良いし、撮像画像から輪郭抽出を行って計測対象3の検査を行う装置であっても良い。つまり、その性能や仕様により定義される計測範囲を有する多様な形態の計測器2を想定することが可能である。 <Modification>
In the above embodiments and modifications, the measuringdevice 2 is a three-dimensional shape measuring device, but may be another measuring device. For example, the measuring device 2 may be a measuring device that calculates the distance from the measuring device 2 to a target, or may be a device that extracts a contour from a captured image and inspects the measuring target 3. . That is, it is possible to assume various forms of the measuring device 2 having a measuring range defined by its performance and specifications.
上記の実施形態や変形例では、計測器2は3次元形状計測器としたが、その他の計測器であっても良い。例えば、計測器2は、該計測器2から対象までの距離の演算を行う計測器であっても良いし、撮像画像から輪郭抽出を行って計測対象3の検査を行う装置であっても良い。つまり、その性能や仕様により定義される計測範囲を有する多様な形態の計測器2を想定することが可能である。 <Modification>
In the above embodiments and modifications, the measuring
また、上記の実施形態や変形例では、計測対象3はばら積みされた小型ワークやそれを収納するパレットとしたが、これに限らない。例えば、パレットが存在しない場合には、ばら積みされた小型ワークが存在するおおよその空間全体を計測対象3として扱うことも出来る。また、比較的大型のワーク一個体のみが載置される場合も往々にしてあり、その中でも特定の範囲を計測するようなケースも考えられ、その場合、該特定の範囲を計測対象3としても良い。つまり、計測対象3のサイズや材質などに依らない。
Also, in the above-described embodiment and the modified examples, the measurement target 3 is a small work stacked in bulk or a pallet for storing the small work, but is not limited thereto. For example, when there is no pallet, it is also possible to treat the approximate entire space where small piled works exist as the measurement target 3. In addition, it is often the case that only a single relatively large work is placed, and among them, there is a case where a specific range is measured. In that case, the specific range is set as the measurement target 3. good. That is, it does not depend on the size, material, or the like of the measurement target 3.
また、上記の実施形態や変形例では、計測範囲4は計測器2が一定の計測性能を達成することが可能な四角錐台状の領域としたが、これに限らない。まず、計測範囲4の定義として、計測性能までは示さなくとも、計測器の撮像範囲としておおよそ望ましいワーキングディスタンス領域を示すこともある。また、計測範囲4の形状には様々な形状が考え得る。例えば、計測範囲4として、四角錐台状の領域の他に、円錐台状、あるいは四角錐状や円錐状などの錐体状、直方体状や立方体状の領域も考えられる。加えて、計測範囲4のサイズも計測器2の性能などの各種条件により、多様に変わり得る。
Also, in the above-described embodiment and modified examples, the measurement range 4 is a truncated quadrangular pyramid-shaped region in which the measurement device 2 can achieve a constant measurement performance, but is not limited thereto. First, as the definition of the measurement range 4, a working distance region that is generally desirable as an imaging range of a measuring instrument may be indicated without indicating measurement performance. Various shapes can be considered for the shape of the measurement range 4. For example, as the measurement range 4, in addition to the truncated quadrangular pyramid region, a truncated cone shape, a pyramid shape such as a quadrangular pyramid shape or a conical shape, a rectangular parallelepiped shape, or a cubic shape may be considered. In addition, the size of the measurement range 4 can be variously changed depending on various conditions such as the performance of the measurement device 2.
また上記の実施形態や変形例では、可動部5は計測対象3を把持するためにロボットアーム12のフランジ部に固定接続されたツールの先端としたが、これに限らない。上述したように、ロボットアームの一部、例えばロボットアーム12のフランジ部を可動部5とすることも出来る。可動部5は自身の移動に伴う位置姿勢の変化に依らず計測器2と特定の相対関係にある部位であればよく、その定義内で様々な部位を用いることが可能である。
In the above-described embodiment and the modification, the movable portion 5 is the tip of the tool fixedly connected to the flange portion of the robot arm 12 for gripping the measurement target 3, but is not limited thereto. As described above, a part of the robot arm, for example, a flange portion of the robot arm 12 may be the movable portion 5. The movable part 5 may be any part having a specific relative relationship with the measuring instrument 2 irrespective of a change in position and orientation due to its movement, and various parts can be used within the definition.
また、画像処理装置1の構成は、図1に示した構成に限らない。例えば、図1では記憶部6は画像処理装置1内に備わっているものとしたが、画像処理装置1の外部に備わっていても良く、例えば、ロボットコントローラ14に記憶部6を組み込んでも良いし、画像処理装置1と通信可能な外部装置に組み込んでも良い。これは演算部9についても同様である。
The configuration of the image processing apparatus 1 is not limited to the configuration shown in FIG. For example, in FIG. 1, the storage unit 6 is provided inside the image processing apparatus 1, but may be provided outside the image processing apparatus 1. For example, the storage unit 6 may be incorporated in the robot controller 14. , May be incorporated in an external device that can communicate with the image processing apparatus 1. This is the same for the arithmetic unit 9.
また、上記の実施形態や変形例では、様々なGUIの構成について説明したが、これら説明したGUIの構成は一例であり、様々な構成が考えられる。例えば、表示領域17に、可動部5の幾何形状を模した3次元モデルや、可動部5を基準とする座標系を表すアイコンを表示することで、演算部9で算出された可動部5の位置姿勢を視覚的に表わしてもよい。
Also, in the above embodiments and modifications, various GUI configurations have been described. However, these GUI configurations described above are merely examples, and various configurations are possible. For example, by displaying a three-dimensional model imitating the geometrical shape of the movable part 5 or an icon representing a coordinate system based on the movable part 5 in the display area 17, the movable part 5 calculated by the arithmetic unit 9 is displayed. The position and orientation may be visually represented.
また、計測器2や可動部5が固定されるマウンタやロボットアーム12の3次元モデルも表示領域17に表示し、計測器モデル24や計測範囲モデル25の操作に伴ってそれらのモデルの位置姿勢が変化してもよい。その場合、マウンタやロボットアーム12の3次元モデルと、計測対象モデル23や計測器モデル24との間の干渉判定を行い、干渉があれば警告(例えば、干渉があることを示すメッセージ)を表示部7の表示画面に表示するようにしても良い。また、計測器モデル24の表示を省略することも可能である。その場合、表示領域17には計測対象モデル23と計測範囲モデル25を示し、その相対関係を調整することとなるが、計測器2と計測範囲4との間の相対的な位置姿勢は一定であるから、第1の実施形態と同様に可動部5の位置姿勢を算出することは可能である。
In addition, a three-dimensional model of the mounter to which the measuring device 2 and the movable part 5 are fixed and the robot arm 12 are also displayed in the display area 17, and the positions and orientations of the models are operated with the operation of the measuring device model 24 and the measurement range model 25. May change. In this case, the interference between the three-dimensional model of the mounter or the robot arm 12 and the measurement target model 23 or the measuring instrument model 24 is determined, and if there is interference, a warning (for example, a message indicating that there is interference) is displayed. You may make it display on the display screen of the part 7. Further, the display of the measuring instrument model 24 can be omitted. In this case, the display area 17 shows the measurement target model 23 and the measurement range model 25, and the relative relationship between them is adjusted. However, the relative position and orientation between the measurement device 2 and the measurement range 4 are constant. Therefore, it is possible to calculate the position and orientation of the movable unit 5 as in the first embodiment.
また、GUIに撮像ボタンや確認用表示領域を追加してもよい。この場合、仮想空間で上記の調整を行った後に、ユーザはプログラミングペンダント15を操作して実空間上で可動部5を移動させてから入力部8や表示部7の表示画面を操作して撮像ボタンを押下する。これにより計測器2は撮像を行い、該撮像により得られる撮像画像を画像処理装置1に送出し、演算部9は、この撮像画像を確認用表示領域に表示する。これにより、他の画面に遷移することなく、計測器2による撮像画像が所望の撮像画像(計測対象3が所望の状態で写っている)となるように調整が出来たか否かを確認することが出来る。
撮 像 Alternatively, an imaging button or a display area for confirmation may be added to the GUI. In this case, after performing the above adjustment in the virtual space, the user operates the programming pendant 15 to move the movable unit 5 in the real space, and then operates the input unit 8 and the display screen of the display unit 7 to perform imaging. Press the button. Accordingly, the measuring device 2 performs imaging, sends the captured image obtained by the imaging to the image processing apparatus 1, and the arithmetic unit 9 displays the captured image in the display area for confirmation. Thereby, without changing to another screen, it is confirmed whether or not adjustment has been performed so that the image captured by the measuring device 2 becomes a desired captured image (the measurement target 3 is captured in a desired state). Can be done.
また、上記の実施形態や変形例では、可動部5の位置姿勢を欄22に表示するものとしたが、演算部9は、可動部5の位置姿勢を表示することなくロボットコントローラ14やプログラミングペンダント15に対して出力するようにしても良い。
Further, in the above-described embodiment and the modified example, the position and orientation of the movable unit 5 are displayed in the column 22. However, the arithmetic unit 9 does not display the position and orientation of the movable unit 5 and the robot controller 14 or the programming pendant. 15 may be output.
また、計測器モデル24や計測範囲モデル25の位置姿勢をGUIにて表示しても良い。その場合、表示した位置姿勢をユーザが入力部8や表示部7の表示画面を操作して変更することで計測器モデル24や計測範囲モデル25の位置姿勢を変更するようにしても良い。
The position and orientation of the measuring instrument model 24 and the measurement range model 25 may be displayed on a GUI. In that case, the position and orientation of the measuring instrument model 24 and the measurement range model 25 may be changed by the user operating the display screen of the input unit 8 and the display unit 7 to change the displayed position and orientation.
また、GUIに表示する位置姿勢の表現方法は特定の表現方法に限らず、オイラー角表現や回転角表現等の様々な表現方法で表現可能である。また、上記のGUIでは6自由度の位置姿勢を個別のボックスに表示する例を提示したが、1つのボックスに6自由度の位置姿勢をカンマ区切りで表示することなども考えられる。
The method of expressing the position and orientation displayed on the GUI is not limited to a specific expression method, but can be expressed by various expression methods such as an Euler angle expression and a rotation angle expression. In the above-described GUI, an example in which the positions and orientations of six degrees of freedom are displayed in individual boxes has been presented. However, the position and orientation of six degrees of freedom may be displayed in one box by separating them with commas.
また、表示部7の表示画面において操作対象とするモデルの選択方法は特定の方法に限らない。例えば、操作対象のモデルの表示領域を直接指定することで選択しても良いし、表示領域17外に各モデルに対応するラジオボタンを設け、ユーザにより選択されたラジオボタンに対応するモデルを操作対象のモデルとして選択するようにしても良い。
(4) The method of selecting a model to be operated on the display screen of the display unit 7 is not limited to a specific method. For example, the model may be selected by directly specifying the display area of the model to be operated, or a radio button corresponding to each model may be provided outside the display area 17 to operate the model corresponding to the radio button selected by the user. You may make it select as a target model.
また、表示領域17に対する操作についても様々な操作がある。例えば入力部8としてマウスを用いた際にマウスホイールを回転させることにより表示領域17に表示する仮想空間の画像を拡大、縮小してもよい。仮想空間の画像の拡大、縮小については、単に画像を拡大、縮小しても良いし、仮想空間中の上記視点を視線方向/視線方向とは逆方向に移動させることで仮想空間の画像に含まれる空間範囲を変化させても良い。また、右クリックしながらマウスを動かすことで、仮想空間中の上記視点を移動させるようにしても良い。
There are also various operations on the display area 17. For example, the image of the virtual space displayed in the display area 17 may be enlarged or reduced by rotating a mouse wheel when a mouse is used as the input unit 8. Regarding the enlargement and reduction of the image in the virtual space, the image may be simply enlarged or reduced, or may be included in the image in the virtual space by moving the viewpoint in the virtual space in the direction of the line of sight / in the direction opposite to the line of sight. May be changed. Alternatively, the viewpoint may be moved in the virtual space by moving the mouse while right-clicking.
また、表示部7は、他の装置が有する表示装置としても良い。例えば、表示部7をプログラミングペンダント15が有する表示装置としても良い。また、表示部7を画像処理装置1とは別個の装置としても良いし、画像処理装置1と一体化した装置としても良い。
The display unit 7 may be a display device included in another device. For example, the display unit 7 may be a display device included in the programming pendant 15. Further, the display unit 7 may be a device separate from the image processing device 1 or a device integrated with the image processing device 1.
また、情報の入力形態については上記の入力形態に限らない。例えば、マウスやキーボードを用いて入力した情報は、例えばロボットコントローラ14からコマンド通信により通知することも考えられ、その入力形態は任意である。
入 力 Also, the information input form is not limited to the above input form. For example, information input using a mouse or a keyboard may be notified by, for example, command communication from the robot controller 14, and the input form is arbitrary.
また、上記の実施形態や変形例では、ユーザが行う操作の順を具体的に示したが、これらの順序は一例であり、これに限るものではない。例えば、計測対象モデル23の表示と計測器モデル24の表示は上記の例に記した通りの順序ではなく、逆の順序であっても構わない。
Also, in the above-described embodiments and modified examples, the order of operations performed by the user is specifically shown, but these orders are merely examples, and the present invention is not limited to this. For example, the display of the measurement target model 23 and the display of the measuring instrument model 24 may be in the reverse order instead of the order described in the above example.
また、上記の実施形態や変形例では、計測器モデル24や計測範囲モデル25を移動させる度に演算部9が可動部5の位置姿勢を演算して欄22に表示するものとした。しかし、ユーザが上記調整の完了指示を入力部8や表示部7の表示画面を操作して入力したことをトリガにして可動部5の位置姿勢の演算及び表示を行ってもよい。
In addition, in the above-described embodiment and the modified example, each time the measuring instrument model 24 or the measurement range model 25 is moved, the calculation unit 9 calculates the position and orientation of the movable unit 5 and displays the calculated position and orientation in the column 22. However, the calculation and display of the position and orientation of the movable unit 5 may be performed with a trigger triggered by the user's input of an instruction to complete the adjustment by operating the display screen of the input unit 8 or the display unit 7.
また、上記の実施形態や変形例に示した一部の手順は省略することも可能である。例えば、実空間上で計測対象3がジグに設置されているなど、予めその位置姿勢が定められている場合には、その位置姿勢を予め記憶部6などに記憶しておけば、計測対象3の位置姿勢の設定は不要である。この場合、記憶された位置姿勢に基づき、計測対象モデル23が表示領域17に表示されることとなる。
In addition, some procedures described in the above-described embodiment and modified examples can be omitted. For example, when the position and orientation are determined in advance such as when the measurement target 3 is installed on a jig in a real space, the position and orientation are stored in the storage unit 6 or the like in advance, and It is not necessary to set the position and orientation of. In this case, the measurement target model 23 is displayed in the display area 17 based on the stored position and orientation.
なお、上記の各実施形態や各変形例にて使用した数値は何れも一例であり、これらの数値に限定することを意図したものではない。また、上記の各実施形態や各変形例では、モデルは何れも、元となる現実物体の幾何形状を模したものとしたが、これに限らず、元となる現実物体が識別できるのであれば、必ずしもモデルは元となる現実物体の幾何形状を模したものに限らない。また、モデルが予めCAD等により予め作成されている場合には、この予め作成されているモデルを取得すれば良い。また、上記の各実施形態や各変形例にて説明した各GUIの構成やその操作方法は何れも一例であり、これらの構成や操作方法に限定することを意図したものではない。
Note that the numerical values used in each of the above embodiments and modifications are merely examples, and are not intended to limit the numerical values. Further, in each of the above-described embodiments and modifications, the model imitates the geometric shape of the original real object. However, the present invention is not limited to this, as long as the original real object can be identified. However, the model is not necessarily limited to a model imitating the geometric shape of the original real object. If the model has been created in advance by CAD or the like, the previously created model may be obtained. In addition, the configurations and operation methods of the GUIs described in each of the above embodiments and modifications are merely examples, and are not intended to limit the configurations and operation methods.
また、上記の実施形態や変形例の一部若しくは全部を適宜組み合わせて使用しても構わない。また、上記の実施形態や変形例の一部若しくは全部を選択的に使用しても構わない。
In addition, some or all of the above-described embodiments and modifications may be used in appropriate combination. In addition, some or all of the above-described embodiments and modifications may be selectively used.
[第7の実施形態]
上述の画像処理装置1である計測装置は、ある支持部材に支持された状態で使用されうる。本実施形態では、一例として、図17のようにロボットアーム5300(把持装置)に備え付けられて使用される制御システムについて説明する。計測装置5100は、支持台5350に置かれた被検物5210にパターン光を投影して撮像し、画像を取得する。そして、計測装置5100の制御部が、又は、計測装置5100の制御部から画像データを取得した制御部5310が、被検物5210の位置および姿勢を求め、求められた位置および姿勢の情報を制御部5310が取得する。制御部5310は、その位置および姿勢の情報に基づいて、ロボットアーム5300に駆動指令を送ってロボットアーム5300を制御する。ロボットアーム5300は先端のロボットハンドなど(把持部)で被検物5210を保持して、並進や回転などの移動をさせる。さらに、ロボットアーム5300によって被検物5210を他の部品に組み付けることにより、複数の部品で構成された物品、例えば電子回路基板や機械などを製造することができる。また、移動された被検物5210を加工することにより、物品を製造することができる。制御部5310は、CPUなどの演算装置やメモリなどの記憶装置を有する。なお、ロボットを制御する制御部を制御部5310の外部に設けても良い。また、計測装置5100により計測された計測データや得られた画像をディスプレイなどの表示部5320に表示してもよい。
(その他の実施例)
本発明は、上述の実施形態の1以上の機能を実現するプログラムを、ネットワーク又は記憶媒体を介してシステム又は装置に供給し、そのシステム又は装置のコンピュータにおける1つ以上のプロセッサーがプログラムを読出し実行する処理でも実現可能である。また、1以上の機能を実現する回路(例えば、ASIC)によっても実現可能である。 [Seventh embodiment]
The measuring device that is the above-described image processing device 1 can be used while being supported by a certain support member. In the present embodiment, as an example, a control system provided and used in a robot arm 5300 (gripping device) as shown in FIG. 17 will be described. Themeasuring device 5100 projects and captures an image of the pattern light on the test object 5210 placed on the support base 5350 to obtain an image. Then, the control unit of the measuring device 5100 or the control unit 5310 that has acquired the image data from the control unit of the measuring device 5100 obtains the position and orientation of the test object 5210 and controls the information of the obtained position and orientation. The unit 5310 acquires the information. The control unit 5310 controls the robot arm 5300 by sending a drive command to the robot arm 5300 based on the information on the position and the posture. The robot arm 5300 holds the test object 5210 with a robot hand or the like (grasping portion) at the tip, and moves such as translation or rotation. Further, by attaching the test object 5210 to another component by the robot arm 5300, an article composed of a plurality of components, for example, an electronic circuit board or a machine can be manufactured. In addition, by processing the moved test object 5210, an article can be manufactured. The control unit 5310 includes an arithmetic device such as a CPU and a storage device such as a memory. Note that a control unit for controlling the robot may be provided outside the control unit 5310. The measurement data measured by the measurement device 5100 and the obtained image may be displayed on a display unit 5320 such as a display.
(Other Examples)
The present invention supplies a program for realizing one or more functions of the above-described embodiments to a system or an apparatus via a network or a storage medium, and one or more processors in a computer of the system or the apparatus read and execute the program. It can also be realized by the following processing. Further, it can be realized by a circuit (for example, an ASIC) that realizes one or more functions.
上述の画像処理装置1である計測装置は、ある支持部材に支持された状態で使用されうる。本実施形態では、一例として、図17のようにロボットアーム5300(把持装置)に備え付けられて使用される制御システムについて説明する。計測装置5100は、支持台5350に置かれた被検物5210にパターン光を投影して撮像し、画像を取得する。そして、計測装置5100の制御部が、又は、計測装置5100の制御部から画像データを取得した制御部5310が、被検物5210の位置および姿勢を求め、求められた位置および姿勢の情報を制御部5310が取得する。制御部5310は、その位置および姿勢の情報に基づいて、ロボットアーム5300に駆動指令を送ってロボットアーム5300を制御する。ロボットアーム5300は先端のロボットハンドなど(把持部)で被検物5210を保持して、並進や回転などの移動をさせる。さらに、ロボットアーム5300によって被検物5210を他の部品に組み付けることにより、複数の部品で構成された物品、例えば電子回路基板や機械などを製造することができる。また、移動された被検物5210を加工することにより、物品を製造することができる。制御部5310は、CPUなどの演算装置やメモリなどの記憶装置を有する。なお、ロボットを制御する制御部を制御部5310の外部に設けても良い。また、計測装置5100により計測された計測データや得られた画像をディスプレイなどの表示部5320に表示してもよい。
(その他の実施例)
本発明は、上述の実施形態の1以上の機能を実現するプログラムを、ネットワーク又は記憶媒体を介してシステム又は装置に供給し、そのシステム又は装置のコンピュータにおける1つ以上のプロセッサーがプログラムを読出し実行する処理でも実現可能である。また、1以上の機能を実現する回路(例えば、ASIC)によっても実現可能である。 [Seventh embodiment]
The measuring device that is the above-described image processing device 1 can be used while being supported by a certain support member. In the present embodiment, as an example, a control system provided and used in a robot arm 5300 (gripping device) as shown in FIG. 17 will be described. The
(Other Examples)
The present invention supplies a program for realizing one or more functions of the above-described embodiments to a system or an apparatus via a network or a storage medium, and one or more processors in a computer of the system or the apparatus read and execute the program. It can also be realized by the following processing. Further, it can be realized by a circuit (for example, an ASIC) that realizes one or more functions.
本発明は上記実施の形態に制限されるものではなく、本発明の精神及び範囲から離脱することなく、様々な変更及び変形が可能である。従って、本発明の範囲を公にするために、以下の請求項を添付する。
The present invention is not limited to the above-described embodiment, and various changes and modifications can be made without departing from the spirit and scope of the present invention. Therefore, the following claims are appended to make the scope of the present invention public.
本願は、2018年7月6日提出の日本国特許出願特願2018-129460及び2019年5月17日提出の日本国特許出願特願2019-093917を基礎として優先権を主張するものであり、その記載内容の全てを、ここに援用する。
This application claims priority based on Japanese Patent Application No. 2018-129460 filed on Jul. 6, 2018 and Japanese Patent Application No. 2019-093917 filed on May 17, 2019, The entire contents of the description are incorporated herein by reference.
2:計測器 3:計測対象 4:計測範囲 5:可動部 6:記憶部 7:表示部 8:入力部 9:演算部 10:設置台 11:マウンタ 12:ロボットアーム 13:ロボットベース 14:ロボットコントローラ 15:プログラミングペンダント
2: Measuring instrument 3: Measurement target 4: Measurement range 5: Movable part 6: Storage part 7: Display part 8: Input part 9: Calculation part 10: Installation base 11: Mounter 12: Robot arm 13: Robot base 14: Robot Controller # 15: Programming pendant
Claims (27)
- 計測器による計測対象を表す第1のモデルと、該計測器の計測範囲を表す第2のモデルと、を表示画面に表示する表示手段と、
ユーザ操作に応じて前記第2のモデルの位置姿勢を変更する変更手段と、
前記第2のモデルの位置姿勢が変更された状態における、前記計測器に対する相対的な位置姿勢が一定である部位の位置姿勢、を出力する出力手段と
を備えることを特徴とする画像処理装置。 Display means for displaying, on a display screen, a first model representing an object to be measured by the measuring instrument, and a second model representing a measuring range of the measuring instrument;
Changing means for changing the position and orientation of the second model in accordance with a user operation;
An output unit configured to output a position and orientation of a part whose position and orientation relative to the measuring device are constant in a state where the position and orientation of the second model are changed. - 前記表示手段は、前記計測器または前記計測範囲の向きに相当する軸を前記表示画面に表示することを特徴とする請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the display unit displays an axis corresponding to the direction of the measuring device or the measurement range on the display screen.
- 前記表示手段は、前記計測器による前記計測対象の撮像画像に基づいて該計測対象の位置姿勢を求め、該求めた位置姿勢で配置した前記第1のモデルを前記表示画面に表示することを特徴とする請求項1または2に記載の画像処理装置。 The display means obtains a position and orientation of the measurement target based on a captured image of the measurement target by the measuring device, and displays the first model arranged at the obtained position and orientation on the display screen. The image processing apparatus according to claim 1 or 2, wherein
- 前記表示手段は、前記第1のモデルと前記第2のモデルとの間の相対的な位置および/または姿勢を前記表示画面に表示することを特徴とする請求項1乃至3の何れか1項に記載の画像処理装置。 4. The display device according to claim 1, wherein the display unit displays a relative position and / or posture between the first model and the second model on the display screen. 5. An image processing apparatus according to claim 1.
- 前記表示手段は、前記第1のモデルを構成する各要素を、前記第2のモデルとの位置関係に応じた色で描画することを特徴とする請求項1乃至4の何れか1項に記載の画像処理装置。 The said display means draws each element which comprises the said 1st model with the color according to the positional relationship with the said 2nd model, The Claims 1 to 4 characterized by the above-mentioned. Image processing device.
- 前記表示手段は、前記部位の位置姿勢が、予め規定されている制約の範囲を超えている場合には、前記表示画面に警告を表示することを特徴とする請求項1乃至5の何れか1項に記載の画像処理装置。 6. The display device according to claim 1, wherein the display unit displays a warning on the display screen when the position and orientation of the part exceeds a predetermined range of restriction. An image processing apparatus according to the item.
- 前記表示手段は、前記計測器による撮像画像を前記表示画面に表示することを特徴とする請求項1乃至6の何れか1項に記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 6, wherein the display unit displays an image captured by the measuring device on the display screen.
- 前記出力手段は、複数の計測対象のそれぞれについて前記部位の位置姿勢を求めて出力することを特徴とする請求項1乃至7の何れか1項に記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 7, wherein the output unit obtains and outputs the position and orientation of the part for each of a plurality of measurement targets.
- 前記出力手段は、前記表示画面に前記部位の位置姿勢を表示することを特徴とする請求項1乃至8の何れか1項に記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 8, wherein the output unit displays the position and orientation of the part on the display screen.
- 前記出力手段は、前記部位の位置姿勢を、前記部位の位置姿勢を制御するための装置に対して出力することを特徴とする請求項1乃至9の何れか1項に記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 9, wherein the output unit outputs the position and orientation of the part to an apparatus for controlling the position and orientation of the part.
- 前記計測器はロボットアームに取り付けられており、前記部位は該ロボットアームの一部であることを特徴とする請求項1乃至10の何れか1項に記載の画像処理装置。 11. The image processing apparatus according to claim 1, wherein the measuring device is attached to a robot arm, and the part is a part of the robot arm.
- 前記計測器は、前記計測対象にパターン光を投影する投影部と、該パターン光が投影された該計測対象を撮像する撮像部と、を有することを特徴とする請求項1乃至11の何れか1項に記載の画像処理装置。 The measurement device according to claim 1, further comprising: a projection unit configured to project pattern light onto the measurement target; and an imaging unit configured to capture the measurement target onto which the pattern light is projected. Item 2. The image processing device according to item 1.
- 前記計測器は、それぞれ異なる位置姿勢で前記計測対象を撮像する複数の撮像装置を有することを特徴とする請求項1乃至11の何れか1項に記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 11, wherein the measuring device includes a plurality of imaging devices that image the measurement target at different positions and orientations.
- 前記第2のモデルは半透明のモデルであることを特徴とする請求項1乃至13の何れか1項に記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 13, wherein the second model is a translucent model.
- 前記第2のモデルはワイヤーフレームのモデルであることを特徴とする請求項1乃至13の何れか1項に記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 13, wherein the second model is a wire frame model.
- 前記表示手段は、前記計測器を表す第3のモデルを前記表示画面に表示し、
前記変更手段は、前記第2のモデルまたは前記第3のモデルがユーザ操作に応じて操作されることにより、前記第2のモデルの位置姿勢を変更し、
前記出力手段は、前記第2のモデルまたは前記第3のモデルの位置姿勢に基づいて、前記部位の位置姿勢を求めて出力する
ことを特徴とする請求項1乃至15の何れか1項に記載の画像処理装置。 The display means displays a third model representing the measuring instrument on the display screen,
The changing unit changes the position and orientation of the second model by operating the second model or the third model in response to a user operation,
The said output means calculates | requires and outputs the position and orientation of the said part based on the position and orientation of the said 2nd model or the said 3rd model. The output of any one of Claims 1 thru | or 15 characterized by the above-mentioned. Image processing device. - 前記変更手段は、前記第2のモデルおよび前記第3のモデルのうち一方がユーザ操作に応じて操作された場合、前記第2のモデルと前記第3のモデルとの間の相対的な位置姿勢が維持されるように他方の位置姿勢を変更することを特徴とする請求項16に記載の画像処理装置。 The change unit is configured to, when one of the second model and the third model is operated in response to a user operation, change a relative position and orientation between the second model and the third model. The image processing apparatus according to claim 16, wherein the other position and orientation are changed so that is maintained.
- 前記表示手段は、前記部位の位置姿勢としてユーザにより指定された位置姿勢に基づいて、前記第2のモデルおよび前記第3のモデルのそれぞれの位置姿勢を求め、該それぞれの位置姿勢で配置した前記第2のモデルおよび前記第3のモデルを前記表示画面に表示することを特徴とする請求項16または17に記載の画像処理装置。 The display means obtains the respective positions and orientations of the second model and the third model based on the position and orientation designated by the user as the position and orientation of the part, and arranges the positions in the respective positions and orientations. The image processing apparatus according to claim 16, wherein a second model and the third model are displayed on the display screen.
- 前記表示手段は、前記第3のモデルの位置姿勢を有する視点から見える、前記第2のモデルが表す計測範囲の空間の画像を前記表示画面に表示することを特徴とする請求項16乃至18の何れか1項に記載の画像処理装置。 19. The display according to claim 16, wherein the display unit displays, on the display screen, an image of a space in a measurement range represented by the second model, which is visible from a viewpoint having the position and orientation of the third model. The image processing device according to claim 1.
- 前記表示手段は、前記第1のモデルと前記第3のモデルとの間の距離を前記表示画面に表示することを特徴とする請求項16乃至19の何れか1項に記載の画像処理装置。 20. The image processing apparatus according to claim 16, wherein the display unit displays a distance between the first model and the third model on the display screen.
- 前記表示手段は、前記第1のモデルと前記第3のモデルとの干渉判定、前記部位を表すモデルと前記第1のモデルとの干渉判定、を行い、干渉があれば警告を前記表示画面に表示することを特徴とする請求項16乃至20の何れか1項に記載の画像処理装置。 The display means performs an interference determination between the first model and the third model, an interference determination between the model representing the site and the first model, and issues a warning on the display screen if there is interference. The image processing apparatus according to claim 16, wherein the image is displayed.
- 前記表示手段は、前記第2のモデルに対して相対位置姿勢が決められている物を前記表示画面に表示し、
前記変更手段は、当該物がユーザ操作に応じて操作されることによって、前記第2のモデルの位置姿勢を変更する
ことを特徴とする請求項1乃至15の何れか1項に記載の画像処理装置。 The display means displays on the display screen an object whose relative position and orientation are determined with respect to the second model,
The image processing according to any one of claims 1 to 15, wherein the changing unit changes the position and orientation of the second model by operating the object in response to a user operation. apparatus. - 請求項1乃至22の何れか1項に記載の画像処理装置と、
前記画像処理装置が出力した位置姿勢に基づいて被検物を計測する計測器と
を有することを特徴とするシステム。 An image processing apparatus according to any one of claims 1 to 22,
A measuring device for measuring a test object based on the position and orientation output by the image processing device. - 請求項1乃至22の何れか1項に記載の画像処理装置と、
被検物を計測する計測器と、
前記計測器が固定され、前記計測器による計測結果に基づいて前記被検物を操作するロボットと、を有し、
前記画像処理装置が出力した位置姿勢に基づいて前記ロボットを移動させて、前記計測器によって前記被検物を計測する、ことを特徴とするシステム。 An image processing apparatus according to any one of claims 1 to 22,
A measuring device for measuring the test object,
The measuring instrument is fixed, and a robot that operates the test object based on a measurement result by the measuring instrument,
A system, wherein the robot is moved based on the position and orientation output by the image processing device, and the test object is measured by the measuring device. - 請求項1乃至22の何れか1項に記載の画像処理装置が出力した位置姿勢に基づいて被検物を計測する工程と、
該計測結果に基づいて被検物を処理することにより物品を製造する工程と、を有することを特徴とする物品の製造方法。 A step of measuring a test object based on the position and orientation output by the image processing apparatus according to any one of claims 1 to 22,
Manufacturing an article by processing the test object based on the measurement result. - 画像処理装置が行う画像処理方法であって、
前記画像処理装置の表示手段が、計測器による計測対象を表す第1のモデルと、該計測器の計測範囲を表す第2のモデルと、を表示画面に表示する表示工程と、
前記画像処理装置の変更手段が、ユーザ操作に応じて前記第2のモデルの位置姿勢を変更する変更工程と、
前記画像処理装置の出力手段が、前記第2のモデルの位置姿勢が変更された状態における、前記計測器に対する相対的な位置姿勢が一定である部位の位置姿勢、を出力する出力工程と
を備えることを特徴とする画像処理方法。 An image processing method performed by the image processing apparatus,
A display step of displaying, on a display screen, a display unit of the image processing device, a first model representing a measurement target of the measuring instrument, and a second model representing a measurement range of the measuring instrument;
Changing means for changing the position and orientation of the second model according to a user operation,
An output step of outputting the position and orientation of a part whose position and orientation relative to the measuring device are constant in a state where the position and orientation of the second model are changed. An image processing method comprising: - コンピュータを、請求項1乃至22の何れか1項に記載の画像処理装置の各手段として機能させるためのコンピュータプログラム。 A computer program for causing a computer to function as each unit of the image processing apparatus according to any one of claims 1 to 22.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018-129460 | 2018-07-06 | ||
JP2018129460 | 2018-07-06 | ||
JP2019-093917 | 2019-05-17 | ||
JP2019093917A JP2020013548A (en) | 2018-07-06 | 2019-05-17 | Image processing apparatus, image processing method, system, article manufacturing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020008936A1 true WO2020008936A1 (en) | 2020-01-09 |
Family
ID=69059625
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/024987 WO2020008936A1 (en) | 2018-07-06 | 2019-06-24 | Image processing device, image processing method, system, and method for manufacturing article |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020008936A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009093611A (en) * | 2007-10-11 | 2009-04-30 | Mwtec Software Gmbh | System and method for recognizing three-dimensional object |
JP2011112402A (en) * | 2009-11-24 | 2011-06-09 | Omron Corp | Method for displaying measurement effective area in three-dimensional visual sensor and three-dimensional visual sensor |
-
2019
- 2019-06-24 WO PCT/JP2019/024987 patent/WO2020008936A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009093611A (en) * | 2007-10-11 | 2009-04-30 | Mwtec Software Gmbh | System and method for recognizing three-dimensional object |
JP2011112402A (en) * | 2009-11-24 | 2011-06-09 | Omron Corp | Method for displaying measurement effective area in three-dimensional visual sensor and three-dimensional visual sensor |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9529945B2 (en) | Robot simulation system which simulates takeout process of workpieces | |
JP7419271B2 (en) | Visualizing and modifying operational boundary zones using augmented reality | |
US10786906B2 (en) | Robot system | |
JP6744709B2 (en) | Information processing device and information processing method | |
JP5257335B2 (en) | Method for displaying measurement effective area in three-dimensional visual sensor and three-dimensional visual sensor | |
US6157368A (en) | Control equipment with a movable control member | |
US10279473B2 (en) | Image processing device, image processing method, and computer program | |
JP2020013548A (en) | Image processing apparatus, image processing method, system, article manufacturing method | |
US20100232683A1 (en) | Method For Displaying Recognition Result Obtained By Three-Dimensional Visual Sensor And Three-Dimensional Visual Sensor | |
US20060269123A1 (en) | Method and system for three-dimensional measurement | |
JP2014512530A (en) | Coordinate positioning device | |
KR20140106421A (en) | System and method for calibration of machine vision cameras along at least three discrete planes | |
US10410419B2 (en) | Laser projection system with video overlay | |
US10724963B2 (en) | Device and method for calculating area to be out of inspection target of inspection system | |
JP2014079864A (en) | Welding robot, and method for arranging arrangement object on surface plate in welding robot | |
KR20190128988A (en) | Aiding maneuvering of obscured objects | |
CN109715307A (en) | Bending machine with workspace image capture device and the method for indicating workspace | |
JP7250489B2 (en) | Image processing device, its control method, and program | |
WO2020008936A1 (en) | Image processing device, image processing method, system, and method for manufacturing article | |
US20240017412A1 (en) | Control device, control method, and program | |
KR20140099622A (en) | Robot localization detecting system using a multi-view image and method thereof | |
JP7443014B2 (en) | robot arm testing equipment | |
JP2022163836A (en) | Method for displaying robot image, computer program, and method for displaying robot image | |
WO2022249295A1 (en) | Robot simulation device | |
JP2021091056A (en) | measuring device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19829746 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19829746 Country of ref document: EP Kind code of ref document: A1 |