WO2017121993A1 - Model building from inspection data - Google Patents

Model building from inspection data Download PDF

Info

Publication number
WO2017121993A1
WO2017121993A1 PCT/GB2017/050043 GB2017050043W WO2017121993A1 WO 2017121993 A1 WO2017121993 A1 WO 2017121993A1 GB 2017050043 W GB2017050043 W GB 2017050043W WO 2017121993 A1 WO2017121993 A1 WO 2017121993A1
Authority
WO
WIPO (PCT)
Prior art keywords
representation
object
face
representations
method
Prior art date
Application number
PCT/GB2017/050043
Other languages
French (fr)
Inventor
David Roberts Mcmurtry
Kapil Mahendra BHUDHIA
Original Assignee
Renishaw Plc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to IN201611001548 priority Critical
Priority to IN201611001548 priority
Priority to EP16164456.2 priority
Priority to EP16164456 priority
Application filed by Renishaw Plc filed Critical Renishaw Plc
Publication of WO2017121993A1 publication Critical patent/WO2017121993A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B5/00Measuring arrangements characterised by the use of mechanical means
    • G01B5/004Measuring arrangements characterised by the use of mechanical means for measuring coordinates of points
    • G01B5/008Measuring arrangements characterised by the use of mechanical means for measuring coordinates of points using coordinate measuring machines

Abstract

A method for generating a model of an object. The method comprises generating from point position data concerning a plurality of faces of the object a representation for each face which comprises determining a particular geometrical form for the representation as well as determining the interrelationship of the representations.

Description

MODEL BUILDING FROM INSPECTION DATA

This invention relates to a computer implemented process for building a graphical model of an object, e.g. of an object being inspected, or to be inspected on an inspection device, such as a coordinate positioning device, for example a coordinate measuring machine (CMM)

It is known to inspect an object using an inspection device such as a coordinate positioning device, for example a coordinate measuring machine (CMM). The object could be in a pre, during, or post manufacture form. Typically a probe on the coordinate positioning device is used to collect measurement data about the object. Known probes include non-contact probes and contact probes. Collected measurement data can comprise the location in 3D space of the point taken (e.g. the X, Y, Z position) as well as optionally (especially in the case of a contact probe) the direction of measurement, and/or the orientation of the probe (if applicable, e.g. if mounted on an articulated head which can reorient the probe).

At a basic level, point measurement data could merely be stored. A point cloud of measured points could be displayed on a screen. When a vast number of points have been collected, e.g. using a non-contact line scanning probe, such a technique can be appropriate and provide a useful visualisation of the object. Additionally/alternatively, with such a vast number of points, a mesh-model (e.g. triangular mesh model can be fitted to the points, e.g. as described in

US20100268355).

In an alternative known technique, when measuring an object the user can take a number of point measurements (e.g. using a joystick to control the CMM and a contact probe probe) on a face of the object to be measured and after obtaining a number of points the user can tell the software what the geometrical form of the face just measured is, e.g. it is a plane, cylinder, sphere, or cone, NURBS (nonuniform rational basis spline) surface, etc. Such knowledge can then be used to determine a representation for graphically depicting the face of the object on a display, to aid analysis of the object and/or to aid generation of subsequent automatic measurement paths. To aid a user, software has been developed (e.g. MODUS™ available from Renishaw pic) to automatically recognise the geometrical form of a measured face, an example of which is described in EP0254515. Once the geometrical form has been determined a representation of said face based on the geometrical form can be displayed on a display unit. The process then repeats for the next measured face of the object, independently of the previous face. The user ends up with a plurality of distinct, independent representations of the different faces measured. This can be of limited use to a user in subsequent analytical steps. Even when displayed graphically, it can be difficult for a user to associate representations with their corresponding actual face of the object, and this can cause confusion and present difficulties to the user, especially for novice users. Such a situation is shown and described below in connection with Figure 21.

The present invention relates to improvements in building a model of an object, e.g. of an object being inspected, or to be inspected by a coordinate positioning apparatus. According to a first aspect of the invention there is provided a computer implemented method for generating a model of an object (e.g. an object measured or to be measured by a measurement device). The method comprises generating from point position data concerning a plurality of faces of the object a

(specific/distinct) representation for each face. Said generating of the

representations can comprise determining a particular (e.g. geometrical) form for the representation. Said generating of the representations can also comprise determining the interrelationship of the representations.

As mentioned above, existing techniques such as that disclosed in EP0254515 treats each representation of a face of the object independently. This results in the user being presented with a set of unrelated and disjointed representations. This can make it difficult for the user to navigate their way around the model when using the model to measure aspects of the object (e.g. the distance between different features). Not only this, but sometimes the faces are hidden from view behind other previously measured faces (e.g. cylinders may not be visible because they are hidden behind a plane even though in real life the open end of the cylinder is at the plane). In contrast, the present invention determines how the representations of the measured faces of the object are related. This means that a model which resembles the object being measured/to be measured can be generated thereby improving the ease and efficiency of navigation and

measurement of features.

As will be understood, said determining of a particular (geometrical) form for the representation can be done automatically. Said determining of a particular (geometrical) form for the representation can comprise deciding on a particular (geometrical) form for the representation. Said determining of the

interrelationship of the representations could be done automatically. Said generating of the representations can also comprise deciding on the

interrelationship of the representations. Either, or both, of such steps can comprise seeking user confirmation that the determination/decision is correct. The representations can comprise a geometrical form, e.g. such as a plane, cylinder, cone, pyramid, sphere, NURBS surface. A representation can comprise a geometrical representation of a face. A representation can comprise a graphical representation of a face. The model can comprise a computer model, e.g. a virtual model, for instance a 3D model, such as a computer-aided design (CAD) model.

The method can comprise determining how to configure the representations so as to resemble the object (e.g. visually resemble; in other words resemble the appearance of the object). Determining the interrelationship of the representations can be useful for determining how to configure the representations, so as to resemble the obj ect.

A representation can also comprise a boundary/border (as opposed to extending to infinity in at least one dimension). The method (e.g. said generating of the representations) can comprise determining the boundary(ies)/border(s) for a representation. The method can comprise determining the interrelationship of the boundaries/borders of the representations. Determining the interrelationship of the representations can be useful for configuring boundaries of representations (of adjacent/adjoining representations) to be coterminous. Accordingly, determining the interrelationship of the representations could comprise determining how to configure the boundaries/borders of the representations so as to (e.g. visually) resemble the object.

For example, the method could comprise determining a shared boundary /border line for at least part of the boundaries/borders of adjacent representations. That is, the method could comprise determining how to configure boundaries/borders of adjacent representations such that they are at least in part coterminous.

Accordingly, determining the interrelationship of the representations could comprise determining how to configure the boundaries/borders of adjacent representations such that they share a boundary/border line for at least part of their boundary/border. As will be understood, the boundary /border of a given representation might be shared with another representation along only part of the given representation's entire boundary/border. For example the boundary of a given representation could be shared along only one side. Optionally, the entire boundary of a representation could be shared with an adjacent representation. As will be understood, a representation might have more than one boundary (e.g. a plane having a hole in it has an outer and an inner boundary).

The method can comprise identifying where said representations meet each other, e.g. abut each other. The method can comprise identifying where said

representations intersect each other. The method can comprise linking said representations to each other, e.g. along shared boundaries, for example along a line at which they meet/abut, for example along an identified line of intersection. The method can comprise changing a property of at least one of said

representations, e.g. so as to fit with at least one other representation. That is, determining the interrelationship of the representations could comprise changing a property of at least one of said representations, e.g. so as to fit with at least one other representation. In other words, the method can comprise (e.g. determining the interrelationship of the representations can comprise) changing the

configuration of at least one of said representations. Optionally, incorporating into a model of said object a representation of a particular face of the object can comprise changing a property of (e.g. the configuration of) at least one of said representations, e.g. so as to fit with at least one other representation. This could comprise changing the orientation of the representation. Accordingly, the property could comprise the orientation of the representation. This could comprise changing the boundary of the representation in at least one dimension. Accordingly, the property could comprise the boundary of a representation in at least one dimension. Changing of the configuration/property of a representation could be such that where a representation meets/intersects another representation, at least one of said representations is configured such that they share a common boundary line (e.g. along the line at which they meet/intersect).

The method (e.g. changing said property/configuration) can comprise determining material and non-material faces of a representation. The method (e.g. changing said property/configuration) can comprise removing any non-material faces of a representation).

Changing said property/configuration can comprise at least one of: i) changing the location of the boundary of the representation in at least one dimension (e.g. changing the extent/size of the representation), ii) changing the shape of said representation's boundary, iii) determining a new (e.g. additional) boundary of a representation (for instance, when a cylinder intersects a plane, thereby defining an opening in the plane); iv) removing a boundary; and v) changing the orientation of the representation. Accordingly, the method can comprise changing the extent/size of said representation.

The method can comprise changing a property/configuration of said

representation, up to a predetermined limited extent. For example, any change to a boundary of a representation could be limited up to a predetermined extent. For instance, the change in location of a boundary (e.g. the change in size of a representation) can be limited up to a predetermined extent. The predetermined limitation could be absolute or relative, depending on the circumstances. For example, the maximum permitted change in the location of a boundary (e.g. the maximum permitted change in the size of a representation) can dependent on the size of the representation as originally determined.

Deciding on a geometrical form to represent a face of the object can comprise selecting a geometrical form from one or more of a plurality of different types of predefined geometrical forms. For example, this can comprise selecting a geometrical form from a set of predefined geometrical forms of comprising two or more of: a plane, cylinder, cone, pyramid, sphere, and prism. Deciding on a geometrical form to represent a face of the object can comprise using contextual information associated with position data (as explained in more detail below).

Said point position data could be obtained (or are to be obtained) by a

measurement probe. According the point position data can be data determined from a measurement of an object. Optionally, the point position data can comprise points derived from an inspection plan for a measurement apparatus. For instance, said point position data can comprise those points specified to be measured in an inspection plan.

Said point position data could be obtained (or are to be obtained) by a contact measurement probe. The probe can comprise a stylus, e.g. a deflectable stylus. For example, the probe can comprise a probe body and a stylus extending therefrom (e.g. which is deflectable relative to the probe body). The probe could provide an output which is merely indicative of when the stylus is (or is not) in contact with the object (e.g. is (or is not) deflected). Accordingly, said point position data could be obtained (or are to be obtained) by what is commonly known as a touch-trigger measurement probe. Optionally, the probe could provide an output which is variable dependent on the extent of deflection of the stylus in at least one dimension. In other words, the probe could provide a measure of the extent of deflection of the stylus. Accordingly, said point position data could be obtained (or are to be obtained) by what is commonly known as a scanning probe (also known as an analogue probe). The measurement device can comprise a coordinate positioning apparatus. For example, the measurement device can comprise a coordinate measuring machine (CMM). The coordinate positioning apparatus could comprise a Cartesian or non- Cartesian positioning apparatus. The method can comprise displaying a visual representation of said model on a display device. Optionally, the model can be interrogated by an operator.

Accordingly, optionally the method comprises receiving (and responding to) a request for dimensional information concerning the object. The method can be configured to output to an operator a visual representation of said geometrical form that has been determined to represent said particular face. The method could be configured to receive an operator input acceptance or rejection of said geometrical form. This could be done before incorporating the geometrical form into the model of said object. The method could be configured such that if the geometrical form is rejected, the method continues to accept points and provide another visual representation of a geometrical form from the new set of points (which could, for example, include the original points).

The method could comprise incorporating into a model of said object a representation of a particular face of the object. This can comprise using point position data (e.g. a group of point position data) associated with said particular face of the object to determine (e.g. decide on) the geometrical form of the representation. This can further comprise determining how said representation should relate to (e.g. interact with) representations of other faces of the object (e.g. determine their interrelationship). This can comprise determining how to configure (e.g. alter) one or more representations (for example, as described above change one or more properties of one or more representations). The method can comprise, subsequent to said incorporating of said representation of a particular face, and on the basis of further data associated with another particular face of the object, incorporating into a model of said object a representation of said other particular face of the object. (Said further data could be received (e.g. obtained by a measurement device) after the receipt (e.g. after the obtaining) of the aforesaid point position data, and for example after the aforesaid representation of said particular face has been incorporated into the model). This can comprise using said further data to determine (e.g. decide on) the geometrical form of the representation of said other particular face, and optionally determining how said representation of said other particular face should interact with representations of other faces of the object so as to resemble the object. This can comprise determining how to configure (e.g. alter) one or more representations (for example, as described above change one or more properties of one or more representations).

Accordingly, the representations for said faces can be determined and

incorporated into the model in a face-by-face manner. Optionally, the points can be obtained (e.g. measured) in a face-by-face manner. Optionally, the point position data is obtained manually. For example, the point positon data could be input manually. For instance, the point position data could comprise measurement data obtained by an operator manually driving a measuring device so as to obtain the measurement data. Optionally, the point position data comprises measurement data obtained from a probe which is manually driven into contact with the object, e.g. (but not necessarily) under the control of an electronic user input device, such as a joystick. Data obtained by a probe under manual control can be less accurate than data obtained by a probe controlled via an automated measurement process. However, it has been found that a model generated using the invention is still good enough to enable navigation around an object to interrogate it and/or create re-runs/new measurement programs to measure the object more accurately.

Accordingly, the method can comprise, for a plurality of faces of an object, receiving measurement data (e.g. a group of measurement points) concerning a particular face of an object which has been obtained via the manual control of a measurement device (e.g. contact measurement device), automatically determining the geometrical form of a representation for said face, and automatically incorporating said representation into a model of said object. Said incorporating can comprise adjusting the boundary of said representation of said measured face, and/or the boundary of at least one previously determined representations of previously measured faces. As described above, this could be done (e.g. sequentially) in a face-by-face manner.

Point position data could comprise the location in space (e.g. either virtual or actual, such as within a measurement volume) of a point. The point position data can comprise, for example, coordinate data. Coordinate data could comprise Cartesian or non-Cartesian coordinate data. Point position data can comprise measurement data concerning an object measured by a measurement device, e.g. measurement data obtained by a measurement device, such as a measurement probe. Point position data can comprise contextual information. For example, point position data can comprise information concerning the direction from which a point was measured (e.g. by a measurement probe). For example, the point position data can comprise information concerning the direction in which a contact probe was moved in order to obtain measurement data. The method (e.g. said determining the geometrical form) can comprise analysing the relative positions of the points. Optionally, the method (e.g. said determining the geometrical form) can comprise analysing contextual information associated with said point position data. The point position data can be grouped, e.g. groups corresponding to particular faces of the object measured. Accordingly, the method can comprise analysing a group of point position data to decide on a particular geometrical form for a representation of said face. The point position data within a group could be assumed, known and/or identified, as relating to a particular face of the object. The generating of representations of faces of the object could take place in a face- by-face manner, e.g. as faces are measured. Point position data could be obtained (e.g. measured) in a face-by-face manner. The number of points associated with the face of the object could be not more than 50, optionally not more than 20 points, for example not more than 10 points, for instance not more than 6 points. Accordingly, a group of point position data could comprise not more than 50 points (e.g. not more than 20 point

measurements), optionally not more than 20 points, for example not more than 10 points, for instance not more than 6 points.

As will be understood, any one, any combination, or all of the above mentioned steps can be performed automatically. Optionally, this can additionally comprise the computer implemented method seeking user confirmation of any one, any combination, or all of the determinations/decision made (e.g. such as the geometrical form, boundaries, changes to boundaries, and/or identified lines of intersection, etc).

The method can comprise using heuristic techniques for determining the geometrical form of a representation and/or how it interrelates to other representations. For example, the determining of the form/shape/orientation of a representation can be based on how the points were taken. For example, for a planar feature, the first and second points could be used to determine the length of the geometrical form (e.g. the length of a rectangular geometrical form) and the third and fourth points could be used to determine the breadth of the geometrical form (e.g. the width of the rectangular geometrical form). As another example, the order in which the faces of the object are measured (and, for example, the order in which the representations are determined) could be used to determine the interrelationship of the representations, e.g. to resolve geometric ambiguity.

Accordingly, the present invention provides a computer implemented method for generating a three-dimensional computer model of an object measured by a measurement machine. The method can comprise receiving measurement points obtained or to be obtained, e.g. by a measurement probe mounted on the measurement machine. The method can comprise generating a three-dimensional computer model of said object from said measurement points. The three- dimensional computer model could comprise a plurality of geometrical representations which are arranged such that the three-dimensional computer model resembles the object. As explained above, the contact measurement probe could comprise a touch-trigger probe. Optionally, the contact measurement probe comprises a scanning/analogue probe.

Accordingly, this application describes, a computer implemented method for generating a model of an object comprising a plurality of faces measured or to be measured by a measurement device. The method could comprise incorporating into a model of said object a representation of a particular face of the object. This can comprise using point position data associated with said particular face of the object to decide on the geometrical form of the representation. The method can further comprise determining how said representation should interact with representations of other faces of the object so as to resemble the object. Accordingly, this application describes, a computer implemented method for generating a model of an object comprising a plurality of features measured or to be measured by a measurement machine. The method can comprise receiving a group of measurement points concerning a feature of the object. The method can further comprise analysing said measurement points to decide on a shape to represent the feature of the object. The method can further comprise

incorporating said shape into a model of said object comprising a plurality of other shapes representing other features of the object. Said incorporating can comprise determining how the shapes should interact with each other so as to resemble the object.

According to a second aspect of the invention there is provided an apparatus (e.g. comprising at least one processor device) configured to perform any of the above described methods. For example, the apparatus can be configured to generate a model of an object (e.g. an object measured or to be measured by a measurement device) from point position data concerning a plurality of faces of the object. The apparatus can, for example, be configured to generate a (distinct/specific/separate) representation for each face. The apparatus can be configured to determine a particular geometrical form for the representation as well as the interrelationship of the representations. The apparatus could comprise a measurement apparatus, for example a coordinate positioning apparatus, for instance a coordinate measuring machine. The apparatus could comprise a measurement probe, for instance a contact probe. The apparatus could comprise at least one controller for communicating instructions to the measurement apparatus. The apparatus could comprise one or more processor devices (e.g. a personal computer or the like) in communication with a controller. The one or more processor devices (e.g. a personal computer or the like) in communication with a measurement device, e.g. in communication with a measurement device's controller, can be configured to perform any of the above described methods. The one or more processor devices could be configured to receive point position data (e.g. measurement data) from a measurement device (e.g. via a controller). The apparatus can comprise one or more manual input devices (e.g. a joystick) via which an operator can control the measurement device to obtain point position data.

According to a third aspect of the invention there is provided a computer program (e.g. a computer program product) comprising instruction which when executed by at least one processor device causes the at least one processor device to perform any of the above described methods. According to a fourth aspect of the invention there is provided a computer readable medium comprising instructions which when executed by at least one processor device, causes the at least one processor device to perform any of the above described methods.

Embodiments of the invention will now be described, by way of example only, with reference to the following drawings, in which:

Figure 1 is a schematic system hardware diagram illustrating an example situation in which the invention might be used;

Figure 2 is a schematic software architecture diagram illustrating an example set-up in which the invention might be used;

Figure 3 is a schematic process diagram illustrating an overview of a process incorporating the invention, from point acquisition to display of a graphical model on a screen;

Figure 4 is a schematic process diagram illustrating an example of the way in which the computer program decides how to update the model of the object;

Figure 5 illustrates an example object to be measured;

Figures 6 to 18 illustrate the stages in development of generating a model according to the invention via the measurement of different faces of the object shown in Figure 5, where in particular: Figures 6a and 6b illustrate provisional and accepted representations of the top face of the object of Figure 5;

Figures 7a, 7b and 7c illustrate the provisional, expanded, and accepted representations of the front face of the object of Figure 5; Figures 8a, 8b and 8c illustrate the provisional, expanded, and accepted representations of the upper cylindrical face of the bore in the top face of the object of Figure 5; Figures 9a and 9b illustrate the provisional and accepted representations of the face defining the shelf in the bore in the top face of the object of Figure 5;

Figures 10a and 10b illustrate the provisional and accepted representations of the lower cylindrical face of the bore in the top face of the object of Figure 5;

Figures 11a and 1 lb illustrate the provisional and accepted representations of the main shaft face of the boss located in the bore in the top face of the object of Figure 5; Figures 12a and 12b illustrate the provisional and accepted representations of the bottom face of the bore in the top face of the object of Figure 5;

Figures 13a and 13b illustrate the provisional and accepted representations of the frusto-conical face of the boss of the bore in the top face of the object of Figure 5;

Figures 14a and 14b illustrate the provisional and accepted representations of the planar top face of the boss of the bore in the top face of the object of Figure 5;

Figures 15a and 15b illustrate the provisional and accepted representations of the planar top face of the corner chamfered face feature that extends between the top, front and side faces of the object of Figure 5;

Figures 16a and 16b illustrate the provisional and accepted representations of the part cylindrical face of the left-hand bore in the front of the object of Figure 5;

Figures 17a and 17b illustrate the provisional and accepted representations of the bottom face of the two bores in the front of the object of Figure 5; Figures 18a and 18b illustrate the provisional and accepted representations of the part cylindrical face of the right-hand bore in the front of the object of Figure 5; Figure 19 illustrates the final model of the object determined by the measurement of the above mentioned features;

Figure 20 illustrates the way in which measurement data can be extracted from, and shown with respect to, the model of the object; and

Figure 21 illustrates a representations of the same faces of the object of Figure 5 obtained via a prior art technique.

Referring to Figure 1, there is shown a coordinate positioning apparatus 2 according to an embodiment of the invention, comprising a coordinate positioning machine in the form of a coordinate measuring machine ("CMM") 14, a controller 4, having connected to it a joystick 6 and a computer 8 having display 12 and input device(s) 15 (e.g. keyboard, mouse, etc). The CMM 14 comprises a platform 20 onto which an object 10 to be inspected can be placed and a bridge 22. The bridge 22 can be moved along the platform 20 in one linear dimension (in this case the labelled the "y" axis) via motors (not shown) under the control of the controller 4. The bridge 22 carries a quill 18 which can be moved along the bridge (in this case labelled the "x" axis) and also perpendicularly to the y and x axes (i.e. along the "z" axis as shown) via motors (not shown) under the control of the controller 4. In particular, the quill 18 is mounted on a carriage 13 which in turn is mounted on the bridge 22 which in turn is mounted on the platform 20. The bridge 22 is driveable in the Y direction relative to the platform 20 by motors in a known manner, the carriage 13 is driveable in the X direction relative to the bridge 22 (and hence relative to the platform 20) by motors in a known manner, and the quill 18 is driveable in the Z direction relative to the carriage 13 (and hence relative to the bridge 17 and platform 20) by motors in a known manner. Accordingly, the quill 18 is moveable in three orthogonal dimensions, X, Y and Z relative to the platform 20.

The quill 18 carries a head 16 which in turn carries a probe 26 which has a stylus 28. The head 16 is articulated in that it has bearings and motors (not shown) that facilitate rotation of the probe 26 and hence stylus 28 about first and second orthogonal axes (shown as "Al" and "A2" in Figure 1) under the control of the controller 4. The CMM 14 comprises position encoders (not shown) which report the position of the bridge, the quill and the probe in each of the three linear and two rotational degrees of freedom to the controller 4. From this, the controller 4 (and/or software within the PC 8) can determine the tip centre point. Additional information reported by the CMM 14 (and also the controller 4 to the PC 8) can include stylus tip radius information (e.g. determined from a calibration routine), probe 26 orientation, and the move direction (e.g. the direction in which the probe 26 was being moved at the point the measurement was taken).

In the described embodiment, the coordinate positioning machine 14 is a serial CMM (i.e. in which the three linear degrees of freedom is provided by three independent, orthogonal axes of motion, arranged in series, in this case with an additional two rotational degrees of freedom thereby making the machine a so- called 5-axis machine). However, as will be understood, the invention can also be used with other types of measuring machines, e.g. different types of coordinate positioning machines, such as parallel CMMs, manual or automatic (robot) measuring arms or the like. The invention can also be used with machines other than dedicated CMMs, for example it can be used with coordinate positioning machines such as machine tools. Furthermore, as will be understood, the invention is also suitable for use with Cartesian and non-Cartesian positioning machines, such as polar and spherical coordinate positioning machines. In the embodiment described, the probe 26 is a contact touch-trigger probe which issues a signal when contact is detected between the probe 26 (and in particular the stylus 28 (and more particularly the stylus tip)) and the object 10. An example of such a probe is described in more detail in GB 1445977. The controller 4 receives the signal and in response can determine and report the position, orientation and any other relevant measurement information to the PC 8. The probe 26 can be moved about and into the object in a number of ways. For example, the user can move the probe 26 using the joystick 6. Optionally, in some embodiments, the user might be able to directly manipulate the probe 26 itself by manually manipulating the probe 26, head 16 and/or another part of the CMM. Other possible example embodiments could include the user entering commands via a keyboard (e.g. either directly into the controller 4 or into the PC 8), or the controller executing a prewritten measurement program.

In a typical situation, especially in the case of an unknown object (e.g. an object for which the controller 4 and/or PC 8 has no computer drawing/model and/or for which there is no pre-written measurement program) the user will at least initially use the joystick to manually control the probe so as to acquire a few measurement points on the object 10. Even in the case of a known object, the user might initially use a joystick to take a few points on the object 10 as part of an object registration process. In accordance with an embodiment of the invention, the PC 8 receives such measurement points and from them determines and provides a graphical model of the object on the screen 12.

Referring now to Figure 2, there is shown a schematic illustration of the software architecture of the computer 8. As shown, the software in the computer 8 can, at least conceptually, be distilled into a number of modules. As will be understood, additional software can be present, and only those modules relevant to the example of the invention are shown in Figure 2 and described below. In this particular example, the computer 8 comprises two main programs running concurrently, a Universal CMM Controller ("UCC") server program 30 and a measurement program 40. As will be understood additional programs 50 might also be running on the computer 8, either concurrently or at different times to the UCC server 30 and the measurement program 40 (which may or may not communicate with each other and/or the UCC server 30 and/or measurement program 40). For example, an underlying operating system may be running on the computer 8. The UCC server 30 can be a known piece of software which is essentially an interface between the controller 4 and the measurement program 40 (and optionally other programs 50) running on the computer 8. As will be understood, it might be that the measurement program 40 (and/or other programs 50) can communicate directly with the controller 4, in which case the UCC server 30 program might be bypassed, or not needed at all.

The measurement program 40 comprises, in no particular order, a communications (comms) module 41, an analysis module 42 which itself comprises a model builder component 43 and a geometrical form recognition component 44, a model module 45, a face representation database module 46, a path generator 47, a three dimensional (3D) renderer module 48 and a user interface (U/I) module 49.

The comms module 41 acts as an interface between the measurement program 41 and the UCC server 30. For example, the comms module receives and passes measurement results from the UCC server 30 to the analysis module 41. It can also, for example, pass movement instructions from the path generator 47 to the UCC server 30.

The analysis module' s 42 geometrical form recognition component 44 is configured to analyse the measurement data associated with a particular face of the object to decide on the geometrical form suitable for the representation of the face of the object. For example, based on said measurement data associated with a particular face should a plane, cylinder, sphere, cone, NURBS surface, etc be used for the representation of the face? The model builder component 43 is configured to determine a geometrical model of the object using the recognised geometrical forms, as described in more detail below. Details associated with the representation for a particular face of an object are stored in the face representation database module 46 (for example, the determined geometrical form, the measured points and the direction of measurement associated with the points are stored). Data (for example, but not necessarily, CAD data) representing a geometrical model of the object is stored in the model module 45. As will be understood, in an alternative embodiment, the information stored in the face representation database 46 could be stored in the model module 45, rather than or in addition to being stored in a separate database. Furthermore, in an alternative embodiment, it might not be necessary to store such information concerning individual faces. Rather, only the model could be required.

The 3D renderer module 48 determines how to render and display the geometrical model stored in the model module 45 as well as other graphical representations (such as measured points to be displayed on the screen 12), and the U/I module acts as an interface between the measurement program 40 and any input and output devices (e.g. the display 12).

In this embodiment, the measurement program 40 also includes a path generator module 47 which can be used to generate subsequent measurement paths, e.g. scan paths, using the data stored in the feature database 46 and/or model module 45. The path generator 47 could use known techniques (e.g. as described in WO2009/024783) for automatically generating instructions for controlling a CMM to measure the object modelled using the technique of the invention. Various arrows are displayed in Figure 2 in order to illustrate various links between the modules. As will be understood, such arrows are merely illustrative to aid understanding, and additional or alternative lines of communication may be present as appropriate. Furthermore, as will also be understood by a person skilled in the art, the architecture of Figure 2 is merely one exemplary way of implementing the concept of the invention, and there are many other equally viable ways. An example process 300 incorporating the invention is illustrated in Figure 3. This will be explained with reference to Figures 4 to 20.

The example process 300 begins at step 302 where an object 10 to be inspected is placed on the CMM' s platform 20. An example object 400 which will be used to help illustrate the invention is shown in Figure 5. As will be understood, various set up procedure(s) and/or calibration step(s) might also be performed at this stage, (and/or prior to step 302) in line with normal and well established procedures when measuring an object on a CMM. For example, a probe, such as the touch trigger probe 26 will at some point have to be loaded onto the CMM 14 and calibrated, e.g. by taking measurements of a known, calibration object.

At step 304 of this example, a single point measurement of a face of the object is measured. For example, a single point on the object' s 400 top face 406 is obtained by an operator using the j oy stick 6 to manually drive the probe' s 26 stylus tip into the top face 406 until the probe' s stylus 28 is deflected from its rest position which causes the probe 26 to output a trigger signal. The position of the probe 26 (and hence the tip of the probe' s stylus) is measured by the CMM's position encoders and reported to the controller 4 which in turn passes the measurements to the measurement program 40 via the UCC server 30. In particular, the measurements are received and processed by the analysis module 42 and a corresponding point representing the point actually measured is displayed on the screen 12. At the same time, the geometrical form recognition component 44 determines at step 308 if a geometrical form can be recognised from the current and previously obtained points. For example, the geometrical form recognition component 44 uses logic such as that explained in EP0254515 to establish if the current point along with previous points define a predefined geometrical form such as plane, cylinder (e.g. internal such as a bore, or external such as a boss), sphere, etc. As will be understood, along with measurement data regarding the position of the probe at the point the measurement was obtained, additional data such as the direction in which the probe was moved to obtain the measurement can be received and used by the geometrical form recognition component 44.

If no geometrical form can be recognised at this stage (e.g. because as is the case here only a single point has been measured), the process returns to step 304 at which another point is obtained by the operator and displayed on the screen. This loop continues until sufficient points and information have been obtained in order for the geometrical form recognition component 44 to recognise a geometrical form. Once a geometrical form has been recognised, the geometrical form recognition component 44 generates a provisional representation of the face just measured, based on the determined geometrical form, which is then displayed on the screen 12 at step 310. This provisional representation of the face has a provisional boundary. The operator is prompted at step 312 to confirm/accept or reject the recognised geometrical form. If rejected, the process returns to step 304 at which another point is taken. Optionally, as illustrated by step 313, previously measured points could be removed by the operator (in case it appears that some points are adversely affecting the geometrical form recognition stage) by the user selecting the points to delete on the screen and instructing their deletion (and/or a point "undo" functionality could be provided). If the operator confirms/accepts the geometrical form, then at step 314, the model builder component 43 incorporates a representation of the face into the model of the object. The updated model is stored in the model module 45 which in turn is displayed on the screen 12. If the operator has measured all the desired faces of the object, then the operator can choose to finish the process. Alternatively, additional points can be measured thereby bringing the process back to step 304.

An example process of constructing the model (step 314) is illustrated in more detail in connection with Figure 4. The process begins with the geometrical form recognition component 44 feeding the confirmed/accepted recognised geometrical form into the model builder component 43 at step 350. In particular, in this embodiment, this involves the geometrical form recognition component 44 feeding the above mentioned provisional representation of the face into the model builder component 43. The model builder component 44 then at step 352 intersects said provisional representation of the face fed to it with existing representations already present in the model. This is done by changing (in this case extending) the boundary of the representation of the face (as explained above, the boundary was a provisional). The boundary of the representation could be extended to infinity, but it has been found expanding the boundary by a limited extent according to a set of predefined rules (which could, for example, be heuristically determined) to be preferential. For instance, if the representation of the face comprises a rectangular plane, then the size of the rectangle could be increased by an amount that is dependent on the size of the determined rectangle. The change (e.g. expansion/contraction) of the representation could also be based on how the points on the face were measured and/or other faces that have already been measured. For example, if the representation of the face comprises a cylindrical bore, the points for which were obtained by measuring through a previously determined planar face, then the bore could be extended up to said planar face. Specific examples of different ways in which the boundary of a representation of a face are illustrated below.

At step 354 the point/line of intersection of the newly determined representation of a face with existing (previously recognised) representations of other faces is established and they are stitched together, e.g. linked in the (e.g. CAD) model so that adjacent representations of faces and their common edge are known. At step 356 the material and non-material surfaces of adjacent representations are determined. Such determination could comprise, for example determining which parts of can be identified as a legal surface, and which are impossible surfaces. This could, for example, comprise taking into consideration the direction of measurement obtained for measurement points associated with a representation. Then at step 358, the model stored in the model module 45 is updated and displayed on the screen. The process of Figure 3 and 4 will now be described with reference to an example, illustrated in Figures 5 to 20. Figure 5 illustrates an example object 400 to be inspected. The operator desires to know various critical measurements about the object 400. For example, in this example the operator wants to know the diameter "D" of a boss 402 that is located within a bore 404 in the top face 406 of the object. Also, the operator wants to know the distance "d" between the centres of overlapping first 408 and second 410 bores in the front face 412 of the object. Furthermore the operator wants to know the angle "A" of inclination of a chamfered face 414 which is formed at the corner of the top 406, front 412 and side 416 face of the obj ect 400.

Accordingly, in accordance with step 302 of the process, the operator loads the object 400 onto the CMM' s 14 platform 20. The operator then uses joystick 6 to manually drive the probe's 26 stylus 28 into the object to measure a point. In this example, the operator decides to measure the top face 406 of the object first.

Accordingly, the operator controls the probe 26 to obtain a plurality of

measurement points on the top face 406, looping around steps 304 to 308 until the geometrical form recognition component 44 recognises a geometrical form and displays a provisional representation of the top face, in accordance with the recognised geometrical form, on the screen 12. Ideally, and as would be natural to an operator, the operator obtains points that are spread apart as much as is reasonably practical over the face being measured. The screen is updated to show the position of each point within the model' s model volume. Figure 6a illustrates an example of what the user can be presented with on the screen 12 at the stage of the geometrical form recognition component 44 recognising the geometrical form of the measured face. As shown, the user has collected four points on the top face 406. These points 530, 532, 534, 536 are displayed on the screen. Furthermore, the geometrical form recognition component 44 has determined that these four points define a planar geometrical form 538 since they all lie in a plane and were obtained in the same direction. Accordingly, as displayed in Figure 6a, a provisional planar representation 538 which contains all four measured points is displayed.

If the operator considers that this is not representative of the face measured, then the user can reject the geometrical form and repeat steps 304 to 308 of the process, and optionally delete one or more of the measured points 530, 532, 534, 536. If the operator considers that this is representative of the face measured, then the user can accept the geometrical form, at which point the process proceeds to step 314. With reference to Figure 4, this involves feeding the accepted geometrical form (and in this embodiment the provisional planar representation of the face) into the model builder component 43. However, since there are no other representations already in the model the representation is expanded according to the predefined rules mentioned above and the expanded representation is added to the model as the model's first representation.

The model is then displayed on the screen at step 358. As shown, although the expanded representation is added and stored in model, the unexpanded version of the representation is displayed for ease of reference by a user. This is illustrated in Figure 6b.

The operator then controls the probe 26 via the joystick 6 so as to take another point measurement, and so the process returns back to step 304. In this case the operator controls the probe 26 to measure points on the front face 412 of the object. The process loops around steps 304 to 308 until sufficient points have been obtained for the feature to be recognised.

Figure 7a illustrates that after three points have been obtained, a planar

geometrical form has been recognised, and a provisional planar representation 540 has been generated. The operator confirms/accepts that the face 412 has been correctly recognised as a plane and so the process proceeds to step 314, and in particular to the process shown in Figure 4. Accordingly, the confirmed/accepted geometrical form (and in particular, in this case the provisional planar representation 540) is passed to the model builder component 43 which at step 352 intersects it with existing representations of geometrical forms. This involves, as schematically illustrated by Figure 7b, expanding the boundary of the provisional planar representation 540 of the front face 412 in all directions.

The process then determines at step 354 if the extended representation intersects any other representations, and if so it stiches them together along the intersection. This is schematically illustrated by Figure 7b in which the expanded planar representation 540 for the front face 412 and the expanded representation of the top planar face 538 are schematically shown. As shown, these representations meet/intersect along line 542. This line of intersection defines a

shared/coterminous boundary of the two representations. In this particular embodiment of the invention the representations are stitched together along the line of intersection.

At step 356, the material "M" and non-material "NM" surfaces of the

representations are determined. In this case, in connection with the front face 412, the material face "M" is identified as the region below the line of intersection 542 and the non-material face "NM" is identified as the region above the line of intersection 542. In connection with the top planar face 538, the material face "M" is identified as the region behind the line of intersection 542 and the non- material face "NM" is identified as the region in front of the line of intersection 542. This can be determined for example based on the location and direction from which measurements were obtained (e.g. for the planar representation 540 of the front face, as well as for the planar representation 538 of the top face 406). The boundaries of planar representation 540 of the front face are then updated so as to conform to the determined line of intersection 542 and the determined material face "M" and the model stored in the model module 45, along with the model displayed on the screen 12 are updated. The displayed updated model is illustrated in Figure 7c. As can be seen, this has resulted in the representations of the top and front faces of the object have a shared/coterminous boundary. As illustrated, the boundaries of the representations of the top and front faces of the object are not shared/coterminous for their entirety, but rather are only

shared/coterminous in part, i.e. in this embodiment along one side of their boundary. This process can be repeated for numerous other faces of the object. Figures 8a to 8c illustrate the same process being used to update the displayed model after the upper cylindrical face 450 of the bore 404 has been measured. In summary, as shown in Figure 8a, three points around the surface of the upper cylindrical face 450 of the bore 404 have been measured by a probe from above the planar representation 538 of the top face 406. These points lie just below the planar representation 538 of the top face 406. Due to the arrangement of the measured points and the directions in which they were obtained, the feature recognition component 44 recognises these points as defining a bore (an internal cylindrical geometrical form) and so as illustrated in step 8b, the circle defined by these three points is extended in opposite directions (i.e. up and down in the embodiment and orientation shown) by an amount defined by predetermined rules, to define a provisional bore representation 552. In this case, it is known that the

measurements points were obtained by a probe which extended through the top planar representation 538, and so the circle is extended upwards until it meets the top planar representation 538, and also in the opposite direction by a small predetermined amount. The line 554 in Figure 8b represents the line along which the provisional bore representation 552 intersects the planar representation 538 of the top face 406, and along which the bore representation 552 is stitched to the planar representation 538.

Material "M" and non-material "NM" parts of the bore representation 552 and the planar representation 538 are then determined based on the location and direction from which measurements were obtained. For example, it is known that the measurements were obtained by the probe 26 passing through the planar representation 538 of the top face 406 and so it can be determined that the circular region (shown in Figure 8b as the circular hashed area) of the planar

representation 538 of the top face 406 does not exist is hence is not material (" M"). Figure 8c illustrates the updated model displayed on the screen 12. As can be seen, in this embodiment, the process has resulted in a new (additional) boundary being created for the representation 538 of the top face. In particular, the representation now comprises two boundaries, an outer (rectangular-shaped) boundary and an inner (circular-shaped) boundary. Furthermore, as can be seen, the bore representation comprises two boundaries; an upper and a lower boundary. In this situation, the entirety of the upper boundary of the bore representation 552 is shared with/coterminous with the entirety of the inner boundary of the representation of the top face.

Figures 9 to 14 illustrates the various stages (in particular the screen displayed to the user at step 310 and 358) during the measurements of the remaining faces defining the bore 404 and the boss 402. In particular Figures 9a and 9b illustrates the measurement of the face defining the shelf 405 in the bore and its

incorporation into the model of a planar representation of said shelf (which as illustrated comprised intersecting the recognised planar representation with the side walls of the previously determined cylinder and throwing away the non- material parts thereof). As illustrated by the example of Figures 9a and 9b, the geometrical form recognised by the measurement of the face which defines the shelf 405 in the bore was a plane, and the provisional representation determined and depicted was a rectangular plane. However, the result of performing steps 352 to 358 of the process of Figure 4 resulted in not only the boundary of the representation being contracted, but also resulted in the shape of boundary changing from rectangular to circular (but the geometrical form, a plane, remains the same).

Figure 10a illustrates the result of the measurement of the object' s lower cylindrical face 407 and Figure 10b illustrates the incorporation into the model of the determined representation for that face. Figure 1 1a illustrates the result of the measurement of the main shaft face 403 of the boss 402 and Figure 1 lb illustrates the incorporation into the model of the determined representation for that face. Figure 12a illustrates the result of the measurement of the bottom face of the bore 404 and Figure 12b illustrates an incorporation into the model of the determined representation for that face. Figure 13a illustrates the result of the measurement of the frusto-conical face 409 of the boss 402 and Figure 13b illustrates the incorporation into the model of the determined representation for that face. Figure 14a illustrates the result of the measurement of the planar top face 41 1 of the boss 402 and Figure 14b illustrates the incorporation into the model of the determined representation for that face.

Figure 15a illustrates the result of the measurement of the corner chamfered face feature 414 of the object, and the provisional representation 560 determined. Figure 15b illustrates the displayed updated model after the provisional representation 560 has been intersected with the representations of the top 406, front 412 and side 416 faces of the object 400 in accordance with the invention. Figures 16 to 18 illustrate the various stages (in particular the screen displayed to the user at steps 310 and 358) during the measurements of the various faces of the overlapping first 408 and second 410 bores in the front face 412 of the object 400.

In the embodiment described, the operator has now measured all of the faces of the object required to obtain the desired dimensions and as illustrated in Figure 19 a model 500 which visually resembles the object 400 has been obtained and displayed on the screen 12. In the embodiment described, the model 500 is manipulable such that the user can change the view point of the model 500. Also, as shown in Figure 20 the user can interact with the model so as to obtain the desired dimensions. These dimensions could be derived from the measured points obtained during the generation of the model 500. Alternatively, the object 400 could be re-measured via an automated process (e.g. by repeating the

measurements obtained manually, and/or by generating a new inspection program based on the model 500 determined), and the dimensions could be derived from the measured points obtained by the automated measuring process. Optionally, the analysis module 42 analyses the model to determine the measurements.

However, as will be understood, the dimensions can be determined in other ways. For example, another module/program/process could analyse the model or other data (e.g. the obtained measurement data).

Figure 21 illustrates the model provided by a prior art software package called MODUS™ available from Renishaw® pic. The model is that obtained after measuring the same faces of the same object 400 as that measured in the above example of the invention. As can be seen, although individual representations for particular faces have been recognised and displayed, there is no consideration as to how the representations interact with each other. Accordingly, this results is a model which is not visually representative of the object 400 measured. It is therefore much more difficult for an operator to identify and select the

faces/features to obtain the desired dimensions. This results in a much higher level of training and experience being required, and even with such training and experience the process of selecting the correct faces/features to obtain the desired dimensions is slow and error prone.

The above embodiments describe how that the boundary /boundaries of a representation can be changed. As will be understood, other properties of a representation could be changed. For example, the orientation of a representation could be changed, e.g. so as to fit with other representations. For example, if a planar face is measured in two parts by an operator (e.g. by measuring two adjacent planes) and the planar representations as determined are nearly but not perfectly parallel/coplanar to each other, then a decision could be made so as to reorient at least one of the planes such that are "snapped" into alignment with each other such that they become parallel/coplanar to each other. As another example, if a bore is measured through a planar face, and the axis of the cylindrical form determined from that measurement is nearly, but not quite, perpendicular to the planar face, a decision could be made so as to reorient the cylindrical form and/or the planar face such that the representations do become perpendicular to each other. As with the above described limits on the extent by which a boundary can be repositioned (e.g. the limit on the extent by which the size of representation can change), limits could be applied as to by how much a representation can be reoriented.

Claims

CLAIMS:
1. A computer implemented method for generating a model of an object, the method comprising generating from point position data concerning a plurality of faces of the object a representation for each face which comprises determining a particular geometrical form for the representation and the interrelationship of the representations.
2. A method as claimed in claim 1, comprising determining how to configure the representations so as to resemble the object.
3. A method as claimed in claim 1 or 2, comprising determining how to configure boundaries of adjacent representations such that at they are at least in part coterminous.
4. A method as claimed in any of claims 1 to 3, comprising linking adjacent representations along shared boundaries.
5. A method as claimed in any preceding claim, comprising incorporating into a model of said object a representation of a particular face of the object, comprising using point position data associated with said particular face of the object to determine the geometrical form of the representation, and further comprising determining how said representation should relate to representations of other faces of the object.
6. A method as claimed in any preceding claim, in which determining the interrelationship of the representations comprises changing a property of at least one of said representations so as to fit with at least one other representation, optionally said changing being restricted to a change up to a predetermined limited extent.
7. A method as claimed in claim 6 in which said property comprises a boundary of the representation in at least one dimension.
8. A method as claimed in claim 6 or 7, in which changing said property comprises one, or any combination, of: i) changing the location of the boundary of the representation in at least one dimension, ii) changing the shape of the representation's boundary, iii) determining an additional boundary of a representation, iv) removing a boundary, and v) changing the orientation of a representation.
9. A method as claimed in any preceding claim, in which deciding on a geometrical form to represent a face of the object comprises selecting a geometrical form from one or more of a plurality of different types of predefined geometrical forms.
10. A method as claimed in any preceding claim, in which said point position data is obtained or is to be obtained by a contact measurement probe, optionally a touch-trigger measurement probe.
11. A method as claimed in any preceding claim, in which said measurement device comprises a coordinate positioning apparatus, optionally a coordinate measuring machine.
12. A method as claimed in any preceding claim, comprising displaying a visual representation of said model on a display device.
13. A method as claimed in any preceding claim, which is configured to output to an operator a visual representation of said geometrical form that has been decided to represent said particular face, and to receive an operator input acceptance or rejection of said geometrical form.
14. A method as claimed in any preceding claim, in which said data point is obtained manually.
15. A computer implemented method for generating a three-dimensional computer model of an object measured by a measurement machine, the method comprising:
receiving measurement points obtained by a contact measurement probe mounted on the measurement machine; and
generating a three-dimensional computer model of said object from said measurement points, the three-dimensional computer model comprising a plurality of geometrical representations which are arranged such that the three-dimensional computer model resembles the object.
16. An apparatus comprising at least one processor device configured to receive measurement points concerning a plurality of faces of an object obtained by a contact measurement probe mounted on a measurement machine and configured to generate from said measurement points a representation for each face which comprises determining a particular geometrical form for the representation and the interrelationship of the representations.
PCT/GB2017/050043 2016-01-15 2017-01-10 Model building from inspection data WO2017121993A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
IN201611001548 2016-01-15
IN201611001548 2016-01-15
EP16164456.2 2016-04-08
EP16164456 2016-04-08

Publications (1)

Publication Number Publication Date
WO2017121993A1 true WO2017121993A1 (en) 2017-07-20

Family

ID=57799733

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2017/050043 WO2017121993A1 (en) 2016-01-15 2017-01-10 Model building from inspection data

Country Status (1)

Country Link
WO (1) WO2017121993A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1445977A (en) 1972-09-21 1976-08-11 Rolls Royce Probes
EP0254515A2 (en) 1986-07-25 1988-01-27 Renishaw plc Co-ordinate measuring
WO2009001298A2 (en) * 2007-06-26 2008-12-31 Densys Ltd. Supplemental scene reference surface devices for three-dimensional mapping
WO2009024783A1 (en) 2007-08-20 2009-02-26 Renishaw Plc Course of motion determination
US20100268355A1 (en) 2009-04-21 2010-10-21 Hon Hai Precision Industry Co., Ltd. Programming system for a coordinate measuring machine and method thereof
WO2013050729A1 (en) * 2011-10-06 2013-04-11 Renishaw Plc Measurement method
US20140240490A1 (en) * 2013-02-25 2014-08-28 Siemens Aktiengesellschaft Method for object marking using a three-dimensional surface inspection system using two-dimensional recordings and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1445977A (en) 1972-09-21 1976-08-11 Rolls Royce Probes
EP0254515A2 (en) 1986-07-25 1988-01-27 Renishaw plc Co-ordinate measuring
WO2009001298A2 (en) * 2007-06-26 2008-12-31 Densys Ltd. Supplemental scene reference surface devices for three-dimensional mapping
WO2009024783A1 (en) 2007-08-20 2009-02-26 Renishaw Plc Course of motion determination
US20100268355A1 (en) 2009-04-21 2010-10-21 Hon Hai Precision Industry Co., Ltd. Programming system for a coordinate measuring machine and method thereof
WO2013050729A1 (en) * 2011-10-06 2013-04-11 Renishaw Plc Measurement method
US20140240490A1 (en) * 2013-02-25 2014-08-28 Siemens Aktiengesellschaft Method for object marking using a three-dimensional surface inspection system using two-dimensional recordings and method

Similar Documents

Publication Publication Date Title
Carbone et al. Combination of a vision system and a coordinate measuring machine for the reverse engineering of freeform surfaces
JP3943030B2 (en) Self-loading spatial reference point array
US4901256A (en) Co-ordinate measuring
US6341996B1 (en) Tool grinding simulation system
US10401144B2 (en) Coordinate measuring machine having a camera
Zhao et al. Computer-aided inspection planning—the state of the art
Shen et al. Multiple-sensor integration for rapid and high-precision coordinate metrology
KR910000873B1 (en) The method and system colntrolling assembling robot
EP0789289A2 (en) Method for designing or producing bent tubing
US5243665A (en) Component surface distortion evaluation apparatus and method
ES2399883T3 (en) Procedure and system for displaying surface errors
JP5350216B2 (en) Processed product measuring apparatus and measuring method
Zexiao et al. Complete 3D measurement in reverse engineering using a multi-probe system
US9366519B2 (en) Measurement method
US5465221A (en) Automated process planning for quality control inspection
CN1269066C (en) Information processing device and method thereof
AU2009228317B2 (en) System, program product, and related methods for registering three-dimensional models to point data representing the pose of a part
US4901253A (en) Coordinate measuring instrument and method of generating pattern data concerning shape of work to be measured
JP4275632B2 (en) Calibration method for parallel mechanism mechanism, calibration verification method, calibration verification program, data collection method, and correction data collection method for spatial position correction
US20020118187A1 (en) Information processing apparatus and information processing method
JPH11509928A (en) Scanning apparatus and method
Ainsworth et al. CAD-based measurement path planning for free-form shapes using contact probes
US6832128B2 (en) Method for rendering, evaluating and optimizing a surface quality based on CNC program data
JP5665270B2 (en) Method for scanning the surface of a workpiece
CN1166919C (en) Apparatus and method concerning analysis and generation of part program for measuring coordinates and surface properties

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17700463

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17700463

Country of ref document: EP

Kind code of ref document: A1