CN105204840B - Editing environment for machine vision system of part program - Google Patents

Editing environment for machine vision system of part program Download PDF

Info

Publication number
CN105204840B
CN105204840B CN201510334979.0A CN201510334979A CN105204840B CN 105204840 B CN105204840 B CN 105204840B CN 201510334979 A CN201510334979 A CN 201510334979A CN 105204840 B CN105204840 B CN 105204840B
Authority
CN
China
Prior art keywords
instructions
mode
image acquisition
image analysis
operations
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510334979.0A
Other languages
Chinese (zh)
Other versions
CN105204840A (en
Inventor
M.德拉尼
B.E.塞勒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitutoyo Corp
Original Assignee
Mitutoyo Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/307,458 external-priority patent/US9167215B2/en
Application filed by Mitutoyo Corp filed Critical Mitutoyo Corp
Publication of CN105204840A publication Critical patent/CN105204840A/en
Application granted granted Critical
Publication of CN105204840B publication Critical patent/CN105204840B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/401Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by control arrangements for measuring, e.g. calibration and initialisation, measuring workpiece for machining purposes

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Signal Processing (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

In a machine vision system editing environment for a part program, a continuous stream of image acquisition operations is performed in a run mode. Previously, a continuous stream of image capture operations was implemented by utilizing different programming representations in the part program and syntax for programming and grouping the image capture operations. A new general syntax and representation is utilized in which such successive image acquisition operations are recorded in the same manner as the conventional operations, and the execution of the component program is performed in two stages. First, the portion of the part program that will have a continuous image acquisition stream is scanned for image acquisition operations and the most efficient order for acquiring images is determined. Next, in performing the image acquisition, the part of the component program is scanned again, and then the image analysis operation is performed.

Description

Editing environment for machine vision system of part program
Cross Reference to Related Applications
This application is a continuation-in-part application of U.S. patent application No. 13/297,220 filed on 15/11/2011, the disclosure of which is incorporated herein by reference in its entirety.
Technical Field
The present disclosure relates generally to machine vision inspection systems, and more particularly to an editing environment for part programs including continuous high-speed image acquisition in such systems.
Background
Precision machine vision inspection systems (or simply "vision systems") may be used to acquire precise dimensional measurements of inspected objects and to detect various other object characteristics. Such systems may include a computer, a camera and optics system, and a precision stage that is movable in multiple directions to allow the camera to scan the inspected workpiece features. An exemplary prior art system is marketed as QUICK sold by Sanfeng corporation of America (MAC) located in Orola, Illinois
Figure BDA0000739604310000011
Serial Personal Computer (PC) based vision systems and
Figure BDA0000739604310000012
and (3) software. For example, use of QVPAK 3D CNC Vision measuring machine published in 1 month of 2003Guide to the household, and guide to the operation of QVPAK 3D CNC Vision measuring machine published in 9 months 1996, middle-to-QUICK
Figure BDA0000739604310000013
Series of vision systems and
Figure BDA0000739604310000014
the features and operation of software are generally described and hereby incorporated by reference in their entirety. For example, a product exemplified by the QV-302Pro model can use a microscope type optical system to provide images of the workpiece at different magnifications and move the platform as needed to traverse the workpiece surface across the limits of any single video image. Typically, a single video image contains only a portion of the workpiece being viewed or inspected, which gives the magnification, measurement resolution, and physical size limitations required for such systems.
Machine vision inspection systems typically use automated video inspection. U.S. Pat. No. 6,542,180 teaches various aspects of such automated video detection, and is incorporated herein by reference in its entirety. As taught in the' 180 patent, automated video detection meters typically have programming capabilities that allow a user to define an automatic detection event sequence for each particular workpiece configuration. This can be achieved by text-based programming, for example, or by gradually "learning" the recording pattern of the sequence of detection events by means of a graphical user interface storing a sequence of machine control instructions corresponding to the sequence of detection operations performed by the user, or by a combination of both methods. This recording mode is commonly referred to as a "learning mode" or "training mode". Once the sequence of detection events is defined in the "learn mode", this sequence can be used to automatically acquire (and otherwise analyze or detect) images of the workpiece in the "run mode".
Video tools (or simply "tools") and other graphical user interface features may be used manually to accomplish manual detection and/or machine control operations (in "manual mode"). The parameters and operation of the settings can also be recorded in a learning mode toAn auto-detection program, or "part program," is created. For example, the video tools may include edge/boundary detection tools, auto-focus tools, shape or pattern matching tools, dimension measurement tools, and the like. Other graphical user interface features may include dialog boxes, steps, and iterative loop programming, etc. associated with data analysis. For example, such tools are commonly used in various commercially available machine vision inspection systems, such as QUICK discussed above
Figure BDA0000739604310000021
Series of vision systems and related
Figure BDA0000739604310000022
And (3) software.
Machine control instructions that include a particular sequence of detection events (i.e., how each image is acquired and how each acquired image is analyzed/detected) are typically stored as a "part program" or "workpiece program" that is specific to a particular workpiece configuration. For example, the part program defines how each image is acquired, such as how to position the camera relative to the workpiece at what illumination level, at what magnification level, and so forth. Further, the part program defines how to analyze/detect the acquired image, for example, by using one or more video tools, such as edge/boundary detection video tools. The ability to create a part program with instructions for performing inspection operations in a predetermined order may provide a number of benefits, including providing enhanced inspection repeatability, and the ability to repeatedly automatically execute the same part program on one or more compatible machine vision inspection systems.
For general machine vision inspection systems for rapidly programmable multiple workpieces, QUICK as mentioned above
Figure BDA0000739604310000023
A series of PC-based vision systems traditionally interleave image acquisition operations with image analysis operations and/or feature detection operations performed on recently acquired images (referred to herein as "interleaving" type operations). However, it is higher for general machine vision inspection systems to offerThroughput demands are increasing. According to one approach, the need can be met by performing image acquisition while utilizing continuous relative motion between the camera and the workpiece platform (as opposed to intermittently stopping and starting relative motion, which is required for a pass-through type operation), thereby significantly improving throughput of inspection. Such operation is referred to herein as continuous motion type operation. This would be advantageous for such systems to include flash illumination to assist image acquisition in continuous motion without blurring the image.
High speed "in-line" vision inspection systems for high speed production lines provide continuous motion type image acquisition. However, such in-line vision systems are typically dedicated to a single production line and may acquire "identical" images multiple times, for example on a conveyor system for successive workpieces. In this case, the moving speed and the flash illumination parameter and the like are the same for each image. Furthermore, workpiece configuration and/or image acquisition parameters and the like are rarely changed. Thus, the programming methods of such systems do not facilitate rapid programming of an unlimited variety of workpieces, camera locations, image acquisition parameters, etc. by relatively unskilled users.
In contrast, experience has shown that a general purpose machine vision inspection system must assist relatively unskilled users in quickly programming an unlimited variety of workpieces, camera positions, image acquisition parameters, and the like. Previous programming methods for general purpose machine vision inspection systems have not made programming of continuous motion type operations sufficiently simple or fast. Furthermore, previous programming methods have not made programming of continuous motion type operations combined with interspersed type operations sufficiently simple or fast. Programming systems and methods that can be used alone or in combination to overcome these problems and disadvantages are desirable.
One exemplary prior art method that may overcome some of these problems and disadvantages is illustrated in U.S. patent No. 7,590,276, which is hereby incorporated by reference in its entirety. As described in the' 276 patent, a method of component programming is provided that allows a user to easily define a plurality of image acquisition operations interspersed with related image analysis operations in a natural and intuitively understandable relationship in a learn mode operation. Then, in the resulting part program, the image acquisition operations for at least some of the images may be automatically rearranged into a continuous motion image acquisition sequence that acquires the images and stores the images in a "non-interspersed" manner in order to increase the throughput of the machine vision inspection system.
However, as shown in the' 276 patent, one drawback of some previous programming approaches is that continuous-flow image acquisition operations have been implemented, typically by analyzing various operations entered by a user in a learning mode, using a programmed representation of "reorganization" and grammatical changes or "reorganization" of their order in the part program, so that image acquisition instructions can be grouped together to acquire multiple images using continuous motion, and changing or "reorganizing" their respective image analysis instructions to follow the image acquisition instructions so that the image analysis operations do not interleave or interrupt high-speed image acquisition in continuous motion. Thus, when the recall component program instructions are for editing or viewing, the image analysis instructions may be separated from the acquisition instructions for their respective images. This has proven to be confusing to users of such systems because the associated image acquisition and analysis instructions are separated by intervening "extraneous" image acquisition and image processing instructions, which are not intuitive and can lead to inefficiencies and errors when the user attempts to read or edit the "rearranged" part program instructions. In other words, the rearranged programming representation and syntax used to group image capture operations together in a part program makes programming and editing of such part programs more difficult for users. Accordingly, there is a need for a part programming grammar and editing operations and functionality that overcomes these and other deficiencies to enable more efficient, intuitive, and flexible programming and editing of sequential image acquisition part programs for precision machine vision inspection systems.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In various embodiments, an editing environment for a precision machine vision system is provided in which continuous or uninterrupted continuous stream image acquisition operations are performed in a run mode. A precision machine vision inspection system includes an imaging section, a stage for carrying one or more workpieces in a field of view (FOV) of the imaging section, a control section, a display, and a user interface.
In various embodiments, the machine vision inspection system further includes a learning mode operable to receive user input to control operation of the machine vision inspection system and to record instructions corresponding to the controlled operation in order to create the part program. The learn mode is also operable to edit the part program and execute previously recorded part program instructions according to the editing mode of execution. In some embodiments, the learning mode may include user interface functions, such as user editable component program representations of the user controllable flow pattern instruction elements and component program instructions. The user-controllable flow pattern instruction element may be used to specify a flow pattern segment, where the flow pattern segment includes a component program segment specified for flow pattern execution. In various embodiments, streaming mode execution may include image acquisition operations in a sequential order in a continuous motion sequence, wherein the platform and imaging portion are continuously or substantially continuously moved relative to each other to acquire at least two images. The editable component program representation comprises an image acquisition instruction representation corresponding to an image acquisition operation, an image analysis instruction representation corresponding to an image analysis operation, and may comprise a stream mode segment representation in some embodiments.
In various embodiments, the machine vision inspection system includes an operational mode operable to execute a previously created part program. The run modes generally include a streaming mode for performing execution of at least some of the component program instructions. In some embodiments, wherein the stream mode segments are defined in the component program, the run modes may include non-stream modes for executing component program instructions not located in the defined stream mode segments and stream modes for executing component program instructions located in the defined stream mode segments. In some embodiments, the non-stream mode of execution executes image acquisition instructions and corresponding image analysis instructions of the first plurality of component program instructions not located in a stream mode segment to perform the image acquisition operations and corresponding image analysis operations in a manner consistent with the first order.
In various embodiments, the learning mode is configured such that the editable component program representation may represent the first plurality of component program instructions including image acquisition and the respective image analysis instructions in a first order, wherein the first order corresponds to an order of the respective controlled operations performed to create the component program. Additionally, the learning mode may be further configured such that the editing mode of execution executes the part program image acquisition instructions and the corresponding image analysis instructions of the first plurality of part program instructions to perform the image acquisition operations and the corresponding image analysis operations in a manner consistent with the first order, and regardless of whether the first plurality of part program image acquisition and corresponding image analysis instructions are included in the flow pattern segment (if defined).
In various embodiments, the run mode is configured such that when the first plurality of component program image acquisition and corresponding image analysis instructions are included in a flow mode segment, the flow mode is caused to execute the instructions according to a second order (e.g., by default, or in some embodiments, when included in a defined flow mode segment). In one embodiment, the second sequence includes executing the first plurality of part program image acquisition instructions to perform their respective image acquisition operations in a sequential order without relying on performing the respective image analysis operations. Additionally, the second sequence may further include executing the first plurality of component program image analysis instructions to perform their respective image analysis operations after acquiring their respective images. In one embodiment, the performance of the image acquisition operations after acquiring their respective images is done in successive sequential image acquisition operations.
In some embodiments, wherein the machine vision inspection system includes flash illumination or other methods to allow for rapid image exposure without blurring, the performance of the image acquisition operations in sequential order is accomplished during a truly continuous sequence of motions in which the platform and the imaging portion are continuously moving relative to each other (e.g., the platform is continuously moving relative to the imaging portion). In some configurations, the stage motion is operable to physically move the workpiece in a horizontal plane (e.g., an X-Y plane) without moving the imaging portion, however, the stage motion is also operable to move the imaging portion in a vertical direction (e.g., a Z direction) without moving the workpiece. In other configurations, the stage motion is operable to physically move the workpiece in one horizontal direction (e.g., the X direction) without moving the imaging portion, however, the stage motion may also be operable to move the imaging portion in a different horizontal direction (e.g., the Y direction) and vertical direction (e.g., the Z direction) without moving the workpiece. In various embodiments, the image acquisition operations may be performed on at least two images in a continuous motion sequence. In other embodiments, where the machine vision inspection system lacks flash illumination, or requires slowing or temporarily stopping the associated motion to limit image blur, the performance of the image acquisition operations in sequential order is completed during a substantially continuous sequence of motions. The substantially continuous motion sequence may include motion operations to prevent image blur (e.g., slowing or temporarily stopping otherwise continuous motion as commanded by image acquisition requirements), but not including motion delays associated with performing image analysis or the like.
In various embodiments, the performance of the image analysis operations after acquiring their respective images is performed at least in part in a sequential order of image acquisition operations.
In some embodiments, the learning mode is configured such that when a part program including a stream mode segment is recalled for editing, the editable part program representation is displayed in a first order, and the editing mode performed performs the image acquisition operation and the corresponding image analysis operation in a manner consistent with the first order. According to another aspect of the disclosure, the learn mode is configured such that a user may select a component program instruction representation included in a flow mode segment as a location to initiate an edit mode to execute the respective component program instruction, after which subsequent component program instructions are executed in a manner consistent with the first order. According to another aspect of the invention, subsequent component program instructions are executed in a manner consistent with the first order.
In some embodiments, the stream mode segment includes a first set of instructions and a second set of instructions, the first set of instructions including first image acquisition instructions and first image analysis instructions, the first image analysis instructions including video tool instructions of a first video tool, and the second set of instructions including second image acquisition instructions and second image analysis instructions, the second image analysis instructions including video tool instructions of a second video tool, wherein in the editing mode of execution, the first image acquisition instructions and the first image analysis instructions including video tool instructions of the first video tool are executed before starting execution of the second image acquisition instructions and the second image analysis instructions including video tool instructions of the second video tool. In the run mode, the stream mode segment is executed in the stream mode due to the stream mode instruction element, and in the run mode of execution, the first and second image acquisition instructions are executed in sequential order without relying on the respective first and second image analysis operations to execute the video tool instructions comprising the first and second video tools. In accordance with another aspect of the disclosure, the first and second video tools include an edge detection video tool.
In some embodiments, the part program further includes a non-flow mode segment, and in the learn mode, due to the lack of flow mode instruction elements for the non-flow mode segment, non-stream mode segments can be distinguished from stream mode segments, the non-stream mode segments including a third set of instructions and a fourth set of instructions, the third set of instructions including third image acquisition instructions and third image analysis instructions, the third image analysis instructions including video tool instructions of a third video tool, and a fourth set of instructions comprises fourth image acquisition instructions and fourth image analysis instructions, the fourth image analysis instructions comprising video tool instructions for a fourth video tool, wherein in the editing mode of execution, the third image acquisition instruction and the third image analysis instruction including the video tool instruction of the third video tool are executed before starting execution of the fourth image acquisition instruction and the fourth image analysis instruction including the video tool instruction of the fourth video tool. In the run mode, the non-stream mode segment is executed in the non-stream mode due to the lack of a stream mode instruction element for the non-stream mode segment, and in the run mode execution of the non-stream mode segment, the third and fourth image acquisition instructions are executed in sequential order without relying on the respective third and fourth image analysis operations to execute video tool instructions including the third and fourth video tools.
In some embodiments, when a part program including a stream mode segment is recalled for editing, the editable part program representation is displayed in a first order, and the editing mode performed performs image acquisition operations and corresponding image analysis operations in a manner consistent with the first order.
In some embodiments, in the learn mode, the user may select any part program instruction in the stream mode segment as the location to begin execution of the part program instruction, after which subsequent part program instructions are executed in a manner consistent with the first order.
In some embodiments, the stream pattern segment includes a first set of instructions and a second set of instructions. The first set of instructions includes first image acquisition instructions and first image analysis instructions, the first image analysis instructions including video tool instructions for a first video tool, and the second set of instructions includes second image acquisition instructions and second image analysis instructions, the second image analysis instructions including video tool instructions for a second video tool. In one embodiment, in the editing mode of execution, the first image acquisition instruction and the first image analysis instruction including the video tool instruction of the first video tool are executed before starting execution of the second image acquisition instruction and the second image analysis instruction including the video tool instruction of the second video tool. Then, in the run mode, the stream mode segment is executed in the stream mode due to the stream mode instruction element. In an operational mode of execution of the stream mode segment, the first and second image acquisition instructions are executed in sequential order without relying on respective first and second image analysis operations to execute video tool instructions comprising the first and second video tools. In one embodiment, the first and second video tools comprise edge detection video tools.
In some embodiments, the part program further comprises a non-flow mode segment, and in the learn mode, the non-flow mode segment can be distinguished from the flow mode segment due to a lack of flow mode instruction elements for the non-flow mode segment. In one embodiment, the non-streaming mode segment includes a third set of instructions and a fourth set of instructions. The third set of instructions includes third image acquisition instructions and third image analysis instructions including video tool instructions for a third video tool, and the fourth set of instructions includes fourth image acquisition instructions and fourth image analysis instructions including video tool instructions for a fourth video tool. In the editing mode of execution, a third image acquisition instruction and a third image analysis instruction including a video tool instruction of a third video tool are executed before starting execution of a fourth image acquisition instruction and a fourth image analysis instruction including a video tool instruction of a fourth video tool. Then, in run mode, the non-stream mode segment is executed in non-stream mode due to the lack of a stream mode instruction element for the non-stream mode segment. In the run mode execution of the non-stream mode segment, the third and fourth image acquisition instructions are executed in sequential order without relying on the respective third and fourth image analysis operations to execute video tool instructions comprising the third and fourth video tools.
In some embodiments, in the learning mode, the first and second sets of instructions are executed in a first order, wherein at least some of the first and second image acquisition instructions are interspersed with video tool instructions of the first video tool and the second video tool, the instructions being displayed on the user interface in the first order. Next, in the run mode, in the stream mode, the part program instructions corresponding to the stream mode segment are processed to determine an image acquisition routine that includes image acquisition instructions of the first and second sets of instructions but does not include video tool instructions, the image acquisition routine is executed for acquiring an image, and the video tool instructions are executed concurrently with the video tool instructions.
In some embodiments, in the run mode, execution of at least a portion of the video tool instructions in the second pass through the stream mode segment in the stream mode is completed in parallel with execution of the image acquisition routine.
In some embodiments, in the run mode, the plurality of image acquisition instructions are executed in succession during execution of the image acquisition routine, and any video tool instructions interspersed in the learn mode have not yet been executed while the plurality of image acquisition instructions are executed in succession, such that the instructions are executed in a second order different from the first order.
In some embodiments, in the run mode, the platform and the imaging portion are continuously moved relative to each other to acquire images while the image acquisition routine is being executed.
In some embodiments, specifying the flow pattern instruction representation at the beginning and end of the flow pattern segment identifies the flow pattern segment of the part program.
In some embodiments, in the learn mode, the component program instructions in the stream pattern segments displayed in the component program representation are made to have the same appearance as similar component program instructions located outside the stream pattern segments, so that the user need not use a different programming representation or syntax when programming or editing operations located inside the stream pattern segments as opposed to outside the stream pattern segments.
While the foregoing summary emphasizes embodiments that include explicit stream mode segment markers or indications, the stream mode segment markers and/or identification of separate operational modes or operational streams may be unintelligible or irrelevant to some users. For example, streaming mode execution may be available on all machines owned by a user and/or may be a default or unique execution mode for run mode execution on a machine, so there may be no need to distinguish one or more streaming mode operations for a particular identification or cause. In some embodiments, some or all of the stream mode compatible instructions and/or operations may be automatically executed in the stream mode during the run mode. In some implementations, this may be a default or unique setting for run mode execution. In other implementations, the user may select streaming mode or non-streaming mode execution as a global "auto" mode of execution during run mode.
In any case, it should be understood that many or all of the foregoing features and benefits of non-stream editing mode representation and execution remain desirable and beneficial to a user during a learning mode and/or editing operation, regardless of whether stream mode instructions are explicitly tagged in a user interface or defined in a part program to indicate stream mode execution during a run mode, as described in more detail with reference to fig. 6A, 6B, and 7 below.
Drawings
The foregoing aspects and many of the attendant advantages of this invention will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
FIG. 1 is a diagram of various typical components of a general purpose precision machine vision inspection system;
FIG. 2 is a block diagram of the control system portion and the vision component portion of a machine vision inspection system similar to the system shown in FIG. 1 and including functionality in accordance with the present invention;
FIG. 3 is a diagram of an editing interface including a representation of a part program including a stream mode segment for image acquisition operations of a continuous stream in a run mode;
FIG. 4 is a diagram of a user interface including a portion of an artifact on which the parts program steps of FIG. 3 are performed; and
FIGS. 5A and 5B are flow diagrams of one embodiment of a routine for providing an editing environment for a part program that includes a stream mode segment;
FIGS. 6A and 6B are flow diagrams of one embodiment of a routine for implementing an embodiment in which a part program may be executed during a run mode using a streaming mode of operation, while a "non-streaming mode" editing mode, represented and executed, may be used during a learn mode; and
FIG. 7 is a timing diagram 700 featuring one aspect of a non-streaming mode or editing mode of execution as compared to a streaming mode of execution.
Detailed Description
FIG. 1 is a block diagram of an exemplary machine vision inspection system 10 that may be used in accordance with the methods described herein. The machine vision inspection system 10 includes a vision measuring machine 12 operatively connected to exchange data and control signals with a control computer system 14. The controlling computer system 14 is also operatively connected to exchange data and control signals with a monitor or display 16, a printer 18, a joystick 22, a keyboard 24, and a mouse 26. The monitor or display 16 may display a user interface suitable for controlling and/or programming the operation of the machine vision inspection system 10.
The vision measuring machine 12 includes a movable workpiece platform 32 and an optical imaging system 34 that may include a zoom lens or interchangeable lenses. The zoom lens or interchangeable lens may generally provide various magnifications for the image provided by the optical imaging system 34. The machine vision inspection system 10 may generally be similar to QUICK discussed above
Figure BDA0000739604310000101
Series of vision systems and
Figure BDA0000739604310000102
software, and similar most advanced commercial precision machine vision inspection systems, compete. Machine vision inspection systems 10 are also described in commonly assigned U.S. patent nos. 7,454,053, 7,324,682, 8,111,938, and 8,111,905, the entire contents of each of which are incorporated herein by reference.
Fig. 2 is a block diagram of a control system portion 120 and a vision component portion 200 of a machine vision inspection system 100 similar to the machine vision inspection system shown in fig. 1, and including functionality in accordance with the present disclosure. As described in more detail below, the control system portion 120 can be used to control the vision components portion 200. The vision assembly portion 200 includes an optical assembly portion 205, light sources 220, 230, and 240, and a workpiece stage 210 having a central transparent portion 212. The workpiece stage 210 is controllably movable along X and Y axes that lie in a plane that is substantially parallel to the stage surface on which the workpiece 20 may be located. The optical assembly portion 205 includes a camera system 260 and an interchangeable objective lens 250, and may include a lens disk assembly 280 having lenses 286 and 288. As an alternative to a lens tray assembly, a fixed or manual interchangeable magnification-varying lens, or a zoom lens arrangement, etc. may also be included. The optical assembly portion 205 can be moved in a controlled manner along a Z-axis that is substantially perpendicular to the X-axis and the Y-axis using a controllable motor 294.
A workpiece 20 to be imaged using the machine vision inspection system 100 or a pallet or fixture carrying a plurality of workpieces 20 is placed on the workpiece platform 210. The workpiece stage 210 may be controlled to move relative to the optical assembly portion 205 to move the interchangeable objective lens 250 between positions on the workpiece 20 and/or between multiple workpieces 20. One or more of the stage lamp 220, the in-line lamp 230, and the surface lamp 240 may emit source light 222, 232, or 242, respectively, to illuminate one or more workpieces 20. The source light is reflected or transmitted as workpiece light 255, and the workpiece light 255 passes through the interchangeable objective lens 250 and lens disk assembly 280 and is converged by the camera system 260. Images of one or more workpieces 20 captured by the camera system 260 are output to the control system portion 120 by signal line 262. The light sources 220, 230, and 240 may be connected to the control system portion 120 via signal lines or buses 221, 231, and 241, respectively. To change the image magnification, the control system portion 120 may rotate the lens disk assembly 280 along the shaft 284 to select a lens disk via the signal line or bus 281.
In various exemplary embodiments, the controllable motor 294 may be used to move the optical assembly portion 205 in a vertical Z-axis direction relative to the workpiece stage 210, and the controllable motor 294 drives an actuator, connecting cable, or the like to move the optical assembly portion 205 along the Z-axis to change the focus of the image of the workpiece 20 captured by the camera system 260. The term Z-axis as used herein refers to an axis that is intended to be used to focus an image obtained by the optical assembly portion 205. In use, the controllable motor 294 is connected to the input/output interface 130 via a signal line 296.
As shown in FIG. 2, in various exemplary embodiments, the control system portion 120 includes a controller 125, an input/output interface 130, a memory 140, a workpiece program generator and executor 170, and a power supply portion 190. Each of the components, as well as additional components described below, may be interconnected by one or more data/control buses and/or application programming interfaces, or by direct connections between different elements.
In various embodiments according to the present disclosure, the artifact program generator and executor 170 includes an edit section 172 that provides or activates various operations and user interface features related to editing the part program, as described in more detail below. It is to be understood that the terms "workpiece program" and "part program" may be used interchangeably herein. In general, the editing section 172 includes an editing operation controller 174 that controls operations for editing functions, and an editing interface 176 that provides user interface features for editing functions. The workpiece program generator and executor 170 also includes a stream mode section 178 that provides various functions related to the present disclosure, as will be described in more detail below.
As shown in fig. 2, the input/output interface 130 includes an imaging control interface 131, a motion control interface 132, an illumination control interface 133, and a lens control interface 134. While the position control element 132a and the velocity/acceleration control element 132b may be combined and/or indistinguishable, the motion control interface 132 may include these elements. The lighting control interface 133 includes lighting control elements 133a-133n that control selection, power, on/off, and strobe pulse timing, etc. (as applicable) for each respective light source of the machine vision inspection system 100.
Memory 140 includes an image file storage portion 141, a workpiece program storage portion 142, which may include one or more part programs, etc., and a video tool portion 143. The video tool portion 143 includes a video tool portion 143a and other video tool portions (e.g., 143m) that determine the GUI, image processing operations, etc. for each respective video tool. Many known video tools are included in commercially available machine vision inspection systems, such as QUICK discussed above
Figure BDA0000739604310000111
Series of vision systems and related
Figure BDA0000739604310000112
Software, etc. The video tool portion 143 also includes a useful field (ROI)) A generator 143x that supports automatic, semi-automatic, and/or manual operations to define various ROIs, and the ROIs may be operated in various video tools included in the video tool portion 143.
In general, the memory portion 140 stores data that may be used to operate the vision system components portion 200 to capture or acquire an image of the workpiece 20 such that the acquired image of the workpiece 20 has desired image characteristics. The storage portion 140 may also store inspection result data, and may also store data that may be used to operate the machine vision inspection system 100 to manually or automatically perform various inspection and measurement operations on the acquired images (e.g., implemented in part as a video tool), and output the results thereof via the input/output interface 130. The memory portion 140 may also contain data defining a user interface operable through the input/output interface 130.
The signal lines or buses 221, 231, and 241 of the platform light 220, the coaxial light 230, and the surface light 240, respectively, are connected to the input/output interface 130. Signal line 262 from camera system 260 and signal line 296 from controllable motor 294 are connected to input/output interface 130. In addition to carrying image data, signal line 262 may also carry signals originating from controller 125 that initiate image acquisition.
One or more display devices 136 (e.g., display 16 of FIG. 1) and one or more input devices 138 (e.g., joystick 22, keyboard 24, and mouse 26 of FIG. 1) may also be connected to the input/output interface 130. The display device 136 and the input device 138 may be used to display a user interface, which may include various user interface features that may be used to perform detection operations, and/or create and/or modify part programs, view images captured by the camera system 260, and/or directly control the vision system components portion 200.
In various exemplary embodiments, when a user uses the machine vision inspection system 100 to create a part program for a workpiece 20, the user generates the part program instructions by explicitly encoding the instructions, either automatically, semi-automatically, or manually, or using a workpiece programming language, and/or by operating the machine vision inspection system 100 in a learning mode to generate the instructions to provide a desired image acquisition training sequence. For example, the training sequence may include locating a workpiece feature in a field of view (FOV), setting a light level, focusing or autofocusing, acquiring an image, and providing an analytical training sequence to be applied to the image (e.g., using a video tool). The learn mode is operated to capture or record one or more sequences and convert them into corresponding component program steps (e.g., instructions). When the part program is executed, these part program steps may cause the machine vision inspection system to again generate trained image acquisition and analysis operations to automatically detect one or more workpieces 20 that match the workpiece used in creating the part program.
The related Editing functions are also described In entitled "Machine Vision System Program Editing Environment Including real-Time Context Generation functions" (U.S. Pub. No. 2013/0123945), "Machine Vision System Program Editing Environment Including Synchronized User Interface Features" (U.S. Pub. No. 2013/0125044), and "System and Method for Editing An initializing Block In A partial Program Editing Environment of a Machine Vision System" (U.S. Pub. No. 2013/0120567), each of which is incorporated by reference herein.
Fig. 3 is a diagram of an editing interface 300, the editing interface 300 including various measurement and/or operation selection fields, such as selection field 310, and editable representations of part programs 320 corresponding to various part program instructions including non-stream mode segment representation 330 and stream mode segment representation 340. The non-flow mode segment representation 330 includes a set of component program instruction representations 331 and 332 and the flow mode segment representation 340 includes a set of component program instruction representations 341 and 349. Stream mode segment 340 is a segment of component program 320 designed for stream mode execution, as described in more detail below. Referring to FIG. 4, the operation of the special section program instructions 331-349 will be described in more detail below.
Fig. 4 is a diagram illustrating a user interface 400, the user interface 400 including a view window 410 having a workpiece portion 415. The user interface 400 also includes various measurement and/or operation selection fields, such as selection fields 420 and 440, a real-time X-Y-Z (position) coordinate window 430, a light control window 450, and a video tool parameters box 460. As described in more detail below, various features on the workpiece 415 may be determined according to the associated part program instructions of fig. 3.
The following description will be made with reference to the part program instruction representations 321-350 of fig. 3 and the corresponding behavior associated with the workpiece 415 of fig. 4. As shown in FIG. 3, the part program 320 begins with instruction representations 321, 331, and 332, which respectively indicate the starting point, setting the lights, and moving the platform to the desired location. With respect to fig. 4, the lamp settings may be viewable by the lamp light located in the view window 410, and the movement of the platform may be viewable in the view window 410 and shown in the real-time X-Y-Z (position) coordinate window 430. As described in more detail below, instruction representations 331 and 332 are part of a non-flow mode segment representation 330, as shown in FIG. 3, followed by a flow mode segment representation 340. It is to be understood that for purposes of illustration, the non-streaming mode segment representation 330 includes only two instruction representations, and in a more detailed embodiment may also include more instruction representations, such as a video tool instruction representation, etc., as described in more detail below.
Instruction representation 341 is a stream mode instruction element that specifies the beginning of stream mode segment 340. The instruction representation 341 may be inserted by activating the stream mode portion 178, for example, by menu selection under the program menu 360. Next, the instruction representations 342, 343, 344, and 345 indicate that the magnification has been set, the path smoothing has been turned off, the allowed motion has been set, and the light has been set, respectively.
Next, command representation 346 indicates that a circle tool is to be opened to measure circle C1, as shown by respective command representations 346A-346C. More specifically, instructional representation 346A represents a setup for measurement (e.g., including movement of the platform to a specified position and acquisition of a corresponding image), while instructional representation 346B represents the use of a circle tool to determine the edge points of circle C1 located in the acquired image. The function and operation of the circle tool, as well as other edge detection video tools, are known in the art and described in detail in the previously incorporated references. As shown in fig. 4, a circular tool is shown in the field of view window 410 and covers a circular feature (e.g., a circle C1) on the workpiece 415. The edge points determined by the circle tool may then be used by the instruction representation 346C to define a circle C1.
Likewise, the instructional representations 347 and 348 indicate that the circle tool is to be opened to measure circles C2 and C3, respectively, as shown by the corresponding instructional representations 347A-347C and 348A-348C. More specifically, instructional representations 347A and 348A represent settings for measurements (e.g., including movement of the platform to the specified position and acquisition of the corresponding image), while instructional representations 347B and 348B represent the use of a circle tool to determine the edge points of circles C2 and C3 located in the acquired images. Referring to FIG. 4, this illustrates the movable platform so that the view window 415 will show its movement from the circle feature C1 to the circle features C2 and C3, respectively, to acquire their respective images. Next, the edge points determined using the circle tool are represented by the instructions 347C and 348C to define circles C2 and C3. Instruction representation 349 is a stream mode instruction element that specifies the end of stream mode segment 340. The instruction representation 350 represents the end of the part program.
With respect to the original creation of the part program 320, part program instructions are recorded in a learning mode according to user input (e.g., user input as provided through a user interface to control operation of a machine vision inspection system). Thereafter, in the editing mode, the above-described part program instruction representations 321 and 350 are provided in an editing interface, such as the editing interface 300 of FIG. 3, so that the corresponding part program instructions are represented in a simplified form (e.g., as written in a programming language) for convenience and ease of use. As described in more detail below, when the sequence of part program instruction representations 341 and 349 in flow mode segment 340 are represented in a first order (i.e., the order shown in fig. 3) in edit mode, in run mode, the corresponding part program instructions may be executed according to a second order in accordance with the present disclosure.
More specifically, in one embodiment, the run mode includes a flow pattern executed that is applicable to the identified flow segment and configures the flow pattern such that the component program instructions corresponding to the flow pattern segment representation 340 are processed to identify an image acquisition operation included in the flow segment (e.g., a plurality of image acquisition operations). Once this process is enabled, the image acquisition process for the plurality of images can begin. Further, the component program instructions corresponding to the stream pattern segment representation 340 are processed to identify image analysis operations corresponding to the acquired images, and the identified image analysis instructions (e.g., video tool operations) may be initiated once their corresponding images are acquired, provided that this does not interrupt the sequence of image acquisition operations (e.g., continuous motion for acquiring images may continue without relying on image analysis operations).
As a specific example, the operation sequence of the part program 320 is executed in the following manner. Instruction representation 321 for the start node, and instruction representations 331 and 332 for setting the lights and moving platform (which are part of non-stream mode segment representation 330) may be executed in the order shown. As described above, once instruction representation 341 is reached, indicating the start of stream mode segment representation 340, this will begin stream mode processing. In stream mode, a first pass through stream mode segment 340 will determine a list of image acquisition operations, which will then be performed, followed by a second pass through stream mode segment 340 to begin performing image analysis operations (e.g., video tool operations).
More specifically, the first pass through the flow pattern segment 340, in one embodiment, any operations required to acquire an image are added to the list of image acquisition routines, while any operations not required for the image acquisition operation are ignored. In a specific embodiment, the instructional representation 342 and 345 that is part of setting up the machine vision inspection system to acquire an image is also added to the list of operations that will be part of the image acquisition routine accordingly. In contrast, part program instruction representation 346 indicating a need to open the circle measurement tool is not added to the list of image acquisition instances because it need not be used to acquire the corresponding images. However, an instructional representation 346A representing the setup of the measurement and including the arrival location and acquisition of the image is added to the list of image acquisition routines. Instruction representations 346B and 346C relating to the running of the circle tool and the definition of the circle need not be used for image acquisition, so instruction representations 346B and 346C can be omitted rather than added to the list. Instruction representation 347 may be ignored in a similar manner, instruction representation 347A added to the list, instruction representations 347B, 347C, and 348 ignored, instruction representation 348A added to the list, and instruction representations 348B and 348C ignored.
After the first pass through the flow pattern segment 340, execution of the image acquisition routine may begin once the image acquisition routine is determined based on the instructional representations 345, 346A, 347A and 348A. In executing the image acquisition routine, a second pass through the stream mode segment 340 during which image analysis operations (e.g., video tool operations) are also performed. In the second pass through the stream pattern segment 340, in one embodiment, the instructional representation 342 and 345 are ignored, and do not include any image analysis operations. Instruction representation 346 is executed, which indicates that a circle measurement needs to be opened.
Instruction representation 346A is a specific example that demonstrates image acquisition and image analysis operations, as described in more detail below. In brief, in the second pass, instruction representation 346A, which previously indicated an image capture operation, also indicates that an image needs to be loaded, i.e., an image analysis operation, and instruction representation 346A is also executed. More specifically, if the image acquisition routine launched as described above has acquired the image represented by instruction representation 346A, the image is loaded. If no image has been acquired, the process of the second pass is temporarily stopped until the image acquisition routine acquires an image. Thus, as shown by instruction representation 346A, certain instruction representations may represent operations performed during the first and second passes through stream pattern segment 340.
After executing instruction representation 346A, the second pass continues with instruction representations 346B and 346C, which may be executed based on the edge points determined by the circle tool to run the circle tool and define circle C1. Likewise, instruction representation 347 is executed, and if the image acquisition routine acquired an image, instruction representation 347A for loading the corresponding image of circle C2 is executed, and if no image has been acquired, waiting continues. After executing instruction representation 347A, instruction representations 347B, 347C and 348 are executed, if a corresponding image is available instruction representation 348A is executed, otherwise if an image has not been acquired, the wait continues. After instruction representation 348A is executed, instruction representations 348B and 348C are executed, and instruction representation 349 represents the end of stream mode segment 340 and the end of stream mode. It is to be understood that in one embodiment, the additional instruction representations following the flow mode segment 340 are located outside of the flow mode and will be executed according to the order shown for the non-flow mode, similar to the instruction elements 331 and 332 of the non-flow mode segment 330.
It is to be understood that the above-described operating and editing environment for stream mode segments facilitates some prior art implementations. More specifically, in certain previous implementations, an exhaustive list of commands is used for image acquisition, and a different and exhaustive list of commands is used to analyze the acquired images. To achieve continuous motion image acquisition, the image acquisition instructions are organized into separate lists at the time of programming and are located in different programming representations and syntax in the part program. This makes editing and "debugging" of the corresponding part program more difficult. More specifically, when the user returns to the recorded operation and performs the part program for continuous motion type image acquisition using a different programming representation and syntax, the operation of determining how to edit or reprogram an instruction located inside the continuous moving image acquisition routine rather than outside thereof is more troublesome. In addition, debugging of the part program is further complicated because there is no readily visible representation of the steps that were originally performed, which thereby makes it more difficult to determine which instructions produced which results.
According to the present disclosure, as described above, a more satisfactory editing environment is provided in which component program instruction representations can be provided in their original order. Further, instruction representations within a stream pattern segment (e.g., stream pattern segment 340) are made to have the same appearance as instruction representations outside the stream pattern segment. As a specific example, if instruction representations 346, 346A, 346B, and 346C have been recorded outside of stream mode segment 340 (e.g., within non-stream mode segment 330), they have the same appearance in editing interface 300 except for the representation "included" by instruction representation 341. In contrast to the above-described prior art approach, where such instruction representations are provided inside the stream pattern section in a different programming representation and syntax than outside. As described above, the representation of the part program 320 shown in the editing interface 300 allows for the debugging of the part program to be performed in sequential order, even in the stream pattern section 340. More specifically, during debugging (e.g., in edit mode), the user may choose to have the instruction representations in flow mode section 340 execute in a stepwise process in a first order (i.e., the order shown in edit representation 300), which is the order in which the instruction representations were originally programmed, in order to simplify the process of determining which instruction representations lead to which results.
Fig. 5A and 5B are flow diagrams 500A and 500B of one embodiment of a routine 500 for providing an editing environment for a part program that includes a stream mode segment. As shown in FIG. 5A, in block 510, a learning mode is provided that is operable to receive user input to control operation of a machine vision inspection system and to record instructions corresponding to the controlled operation in order to create a part program. The learn mode is also operable to edit the part program and execute previously recorded part program instructions according to the editing mode of execution.
In block 520, user interface features are provided that include user editable part program representations of control flow pattern instruction elements and part program instructions. The user-controllable flow pattern instruction element may be used to specify a flow pattern segment, where the flow pattern segment includes a component program segment specified for flow pattern execution. The editable component program representation may include an image acquisition instruction representation corresponding to an image acquisition operation, an image analysis instruction representation corresponding to an image analysis operation, and a stream mode segment representation.
In block 530, run modes are provided that are operable to execute previously created component programs, the run modes including a non-streaming mode for executing component program instructions that are not located in a streaming mode segment and a streaming mode for executing component program instructions that are located in a streaming mode segment. From block 530, the routine continues to point A, as will be described in more detail below with reference to FIG. 5B.
As shown in FIG. 5B, from point A, the routine continues to block 540. In block 540, the learning mode is configured such that the editable part program representation may represent the first plurality of part program instructions including image acquisition and the respective image analysis instructions in a first order, wherein the first order corresponds to an order of the respective controlled operations performed to create the part program. The learning mode may be further configured such that the editing mode of execution executes the component program instructions to perform the image acquisition operations and the corresponding image analysis operations in a manner consistent with the first sequence, and regardless of whether the first plurality of component program instructions are included in the stream mode segment.
In block 550, the run mode is configured to cause the flow mode to execute instructions according to the second order when the first plurality of component program instructions are included in the flow mode segment. The second sequence includes executing the first plurality of part program image acquisition instructions to perform their respective image acquisition operations in a sequential order without relying on performing the respective image analysis operations. In one embodiment, in a continuous motion sequence, the image acquisition operations may be performed in sequential order. The second sequence further includes executing the first plurality of component program image analysis instructions to perform their respective image analysis operations after acquiring their respective images.
Although the foregoing description has emphasized embodiments that include explicit stream mode segment marking or indication, stream mode segment marking and/or identification of separate operational modes or operational streams may be unintelligible or irrelevant to some users. For example, streaming mode execution may be available on all machines owned by a user and/or may be a default or unique execution mode for run mode execution on a machine, so there may be no need to distinguish one or more streaming mode operations for a particular identification or cause. Thus, a flow mode user interface representation such as part program representation 341 and code markup may be removed in some embodiments; for example, in various embodiments similar to those described below with reference to fig. 6, 7A, and 7B. For example, in some embodiments, some or all of the stream mode compatible instructions and/or operations may be automatically executed in stream mode during run mode. In some implementations, this may be a default or unique setting for run mode execution. In other implementations, the user may select streaming mode or non-streaming mode execution as a global "auto" mode of execution during run mode.
In any case, it should be understood that many or all of the foregoing features and benefits of non-stream editing mode representation and execution remain desirable and beneficial to a user during learning mode and/or editing operations, for reasons outlined previously, regardless of whether stream mode instructions are explicitly tagged in a user interface or defined in a part program to indicate stream mode execution during run mode.
In general overview of such embodiments, a precision machine vision inspection system may include an imaging section, a stage for carrying one or more workpieces in a field of view (FOV) of the imaging section, a control section, a display, and a user interface. The machine vision inspection system may further include a learning mode operable to receive user input to control operation of the machine vision inspection system and to record instructions corresponding to the controlled operation in order to create a part program; operable to edit a part program; and operable to execute the previously recorded component program instructions according to the editing mode of execution. The learning mode may include user interface features including editable part program representations of part program instructions including image acquisition instruction representations corresponding to image acquisition operations, image analysis instruction representations corresponding to image analysis operations. The machine vision inspection system may also include an operational mode operable to execute a previously created part program, the operational mode including a flow mode for executing part program instructions. The learning mode may be configured such that the editable component program representation represents a first plurality of component program instructions including image acquisition and respective image analysis instructions in a first order, the first order corresponding to an order of respective controlled operations performed to create the component program, and the executed editing mode executes the component program image acquisition instructions and the respective image analysis instructions of the first plurality of component program instructions to perform the image acquisition operations and the respective image analysis operations in a manner consistent with the first order. The run mode may be configured such that the stream mode executes the first plurality of part program instructions according to a second sequence that includes executing the first plurality of part program image acquisition instructions to perform their respective image acquisition operations in a sequential order without relying on performing the respective image analysis operations, and executes the first plurality of part program image analysis instructions to perform their respective image analysis operations after acquiring their respective images.
Fig. 6A and 6B are flow diagrams 600A and 600B of one embodiment of a routine for implementing embodiments consistent with those just outlined above, in which a part program may be generally executed during a run mode using a streaming mode of operation, while during a learn mode a more easily used "non-streaming" edit mode of presentation and execution may be used.
As shown in FIG. 6A, at block 610, a learning mode is provided that is operable to receive user input to control operation of the machine vision inspection system and to record instructions corresponding to the controlled operation in order to create a part program. The learn mode is further operable to edit the part program and execute previously recorded part program instructions according to the executed edit mode. At block 620, user interface features are provided that include editable part program representations of part program instructions. The editable part program representation may include an image acquisition instruction representation corresponding to an image acquisition operation, and an image analysis instruction representation corresponding to an image analysis operation. At block 630, an execution mode is provided that is operable to execute the previously created component program, the execution mode including a streaming mode for executing the component program instructions. From block 630, the routine continues to point A, as will be described in more detail below with reference to FIG. 6B.
As shown in FIG. 6B, from point A, the routine continues to block 640. At block 640, the learning mode is configured such that the editable component program representation represents a first plurality of component program instructions including image acquisition and respective image analysis instructions in a first order, the first order corresponding to an order of respective controlled operations performed to create the component program. The learning mode is further configured such that the executed editing mode executes the component program instructions to perform the image acquisition operations and corresponding image analysis operations in a manner consistent with the first sequence. At block 650, the run mode is configured such that the stream mode executes the first plurality of component program instructions according to a second order. The second sequence includes executing the first plurality of part program image acquisition instructions to perform their respective image acquisition operations in a sequential order without relying on performing the respective image analysis operations. In one embodiment, the image acquisition operations may be performed in sequential order during a continuous sequence of movements. The second sequence may also include executing the first plurality of component program image analysis instructions to perform their respective image analysis operations after acquiring their respective images.
In an embodiment, executing the first plurality of part program image acquisition instructions to perform their respective image acquisition operations in a sequential order may comprise performing image acquisition operations during a continuous image acquisition sequence in which the platform and the imaging portion are moved substantially continuously relative to each other to acquire images, except for image acquisition motion operations required to limit image blurring at the imaging location, and without motion delay operations due to image analysis operations. For example, with respect to the part program 320 shown in FIG. 3, during operation corresponding to the instruction representation 346A, the settings for measurement may include moving the platform to a specified position and acquiring a corresponding image when the movement is stopped or slowed sufficiently to limit image blur. Similar movements may be used for operations corresponding to instruction representations 347A and 348A.
The foregoing description has emphasized embodiments in which the machine vision inspection system includes flash illumination or other methods to allow rapid image exposure without blurring during streaming mode execution of the part program. In such a system, the performance of image acquisition operations in sequential order may be accomplished during a truly continuous sequence of motions in which the platform and the imaging portion are continuously moving relative to each other without stopping. However, in other embodiments where the machine vision inspection system lacks flash illumination or requires slowing or temporarily stopping the associated motion to limit image blurring, the performance of the image acquisition operations in the sequential order during streaming mode execution may be completed while using a substantially continuous sequence of motions, and the significant benefits outlined previously may still be maintained. In various embodiments, the substantially continuous motion sequence may then include the necessary motion operations to prevent image blurring (e.g., slowing or temporarily stopping otherwise continuous motion as commanded by image acquisition requirements), but not including motion delays associated with performing image analysis and the like, so as to reduce or minimize the overall execution time of the component program during streaming mode execution. This idea is elucidated with reference to fig. 7.
FIG. 7 is a timing diagram 700 depicting features of an aspect of one exemplary embodiment of a non-streaming mode or edit mode of execution 710 and one exemplary embodiment of a streaming mode of execution 740. The time sequence associated with the executed non-streaming or editing mode 710 does not indicate a sequential image acquisition sequence. Instead, first, an image acquisition operation 720A is performed, which may include the following operations: the platform and the imaging portion are moved relative to each other to place the feature in a field of view of the imaging system; providing image illumination; and other operations required to acquire images. After image acquisition operation 720A, an image analysis operation 730A is performed on the acquired image, which image analysis operation 730A may include, for example, an operation based on edge detection of the image acquired during image acquisition operation 720A. After image analysis operation 730A is complete, image acquisition operation 720B is performed (e.g., similar to operation 720A), and after completion, image analysis operation 730B is performed. This mode proceeds to perform image acquisition operations 720C and 720D and image analysis operations 730C and 730D, etc. Each image analysis operation may interrupt and delay subsequent image acquisition. Such an implementation is beneficial and understandable during the learning mode when user action is required to define image acquisition parameters and image analysis results need to be evaluated to confirm that the desired image has produced the expected analysis results before proceeding to define or verify the next part program operation. In such a case, the image analysis time does not significantly delay the overall learning mode process, which is typically sped up by user input and evaluation.
Rather, the time sequence associated with the executed stream mode 740 indicates a sequential image acquisition sequence. First, image acquisition operation 720A' may be performed similarly to image acquisition operation 720A in this example and includes the following operations: the platform and the imaging portion are moved relative to each other to place the feature in a field of view of the imaging system; providing image illumination; and other operations required to acquire images. After image acquisition operation 720A 'is completed, image acquisition operation 720B' is initiated and performed immediately, as movement between image acquisition locations is typically the more time consuming one in detecting a sequence and managing overall throughput. Subsequent image acquisition operations 720C ', 720D', etc. follow a similar pattern for this reason, forming a sequential image acquisition sequence. In modern computers, image analysis operations corresponding to any particular image may be initiated at any actual time after the image is acquired and/or stored in memory. In this example, an image analysis operation 730A '(e.g., similar or identical to image analysis operation 730A), which may include, for example, operations for edge detection analysis of edges included in the image of operation 720A', is initiated immediately after image acquisition operation 720A 'is completed and is performed in parallel with image acquisition operation 720B'. Subsequent image analysis operations 730B ', 730C ', 730D ', etc. follow a similar pattern. The result is an improvement in the time required for the executing streaming mode 740 to perform the same number of image acquisition and analysis operations as compared to the executing non-streaming or editing mode 710.
In embodiments where the machine vision inspection system includes flash illumination or other methods to allow for rapid image exposure without blurring, performance of the image acquisition operations in sequential order during flow mode 740 may be accomplished during a truly continuous sequence of motion in which the stage and the imaging portion are continuously moving relative to each other. This is an ideal situation and will typically result in the shortest execution time. In embodiments where the machine vision inspection system lacks flash illumination, or requires slowing or temporarily stopping the associated motion to limit image blur, the performance of the image acquisition operations in sequential order during flow mode 740 may be completed during a "substantially continuous" sequence of motions. A substantially continuous motion sequence may include motion operations that prevent image blurring (e.g., slowing or temporarily stopping otherwise continuous motion as commanded by image acquisition requirements), but do not include motion delays associated with performing image analysis or the like. Streaming mode execution in such embodiments is still beneficial. Experimental results show that one exemplary component program for a non-streaming or "traditional" machine vision inspection system that acquires 16 images of a 4X4 array in steps of 1mm in the X-Y plane is executed approximately 40% faster using the streaming mode 740 than using the executed non-streaming or editing mode 710, even when temporarily stopped or slowed sufficiently during execution of image acquisition operations in sequential order during a substantially continuous image acquisition sequence. Thus, in general, various embodiments outlined herein may include performing image acquisition operations during a continuous image acquisition sequence, the image acquisition operations including at least one of: (a) an operation in which the stage and the imaging section are continuously moved relative to each other to acquire an image, or (b) an operation in which the stage and the imaging section are substantially continuously moved relative to each other to acquire an image, except for an image acquisition motion operation required to limit image blurring at an imaging position, and there is no motion delay due to an image analysis operation.
While various preferred and exemplary embodiments of the present invention have been shown and described, it will be understood that various changes may be made therein without departing from the spirit and scope of the invention.

Claims (29)

1. A precision machine vision inspection system comprising an imaging section, a stage for carrying one or more workpieces in a field of view (FOV) of the imaging section, a control section, a display, and a user interface, wherein the machine vision inspection system further comprises:
a learning mode operable to receive user input to control operation of the machine vision inspection system and to record instructions corresponding to the controlled operation in order to create a part program; operable to edit a part program; and component program instructions operable to execute the previously recorded program instructions according to an editing mode of execution, the learning mode including user interface features including:
a user controllable flow pattern instruction element operable to specify a flow pattern segment, the flow pattern segment including a component program segment specified for flow pattern execution;
an editable component program representation of component program instructions comprising an image acquisition instruction representation corresponding to an image acquisition operation, an image analysis instruction representation corresponding to an image analysis operation, and a stream mode segment representation; and
an execution mode operable to execute a previously created component program, the execution mode including a non-streaming mode for executing component program instructions not located in a streaming mode segment and a streaming mode for executing component program instructions located in a streaming mode segment, wherein,
the learning mode is configured such that:
the editable component program representation representing a first plurality of component program instructions including image acquisition and respective image analysis instructions in a first order, the first order corresponding to an order of respective controlled operations performed to create the component program;
the editing mode of execution executes the part program image acquisition instructions and corresponding image analysis instructions of the first plurality of part program instructions to perform the image acquisition operations and corresponding image analysis operations in a manner consistent with the first order, regardless of whether the first plurality of part program instructions are included in a stream mode segment; and
the operating mode is configured such that:
when the first plurality of component program instructions is included in a stream pattern segment, the stream pattern executes the first plurality of component program instructions according to a second order, the second order comprising:
executing the first plurality of part program image acquisition instructions to perform their respective image acquisition operations in sequential order during a successive image acquisition sequence without relying on performing the respective image analysis operations, and executing the first plurality of part program image analysis instructions to perform their respective image analysis operations after acquiring their respective images, the successive image acquisition sequence in which the platform and the imaging portion are moved substantially continuously relative to each other to acquire images, except for image acquisition motion operations required to limit image blurring at the imaging location.
2. The system of claim 1, wherein performance of the image acquisition operations in sequential order is accomplished during a continuous sequence of motions in which the platform and the imaging portion are continuously moved relative to each other to acquire images.
3. The system of claim 1, wherein the performance of the image analysis operations is performed at least in part during a sequential sequence of image acquisition operations after acquiring their respective images.
4. The system of claim 1, wherein the non-streaming mode of execution executes the image acquisition instructions and corresponding image analysis instructions of the first plurality of component program instructions that are not located in a streaming mode segment to perform the image acquisition operations and corresponding image analysis operations in a manner consistent with the first order.
5. The system of claim 1, wherein the learning mode is configured such that when a part program including a stream mode segment is recalled for editing, the editable part program representation is displayed in the first order, and the editing mode of execution executes the image acquisition operations and corresponding image analysis operations in a manner consistent with the first order.
6. The system of claim 1, wherein the learning mode is configured such that a user can select a component program instruction representation included in a stream mode segment as a location of an editing mode that initiates execution of the respective component program instruction, after which subsequent component program instructions are executed in a manner consistent with the first order.
7. The system of claim 6, wherein the subsequent component program instructions are executed in a manner consistent with the first order.
8. The system of claim 1, wherein the stream pattern segment includes a first set of instructions and a second set of instructions, the first set of instructions including first image acquisition instructions and first image analysis instructions, the first image analysis instructions including video tool instructions of a first video tool, and the second set of instructions including second image acquisition instructions and second image analysis instructions, the second image analysis instructions including video tool instructions of a second video tool, wherein during the editing mode of execution, the first image acquisition instructions and the first image analysis instructions including video tool instructions of the first video tool are executed before starting execution of the second image acquisition instructions and the second image analysis instructions including video tool instructions of the second video tool; and
during the run mode, the stream mode segment is executed in the stream mode due to the stream mode instruction element, and during the run mode in which the stream mode segment is executed, the first and second image acquisition instructions are executed in sequential order without relying on the respective first and second image analysis operations to execute video tool instructions including the first and second video tools.
9. The system of claim 8, wherein the component program further comprises a non-stream mode segment, and during the learn mode, the non-stream mode segment is distinguished from the stream mode segment due to a lack of stream mode instruction elements for the non-stream mode segment, the non-stream mode segment comprising a third set of instructions and a fourth set of instructions, the third set of instructions comprising third image acquisition instructions and third image analysis instructions, the third image analysis instructions comprising video tool instructions of a third video tool, and the fourth set of instructions comprising fourth image acquisition instructions and fourth image analysis instructions, the fourth image analysis instructions comprising video tool instructions of a fourth video tool, wherein during the edit mode of execution, the third image acquisition instructions and the fourth image analysis instructions comprising video tool instructions of the fourth video tool are executed before starting execution of the fourth image acquisition instructions and the fourth image analysis instructions Third image analysis instructions of video tool instructions of a third video tool; and
during the run mode, the non-stream mode segment is executed in the non-stream mode due to a lack of a stream mode instruction element for the non-stream mode segment, and during the run mode of the non-stream mode segment execution, the third and fourth image acquisition instructions are executed in sequential order without relying on execution of respective third and fourth image analysis operations including video tool instructions of the third and fourth video tools.
10. The system of claim 8, wherein the first and second video tools comprise edge detection video tools.
11. The system of claim 8, wherein during the learning mode, the first and second sets of instructions are executed in the first order, wherein at least some of the first and second image acquisition instructions are interspersed with the video tool instructions of the first video tool and the second video tool, the instructions being displayed on the user interface in the first order; and
during the run mode, processing the component program instructions corresponding to the stream mode segment in the stream mode to determine an image acquisition routine that includes the image acquisition instructions of the first and second sets of instructions but does not include the video tool instructions, executing the image acquisition routine for acquiring an image, and executing the video tool instructions while executing the image acquisition routine.
12. The system of claim 11, wherein, during the run mode, at least a portion of the execution of the video tool instructions during a second pass through the stream mode segment in the stream mode is completed in parallel with the execution of the image acquisition routine.
13. The system of claim 11, wherein during the run mode, during execution of the image acquisition routine, a plurality of the image acquisition instructions are executed in succession, during which time any video tool instructions interspersed during the learn mode have not yet been executed, such that the instructions are executed in the second order, which is different from the first order.
14. The system of claim 11, wherein during the run mode, the platform and the imaging portion are continuously moved relative to each other while the image acquisition routine is being executed to acquire images.
15. The system of claim 1, wherein a stream pattern instruction representation identifies the stream pattern segment of the component program at a beginning and an end of the stream pattern segment.
16. The system of claim 1, wherein during the learning mode, the component program instructions in the stream pattern section displayed in a component program representation are made to have the same appearance as similar component program instructions located outside the stream pattern section, such that a user need not use a different programming representation or syntax when programming or editing operations located inside the stream pattern section as opposed to outside the stream pattern section.
17. A method for operating a precision machine vision inspection system, the system comprising an imaging section, a stage for carrying one or more workpieces in a field of view (FOV) of the imaging section, a control section comprising a processor, a display, and a user interface, the method comprising:
providing a learning mode operable to receive user input to control operation of the machine vision inspection system and to record instructions corresponding to the controlled operation in order to create a part program; operable to edit a part program; and component program instructions operable to execute the previously recorded program instructions according to an editing mode of execution, the learning mode including user interface features including:
a user controllable flow pattern instruction element operable to specify a flow pattern segment, the flow pattern segment including a component program segment specified for flow pattern execution;
an editable component program representation of component program instructions comprising an image acquisition instruction representation corresponding to an image acquisition operation, an image analysis instruction representation corresponding to an image analysis operation, and a stream mode segment representation; and
providing run modes operable to execute previously created component programs, the run modes including a non-streaming mode for executing component program instructions not located in a streaming mode segment and a streaming mode for executing component program instructions located in a streaming mode segment, wherein,
the learning mode is configured such that:
the editable component program representation representing a first plurality of component program image acquisition and respective image analysis instructions in a first order, the first order corresponding to an order of respective controlled operations performed to create the component program;
the editing mode of execution executes the part program image acquisition instructions and corresponding image analysis instructions of the first plurality of part program instructions to perform the image acquisition operations and corresponding image analysis operations in a manner consistent with the first order, regardless of whether the first plurality of part program image instructions are included in a stream mode segment;
the operating mode is configured such that:
when the first plurality of component program instructions is included in a stream pattern segment, the stream pattern executes the first plurality of component program instructions according to a second order, the second order comprising:
executing the first plurality of part program image acquisition instructions to perform their respective image acquisition operations in a sequential order without relying on performing the respective image analysis operations, and executing the first plurality of part program image analysis instructions to perform their respective image analysis operations after acquiring their respective images, wherein executing the first plurality of part program image acquisition instructions to perform their respective image acquisition operations in a sequential order comprises performing image acquisition operations during a sequential image acquisition sequence that includes at least one of: (a) an operation in which the stage and the imaging portion are continuously moved relative to each other to acquire images, or (b) an operation in which the stage and the imaging portion are substantially continuously moved relative to each other to acquire images, except for an image acquisition motion operation required to limit image blurring at an imaging position.
18. The method of claim 17, wherein when a part program comprising a stream mode segment is recalled for editing, the editable part program representation is displayed in the first order, and the executed editing mode performs the image acquisition operation and the corresponding image analysis operation in a manner consistent with the first order.
19. A precision machine vision inspection system comprising an imaging section, a stage for carrying one or more workpieces in a field of view (FOV) of the imaging section, a control section, a display, and a user interface, wherein the machine vision inspection system further comprises:
a learning mode operable to receive user input to control operation of the machine vision inspection system and to record instructions corresponding to the controlled operation in order to create a part program; operable to edit a part program; and component program instructions operable to execute the previously recorded program instructions according to an editing mode of execution, the learning mode including user interface features including:
an editable component program representation of component program instructions comprising an image acquisition instruction representation corresponding to an image acquisition operation, an image analysis instruction representation corresponding to an image analysis operation; and
an execution mode operable to execute a previously created component program, the execution mode comprising a streaming mode for executing component program instructions, wherein,
configuring the learning mode such that:
the editable component program representation representing a first plurality of component program instructions including image acquisition and respective image analysis instructions in a first order, the first order corresponding to an order of respective controlled operations performed to create the component program; and
the editing mode of execution executing the part program image acquisition instructions and corresponding image analysis instructions of the first plurality of part program instructions to perform the image acquisition operations and corresponding image analysis operations in a manner consistent with the first order; and
configuring the operating mode such that:
the streaming mode executes the first plurality of component program instructions according to a second order comprising:
executing the first plurality of part program image acquisition instructions to perform their respective image acquisition operations in sequential order during a successive image acquisition sequence without relying on performing the respective image analysis operations, and executing the first plurality of part program image analysis instructions to perform their respective image analysis operations after acquiring their respective images, the successive image acquisition sequence in which the platform and the imaging portion are moved substantially continuously relative to each other to acquire images, except for image acquisition motion operations required to limit image blurring at the imaging location.
20. The system of claim 19, wherein the performance of the image analysis operations is performed at least in part during a sequential sequence of image acquisition operations after acquiring their respective images.
21. The system of claim 19, wherein the learning mode is configured such that when a component program executed using a streaming mode during a run mode is recalled for editing, the editable component program representation is displayed in the first order, and the executed editing mode performs the image acquisition operations and corresponding image analysis operations in a manner consistent with the first order.
22. The system of claim 19, wherein a part program executed using the stream mode during run mode comprises a first set of instructions and a second set of instructions, the first set of instructions includes first image acquisition instructions and first image analysis instructions, the first image analysis instructions including video tool instructions for a first video tool, and the second set of instructions includes second image acquisition instructions and second image analysis instructions, the second image analysis instructions including video tool instructions for a second video tool, wherein in the editing mode of execution, the first image acquisition instruction and a first image analysis instruction comprising a video tool instruction of the first video tool are executed before starting execution of the second image acquisition instruction and a second image analysis instruction comprising a video tool instruction of the second video tool; and
during the run mode, the component program is executed in the streaming mode, wherein the first and second image acquisition instructions are executed in sequential order without relying on respective first and second image analysis operations to execute video tool instructions including the first and second video tools.
23. The system of claim 22, wherein during the learning mode, the first and second sets of instructions are executed in the first order, wherein at least some of the first and second image acquisition instructions are interspersed with the video tool instructions of the first video tool and the second video tool, the instructions being displayed on the user interface in the first order; and
during the run mode, the component program instructions are processed to determine an image acquisition sequence or routine that includes the image acquisition instructions of the first and second sets of instructions but does not include the video tool instructions, the image acquisition sequence or routine is executed for acquiring images, and the video tool instructions are executed while the image acquisition sequence or routine is being executed.
24. The system of claim 19, wherein:
the learning mode includes user interface features that further include user controllable flow pattern instruction elements that are usable to specify flow pattern segments that include component program segments that are specified for flow pattern execution;
the editable part program representation of the part program instructions further comprises a stream mode segment representation;
the run mode includes, in addition to the stream mode, a non-stream mode for executing component program instructions not located in a stream mode segment, the stream mode being used to execute component program instructions located in a stream mode segment;
during a learning mode, the executed editing mode executes the part program image acquisition instructions and corresponding image analysis instructions of the first plurality of part program instructions to perform the image acquisition operations and corresponding image analysis operations in a manner consistent with the first order, regardless of whether the first plurality of part program instructions are included in a stream mode segment; and is
During the run mode, the stream mode executes the first plurality of component program instructions according to the second order only on a condition that the first plurality of component program instructions is included in a stream mode segment.
25. A method for operating a precision machine vision inspection system, the system comprising an imaging section, a stage for carrying one or more workpieces in a field of view (FOV) of the imaging section, a control section comprising a processor, a display, and a user interface, the method comprising:
providing a learning mode operable to receive user input to control operation of the machine vision inspection system and to record instructions corresponding to the controlled operation in order to create a part program; operable to edit a part program; and component program instructions operable to execute the previously recorded program instructions according to an editing mode of execution, the learning mode including user interface features including:
an editable component program representation of component program instructions comprising an image acquisition instruction representation corresponding to an image acquisition operation, an image analysis instruction representation corresponding to an image analysis operation; and
providing an execution mode operable to execute a previously created component program, the execution mode comprising a streaming mode for executing component program instructions, wherein,
configuring the learning mode such that:
the editable component program representation representing a first plurality of component program image acquisition and respective image analysis instructions in a first order, the first order corresponding to an order of respective controlled operations performed to create the component program;
the editing mode of execution executing the part program image acquisition instructions and corresponding image analysis instructions of the first plurality of part program instructions to perform the image acquisition operations and corresponding image analysis operations in a manner consistent with the first order; and
configuring the operating mode such that:
the streaming mode executes the first plurality of component program instructions according to a second order comprising:
executing the first plurality of part program image acquisition instructions to perform their respective image acquisition operations in sequential order during a successive image acquisition sequence without relying on performing the respective image analysis operations, and executing the first plurality of part program image analysis instructions to perform their respective image analysis operations after acquiring their respective images, the successive image acquisition sequence in which the platform and the imaging portion are moved substantially continuously relative to each other to acquire images, except for image acquisition motion operations required to limit image blurring at the imaging location.
26. The method of claim 25, wherein the performing of the image analysis operations is performed at least in part during a sequential sequence of image acquisition operations after acquiring their respective images.
27. The method of claim 25, wherein the learning mode is configured such that when a component program executed using a streaming mode during a run mode is recalled for editing, the editable component program representation is displayed in the first order, and the executed editing mode performs the image acquisition operations and corresponding image analysis operations in a manner consistent with the first order.
28. The method of claim 25, wherein a part program executed using the stream mode during a run mode comprises a first set of instructions and a second set of instructions, the first set of instructions includes first image acquisition instructions and first image analysis instructions, the first image analysis instructions including video tool instructions for a first video tool, and the second set of instructions includes second image acquisition instructions and second image analysis instructions, the second image analysis instructions including video tool instructions for a second video tool, wherein in the editing mode of execution, the first image acquisition instruction and a first image analysis instruction comprising a video tool instruction of the first video tool are executed before starting execution of the second image acquisition instruction and a second image analysis instruction comprising a video tool instruction of the second video tool; and
during the run mode, the component program is executed in the streaming mode, wherein the first and second image acquisition instructions are executed in sequential order without relying on respective first and second image analysis operations to execute video tool instructions including the first and second video tools.
29. The method of claim 28, wherein during the learning mode, the first and second sets of instructions are executed in the first order, wherein at least some of the first and second image acquisition instructions are interspersed with the video tool instructions of the first video tool and the second video tool, the instructions being displayed on the user interface in the first order; and
during the run mode, the component program instructions are processed to determine an image acquisition sequence or routine that includes the image acquisition instructions of the first and second sets of instructions but does not include the video tool instructions, the image acquisition routine is executed for acquiring an image, and the video tool instructions are executed while the image acquisition routine is executed.
CN201510334979.0A 2014-06-17 2015-06-16 Editing environment for machine vision system of part program Active CN105204840B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/307,458 US9167215B2 (en) 2011-11-15 2014-06-17 Machine vision system editing environment for a part program in which a continuous stream of image acquisition operations are performed during a run mode
US14/307,458 2014-06-17

Publications (2)

Publication Number Publication Date
CN105204840A CN105204840A (en) 2015-12-30
CN105204840B true CN105204840B (en) 2020-05-12

Family

ID=54706996

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510334979.0A Active CN105204840B (en) 2014-06-17 2015-06-16 Editing environment for machine vision system of part program

Country Status (3)

Country Link
JP (1) JP6585934B2 (en)
CN (1) CN105204840B (en)
DE (1) DE102015208337A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6647094B2 (en) * 2016-03-16 2020-02-14 株式会社ミツトヨ Part program generator for surface texture measuring machine

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8516462D0 (en) * 1984-07-17 1985-07-31 Electronic Automation Ltd Vision inspection method
CN103105193A (en) * 2011-11-15 2013-05-15 株式会社三丰 Precision machine vision inspection system and method for operating the same
CN103106077A (en) * 2011-11-15 2013-05-15 株式会社三丰 Machine vision system

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6542180B1 (en) 2000-01-07 2003-04-01 Mitutoyo Corporation Systems and methods for adjusting lighting of a part based on a plurality of selected regions of an image of the part
US7324682B2 (en) 2004-03-25 2008-01-29 Mitutoyo Corporation System and method for excluding extraneous features from inspection operations performed by a machine vision inspection system
US7454053B2 (en) 2004-10-29 2008-11-18 Mitutoyo Corporation System and method for automatically recovering video tools in a vision system
US7590276B2 (en) 2004-12-20 2009-09-15 Mitutoyo Corporation System and method for programming interrupting operations during moving image acquisition sequences in a vision system
US7394926B2 (en) * 2005-09-30 2008-07-01 Mitutoyo Corporation Magnified machine vision user interface
US8111938B2 (en) 2008-12-23 2012-02-07 Mitutoyo Corporation System and method for fast approximate focus
US8111905B2 (en) 2009-10-29 2012-02-07 Mitutoyo Corporation Autofocus video tool and method for precise dimensional inspection
US9013574B2 (en) 2011-11-15 2015-04-21 Mitutoyo Corporation Machine vision system program editing environment including synchronized user interface features
US9223306B2 (en) 2011-11-15 2015-12-29 Mitutoyo Corporation System and method utilizing an editing initialization block in a part program editing environment in a machine vision system
WO2013096929A1 (en) * 2011-12-23 2013-06-27 Mitutoyo Corporation Points from focus operations using multiple light settings in a machine vision system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8516462D0 (en) * 1984-07-17 1985-07-31 Electronic Automation Ltd Vision inspection method
CN103105193A (en) * 2011-11-15 2013-05-15 株式会社三丰 Precision machine vision inspection system and method for operating the same
CN103106077A (en) * 2011-11-15 2013-05-15 株式会社三丰 Machine vision system
JP2013104877A (en) * 2011-11-15 2013-05-30 Mitsutoyo Corp Machine vision system program edition environment including a real time context generation function
JP2013105500A (en) * 2011-11-15 2013-05-30 Mitsutoyo Corp Machine vision system edition environment for part program in which continuous stream of image acquiring operation is executed in execution mode

Also Published As

Publication number Publication date
CN105204840A (en) 2015-12-30
DE102015208337A1 (en) 2015-12-17
JP6585934B2 (en) 2019-10-02
JP2016004043A (en) 2016-01-12

Similar Documents

Publication Publication Date Title
US8902307B2 (en) Machine vision system editing environment for a part program in which a continuous stream of image acquisition operations are performed during a run mode
JP6071452B2 (en) System and method for using editing initialization block in part program editing environment of machine vision system
US7590276B2 (en) System and method for programming interrupting operations during moving image acquisition sequences in a vision system
US9013574B2 (en) Machine vision system program editing environment including synchronized user interface features
JP6122282B2 (en) Machine vision system program editing environment including real-time context generation function
JP4933785B2 (en) Video tool control method for image inspection system
JP6239232B2 (en) High performance edge focus tool
EP1653224B1 (en) Smear-limit based method for controlling vision systems for consistently accurate and high-speed inspection
JP6322456B2 (en) System and method for acquiring an image having an offset used to enhance edge resolution
US20110231787A1 (en) Gui for programming step and repeat operations in a machine vision inspection system
JP6418780B2 (en) System, GUI, and method for editing step and repeat motion commands in machine vision inspection system
JP7319903B2 (en) High-speed TAG lens-assisted 3D measurement and extended depth-of-field imaging
US9167215B2 (en) Machine vision system editing environment for a part program in which a continuous stream of image acquisition operations are performed during a run mode
CN105204840B (en) Editing environment for machine vision system of part program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant