US20110317907A1 - Optimized Distribution of Machine Vision Processing - Google Patents
Optimized Distribution of Machine Vision Processing Download PDFInfo
- Publication number
- US20110317907A1 US20110317907A1 US13/159,126 US201113159126A US2011317907A1 US 20110317907 A1 US20110317907 A1 US 20110317907A1 US 201113159126 A US201113159126 A US 201113159126A US 2011317907 A1 US2011317907 A1 US 2011317907A1
- Authority
- US
- United States
- Prior art keywords
- vision
- computer
- given
- tools
- tool parameters
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/94—Hardware or software architectures specially adapted for image or video understanding
- G06V10/95—Hardware or software architectures specially adapted for image or video understanding structured as a network, e.g. client-server architectures
Definitions
- aspects of the present invention generally relate to machine vision. Other, aspects of the present invention relate to providing access to the results of machine vision operations.
- FIG. 2 illustrates a machine vision system 200 in which an image acquisition subsystem 202 , positioned on a production line, captures and stores an image of the part to be inspected.
- a machine vision computer 204 uses machine vision image and data analysis software to extract information from the image and to produce a result that can be used to make decisions about the image.
- the result 206 is communicated to the operator, or, as illustrated, to other manufacturing equipment 208 on the factory floor. The result may be used to control manufacturing equipment or to determine the quality of a part, or it may be input to another image analysis operation.
- the machine vision software on computer 204 performs image analysis operations.
- image analysis operations include, but are not limited to, pattern location algorithms, gauging algorithms, character recognition algorithms, and image filters such as a Gaussian filter.
- Suppliers of machine vision software may protect their software from unauthorized duplication by using a hardware or software security method.
- users of machine vision may be forced to support awkward licensing schemes imposed by machine vision vendors in an effort to protect their software from unauthorized use or duplication.
- licensees may be required to pay for licenses not needed to support their usage.
- machine vision systems are difficult to maintain in the field. For instance, it may be challenging to update a machine vision system with a new version of software, or a new license, after it has been installed on a manufacturing production line. Moreover, customers wishing to test proprietary machine vision software on a particular part may be required to purchase and install software and the associated licenses, which is a significant deterrent to “quick-turn” software development.
- FIG. 1 is an illustrative diagram of an exemplary visual inspection system that can be employed by, for example, a machine vision system for inspecting at least one characteristic of a material, such as a continuous web product having a generally uniform structure, using a set of optical arrangements in an inspection process in accordance with the exemplary embodiment of the invention;
- FIG. 2 is a flow chart illustrating a method designed in accordance with a first exemplary embodiment of the invention in which a material, such as a continuous web product or material, is inspected for at least one characteristic of the material using a set of optical arrangements;
- FIG. 3 is a flow chart showing the method of FIG. 2 in greater detail
- FIG. 4 is a flow chart illustrating a method designed in accordance with a second exemplary embodiment of the invention in which a material, such as a continuous web product or material, is inspected for at least one characteristic of the material using a set of optical arrangements;
- FIG. 5 is a flow chart showing the method of FIG. 4 in greater detail
- FIG. 6 is a flow chart showing the method of FIG. 5 in greater detail
- FIG. 7 is an illustrative diagram of the elements of the visual inspection system illustrated in FIG. 1 ;
- FIG. 8 is an illustrative diagram showing a modification of the visual inspection system illustrated in FIG. 7 .
- a visual inspection system that can be employed by, for example, a machine vision system for inspecting at least one characteristic of a material using a set of optical arrangements in an inspection process such as commonly occur in automated manufacturing.
- the visual inspection system can be employed in a machine vision system 10 for a manufacturing line such as a manufacturing line 12 , as shown in FIG. 1 .
- sample-object 14 e.g., a continuous web product or material
- sample-object 14 can be inspected for compliance with metrics, such as the quantity and size of holes, pits, cracks, streaks, bugs, blister, bumps, splash, grooves, dirt, bubble, ripple, wrinkle, dents, or any other defect optically visible making it less valuable for the user or customer.
- metrics such as the quantity and size of holes, pits, cracks, streaks, bugs, blister, bumps, splash, grooves, dirt, bubble, ripple, wrinkle, dents, or any other defect optically visible making it less valuable for the user or customer.
- Such continuous web products may include paper, metals, plastic foils and non-woven materials whereby the visual quality of these products or product surfaces may be monitored.
- Image(s) of the sample-object 14 illuminated by a light source 13 is obtained by an imaging device or camera 16 .
- the first computer also sends to the second computer parameters needed by the second computer to run the selected vision software.
- the second computer runs the selected vision software on the transferred image to obtain a result.
- the result is sent from the second computer to a designated location.
- FIG. 1 illustrates an embodiment of the present invention
- FIG. 2 illustrates a prior art machine vision system
- FIGS. 3( a ) and 3 ( b ) illustrates an embodiment of the invention in which a first computer arrangement is shown
- FIG. 4 illustrates an embodiment of the invention in which a second computer arrangement is shown
- FIG. 5 is a flow diagram explaining the process of a first computer of the present invention.
- FIG. 6 is a flow diagram explaining the process of a second computer of the present invention.
- FIG. 1 is a diagram illustrating an embodiment of a networked machine vision system 100 .
- the illustrated system 100 includes a first computer 102 , a second computer 104 , and a communications link (an internetwork) 106 .
- image data is sent from first computer 102 to remotely located second computer 104 via communications link 106 .
- Vision tool parameters may also be sent, if needed, from first computer 102 to remotely located second computer 104 .
- Second computer 104 analyzes the image data and any vision tool given parameters that may be sent, using an chosen vision tool to produce an given result. The result is sent from second computer 104 to first computer 102 via the communications link 106 .
- the communications link located between the first and second computer may be an Internet connection or wide area network (WAN) connection.
- WAN wide area network
- FIG. 3 ( a ) illustrates an arrangement of the first computer 102 in an embodiment of the present invention.
- First computer 102 includes a collector 302 , a transmitter 304 , and a receiver 306 .
- FIG. 3 ( b ) further shows collector 302 of FIG. 3( a ).
- Collector 302 includes an image acquirer 308 , for example, a frame grabber, client data procurer 310 , and selector 312 .
- Collector 302 may uses a protocol such as DCOM (Distributed Component Object Model) or CORBA (Common Object Request Broker Architecture), to provide a standard method for inter-process communication in a heterogeneous environment.
- DCOM Distributed Component Object Model
- CORBA Common Object Request Broker Architecture
- DCOM and CORBA are examples of distributed object computing infrastructures which automate many common network programming tasks such as object registration, location, and activation, request demultiplexing, framing and error-handling, parameter marshalling and demarshalling, and operation dispatching.
- Client data procurer 310 sends an acquisition command 316 to image acquirer 308 to acquire an image 314 .
- the image acquirer 308 may be stored on first computer 102 or remote from first computer 102 such that image acquirer 308 used to acquire the image data may be local or networked.
- Software demonstration image data may be stored at second computer 104 or on a computer connected to second computer 104 .
- the source of the image may include a camera, x-ray, scanning electron microscopes or focused ion beams.
- Image acquirer 308 responds to the acquisition command 316 by returning an image 314 to client data procurer 310 .
- selector 312 selects a vision operation tool to be used to analyze the image 314 . Multiple vision tools may be indicated by selector 312 .
- the selected vision tool may include such vision operations as guidance, inspection, gauging or identification.
- the corresponding parameters for the selected vision tools may include, for example: in a guidance operation, a model pattern and constraints on the alignment operation such as minimum match quality and allowable scale and rotation change; or in a gauging operation the corresponding parameter may include the polarity of an edge transition, the expected angle of the edge, the type of scoring operation to perform or the minimum edge intensity.
- Vision tool parameters that correspond to the selected vision tool may be entered manually at the first computer 102 , for example, by using a keyboard, mouse, or touchpad in a software demonstration scenario.
- the vision tool parameters may also be entered, for example in a manufacturing or production environment, using a keyboard, mouse, touchpad, or an application program in collector 302 .
- the acquired image data 314 , vision tool parameters (if any), and the selected vision tool are sent by transmitter 304 to second computer 104 to be analyzed using the selected vision tool.
- the data transmitted by transmitter 304 may also contain information such as client account information or a password.
- the image data 314 may be transmitted using a format such as JPEG or .bmp
- the receiver 306 is used to receive an analyzed result of the selected vision tool from the second computer 104 .
- FIG. 4 illustrates an arrangement of the second computer 104 in an embodiment of the present invention.
- Second computer 104 includes a receiver 402 , a validator 408 , an analyzer 404 , and a transmitter 406 .
- Receiver 402 receives the information and data transmitted from the first computer 102 .
- the received information includes acquired image data, corresponding vision tool parameters (if any), the selected vision tool, and client identifiers.
- the information and data are verified by validator 408 .
- Validator 408 functions to ensure client account security so that client identifiers such as account information and passwords are verified.
- the image data and vision tool parameters (if any) may be verified within the selected vision tool or by a validator independent from validator 408 .
- validator 408 will return an error message to first computer 102 .
- analyzer 404 uses the selected vision tool to analyze the acquired image data and any corresponding vision tool parameters. If an invalid image or vision tool parameters have been provided by first computer 102 , analyzer 404 will return an error result to first computer 102 . Otherwise, if image data and vision tool parameters (if any) are valid, analyzer 404 will perform the selected vision operation to obtain an analyzed result. The analyzed result will be sent by transmitter 406 to a designated location. The designated location may include first computer 102 or a computer other than the first computer 102 .
- FIG. 5 explains the operation of the first computer in an embodiment of the invention.
- first computer 102 acquires image data.
- the image may be acquired from a location remote from first computer 102 or stored at a location on first computer 102 .
- a vision operation tool is selected to analyze the acquired image.
- the selected vision operation tool is remotely located from the first computer 102 —in a different part of the same building (site) at a different site, or in another part of the country (or world). Multiple vision operation tools may be selected to conduct various data analysis.
- vision tool parameters if any, are entered.
- the vision operation tools correspond to each vision tool selected at P 502 .
- the acquired image data, selected vision tool(s), corresponding vision tool parameters (if any), and client account information are sent from first computer 102 to second computer 104 via a communications link 106 .
- an analyzed result or error message is received from second computer 104 .
- the analyzed result or error message is obtained from the processing of the acquired image data and any corresponding vision tool parameters using the selected vision operation tool.
- FIG. 6 explains the operation of the second computer in an embodiment of the invention.
- acquired image data a selected vision tool, vision tool parameters (if any), and client account information are received from first computer 102 .
- the client account information received at P 600 is validated.
- the validation maintains client account security by verifying that a correct client identifier and password have been entered. If any information fails to be validated, an error message is sent to the first computer 102 .
- the image data, vision tool and any vision tool parameters are verified to ensure that the correct type, number and values required for the selected vision tool have been entered.
- the acquired image data and any vision tool parameters are processed using the selected vision tool to produce an analyzed result.
- the analyzed result is sent from the second computer 104 to a designated location via a communications link.
- the designated location may include the first computer 102 or a location other than the first computer 102 .
- the communications link 106 may include an Internet connection or a wide area network (WAN) connection.
- the second computer 104 may execute P 604 , P 606 and P 610 in turn for each selected vision tool.
- the present invention may be implemented by hardware or by a combination of hardware and software.
- the software may be recorded on a medium and executed by a computer.
- the medium may be, but is not limited to, for example, a floppy disk, a CD ROM, a writable CD, a Read-Only-Memory (ROM), or an Electrically Erasable Programmable Read Only Memory (EEPROM).
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
A system and method is provided for remotely analyzing machine vision data. An indication of a choice of vision software is sent from a first computer to a remote second computer. The second computer, using the selected vision software, processes image data to provide a result that is transmitted from the second computer to a designated location.
Description
- 1. Copyright Notice
- This patent document contains information subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent, as it appears in the U.S. Patent and Trademark Office files or records, but otherwise reserves all copyright rights whatsoever.
- 2. Field of the Invention
- Aspects of the present invention generally relate to machine vision. Other, aspects of the present invention relate to providing access to the results of machine vision operations.
- 3. Description of Background Information
- Machine vision technology is used around the world to automatically gauge part dimensions, guide robotic equipment, identify products, and inspect for defects in industries that include, but are not limited to, semiconductors, electronics, automotive parts, consumer products, electrical components, medical devices, and packaging.
FIG. 2 , for example, illustrates amachine vision system 200 in which animage acquisition subsystem 202, positioned on a production line, captures and stores an image of the part to be inspected. Amachine vision computer 204 then uses machine vision image and data analysis software to extract information from the image and to produce a result that can be used to make decisions about the image. Once the vision system has processed and analyzed the image, theresult 206 is communicated to the operator, or, as illustrated, toother manufacturing equipment 208 on the factory floor. The result may be used to control manufacturing equipment or to determine the quality of a part, or it may be input to another image analysis operation. - The machine vision software on
computer 204 performs image analysis operations. Examples of image analysis operations include, but are not limited to, pattern location algorithms, gauging algorithms, character recognition algorithms, and image filters such as a Gaussian filter. - Suppliers of machine vision software may protect their software from unauthorized duplication by using a hardware or software security method. In addition, or as a substitute for hardware or software security methods, users of machine vision may be forced to support awkward licensing schemes imposed by machine vision vendors in an effort to protect their software from unauthorized use or duplication. Depending on the type of security used, licensees may be required to pay for licenses not needed to support their usage.
- In addition, machine vision systems are difficult to maintain in the field. For instance, it may be challenging to update a machine vision system with a new version of software, or a new license, after it has been installed on a manufacturing production line. Moreover, customers wishing to test proprietary machine vision software on a particular part may be required to purchase and install software and the associated licenses, which is a significant deterrent to “quick-turn” software development.
- To be more specific vendors may use one of the following three security methods to prevent improper copying or use of their software:
-
- Hardware security, for example, a security code may be programmed into a hardware device, such as EEPROM on a frame grabber, or a hardware dongle which plugs into a
- The present invention is further described in the detailed description which follows, by reference to the noted drawings by way of non-limiting exemplary embodiments, in which like reference numerals represent similar parts throughout the several views of the drawings, and wherein:
-
FIG. 1 is an illustrative diagram of an exemplary visual inspection system that can be employed by, for example, a machine vision system for inspecting at least one characteristic of a material, such as a continuous web product having a generally uniform structure, using a set of optical arrangements in an inspection process in accordance with the exemplary embodiment of the invention; -
FIG. 2 is a flow chart illustrating a method designed in accordance with a first exemplary embodiment of the invention in which a material, such as a continuous web product or material, is inspected for at least one characteristic of the material using a set of optical arrangements; -
FIG. 3 is a flow chart showing the method ofFIG. 2 in greater detail; -
FIG. 4 is a flow chart illustrating a method designed in accordance with a second exemplary embodiment of the invention in which a material, such as a continuous web product or material, is inspected for at least one characteristic of the material using a set of optical arrangements; -
FIG. 5 is a flow chart showing the method ofFIG. 4 in greater detail; -
FIG. 6 is a flow chart showing the method ofFIG. 5 in greater detail; -
FIG. 7 is an illustrative diagram of the elements of the visual inspection system illustrated inFIG. 1 ; and -
FIG. 8 is an illustrative diagram showing a modification of the visual inspection system illustrated inFIG. 7 . - Referring to
FIG. 1 , in accordance with the present invention, there is provided a visual inspection system that can be employed by, for example, a machine vision system for inspecting at least one characteristic of a material using a set of optical arrangements in an inspection process such as commonly occur in automated manufacturing. For example, the visual inspection system can be employed in a machine vision system 10 for a manufacturing line such as a manufacturing line 12, as shown inFIG. 1 . - Using the inspection system, sample-object 14, e.g., a continuous web product or material, can be inspected for compliance with metrics, such as the quantity and size of holes, pits, cracks, streaks, bugs, blister, bumps, splash, grooves, dirt, bubble, ripple, wrinkle, dents, or any other defect optically visible making it less valuable for the user or customer. Such continuous web products may include paper, metals, plastic foils and non-woven materials whereby the visual quality of these products or product surfaces may be monitored.
- Image(s) of the sample-object 14 illuminated by a light source 13 is obtained by an imaging device or camera 16. As shown in
FIG. 1 , the one or more specific vision tools for the second computer to run on the acquired image data. The first computer also sends to the second computer parameters needed by the second computer to run the selected vision software. The second computer runs the selected vision software on the transferred image to obtain a result. The result is sent from the second computer to a designated location. - The present invention is further described in the detailed description which follows, by reference to the noted drawings by way of non-limiting exemplary embodiments, in which like reference numerals represent similar parts throughout the several views of the drawings, and wherein:
-
FIG. 1 illustrates an embodiment of the present invention; -
FIG. 2 illustrates a prior art machine vision system; -
FIGS. 3( a) and 3(b) illustrates an embodiment of the invention in which a first computer arrangement is shown; -
FIG. 4 illustrates an embodiment of the invention in which a second computer arrangement is shown; -
FIG. 5 is a flow diagram explaining the process of a first computer of the present invention; and -
FIG. 6 is a flow diagram explaining the process of a second computer of the present invention. -
FIG. 1 is a diagram illustrating an embodiment of a networkedmachine vision system 100. The illustratedsystem 100 includes afirst computer 102, asecond computer 104, and a communications link (an internetwork) 106. In the system ofFIG. 1 , image data is sent fromfirst computer 102 to remotely locatedsecond computer 104 viacommunications link 106. Vision tool parameters may also be sent, if needed, fromfirst computer 102 to remotely locatedsecond computer 104.Second computer 104 analyzes the image data and any vision tool given parameters that may be sent, using an chosen vision tool to produce an given result. The result is sent fromsecond computer 104 tofirst computer 102 via thecommunications link 106. The communications link located between the first and second computer may be an Internet connection or wide area network (WAN) connection. -
FIG. 3 (a) illustrates an arrangement of thefirst computer 102 in an embodiment of the present invention.First computer 102 includes acollector 302, atransmitter 304, and areceiver 306.FIG. 3 (b) further showscollector 302 ofFIG. 3( a).Collector 302 includes animage acquirer 308, for example, a frame grabber,client data procurer 310, andselector 312.Collector 302 may uses a protocol such as DCOM (Distributed Component Object Model) or CORBA (Common Object Request Broker Architecture), to provide a standard method for inter-process communication in a heterogeneous environment. DCOM and CORBA are examples of distributed object computing infrastructures which automate many common network programming tasks such as object registration, location, and activation, request demultiplexing, framing and error-handling, parameter marshalling and demarshalling, and operation dispatching. Client data procurer 310 sends anacquisition command 316 toimage acquirer 308 to acquire animage 314. - The
image acquirer 308 may be stored onfirst computer 102 or remote fromfirst computer 102 such thatimage acquirer 308 used to acquire the image data may be local or networked. Software demonstration image data may be stored atsecond computer 104 or on a computer connected tosecond computer 104. The source of the image may include a camera, x-ray, scanning electron microscopes or focused ion beams.Image acquirer 308 responds to theacquisition command 316 by returning animage 314 toclient data procurer 310. Next,selector 312 selects a vision operation tool to be used to analyze theimage 314. Multiple vision tools may be indicated byselector 312. The selected vision tool may include such vision operations as guidance, inspection, gauging or identification. Most vision operations require additional parameters to precisely specify their behavior, and correspond with the selected vision tool; however, these additional parameters may not be needed in every case. The corresponding parameters for the selected vision tools may include, for example: in a guidance operation, a model pattern and constraints on the alignment operation such as minimum match quality and allowable scale and rotation change; or in a gauging operation the corresponding parameter may include the polarity of an edge transition, the expected angle of the edge, the type of scoring operation to perform or the minimum edge intensity. - Vision tool parameters that correspond to the selected vision tool may be entered manually at the
first computer 102, for example, by using a keyboard, mouse, or touchpad in a software demonstration scenario. The vision tool parameters may also be entered, for example in a manufacturing or production environment, using a keyboard, mouse, touchpad, or an application program incollector 302. The acquiredimage data 314, vision tool parameters (if any), and the selected vision tool are sent bytransmitter 304 tosecond computer 104 to be analyzed using the selected vision tool. The data transmitted bytransmitter 304 may also contain information such as client account information or a password. Theimage data 314 may be transmitted using a format such as JPEG or .bmp Thereceiver 306 is used to receive an analyzed result of the selected vision tool from thesecond computer 104. -
FIG. 4 illustrates an arrangement of thesecond computer 104 in an embodiment of the present invention.Second computer 104 includes areceiver 402, avalidator 408, ananalyzer 404, and atransmitter 406.Receiver 402 receives the information and data transmitted from thefirst computer 102. The received information includes acquired image data, corresponding vision tool parameters (if any), the selected vision tool, and client identifiers. Once the information is received onreceiver 402, the information and data are verified byvalidator 408.Validator 408 functions to ensure client account security so that client identifiers such as account information and passwords are verified. The image data and vision tool parameters (if any) may be verified within the selected vision tool or by a validator independent fromvalidator 408. If an incorrect client identifier entry is received,validator 408 will return an error message tofirst computer 102. Once the received client identifier information is verified byvalidator 408,analyzer 404 uses the selected vision tool to analyze the acquired image data and any corresponding vision tool parameters. If an invalid image or vision tool parameters have been provided byfirst computer 102,analyzer 404 will return an error result tofirst computer 102. Otherwise, if image data and vision tool parameters (if any) are valid,analyzer 404 will perform the selected vision operation to obtain an analyzed result. The analyzed result will be sent bytransmitter 406 to a designated location. The designated location may includefirst computer 102 or a computer other than thefirst computer 102. -
FIG. 5 explains the operation of the first computer in an embodiment of the invention. At P500first computer 102 acquires image data. The image may be acquired from a location remote fromfirst computer 102 or stored at a location onfirst computer 102. - At P502 a vision operation tool is selected to analyze the acquired image. The selected vision operation tool is remotely located from the
first computer 102—in a different part of the same building (site) at a different site, or in another part of the country (or world). Multiple vision operation tools may be selected to conduct various data analysis. - At P503 vision tool parameters, if any, are entered. The vision operation tools correspond to each vision tool selected at P502.
- At P504 the acquired image data, selected vision tool(s), corresponding vision tool parameters (if any), and client account information are sent from
first computer 102 tosecond computer 104 via acommunications link 106. - At P514 an analyzed result or error message is received from
second computer 104. The analyzed result or error message is obtained from the processing of the acquired image data and any corresponding vision tool parameters using the selected vision operation tool. -
FIG. 6 explains the operation of the second computer in an embodiment of the invention. At P600 acquired image data, a selected vision tool, vision tool parameters (if any), and client account information are received fromfirst computer 102. - At P602 the client account information received at P600 is validated. The validation maintains client account security by verifying that a correct client identifier and password have been entered. If any information fails to be validated, an error message is sent to the
first computer 102. - At P604 the image data, vision tool and any vision tool parameters are verified to ensure that the correct type, number and values required for the selected vision tool have been entered.
- At P606 the acquired image data and any vision tool parameters are processed using the selected vision tool to produce an analyzed result.
- At P610 the analyzed result is sent from the
second computer 104 to a designated location via a communications link. The designated location may include thefirst computer 102 or a location other than thefirst computer 102. The communications link 106, as discussed above, may include an Internet connection or a wide area network (WAN) connection. - If more than one vision tool is selected, the
second computer 104 may execute P604, P606 and P610 in turn for each selected vision tool. - The present invention may be implemented by hardware or by a combination of hardware and software. The software may be recorded on a medium and executed by a computer. The medium may be, but is not limited to, for example, a floppy disk, a CD ROM, a writable CD, a Read-Only-Memory (ROM), or an Electrically Erasable Programmable Read Only Memory (EEPROM).
- While the invention has been described with reference to certain illustrated embodiments, the words that have been used herein are words of description, rather than words of limitation. Changes may be made, within the purview of the appended claims, without departing from the scope and spirit of the invention in its aspects. Although the invention has been described herein with reference to particular structures, acts, and materials, the invention is not limited to the particulars disclosed, but rather extends to all equivalent structures, acts, and materials, such as are within the scope of the appended claims.
Claims (18)
1-68. (canceled)
69. Apparatus comprising:
a computer including a receiver configured to receive, from a remote source via a communications network, image data including at least one given image to be analyzed by one or more given vision tools that have been selected, and corresponding vision tool parameters corresponding to the selected one or more given vision tools that have been selected to analyze the given image;
the computer being configured to, following receiving certain data by the receiver, cause a machine vision engine to analyze, with the selected one or more given vision tools, the given image to be analyzed in accordance with the corresponding vision tool parameters received by the receiver; and
wherein the machine vision engine includes one or more individually selectable vision tools having been configured to, when selected, carry out vision operations including pattern location.
70. The apparatus according to claim 69 , wherein the vision operations include guidance.
71. The apparatus according to claim 69 , wherein the vision operations include inspection.
72. The apparatus according to claim 69 , wherein the vision operations include gauging.
73. The apparatus according to claim 69 , wherein the vision operations include identification.
74. The apparatus according to claim 69 , wherein the vision operations include a selectable guidance vision tool configured to, when selected, (i) obtain guidance operation vision tool parameters including a model pattern and alignment operation constraints, and (ii) carry out a corresponding guidance operation corresponding to the obtained guidance operation vision tool parameters.
75. The apparatus according to claim 74 , wherein the obtained guidance operation vision tool parameters include parameters defining a minimum match quality and allowable scale and rotation change.
76. The apparatus according to claim 69 , wherein the computer includes the machine vision engine.
77. The apparatus according to claim 70 , wherein the communications network includes an internetwork.
78. The apparatus according to claim 77 , wherein the internetwork includes the Internet.
79. The apparatus according to claim 69 , wherein the selected one or more given vision tools that have been selected have been selected at a location remote from the computer.
80. The apparatus according to claim 69 , further comprising a validator configured to verify associated validation data to ensure client account security, the associated validation data having been associated with the received given image, the selected one or more given vision tools, and the corresponding vision tool parameters.
81. The apparatus according to claim 80 , wherein the associated validation data has been received by the receiver.
82. A system comprising:
a computer in a manufacturing facility, the computer including a vision tool parameters input configured to receive, at the computer, corresponding vision tool parameters corresponding to at least one of selected one or more given vision tools;
a transmitter configured to send, from the computer to a machine vision engine located remotely from the computer and via a communications network, (i) image data including at least one given image to be analyzed by the selected one or more given vision tools, and (ii) the corresponding vision tool parameters; and
wherein the machine vision engine includes vision tools including the selected one or more given vision tools, the selectable vision tools having been configured to, when selected, carry out vision operations including pattern location on the given image.
83. Apparatus comprising:
a computer including a receiver configured to receive, from a remote source via a communications network, image data including at least one given image to be analyzed by one or more given vision tools that have been selected, and corresponding vision tool parameters corresponding to the selected one or more given vision tools that have been selected to analyze the given image; and
the computer being configured to, following receiving certain data by the receiver, cause a machine vision engine to analyze, with the selected one or more given vision tools, the given image to be analyzed in accordance with the corresponding vision tool parameters received by the receiver;
wherein the machine vision engine includes one or more individually selectable vision tools having been configured to, when selected, carry out vision operations.
84. The apparatus according to claim 83 , wherein the vision operations include guidance.
85. The apparatus according to claim 83 , wherein the vision operations include inspection.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/159,126 US20110317907A1 (en) | 2000-12-29 | 2011-06-13 | Optimized Distribution of Machine Vision Processing |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US75017300A | 2000-12-29 | 2000-12-29 | |
US09/842,948 US7962898B1 (en) | 2000-12-29 | 2001-04-27 | Optimized distribution of machine vision processing |
US13/159,126 US20110317907A1 (en) | 2000-12-29 | 2011-06-13 | Optimized Distribution of Machine Vision Processing |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US75017300A Continuation | 2000-12-29 | 2000-12-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110317907A1 true US20110317907A1 (en) | 2011-12-29 |
Family
ID=44122025
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/842,948 Expired - Fee Related US7962898B1 (en) | 2000-12-29 | 2001-04-27 | Optimized distribution of machine vision processing |
US13/159,126 Abandoned US20110317907A1 (en) | 2000-12-29 | 2011-06-13 | Optimized Distribution of Machine Vision Processing |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/842,948 Expired - Fee Related US7962898B1 (en) | 2000-12-29 | 2001-04-27 | Optimized distribution of machine vision processing |
Country Status (1)
Country | Link |
---|---|
US (2) | US7962898B1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9172916B2 (en) | 2010-12-12 | 2015-10-27 | Event Capture Systems, Inc. | Web monitoring system |
US20150339812A1 (en) * | 2012-05-09 | 2015-11-26 | Sight Machine, Inc. | System and method of distributed processing for machine-vision analysis |
CN108344743A (en) * | 2018-02-02 | 2018-07-31 | 佛山职业技术学院 | One kind being based on machine vision drug blister package defect inspection method and system |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7962898B1 (en) * | 2000-12-29 | 2011-06-14 | Cognex Corporation | Optimized distribution of machine vision processing |
US8595689B2 (en) | 2008-12-24 | 2013-11-26 | Flir Systems Ab | Executable code in digital image files |
US9858165B2 (en) * | 2012-09-10 | 2018-01-02 | Kpit Cummins Infosystems, Ltd. | Method and apparatus for designing vision based software applications |
US10672046B2 (en) * | 2012-12-31 | 2020-06-02 | Baker Hughes, A Ge Company, Llc | Systems and methods for non-destructive testing online stores |
US11581713B2 (en) * | 2018-03-06 | 2023-02-14 | Duke Energy Corporation | Methods and apparatuses for robotic breaker racking |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7962898B1 (en) * | 2000-12-29 | 2011-06-14 | Cognex Corporation | Optimized distribution of machine vision processing |
Family Cites Families (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4641356A (en) | 1984-08-24 | 1987-02-03 | Machine Vision International Corporation | Apparatus and method for implementing dilation and erosion transformations in grayscale image processing |
US4975972A (en) | 1988-10-18 | 1990-12-04 | At&T Bell Laboratories | Method and apparatus for surface inspection |
US4985846A (en) | 1989-05-11 | 1991-01-15 | Fallon Patrick J | Acoustical/optical bin picking system |
US5040228A (en) | 1989-08-28 | 1991-08-13 | At&T Bell Laboratories | Method and apparatus for automatically focusing an image-acquisition device |
US5081529A (en) * | 1990-12-18 | 1992-01-14 | Eastman Kodak Company | Color and tone scale calibration system for a printer using electronically-generated input images |
US5327265A (en) * | 1992-05-01 | 1994-07-05 | Mcdonald Bruce A | Modem accessable image database system for on-demand printing |
CA2093448C (en) * | 1992-07-17 | 1999-03-09 | Albert D. Edgar | Expert system for image enhancement |
US5481712A (en) * | 1993-04-06 | 1996-01-02 | Cognex Corporation | Method and apparatus for interactively generating a computer program for machine vision analysis of an object |
JPH07261279A (en) * | 1994-02-25 | 1995-10-13 | Eastman Kodak Co | Selection system and method of photograph picture |
US5694484A (en) * | 1995-05-15 | 1997-12-02 | Polaroid Corporation | System and method for automatically processing image data to provide images of optimal perceptual quality |
US5835627A (en) * | 1995-05-15 | 1998-11-10 | Higgins; Eric W. | System and method for automatically optimizing image quality and processing time |
US5768401A (en) | 1995-08-02 | 1998-06-16 | Lucent Technologies Inc. | Balanced focus system and method for achieving optimal focus of different areas of an object that are concurrently imaged |
JPH09130783A (en) * | 1995-10-31 | 1997-05-16 | Matsushita Electric Ind Co Ltd | Distributed video monitoring system |
US5673334A (en) | 1995-11-30 | 1997-09-30 | Cognex Corporation | Method and apparatus for inspection of characteristics on non-rigid packages |
US5821993A (en) | 1996-01-25 | 1998-10-13 | Medar, Inc. | Method and system for automatically calibrating a color camera in a machine vision system |
US5982362A (en) | 1996-05-30 | 1999-11-09 | Control Technology Corporation | Video interface architecture for programmable industrial control systems |
US5715051A (en) | 1996-10-21 | 1998-02-03 | Medar, Inc. | Method and system for detecting defects in optically transmissive coatings formed on optical media substrates |
JP4330665B2 (en) * | 1996-10-30 | 2009-09-16 | 株式会社リコー | Client server system and image processing apparatus |
US6017157A (en) * | 1996-12-24 | 2000-01-25 | Picturevision, Inc. | Method of processing digital images and distributing visual prints produced from the digital images |
IL119948A (en) * | 1996-12-31 | 2004-09-27 | News Datacom Ltd | Voice activated communication system and program guide |
US5867322A (en) * | 1997-08-12 | 1999-02-02 | Eastman Kodak Company | Remote approval of lenticular images |
US6971066B2 (en) | 1997-08-18 | 2005-11-29 | National Instruments Corporation | System and method for deploying a graphical program on an image acquisition device |
US6608638B1 (en) * | 2000-02-07 | 2003-08-19 | National Instruments Corporation | System and method for configuring a programmable hardware instrument to perform measurement functions utilizing estimation of the hardware implentation and management of hardware resources |
US6025854A (en) | 1997-12-31 | 2000-02-15 | Cognex Corporation | Method and apparatus for high speed image acquisition |
JP4026944B2 (en) * | 1998-08-06 | 2007-12-26 | キヤノン株式会社 | Video transmission device and control method thereof |
JP2000115619A (en) | 1998-09-30 | 2000-04-21 | Canon Inc | Camera control device, system and method and storage medium with camera control program stored therein |
US7092860B1 (en) * | 1999-02-03 | 2006-08-15 | Mitutoyo Corporation | Hardware simulation systems and methods for vision inspection systems |
US6578017B1 (en) * | 1999-02-26 | 2003-06-10 | Information Decision Technologies, Llc | Method to aid object detection in images by incorporating contextual information |
US6381357B1 (en) | 1999-02-26 | 2002-04-30 | Intel Corporation | Hi-speed deterministic approach in detecting defective pixels within an image sensor |
US6944584B1 (en) | 1999-04-16 | 2005-09-13 | Brooks Automation, Inc. | System and method for control and simulation |
US6298474B1 (en) * | 1999-04-30 | 2001-10-02 | Intergral Vision, Inc. | Method and system for interactively developing a graphical control-flow structure and associated application software for use in a machine vision system and computer-readable storage medium having a program for executing the method |
US6813621B1 (en) * | 1999-08-12 | 2004-11-02 | Hewlett-Packard Development Company, L.P. | Processing graphic images having various file formats |
US6798531B1 (en) * | 1999-10-12 | 2004-09-28 | Eastman Kodak Company | Printing and delivery of digital images and text via a central receiving agency |
US7302114B2 (en) * | 2000-01-18 | 2007-11-27 | Branders.Com, Inc. | Methods and apparatuses for generating composite images |
US6493677B1 (en) * | 2000-01-19 | 2002-12-10 | Jones Soda Co. | Method and apparatus for creating and ordering customized branded merchandise over a computer network |
US20010055069A1 (en) * | 2000-03-10 | 2001-12-27 | Hudson Edison T. | One camera system for component to substrate registration |
KR100388419B1 (en) * | 2000-05-10 | 2003-06-25 | 김석배 | Electronic commerce system using real images in internet |
US6915273B1 (en) * | 2000-05-23 | 2005-07-05 | Eastman Kodak Company | Method for providing customized photo products over a network using images captured from a digital camera |
US7029715B2 (en) * | 2000-05-25 | 2006-04-18 | Hdn Development Corporation | Methods and systems for automatically extruding and cutting dough-based products having pre-selected weights |
US6763515B1 (en) | 2000-06-05 | 2004-07-13 | National Instruments Corporation | System and method for automatically generating a graphical program to perform an image processing algorithm |
US6781724B1 (en) * | 2000-06-13 | 2004-08-24 | Eastman Kodak Company | Image processing and manipulation system |
CA2347181A1 (en) * | 2000-06-13 | 2001-12-13 | Eastman Kodak Company | Plurality of picture appearance choices from a color photographic recording material intended for scanning |
US6931633B1 (en) * | 2000-08-01 | 2005-08-16 | National Instruments Corporation | System and method of evaluating the performance of an image processing algorithm |
DE10040899A1 (en) | 2000-08-18 | 2002-02-28 | Gavitec Gmbh | Method and device for decoding optical codes |
AU2001292559A1 (en) * | 2000-08-24 | 2002-03-04 | Immersive Technologies Llc | Computerized image system |
US7869067B2 (en) * | 2000-10-20 | 2011-01-11 | Visioneer, Inc. | Combination scanner and image data reader system including image management and software |
US7200838B2 (en) | 2000-12-20 | 2007-04-03 | National Instruments Corporation | System and method for automatically generating a graphical program in response to a state diagram |
US6931602B1 (en) * | 2000-12-22 | 2005-08-16 | Cognex Corporation | Approach facilitating the selection of various machine vision functionality from among different platforms |
US7383536B1 (en) | 2000-12-29 | 2008-06-03 | Petry John | Remote machine vision application program development method |
US7627860B2 (en) | 2001-08-14 | 2009-12-01 | National Instruments Corporation | Graphically deployment of a program with automatic conversion of program type |
DE10148160A1 (en) | 2001-09-28 | 2003-04-24 | Siemens Ag | Method and device for providing data |
US7305114B2 (en) | 2001-12-26 | 2007-12-04 | Cognex Technology And Investment Corporation | Human/machine interface for a machine vision sensor and method for installing and operating the same |
-
2001
- 2001-04-27 US US09/842,948 patent/US7962898B1/en not_active Expired - Fee Related
-
2011
- 2011-06-13 US US13/159,126 patent/US20110317907A1/en not_active Abandoned
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7962898B1 (en) * | 2000-12-29 | 2011-06-14 | Cognex Corporation | Optimized distribution of machine vision processing |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9172916B2 (en) | 2010-12-12 | 2015-10-27 | Event Capture Systems, Inc. | Web monitoring system |
US20150339812A1 (en) * | 2012-05-09 | 2015-11-26 | Sight Machine, Inc. | System and method of distributed processing for machine-vision analysis |
US10134122B2 (en) * | 2012-05-09 | 2018-11-20 | Sight Machine, Inc. | System and method of distributed processing for machine-vision analysis |
CN108344743A (en) * | 2018-02-02 | 2018-07-31 | 佛山职业技术学院 | One kind being based on machine vision drug blister package defect inspection method and system |
Also Published As
Publication number | Publication date |
---|---|
US7962898B1 (en) | 2011-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110317907A1 (en) | Optimized Distribution of Machine Vision Processing | |
Munoz et al. | Mixed reality-based user interface for quality control inspection of car body surfaces | |
JP6882991B2 (en) | How to assemble an electric switch and an assembly aid that simplifies the assembly of such a switch | |
CN104977874B (en) | The enabled mobile device of industry | |
JP4235293B2 (en) | Computer-implemented device for coordinating paint-related process steps | |
EP0985992B1 (en) | Simultaneous manufacturing and product engineering integrated with knowledge networking | |
US8510476B2 (en) | Secure remote diagnostic customer support network | |
US10805335B2 (en) | Application security management system and edge server | |
CN110378273B (en) | Method and device for monitoring operation flow | |
CN113220537B (en) | Software monitoring method, device, equipment and readable storage medium | |
US20050182580A1 (en) | Measurement data collection apparatus | |
CN113888480A (en) | MES-based quality tracing method and system | |
CN109754148A (en) | Use the inspection workflow of Object identifying and other technologies | |
EP3264344A1 (en) | Mapping rule updating method, device and system | |
Winchell | Inspection and measurement in manufacturing: keys to process planning and improvement | |
US20220284699A1 (en) | System and method of object detection using ai deep learning models | |
DE102018008366B4 (en) | Process and system for gesture-based control of a test process at an industrial, partially automated visual inspection station of an industrial technical process or an industrial technical production plant | |
JP2009071230A (en) | System and method for analyzing defect distribution, and program | |
KR102000938B1 (en) | Method for Setting Inspection Criteria Automatically by Indicating Regions and Smart Learning Method for X-ray Inspection Using the Same | |
JP2006277475A (en) | Method for providing inspection result information and its providing system | |
JP7392821B2 (en) | Automatic testing method and device for control software and computer program | |
CN113743964B (en) | Product supervision system and method | |
KR101803383B1 (en) | Semiconductor Factory Automation Solution System based on SECS Communication Protocol | |
Nakajo | A method of identifying latent human errors in work systems | |
JP3998642B2 (en) | Printed solder inspection service method and printed solder inspection apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: COGNEX CORPORATION, MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PETRY, JOHN;MARRION, JR., CYRIL C.;EAMES, ANDREW;SIGNING DATES FROM 20010310 TO 20010419;REEL/FRAME:026695/0035 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |