KR101748768B1 - Image generating method and apparatus for machine vision test, method and apparatus for simulating machine vision test, and machine vision testing system - Google Patents
Image generating method and apparatus for machine vision test, method and apparatus for simulating machine vision test, and machine vision testing system Download PDFInfo
- Publication number
- KR101748768B1 KR101748768B1 KR1020150144656A KR20150144656A KR101748768B1 KR 101748768 B1 KR101748768 B1 KR 101748768B1 KR 1020150144656 A KR1020150144656 A KR 1020150144656A KR 20150144656 A KR20150144656 A KR 20150144656A KR 101748768 B1 KR101748768 B1 KR 101748768B1
- Authority
- KR
- South Korea
- Prior art keywords
- machine vision
- image
- vision inspection
- virtual test
- setting
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2201/00—Features of devices classified in G01N21/00
- G01N2201/12—Circuits of general importance; Signal processing
- G01N2201/127—Calibration; base line adjustment; drift compensation
- G01N2201/12707—Pre-test of apparatus, e.g. dark test, sensor test
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Abstract
The present invention relates to a method and apparatus for generating a virtual test image for machine vision inspection, a method and apparatus for machine vision simulation, and a machine vision test system. According to a first aspect of the present invention there is provided a method for generating a plurality of virtual test images for machine vision inspection comprising the steps of obtaining a sample image for generating the plurality of virtual test images, Setting at least one condition that can be applied to the sample image so as to reproduce the environment, and generating the plurality of virtual test images using the set conditions, wherein the machine vision inspection includes pattern matching Wherein the object is inspected for attributes using any one or more algorithms of pattern matching, point, line, face fitting, edge, color, and location, It is an environment that can change the image used for machine vision inspection.
Description
The present invention relates to a virtual test image generation method and apparatus for machine vision inspection, a machine vision inspection simulation method and apparatus, and a machine vision test system. More particularly, the present invention relates to a method and apparatus for generating a plurality of virtual test images, To a method, apparatus and system for simulating machine vision inspection using a virtual test image.
In recent years, industrial automation has been accelerated by combining factors such as machine vision, robot, sensor, PLC, and computer in the automotive, electronics, display, semiconductor, food and beverage, have.
Here machine vision means automating the industry through cameras (visual perception), CPU, and SW, instead of the conventional way people judge by the naked eye, to inspect or measure objects.
Machine vision is often used for industrial purposes rather than being used by end users. To maximize efficiency, machine vision must recognize smaller objects, recognize fast-moving objects, and be able to simultaneously read multiple objects rather than a single object. Therefore, the machine specific camera differs greatly in the characteristics of the equipment compared to the consumer camera.
That is, the consumer camera senses the visible ray region, while the machine non-exclusive camera mainly senses the infrared region. In order to sense the infrared rays, machine non-dedicated cameras use various types of infrared light and use pinhole lenses or telecentric lenses as well as lenses for consumer use. Also, machine non-dedicated cameras use grayscale black and white cameras more than color cameras.
The image extracted through the machine-specific camera is interpreted through vision software. Vision software can match patterns of objects from extracted images, fit points, lines or faces, discriminate colors, measure objects (Gauging), provide location information for robot guides Or read a 1D, 2D bar code displayed on an object, or perform optical character reading (OCR). Through this process, an object can be inspected, measured or read so that the automation of the industry can be realized.
The problem is that there are a wide variety of industries in which machine vision is applied, and the process environment for automating each product in each industry is very different. In addition, various lights and lenses applied to machine-specific cameras produce different optical environments. In other words, due to various process environments and various optical environments, it is difficult to predict the image extracted by the machine-specific camera. Actually, an automation environment for achieving a specific purpose, and a task of matching the machine non-dedicated camera to the environment require more time than thought, and the image extracted by the machine non-dedicated camera can not be read out through the vision software Also frequent.
Therefore, it is difficult to apply a specific machine non-dedicated camera to all industrial sites. For this purpose, before applying the machine non-dedicated camera to the industrial site, it is necessary to pre-test the sample image of the industrial site with a specific machine non- There is a need.
At this time, the sample image is likely to be an image separated from the image photographed by the machine-exclusive camera. That is, since the situation (photographing angle, photographing distance, photographing range, resolution, and the like) when the sample image is photographed may be different from the situation when the actual machine non- May be an image completely different from the image obtained by the actual machine non-dedicated camera.
For example, suppose a customer (a company that requires automation) asks a different machine vision vendor to sample a sample of the object they want to examine and inquires if pattern matching is possible. Machine vision equipment vendors can find successful pattern matching using vision software that performs pattern matching with sample images. However, after installing a machine-specific camera on the spot, it is often the case that pattern matching with an image by the camera fails.
That is, there is a high possibility that the sample image provided by the customer is a sample image in which no external variables such as a high-speed environment or illumination for an actual automation process are completely reflected.
Therefore, it is necessary to search for a technique for solving the above-mentioned problem.
Korean Registration No. 10-1465684, which is a prior art document, proposes a vision system that adjusts the brightness of a survey object by implementing a lighting control device as a rack type, thereby realizing an environment in which the vision system is used as an ideal environment It is proposed to control the hazard lighting control device. However, since the brightness of the illumination is controlled after the machine vision is applied to the actual environment, it is difficult to expect the effect of obtaining the image brightly or acquiring the darkness. As a result, prior art is difficult to test precisely the sample delivered by the customer prior to the actual configuration of the environment.
On the other hand, the background art described above is technical information acquired by the inventor for the derivation of the present invention or obtained in the derivation process of the present invention, and can not necessarily be a known technology disclosed to the general public before the application of the present invention .
An embodiment of the present invention is directed to a virtual test image generation method and apparatus for machine vision inspection, a machine vision inspection simulation method and apparatus, and a machine vision test system.
According to a first aspect of the present invention, there is provided a method for generating a plurality of virtual test images for machine vision inspection, comprising the steps of: Setting at least one condition applicable to the sample image so as to reproduce various environments in which machine vision inspection is performed; and generating the plurality of virtual test images using the set conditions Wherein the machine vision inspection is performed using one or more algorithms of pattern matching, point, line, face fitting, edge, color, and location, , And the various environments can change the image used in the machine vision inspection Characterized in that the environment.
According to a second aspect of the present invention there is provided a method for simulating machine vision inspection using a plurality of virtual test images, comprising the steps of: obtaining a sample image for generating the plurality of virtual test images; The method comprising: setting at least one condition applicable to the sample image so as to match various environments; generating the plurality of virtual test images using the set conditions; Triggering a machine vision inspection for each of the plurality of cameras, and when the triggering signal is inputted, performing the machine vision inspection and outputting the result.
According to a third aspect of the present invention, there is provided an apparatus for generating a plurality of virtual test images for machine vision inspection, comprising: an image obtaining unit for obtaining a sample image for generating the plurality of virtual test images; And an image generation unit configured to generate the plurality of virtual test images using the set conditions, wherein the machine vision unit is configured to generate the plurality of virtual test images based on the set conditions, The inspection is to check the attribute of the object using at least one of an algorithm such as pattern matching, point, line, face fitting, edge, color, and location, The various environments are characterized by an environment that can change the image used in the machine vision inspection.
According to a fourth aspect of the present invention there is provided an apparatus for simulating machine vision inspection using a plurality of virtual test images, the apparatus comprising: a plurality of virtual test images; An operation request unit for triggering the machine vision inspection, and a vision inspection unit for performing a machine vision inspection and outputting a result when the triggering signal is input.
According to a fifth aspect of the present invention, there is provided a machine vision test system comprising: an image generation apparatus for generating a plurality of virtual test images for machine vision inspection; and a simulation apparatus for simulating machine vision inspection using the plurality of virtual test images And a control unit.
According to one of the above-mentioned objects of the present invention, an embodiment of the present invention provides a virtual test image generation method and apparatus, a machine vision inspection simulation method and apparatus, and a machine vision test system for machine vision inspection .
Further, according to any one of the means for solving the problems of the present invention, it is possible to generate a large number of target images which can be distorted under various circumstances. This suggests a variety of situations that machine vision device designers have difficult to predict, enabling the design of machine vision devices that can operate more effectively.
Further, according to any one of the tasks of the present invention, a defect of the machine vision apparatus can be found in advance through the simulation of the operation of the machine vision apparatus. This allows machine vision devices to be built and operated with minimal resources by allowing the machine vision device to be pre-discovered and repaired prior to installation of the machine vision device in a factory or the like.
In addition, according to any one of the means for solving the problems of the present invention, an area, a pattern matching, an edge (a position angle, a width, a pitch), a stain, a blob, (Auto-Teach) process for creating a golden template in the machine vision inspection of an image, an image, an intensity, a color, an OCR, a barcode, and the like.
The effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description will be.
1A to 1C are block diagrams of a machine vision test system according to an embodiment of the present invention.
2A and 2B are block diagrams illustrating a machine vision apparatus in accordance with an embodiment of the present invention.
3 is a block diagram showing an image generating apparatus and a simulation apparatus according to an embodiment of the present invention.
4A and 4B are diagrams for explaining a user interface provided by the image generating apparatus and the simulation apparatus according to an embodiment of the present invention.
5 is a flowchart illustrating an image generating method and a simulation method according to an embodiment of the present invention.
6 to 11 are diagrams for explaining an image generating method and a simulation method according to an embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, which will be readily apparent to those skilled in the art. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.
Throughout the specification, when a part is referred to as being "connected" to another part, it includes not only "directly connected" but also "electrically connected" with another part in between . Also, when an element is referred to as "comprising ", it means that it can include other elements as well, without departing from the other elements unless specifically stated otherwise.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.
1A through 1C are block diagrams for explaining a machine vision test system according to an embodiment of the present invention.
1A, a machine
The
At this time, the machine vision inspection can check the property of the object using one or more algorithms such as pattern matching, point / line / face fitting, edge, color and location. .
An application (i.e., vision software) can be installed on the
The
The
On the other hand, the
On the other hand, the
In this regard, each of the above-described
Further, each of the
1B and 1C each illustrate a machine
That is, FIG. 1B is a test system implemented with a machine vision camera M and a PC (workstation). At this time, the
On the other hand, FIG. 1C is a test system implemented with a smart camera (S). Here, smart camera means a stand-alone system that can operate itself without the help of a PC, and it is also possible to operate with a PC. At this time, the
2A and 2B are block diagrams illustrating a
2A, a
The
At this time, the
The
Meanwhile, the
For this, the
The
For this, the
The
Also, the preprocessed image can be transmitted to the
On the other hand, the
To this end, the
Also, the
Meanwhile, the
Such a
The
A user terminal (not shown) is located outside the
The user terminal (not shown) may additionally include a module for interacting with the user. The user may input a command to the
The user terminal (not shown) may be implemented as a computer, a portable terminal, a television, a wearable device, or the like. Here, the computer includes, for example, a PC, a notebook equipped with a WEB Browser, a desktop, a laptop, and the like. The portable terminal may be, for example, As an apparatus, it may include all kinds of handheld based wireless communication apparatuses. In addition, the television may include an Internet Protocol Television (IPTV), an Internet television (TV), a terrestrial TV, a cable TV, and the like. Further, the wearable device is an information processing device of a type that can be directly worn on a human body, for example, a watch, a glasses, an accessory, a garment, shoes, or the like, and can be connected to a remote server via a network, Lt; / RTI >
The
Meanwhile, the
The
For this, the
3 is a block diagram illustrating an
The
The
The
The
To this end, the
On the other hand, the
At this time, 'diverse environment' means an environment that can change the image used for machine vision inspection.
This condition can be determined, for example, by collating a sample image with a shot image that was not normally obtained even though the
In order to set such a condition, the
According to an embodiment of the present invention, the
First, the
Secondly, the
Thirdly, the
Table 1 below illustrates the relationship between environmental factors and filter values. The level of the filter can be set by the filter value, the levels of the various elements can be combined by combining the levels of the various filters, and the level when the image is generated automatically by combining the levels of the environmental elements can be adjusted have.
Contrast
Radial Shading
Linear Shading
Constant
Location (x, y)
Constant
Radius
Shift
Zoom
Perspective
x-offset
scaling factor
y-offset
Point & Noise (Salt & Peeper Noise)
Probability
Motion Blur
Radial Blur
Zoom Blur
Direction (0 ~ 3)
Location (x, y)
Location (x, y)
Line size
Radius
Line Size
Crack
Lens Flare
Dust
Count
Location (x, y)
Location (x, y)
Scale
Fourth, the
Fifth, the
On the other hand, the
To this end, the
At this time, there may be a plurality of conditions, and one or more virtual test images may be generated as a plurality of conditions are simultaneously or sequentially applied to the sample image.
In addition, if there are a plurality of generated virtual test images, the
Meanwhile, according to a preferred embodiment of the present invention, the
3, the
The
To this end, the
That is, the
On the other hand, when the triggering signal is inputted, the
That is, the
For this, the
The
The
That is, the
On the other hand, if it is determined that an error has occurred in the operation of the machine vision apparatus, the
Hereinafter, how the present invention can be implemented in a user interface environment will be described with reference to FIGS. 4A and 4B. 4A and 4B are diagrams for explaining a user interface provided by the image generating apparatus and the simulation apparatus according to an embodiment of the present invention.
The
The
The
The
5 is a flowchart illustrating an image generating method and a simulation method according to an embodiment of the present invention.
The image generating method according to the embodiment shown in Fig. 5 includes steps that are processed in a time-series manner in each of the
FIGS. 6 to 11 are diagrams for explaining an image generating method and a simulation method according to an embodiment of the present invention, which will be described below with reference to FIGS. 6 to 11. FIG.
First, a method by which the
That is, the
That is, the
Meanwhile, the
At this time, the
If the condition is set as described above, the
That is, the
As a result, the
That is, assuming that the camera focus is in a bad state or the camera is vibrated, the
In addition, the
In addition, the
Further, assuming that noise is applied to the
Furthermore, the
The
On the other hand, using one or more virtual test images generated as described above, a method for simulating the
That is, the
That is, the
At this time, if the inspection result performed by the
On the other hand, since the
The term " part " used in the present embodiment means a hardware component such as software or a field programmable gate array (FPGA) or an ASIC, and 'part' performs certain roles. However, 'part' is not meant to be limited to software or hardware. &Quot; to " may be configured to reside on an addressable storage medium and may be configured to play one or more processors. Thus, by way of example, 'parts' may refer to components such as software components, object-oriented software components, class components and task components, and processes, functions, , Subroutines, segments of program patent code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
The functions provided within the components and components may be combined with a smaller number of components and components or separated from additional components and components.
In addition, the components and components may be implemented to play back one or more CPUs in a device or a secure multimedia card
Each of the image generation method and the simulation method according to the embodiment described with reference to FIG. 5 may be implemented in the form of a recording medium including instructions executable by a computer such as a program module executed by a computer. Computer readable media can be any available media that can be accessed by a computer and includes both volatile and nonvolatile media, removable and non-removable media. In addition, the computer-readable medium may include both computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Communication media typically includes any information delivery media, including computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, or other transport mechanism.
Further, each of the image generation method and the simulation method according to an embodiment of the present invention may be implemented as a computer program (or a computer program product) including instructions executable by a computer. A computer program includes programmable machine instructions that are processed by a processor and can be implemented in a high-level programming language, an object-oriented programming language, an assembly language, or a machine language . The computer program may also be recorded on a computer readable recording medium of a type (e.g., memory, hard disk, magnetic / optical medium or solid-state drive).
Thus, each of the image generation method and the simulation method according to the embodiment of the present invention can be implemented by the computer program as described above being executed by the computing device. The computing device may include a processor, a memory, a storage device, a high-speed interface connected to the memory and a high-speed expansion port, and a low-speed interface connected to the low-speed bus and the storage device. Each of these components is connected to each other using a variety of buses and can be mounted on a common motherboard or mounted in any other suitable manner.
Where the processor may process instructions within the computing device, such as to display graphical information to provide a graphical user interface (GUI) on an external input, output device, such as a display connected to a high speed interface And commands stored in memory or storage devices. As another example, multiple processors and / or multiple busses may be used with multiple memory and memory types as appropriate. The processor may also be implemented as a chipset comprised of chips comprising multiple independent analog and / or digital processors.
The memory also stores information within the computing device. In one example, the memory may comprise volatile memory units or a collection thereof. In another example, the memory may be comprised of non-volatile memory units or a collection thereof. The memory may also be another type of computer readable medium such as, for example, a magnetic or optical disk.
And the storage device can provide a large amount of storage space to the computing device. The storage device may be a computer readable medium or a configuration including such a medium and may include, for example, devices in a SAN (Storage Area Network) or other configurations, and may be a floppy disk device, a hard disk device, Or a tape device, flash memory, or other similar semiconductor memory device or device array.
It will be understood by those skilled in the art that the foregoing description of the present invention is for illustrative purposes only and that those of ordinary skill in the art can readily understand that various changes and modifications may be made without departing from the spirit or essential characteristics of the present invention. will be. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive. For example, each component described as a single entity may be distributed and implemented, and components described as being distributed may also be implemented in a combined form.
The scope of the present invention is defined by the appended claims rather than the detailed description and all changes or modifications derived from the meaning and scope of the claims and their equivalents are to be construed as being included within the scope of the present invention do.
100: Simulation system
20: Machine vision device
30: Image generating device
310: image obtaining unit 320: setting unit
330:
340: operation requesting unit 350: vision checker
Claims (20)
Obtaining a sample image for generating the plurality of virtual test images;
Setting at least one condition applicable to the sample image so as to reproduce various environments in which the machine vision inspection is performed; And
And generating the plurality of virtual test images by applying the set conditions to the sample images,
The machine vision inspection may be performed by inspecting an object using an algorithm using at least one of pattern matching, point, line, face fitting, edge, color, and location And,
Wherein the various environment is an environment that can give a change to the image used in the machine vision inspection.
Further comprising triggering a machine vision inspection for each or a portion of the plurality of virtual test images.
And when the triggering signal is inputted, performing the machine vision inspection and outputting the result.
Wherein the setting step comprises:
A condition to be applied to the sample image is set as a filter value,
The filter value may include at least one of a brightness, a contrast, a shading, a rotation, a shift, a zoom, a perspective, a Gaussian noise, a blur, , A scratch (Scratch), a lens flare (Df), and dust (Dust).
Wherein the setting step comprises:
Setting a condition to be applied to the sample image as an environmental element,
Characterized in that the environmental element is at least one of illumination non-uniformity, camera position deviation, noise, vibration and poor focus.
The environmental element may include at least one of a brightness, a contrast, a shading, a rotation, a shift, a zoom, a perspective, a Gaussian noise, a blur, Wherein at least one filter value of at least one of a plurality of filters, a Scratch, a lens flare, and a dust filter is combined and extracted.
Wherein the setting step comprises:
A condition to be applied to the sample image is set as a template,
Wherein the template is a condition in which a condition conforming to the machine vision inspection environment is stored in advance.
Wherein the setting step comprises:
Automatically setting a condition to be applied to the sample image,
Wherein the automatic setting is automatically set in correspondence with the number of images to be generated or automatically set in response to input of a degree of deformation at a level with a small degree of deformation of the sample image.
Wherein the setting step comprises:
Setting a condition to be applied to the sample image as user-defined,
Wherein the user definition is one in which at least one of an automatic setting, a filter setting, an environment element setting, and a template setting is previously stored and defined by a user.
Obtaining a sample image for generating the plurality of virtual test images;
Setting at least one condition applicable to the sample image so as to match various environments in which machine vision inspection is performed;
Generating the plurality of virtual test images by applying the set conditions to the sample images;
Triggering a machine vision inspection for each or a portion of the plurality of virtual test images; And
And when the triggering signal is inputted, performing the machine vision inspection and outputting a result.
An image obtaining unit obtaining a sample image for generating the plurality of virtual test images;
A setting unit configured to set at least one condition applicable to the sample image so as to reproduce various environments in which the machine vision inspection is performed; And
And an image generator for generating the plurality of virtual test images by applying the set conditions to the sample images,
The machine vision inspection may be performed by inspecting an object using an algorithm using at least one of pattern matching, point, line, face fitting, edge, color, and location And,
Wherein the various environments are environments in which changes can be made to the images used in the machine vision inspection.
In addition,
A condition to be applied to the sample image is set as a filter value,
The filter value may include at least one of a brightness, a contrast, a shading, a rotation, a shift, a zoom, a perspective, a Gaussian noise, a blur, , A scratch (Scratch), a lens flare (Lens Flare), and a dust (Dust).
In addition,
Setting a condition to be applied to the sample image as an environmental element,
Wherein the environmental element is at least one of illumination non-uniformity, camera position deviation, noise, vibration, and defective focus.
The environmental element may include at least one of a brightness, a contrast, a shading, a rotation, a shift, a zoom, a perspective, a Gaussian noise, a blur, Wherein at least one of a filter value, a scratch value, a lens flare, and a dust value is combined and extracted.
In addition,
A condition to be applied to the sample image is set as a template,
Wherein the template is a condition in which conditions matching the machine vision inspection environment are stored in advance.
In addition,
Automatically setting a condition to be applied to the sample image,
Wherein the automatic setting is automatically set in correspondence with the number of images to be generated, or automatically set in response to input of a large number of levels at a level with less variation of the sample image.
In addition,
Setting a condition to be applied to the sample image as user-defined,
Wherein the user definition is one in which at least one of an automatic setting, a filter setting, an environment element setting, and a template setting has been stored and defined by the user.
An operation request unit for acquiring a plurality of virtual test images and triggering machine vision inspection for each or a part of the images of the plurality of virtual test images; And
And a vision inspection unit for performing a machine vision inspection and outputting a result when the triggering signal is inputted.
An image generation device for applying a condition for reproducing various environments in which machine vision inspection is performed to a sample image to generate a plurality of virtual test images for machine vision inspection; And
And a simulation apparatus for simulating machine vision inspection using the plurality of virtual test images.
Perform machine vision inspection to check the properties of the object using one or more algorithms of Pattern Matching, Point, Line, Face Fitting, Edge, Color and Location. Further comprising a machine vision device,
Wherein the simulation apparatus is configured to cause the machine vision apparatus to perform a machine vision inspection on each or a portion of the images of the plurality of virtual test images.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150144656A KR101748768B1 (en) | 2015-10-16 | 2015-10-16 | Image generating method and apparatus for machine vision test, method and apparatus for simulating machine vision test, and machine vision testing system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150144656A KR101748768B1 (en) | 2015-10-16 | 2015-10-16 | Image generating method and apparatus for machine vision test, method and apparatus for simulating machine vision test, and machine vision testing system |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170044933A KR20170044933A (en) | 2017-04-26 |
KR101748768B1 true KR101748768B1 (en) | 2017-06-19 |
Family
ID=58705018
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150144656A KR101748768B1 (en) | 2015-10-16 | 2015-10-16 | Image generating method and apparatus for machine vision test, method and apparatus for simulating machine vision test, and machine vision testing system |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101748768B1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102018896B1 (en) * | 2017-11-20 | 2019-09-06 | 인곡산업 주식회사 | Vision inspection equipment platform to check the quality of endmill |
KR102135632B1 (en) | 2018-09-28 | 2020-07-21 | 포항공과대학교 산학협력단 | Neural processing device and operating method thereof |
CN110296992A (en) * | 2019-08-15 | 2019-10-01 | 西北农林科技大学 | It is a kind of that rear wild cabbage inside and outside quality detection evaluating apparatus is adopted based on machine vision |
KR20220162728A (en) * | 2020-10-14 | 2022-12-08 | 엘지전자 주식회사 | Artificial intelligence device and method for generating learning data |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004294358A (en) * | 2003-03-28 | 2004-10-21 | Hitachi High-Technologies Corp | Method and apparatus for inspecting defect |
JP2005017159A (en) | 2003-06-27 | 2005-01-20 | Hitachi High-Technologies Corp | Inspection recipe setting method in flaw inspection device and flaw inspecting method |
JP2009192440A (en) | 2008-02-16 | 2009-08-27 | Mega Trade:Kk | Storage structure of test program file of automatic inspection device |
KR101182768B1 (en) * | 2012-02-21 | 2012-09-13 | 주식회사 엠비젼 | Examining apparatus and method for machine vision system |
-
2015
- 2015-10-16 KR KR1020150144656A patent/KR101748768B1/en active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004294358A (en) * | 2003-03-28 | 2004-10-21 | Hitachi High-Technologies Corp | Method and apparatus for inspecting defect |
JP2005017159A (en) | 2003-06-27 | 2005-01-20 | Hitachi High-Technologies Corp | Inspection recipe setting method in flaw inspection device and flaw inspecting method |
JP2009192440A (en) | 2008-02-16 | 2009-08-27 | Mega Trade:Kk | Storage structure of test program file of automatic inspection device |
KR101182768B1 (en) * | 2012-02-21 | 2012-09-13 | 주식회사 엠비젼 | Examining apparatus and method for machine vision system |
Also Published As
Publication number | Publication date |
---|---|
KR20170044933A (en) | 2017-04-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101748768B1 (en) | Image generating method and apparatus for machine vision test, method and apparatus for simulating machine vision test, and machine vision testing system | |
JP6955211B2 (en) | Identification device, identification method and program | |
US11295434B2 (en) | Image inspection system and image inspection method | |
EP3109826A2 (en) | Using 3d vision for automated industrial inspection | |
US11158039B2 (en) | Using 3D vision for automated industrial inspection | |
KR102108956B1 (en) | Apparatus for Performing Inspection of Machine Vision and Driving Method Thereof, and Computer Readable Recording Medium | |
WO2019186915A1 (en) | Abnormality inspection device and abnormality inspection method | |
US8315457B2 (en) | System and method for performing multi-image training for pattern recognition and registration | |
US20170061209A1 (en) | Object processing method, reference image generating method, reference image generating apparatus, object processing apparatus, and recording medium | |
KR101964805B1 (en) | Guide providing method and apparatus for machine vision | |
US20220366658A1 (en) | Systems and methods of augmented reality guided image capture | |
US20140320638A1 (en) | Electronic device and method for detecting surface flaw of sample | |
JP2012242281A (en) | Method, device and program for calculating center position of detection object | |
JP7127046B2 (en) | System and method for 3D profile determination using model-based peak selection | |
JP2022009474A (en) | System and method for detecting lines in vision system | |
US11012612B2 (en) | Image inspection system and image inspection method | |
CN116908185A (en) | Method and device for detecting appearance defects of article, electronic equipment and storage medium | |
US10791661B2 (en) | Board inspecting apparatus and method of compensating board distortion using the same | |
JP7380332B2 (en) | Image processing device, control method and program for the image processing device | |
KR20170131257A (en) | Image analysis method and apparatus for machine vision | |
JP2006145228A (en) | Unevenness defect detecting method and unevenness defect detector | |
Pokharel | Machine Vision and Object Sorting: PLC Communication with LabVIEW using OPC | |
KR102491420B1 (en) | Vision inspection method for vision inspection using dynamic plate based on artificial intelligence learning | |
CN110274911B (en) | Image processing system, image processing apparatus, and storage medium | |
CN116046791B (en) | Method and device for detecting defect of adhesive dispensing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |