CN113495162A - Control system of automatic optical detection equipment - Google Patents

Control system of automatic optical detection equipment Download PDF

Info

Publication number
CN113495162A
CN113495162A CN202010200109.5A CN202010200109A CN113495162A CN 113495162 A CN113495162 A CN 113495162A CN 202010200109 A CN202010200109 A CN 202010200109A CN 113495162 A CN113495162 A CN 113495162A
Authority
CN
China
Prior art keywords
abstraction layer
light source
camera
trigger
abstract
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010200109.5A
Other languages
Chinese (zh)
Inventor
杨咏亘
张振昌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Horng Terng Automation Co Ltd
Original Assignee
Horng Terng Automation Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Horng Terng Automation Co Ltd filed Critical Horng Terng Automation Co Ltd
Priority to CN202010200109.5A priority Critical patent/CN113495162A/en
Publication of CN113495162A publication Critical patent/CN113495162A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/00584Control arrangements for automatic analysers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination

Abstract

The invention relates to a control system of automatic optical detection equipment, which comprises a control computer, wherein the control computer is used for controlling one or more detection stations, each detection station can be provided with an entity light source controller, an entity camera and an entity trigger, and the control computer controls entity equipment of each detection station through a main program, a light source abstraction layer, a camera abstraction layer and a trigger abstraction layer; the light source abstraction layer, the trigger abstraction layer and the camera abstraction layer generate corresponding abstract objects (abstract light source controller, abstract camera group and abstract trigger) according to the setting of a user and have preset control rules, and the main program can control each entity device through the abstraction layer or provide resources required by the main program through the abstraction layer.

Description

Control system of automatic optical detection equipment
Technical Field
The present invention relates to a control system for automatic optical inspection equipment, and more particularly to a control system that can adapt to the change of hardware architecture and flexibly cooperate with the change of hardware architecture.
Background
Automatic Optical Inspection (AOI) equipment is widely used in various fields, mainly relying on machine vision instead of human vision, for example, in a production line of high-tech industries, and inspecting whether a product has a defect problem by measuring the appearance of the product through the machine vision.
Referring to fig. 13, a conventional automatic optical inspection apparatus generally includes four hardware units, namely an image capturing unit, a control computer, a mechanism and driving motor unit, and an electric control unit. The control computer is used for controlling the operation of the whole automatic optical detection equipment, and internal software of the control computer is responsible for image processing, equipment communication, action control and the like; the image capturing unit is connected with the control computer, comprises an illuminating light source and a camera (CCD) and is used for shooting the appearance of an object to be inspected; the driving motor unit and the electric control unit execute corresponding mechanical operation according to the instruction of the control computer.
In some cases, it is possible to perform hardware expansion or reduction on the original AOI device, for example, when the data computation performance of the original control computer is not enough to cope with the huge amount of data, a control computer must be added to divide the work of the processing part.
For example, referring to fig. 14, it is assumed that the original hardware architecture uses a single control computer to control a first detection station and a second detection station, and the first detection station includes a light source controller, a trigger and a first camera group; the second detection station comprises a light source controller, a trigger and a second camera group, and the number of cameras in each camera group can be different from each other. When the control computer receives a hardware in-place signal, the current of the light source controller of the relevant detection station is set, and the trigger synchronously triggers the light source and the camera. The images shot by the first camera group and the second camera group are transmitted to the control computer for subsequent analysis.
Fig. 15 is a view showing that a new control computer is added based on the architecture of fig. 14, and a part of the cameras are transferred to be controlled by the new control computer, the original control computer is responsible for controlling the first inspection station and receiving the images captured by the first camera group, and the new control computer is responsible for controlling the second inspection station and receiving the images captured by the second camera group.
During the above architectural changes, the programmer must directly modify the original code to fit the new architecture. The parts that may be involved in the modification of the original program code contain:
a camera: and adjusting the corresponding program code aiming at the camera hardware to complete the image data required by the image processing program.
A light source controller: the function of the light source controller is to control the illumination light source. When a single control computer is expanded into two control computers, a programmer must modify the program code of the original light source controller for the light source controllers of different detection stations, so that each control computer corresponds to its own light source controller.
Triggering an interface: the function of the trigger interface is to start each camera in the corresponding camera group to take images, and each trigger interface has a plurality of control channels (channels) which can be respectively connected with a plurality of cameras. Because the control right of the camera group is transferred to a new control computer, the cameras respectively controlled by different trigger interfaces are changed, and the program code must be modified according to the actual situation of each control channel.
An image processing program: since the original two-phase cluster is already under the responsibility of different control computers, the image processing program in the control computer also needs to be modified along with the new hardware architecture.
It can be appreciated from the foregoing examples that when hardware devices are changed, the programmer needs to modify the original program code to conform to the new architecture, which is time consuming and inefficient.
Disclosure of Invention
The main purpose of the present invention is to provide a more flexible control system for automatic optical inspection equipment, which can be adapted to new hardware configuration quickly without changing the original program code in the control computer when the light source controller, camera or trigger in the automatic optical inspection equipment needs to be changed.
The invention relates to a control system of automatic optical detection equipment, which is used for controlling at least one detection station of the automatic optical detection equipment, wherein each detection station comprises an entity light source controller, an entity camera and an entity trigger, and the control system comprises:
a control computer, comprising:
a main program, providing a use interface for user to input setting parameters, wherein the main program comprises a universe control module and at least one main control module;
the light source abstraction layer is used for controlling the physical light source controller or establishing a software light source;
a camera abstract layer, which generates at least one abstract camera group according to the set parameters of the user, and is controlled by the main control module through the abstract camera group, wherein the camera abstract layer is used for controlling the entity camera or establishing a software camera;
and the trigger abstract layer is used for controlling the physical trigger or establishing a software trigger.
The light source abstraction layer, the trigger abstraction layer and the camera abstraction layer generate corresponding abstract objects (an abstract light source controller, an abstract camera group and an abstract trigger) according to the setting of a user, and detect whether hardware such as the light source controller, the abstract camera group and the abstract trigger of an entity exists or not, so as to determine whether software virtualization is needed or not according to the hardware configuration of the entity, namely when the existence of actual hardware is detected, the actual hardware device can be controlled through the abstraction layer; when no actual hardware is detected, the function of the hardware can be virtualized through the abstraction layer.
For the main program, only a control instruction needs to be sent to the abstract object, and the entity control is achieved through each abstract layer without directly controlling the entity device. Therefore, when the hardware device is changed, the burden of modifying the program can be reduced.
Drawings
FIG. 1 is a system architecture diagram of the present invention;
FIG. 2A is a schematic diagram of an 8-channel abstract light source controller abstracted from two 4-channel physical light source controllers by using a light source abstract layer according to the present invention;
FIG. 2B is a schematic diagram of an 8-channel real-world light source controller abstracted into two 4-channel abstract light source controllers by using a light source abstract layer according to the present invention;
FIG. 3A is a schematic diagram of the present invention using a camera abstraction layer to respectively use image data captured by different camera groups as image data sources of different inspection stations;
FIG. 3B is a schematic diagram of the present invention using a camera abstraction layer to use image data captured by the same camera group as the source of image data for different inspection stations;
FIG. 4A is a diagram illustrating the use of a trigger abstraction layer to perform hardware triggering according to the present invention;
FIG. 4B is a diagram illustrating the use of a trigger abstraction layer to perform software triggering in accordance with the present invention;
FIG. 5 is a block diagram of the global control module and the main control module in a single control computer according to the present invention;
FIG. 6 is a schematic control flow diagram of the main control module according to the present invention;
FIG. 7 is a diagram illustrating instruction conversion of the abstract layer controlling actual hardware according to the present invention;
FIG. 8 is a diagram illustrating an abstraction layer receiving signals from actual hardware according to the present invention;
FIG. 9 is a block diagram of the global control module and the main control module of the two control computers according to the present invention;
FIGS. 10A-10C are schematic diagrams of a camera cluster splitting process according to the present invention;
FIGS. 11A-11D are schematic diagrams illustrating a flip-flop splitting process according to the present invention;
fig. 12A to 12D are schematic diagrams illustrating a splitting flow of the light source controller according to the present invention;
FIG. 13 is a schematic diagram of the composition of an Automated Optical Inspection (AOI) system;
FIG. 14 is a schematic diagram of an AOI system with a single control computer controlling two inspection stations;
FIG. 15 is a schematic diagram of the AOI system using two control computers to control two inspection stations respectively.
Detailed Description
Aiming at three main devices in an Automatic Optical Inspection (AOI) system, namely a camera, a trigger (trigger interface) and a light source controller, Hardware Abstraction Layers (HAL) are respectively established, namely the camera abstraction layer, the trigger abstraction layer and the light source abstraction layer, and the abstraction layers can enable the AOI system to simplify modification operation through the abstraction layers when Hardware needs to be changed.
The light source abstraction layer, the trigger abstraction layer and the camera abstraction layer can detect whether hardware such as a light source controller, a trigger and a camera of an entity exists according to the setting of a user, so as to determine whether software virtualization is required according to the hardware configuration of the entity, that is, when the actual hardware exists, the actual hardware device can be controlled through the abstraction layer, because the light source controller, the trigger or the camera of the entity all have unique identification codes (ID, such as serial number S.N. of a machine, IP (Internet protocol) of the machine, and the like), whether the actual hardware exists can be judged according to the identification codes, and the identification codes are usually an addressing mode provided by a hardware manufacturer; when no actual hardware is detected, the function of the hardware can be virtualized through the abstraction layer, and the control and function of each abstraction layer will be further described in detail later.
Referring to fig. 1, a system architecture diagram of the present invention is shown, in which a control computer 1 in an AOI system can control different hardware, the hardware mainly includes an entity light source controller 11, an entity camera 12 and an entity trigger 13, the entity trigger 13 is used for triggering the entity light source controller 11 and the entity camera 12 at the same time, and the number of the various devices shown in fig. 1 is only for illustration.
A software control system is installed in the control computer 1, and the software control system includes a main program 20, a light source abstraction layer 30, a camera abstraction layer 40, and a trigger abstraction layer 50, wherein the light source abstraction layer 30, the camera abstraction layer 40, and the trigger abstraction layer 50 can be independent programs that communicate with each other through a standard protocol or can be integrated with the main program 10 in a same execution file.
Each control computer 1 has a main program 20, and the main program 20 includes a global control module 21 and a number of main control modules 22 corresponding to the number of test stations, for example, if the control computer 1 is used to control two test stations, the main program 20 has two main control modules 22. The main program 10 provides a visual interface for the user to set and display the current state of the system, and the user can set the camera groups required by each inspection station and the number of cameras, triggers, light source controllers, and the like included in each camera group through the visual interface.
The light source abstraction layer 30 can establish an abstract light source controller 31 according to the setting parameters (i.e. the setting table of the light source abstraction layer 30) input by the user from the user interface, and the light source abstraction layer 30 and the main control module 22 transmit instructions to each other through a predefined communication protocol (protocol). The light source abstraction layer 30 determines whether there are physical light source controllers 13, the number of physical light source controllers 13, and the like, and determines whether to directly control the physical light source controllers 13 or virtualize a software light source. If there is an entity light source controller 13 for control, as shown in fig. 2A, the light source abstraction layer 30 may abstract two 4-channel entity light source controllers 13 into a single 8-channel abstraction light source controller 31, i.e., a one-to-many abstraction, where each 4-channel entity light source controller 13 may be triggered to output a driving current to start an entity illumination light source (e.g., an LED light source) connected to each channel. As an example shown in fig. 2B, the light source abstraction layer 30 may also abstract two 4-channel real light controllers 13 into a single 8-channel abstract light controller 31, i.e. a many-to-one abstraction. If the light source abstraction layer 30 determines that there is no real physical light source controller 13, the light source abstraction layer 30 virtualizes a software light source, but the software light source has substantially no function.
As another example, if a user needs to construct a "12-channel light source controller 31", the light abstraction layer 30 can be constructed in any of the following ways, but in any way, it is regarded as a 12-channel light source controller to the main control module 22:
(1) a 12-channel entity light controller is selected: the identification code of the 12-channel entity light source controller can be selected and selected.
(2)4 channel +8 channel: and a 4-channel entity light source controller and an 8-channel entity light source controller are combined by selecting the identification code of the 4-channel entity light source controller and the identification code of the 8-channel entity light source controller.
(3)4 channel +4 channel: the three 4-channel physical light controllers are combined, i.e., by selecting the identification codes of the three 4-channel physical light controllers.
The camera abstraction layer 40 can establish the number of abstract camera groups 41A and 41B and specify the cameras to be included in each abstract camera group 41A and 41B according to the settings input by the user through the user interface. The camera abstraction layer 40 determines whether the physical cameras 12 exist and the number thereof, and determines whether the physical cameras 12 should be directly controlled or virtualize software cameras according to the determination result. Similarly, the camera and the main control module 22 communicate with each other and transmit commands through a predefined communication protocol (protocol). As shown in fig. 1 and fig. 3A, a user wants to create an abstract camera group 41A and an abstract camera group 41B, wherein one abstract camera group 41A is designated to include two cameras, and the other abstract camera group 41B is designated to have one camera. If the camera abstraction layer 40 detects that there are two entity cameras 12 designated by the abstraction camera group 41A, the image data generated when the two entity cameras 12 actually take images is provided to the main program 10, otherwise, if the camera abstraction layer 40 does not detect the two entity cameras 12 designated by the abstraction camera group 41A, the two software cameras are virtualized, that is, the image data is read from a database (such as a hard disk) and provided to the main program 10; similarly, the camera abstraction layer 40 will also determine whether there is a corresponding physical camera 12 in the other abstract camera group 41B, and if not, a software camera will be virtualized.
In fig. 3A, the camera abstraction layer 40 may use the image data of different abstract camera clusters 41A and 41B as the image data of different inspection stations, that is, the image data of a first inspection station is obtained from the abstract camera cluster 41A, and the image data of a second inspection station is obtained from the abstract camera cluster 41B. In other embodiments, as shown in fig. 3B, the camera abstraction layer 40 may use the image data of a single abstract camera cluster 41 as the image data of a plurality of inspection stations, that is, different inspection stations may use the image data of the same abstract camera cluster 41.
The image data generated after the camera shoots can be divided into synchronous images and asynchronous images according to the generation mode of the images. The definition of synchronous image refers to image data obtained by shooting with different cameras at the same time sequence. Asynchronous images are defined as groups of images generated by the same camera cluster at different time sequences. As shown in the following table, the following,
Figure BDA0002419070110000061
taking (1,2) in the above table as an example, it represents that one camera performs two image capturing operations to obtain 2 images. Taking (2,2) as an example, two cameras take images twice, so as to obtain 4 images.
The trigger abstraction layer 50 may create an abstract trigger 51 based on the setting parameters entered by the user from the user interface. The trigger abstraction layer 50 determines whether a physical trigger 13 exists and determines whether to control the physical trigger 13 or virtualize a software trigger. Referring to fig. 4A, taking the physical trigger 13 controlling 2 channels as an example, when the trigger abstraction layer 50 receives a go-to-go signal, the two channels of the physical trigger 13 can be controlled to synchronously send out trigger signals to control a camera or a light source. Referring to fig. 4B, the trigger abstraction layer 50 may also directly send out a trigger signal to directly control the camera or the light source after receiving the in-place signal, i.e. the trigger is completed in the form of a software trigger.
Referring to fig. 5, the global control module 21 in the control computer 1 is used to designate a detection station for each of the main control modules 22A and 22B, and each of the main control modules 22A and 22B respectively controls an abstract light source controller 31A, an abstract light source controller 31B, an abstract camera group 41A, an abstract camera group 41B, an abstract trigger 51A, and an abstract trigger 51B of the corresponding detection station. For example, after the control computer 1 assigns the main control module 22A to be responsible for the first inspection station and another main control module 22B to be responsible for the second inspection station, the main control module 22A controls the abstract light source controller 31A, the abstract camera group 41A and the abstract trigger 51A corresponding to the first inspection station; the other main control module 22B controls the abstract light source controller 31B, the abstract camera group 41B and the abstract trigger 51B corresponding to the second inspection station. For each of the master control modules 22A, 22B, the control is not a true physical device, but rather an abstract object. The physical devices are controlled by corresponding abstraction layers. However, for the same control computer 1, even the abstract light source controllers 31A and 31B of different inspection stations still share the same light source abstraction layer 30, and the camera abstraction layer 40 and the trigger abstraction layer 50 also do so.
Referring to fig. 6, a schematic control flow diagram of a single master control module 22 is shown, which roughly includes: when the master control module 22 receives a go-to-go signal (S1), the master control module 22 sets the light source to be controlled and the trigger condition according to the detection process (S2), and waits for the trigger (S3), and when the camera receives the trigger, it takes a picture to generate an image, and the master control module (22) finally receives the image data (S4). The image detection program in the main control module 22 analyzes the received image detection process.
The main control module 22 is configured to pre-establish general commands for different abstraction layers, where the general commands refer to commands issued by the main control module 22 to each abstraction object, for example, the general commands related to light source control are pre-established for the light source abstraction layer 30, the general commands related to camera control are pre-established for the camera abstraction layer 40, and the related general commands are also established for the trigger abstraction layer 50 to control the abstraction object corresponding thereto. Different translation rules are pre-established in each abstraction layer, and after each abstraction layer receives a general instruction, each general instruction is translated into an actual hardware instruction, and the actual hardware instruction is output to control an entity device, namely, an entity light source controller 11, an entity camera 12 or an entity trigger 13. The actual hardware instructions are usually api (application programming interface) instructions provided by hardware device manufacturers, so the translation rules are established according to the specifications of different hardware device manufacturers. Even for the same general-purpose command, actual hardware devices of different manufacturers have corresponding actual hardware commands. For example, if the general command sent by the main control module 22 is "camera image capture at position a", if the specification, model and manufacturer of the actual hardware device are determined by the camera abstraction layer 40 as the first manufacturer, the camera abstraction layer 40 will translate the general command into the actual hardware command meeting the specification of the first manufacturer; if the camera abstraction layer 40 compares that the actual hardware device is provided by vendor b, the camera abstraction layer 40 translates the generic instruction into an actual hardware instruction that meets the specification of vendor b.
Thus, the generic commands are widely applicable to different hardware device manufacturers, and the host control module 22 only needs to be responsible for issuing pre-established high-level commands. The low-level commands that control the actual hardware devices are created by the abstraction layer. The programmer only needs to preset common general instructions and how to adapt the translation rules of different hardware device manufacturers, and each abstraction layer can control the actual hardware device.
Referring to fig. 7, how the control of the entity device is achieved by the master control module 22 through various abstraction layers is further described. Taking the light source abstraction layer 30 as an example, assuming that the general instruction sent by the main control module 22 is to control a 12-channel light source (i.e. the abstract light source controller is a 12-channel), after the light source abstraction layer 30 receives the general instruction, it determines whether to split the instruction according to the setting table of the light source abstraction layer 30 itself.
Taking the upper part of fig. 7 as an example, if the 12-channel abstract light source controller is formed by combining an 8-channel light source controller and a 4-channel light source controller according to a setting table, when the light source abstraction layer 30 receives a general instruction from the main control module 22, the light source abstraction layer 30 splits the general instruction into two software sub-controllers, and then respectively translates two sets of actual hardware instructions to respectively control the 8-channel physical light source controller and the 4-channel physical light source controller.
Taking the situation below fig. 7 as an example, if the abstract light source controller of 12 channels is implemented by using a physical light source controller of 12 channels according to a setting table, when the light source abstraction layer 30 receives a general instruction from the main control module 22, the light source abstraction layer 30 will translate the general instruction into a physical hardware instruction to control the physical light source controller of 12 channels.
Referring to FIG. 8, the reply from a physical device to the master control module 22 only occurs for a physical camera, and is thus illustrated by the camera abstraction layer 40. The physical camera outputs a completion signal when completing image capture, for the abstract camera group 41A, since the two synchronous physical cameras 12A and 12B are set in advance, the abstract camera group 41A will determine whether the two synchronous physical cameras 12A and 12B complete image capture, if so, the abstract camera group 41A will call back (callback) the main control module 22 to notify the main control module that the image capture operation is completed. For the other abstract camera group 41B, since it is configured to correspond to one physical camera 12, the abstract camera group 41B will determine whether the physical camera 12 finishes capturing images, and if so, the abstract camera group 41B will respond to the main control module 22.
Referring to fig. 9, taking two control computers 1A and 1B as an example, each of the control computers 1A and 1B has a global control module 21A and a global control module 21B. The control computer 1A sets the main control module 22A to be in charge of the first detection station, and the main control module 22A controls the abstract light source controller 31A, the abstract camera group 41A, and the abstract trigger 51A corresponding to the first detection station. The other control computer 1B sets the main control module 22B to be responsible for the second inspection station, and controls the abstract light source controller 31B, the abstract camera group 41B, and the abstract trigger 51B corresponding to the second inspection station.
The flow of the present invention corresponding to the hardware configuration change when the hardware configuration of the AOI system changes is further described as follows:
first, a splitting process of a camera group is described, please refer to fig. 10A, and assume that two main control modules 22A and 2B originally exist in a control computer 1A and respectively correspond to two detection stations, wherein the main control module 22A controls an abstract camera group 41A, and the abstract camera group 41A corresponds to two entity cameras 121 and 122. Another master control module 22B controls another abstract camera cluster 41B, and the abstract camera cluster 41B corresponds to one physical camera 123. When another control computer 1B is newly added, the user changes the physical camera 123 originally connected to the control computer 1A to be connected to the newly added control computer 1B, completes the change of the hardware line, uses the interface in the main program 20 of the original control computer 1A, removes the abstract camera group 41B, adds an abstract camera group 41B in the new control computer 1B through the use interface of the main program, and designates a constituent member of the abstract camera group 41B as the physical camera 123. Referring to fig. 10B and 10C, the master control module 22B in the original control computer 1A is moved to a new control computer 1B, and then the global control module 21B in the control computer 1B sets the master control module 22B to control the abstract camera group 41B, so as to complete the splitting process of the camera group, and move the abstract camera group 41B to be dominated by the newly added control computer 1B.
Referring to fig. 11A, similarly, there are two main control modules 22A and 22B corresponding to two detection stations in a control computer 1A, where the main control module 22A controls an abstract trigger 51A, the abstract trigger 51A is used to control a first channel and a second channel on the physical trigger 13A, and the first channel and the second channel may be connected to a light source or a camera, respectively; the other master control module 22B controls another abstraction trigger 51B, and this abstraction trigger 51B is used to control the third channel on the physical trigger 13A. Referring to fig. 11B, when another control computer 1B is added, a physical trigger 13B is installed on the control computer 1B, and the device (such as a camera or an illumination light source) originally controlled by the third channel is connected to the physical trigger 13B, so as to complete the change in hardware. Referring to fig. 11C, the abstract trigger 51B is removed by the main program 20 of the original control computer 1A using the interface, the main control module 22B in the original control computer 1A is moved to the new control computer 1B, and an abstract trigger 51B is added to the new control computer 1B through the interface of the main program. Referring to fig. 11D, the master control module 22B is configured by the global control module 21B in the control computer 1B to control the abstract flip-flop 51B, so as to complete the flip-flop splitting process, and transfer the abstract flip-flop 51B to the new control computer 1B.
Regarding the splitting process of the light source controller, please refer to fig. 12A, there are two main control modules 22A and 22B corresponding to two detection stations respectively in a control computer 1A, where the main control module 22A controls an abstract light source controller 31A of one 12-channel, the abstract light source controller 31A is used to control two entity light sources 11A and 11B, another main control module 22B controls an abstract light source controller 31B of one 8-channel, and the abstract light source controller 31B is used to control one entity light source controller 11C. In fig. 12B, after another control computer 1B is newly added, the original entity light source controller 11C is connected to the control computer 1B instead, and the change in hardware is completed. Referring to fig. 12C, the abstract light source controller 31B is removed by using the interface of the main program 20 of the original control computer 1A, and a user sets a desired light source according to the situation and adds an abstract light source controller 31B in the new control computer 1B by using the interface of the main program. Referring to fig. 12D, the master control module 22B in the original control computer 1A is moved to a new control computer 1B, and then the global control module 21B in the control computer 1B sets the master control module 22B to control the abstract light source controller 31B, so as to complete the splitting process of the light source controller, and move the light source controller 31B to be controlled by the newly added control computer 1B.
As can be seen from the foregoing splitting process of the light source controller, the camera group, and the trigger controller, the light source abstraction layer 30, the camera abstraction layer 40, and the trigger abstraction layer 50 that can be commonly used by different control computers are pre-constructed, so that the requirement of directly editing and modifying the original program code by a programmer is greatly reduced. Although the above example is an example of a newly added control computer, the present invention can also integrate the original architecture that uses multiple control computers to control different inspection stations respectively into a single control computer to control multiple inspection stations, and the user only needs to set the control computer according to the actual hardware environment, and the control computer can control the actual hardware through the light source abstraction layer 30, the camera abstraction layer 40 and the trigger abstraction layer 50, so that the overall system is more flexible in design and maintenance.

Claims (10)

1. A control system for an automated optical inspection device, the control system being configured to control at least one inspection station of the automated optical inspection device, each inspection station including an actual hardware configuration including at least one of an entity light controller, an entity camera, and an entity trigger, the control system comprising:
a control computer, comprising:
a main program, providing a use interface for user to input setting parameters, wherein the main program comprises a universe control module and at least one main control module;
the light source abstraction layer is used for generating at least one abstraction light source controller according to the set parameters of a user, the abstraction light source controller is controlled by the main control module, and the light source abstraction layer is used for controlling the physical light source controller or establishing a software light source;
the camera abstraction layer is used for generating at least one abstraction camera group according to the set parameters of a user, the abstraction camera group is controlled by the main control module, and the camera abstraction layer is used for controlling the entity camera or establishing a software camera;
and the trigger abstraction layer is used for controlling the physical trigger or establishing a software trigger.
2. The control system of automatic optical inspection device as claimed in claim 1, wherein the main program comprises a plurality of main control modules, the global control module is used to assign a corresponding inspection station controlled by each main control module; and each main control module shares the light source abstraction layer, the camera abstraction layer and the trigger abstraction layer to control the corresponding detection station.
3. The control system of claim 2, wherein each of the main control modules is responsible for processing the image data generated by the corresponding inspection station.
4. The control system of automatic optical inspection device as claimed in claim 3, wherein the light source abstraction layer controls data conversion between the abstract light source controller and the physical light source controller.
5. The control system of claim 3, wherein when the camera abstraction layer determines that the physical camera group does not exist in the inspection station, the camera abstraction layer fetches image data from a database to serve as a software camera.
6. The control system of automatic optical inspection device of claim 3, wherein the software trigger is a trigger signal generated by the trigger abstraction layer.
7. The control system of the automatic optical inspection device according to claim 3, wherein the physical trigger is a trigger interface having a plurality of trigger channels.
8. The control system of automatic optical inspection equipment according to any one of claims 1 to 7, wherein different general commands corresponding to the light source abstraction layer, the camera abstraction layer and the trigger abstraction layer are preset in the main control module; the light source abstraction layer, the camera abstraction layer, and the trigger abstraction layer each have preset translation rules for translating general instructions into actual hardware instructions.
9. The control system of automatic optical inspection equipment as claimed in claim 8, wherein the actual hardware commands translated by the light source abstraction layer, the camera abstraction layer and the trigger abstraction layer are used to control the physical light source controller, the physical camera and the physical trigger controller respectively;
and the light source abstraction layer, the camera abstraction layer and the trigger abstraction layer compare the actual hardware configuration of the detection station after receiving the general instruction sent by the main control module, and establish the actual hardware instruction according to the comparison result and the corresponding translation rule.
10. The control system of claim 8, wherein the light source abstraction layer, the camera abstraction layer and the trigger abstraction layer are manufacturers of the specification, model and manufacturer of the comparison hardware when comparing the actual hardware configuration of the inspection station.
CN202010200109.5A 2020-03-20 2020-03-20 Control system of automatic optical detection equipment Pending CN113495162A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010200109.5A CN113495162A (en) 2020-03-20 2020-03-20 Control system of automatic optical detection equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010200109.5A CN113495162A (en) 2020-03-20 2020-03-20 Control system of automatic optical detection equipment

Publications (1)

Publication Number Publication Date
CN113495162A true CN113495162A (en) 2021-10-12

Family

ID=77993629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010200109.5A Pending CN113495162A (en) 2020-03-20 2020-03-20 Control system of automatic optical detection equipment

Country Status (1)

Country Link
CN (1) CN113495162A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024007206A1 (en) * 2022-07-06 2024-01-11 宁德时代新能源科技股份有限公司 Debugging method and apparatus for production line devices, and production line system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040025173A1 (en) * 2002-04-24 2004-02-05 Gil Levonai Interaction abstraction system and method
CN101512359A (en) * 2006-07-10 2009-08-19 阿斯特瑞昂公司 System and method for performing processing in a testing system
JP2011221803A (en) * 2010-04-09 2011-11-04 Toyota Motor Corp Test tool and test method
CN203287332U (en) * 2013-05-17 2013-11-13 深圳明锐理想科技有限公司 Immovable automatic optical check system
US20190293566A1 (en) * 2016-05-31 2019-09-26 Shanghai Micro Electronics Equipment (Group) Co., Ltd. Automatic optical inspection device and method
US20200019147A1 (en) * 2018-07-11 2020-01-16 Siemens Aktiengesellschaft Abstraction layers for automation applications

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040025173A1 (en) * 2002-04-24 2004-02-05 Gil Levonai Interaction abstraction system and method
CN101512359A (en) * 2006-07-10 2009-08-19 阿斯特瑞昂公司 System and method for performing processing in a testing system
JP2011221803A (en) * 2010-04-09 2011-11-04 Toyota Motor Corp Test tool and test method
CN203287332U (en) * 2013-05-17 2013-11-13 深圳明锐理想科技有限公司 Immovable automatic optical check system
US20190293566A1 (en) * 2016-05-31 2019-09-26 Shanghai Micro Electronics Equipment (Group) Co., Ltd. Automatic optical inspection device and method
US20200019147A1 (en) * 2018-07-11 2020-01-16 Siemens Aktiengesellschaft Abstraction layers for automation applications

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024007206A1 (en) * 2022-07-06 2024-01-11 宁德时代新能源科技股份有限公司 Debugging method and apparatus for production line devices, and production line system

Similar Documents

Publication Publication Date Title
US10025291B2 (en) Simulator, simulation method, and simulation program
CN106897688B (en) Interactive projection apparatus, method of controlling interactive projection, and readable storage medium
CN1805542B (en) System and method for programming interrupting operations in a vision system
KR102132330B1 (en) Remote guidance apparatus and method capable of handling hyper-motion step based on augmented reality and machine learning
US11676405B2 (en) Identification of objects for three-dimensional depth imaging
JP2015204615A (en) Method and system for interacting between equipment and moving device
JP2014167786A (en) Automated frame-of-reference calibration for augmented reality
CN105431790A (en) Programming device
JP2019032788A (en) Information processing apparatus, information processing method, and information processing program
JP2019032789A (en) Information processing apparatus, information processing method, and information processing program
CN113495162A (en) Control system of automatic optical detection equipment
CN112148241B (en) Light processing method, device, computing equipment and storage medium
WO2020142499A1 (en) Robot object learning system and method
US20180224825A1 (en) Image processing system, image processing device, method of reconfiguring circuit in fpga, and program for reconfiguring circuit in fpga
JP2021534506A (en) Modular Acceleration Module for Artificial Intelligence Based on Programmable Logic Controller
TWI825289B (en) Control system for automatic optical inspection equipment
US20170050319A1 (en) Programmable Machine Vision Device
JP7024679B2 (en) Development support programs, development support devices, and development support methods
CN112233208B (en) Robot state processing method, apparatus, computing device and storage medium
US20230297109A1 (en) Method for Remote Assistance and Device
JP6819096B2 (en) Image processing equipment, image processing methods, and image processing programs
JP7067869B2 (en) Image processing systems, information processing equipment, information processing methods, and information processing programs
CN111767075A (en) Method and device for synchronizing application program versions
GB2570193A (en) Test system and robot arrangement for carrying out a test
CN115033503B (en) Equipment control method based on augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination