CN117951032A - Debugging method, device, equipment and storage medium of visual detection scheme - Google Patents
Debugging method, device, equipment and storage medium of visual detection scheme Download PDFInfo
- Publication number
- CN117951032A CN117951032A CN202410204954.8A CN202410204954A CN117951032A CN 117951032 A CN117951032 A CN 117951032A CN 202410204954 A CN202410204954 A CN 202410204954A CN 117951032 A CN117951032 A CN 117951032A
- Authority
- CN
- China
- Prior art keywords
- debugging
- tool box
- toolbox
- data
- visual inspection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 238000003860 storage Methods 0.000 title claims abstract description 15
- 238000001514 detection method Methods 0.000 title claims description 44
- 230000000007 visual effect Effects 0.000 title claims description 31
- 238000011179 visual inspection Methods 0.000 claims abstract description 47
- 238000004590 computer program Methods 0.000 claims description 19
- 238000012423 maintenance Methods 0.000 claims description 6
- 238000011160 research Methods 0.000 claims description 4
- 238000012827 research and development Methods 0.000 claims description 3
- 238000012360 testing method Methods 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 238000007689 inspection Methods 0.000 description 10
- 238000013507 mapping Methods 0.000 description 10
- 230000007547 defect Effects 0.000 description 8
- 238000012545 processing Methods 0.000 description 8
- 238000000605 extraction Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 238000012986 modification Methods 0.000 description 7
- 230000004048 modification Effects 0.000 description 7
- 238000007781 pre-processing Methods 0.000 description 5
- 230000009286 beneficial effect Effects 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000012795 verification Methods 0.000 description 3
- 238000005538 encapsulation Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000004806 packaging method and process Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 238000013506 data mapping Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000004513 sizing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Landscapes
- Stored Programmes (AREA)
Abstract
The present application relates to the field of visual inspection, and in particular, to a method, an apparatus, a device, and a storage medium for debugging a visual inspection scheme. The method comprises the following steps: determining a workflow included in the visual inspection task; searching more than two toolboxes corresponding to the workflow in a preset toolbox set; determining a connection relationship between input data and output data of the toolbox; and debugging according to the connection relation of the tool box, and displaying the operation information corresponding to the current debugging step. The scheme can be modified and reconstructed efficiently according to the tool boxes in the tool box set, and the running information corresponding to the current debugging step is displayed during debugging, so that the convenience of debugging can be effectively improved, and the debugging efficiency of the test scheme is improved.
Description
Technical Field
The present application relates to the field of visual inspection, and in particular, to a method, an apparatus, a device, and a storage medium for debugging a visual inspection scheme.
Background
Machine vision products are subject to a number of critical tasks in modern industrial automation, including, for example, pilot positioning, defect detection, sizing, and content recognition. The machine vision product utilizes advanced image acquisition equipment and intelligent image processing technology to automate complex vision judgment work, can remarkably improve production efficiency, reduce cost, promote product technicians and meet the requirements of large-scale customized production.
As the demand for machine vision inspection becomes ever-changing, including a wide variety of defect indicators such as defect inspection, inspection items, and patterns of products, modification and debugging of the vision inspection scheme is often required. By adopting a centralized packaging mode and a debugging method depending on break points, modification and reconstruction are more troublesome, and the scheme generation efficiency is not improved.
Disclosure of Invention
In view of the above, embodiments of the present application provide a method, an apparatus, a device, and a storage medium for debugging a visual inspection scheme, so as to solve the problem in the prior art that the debugging, modification, and reconfiguration are troublesome and are unfavorable for improving the efficiency of generating a test scheme due to the ever-changing requirements of machine vision inspection.
A first aspect of an embodiment of the present application provides a method for debugging a visual inspection scheme, where the method includes:
determining a workflow included in the visual inspection task;
Searching more than two toolboxes corresponding to the workflow in a preset toolbox set;
determining a connection relationship between input data and output data of the toolbox;
and debugging according to the connection relation of the tool box, and displaying the operation information corresponding to the current debugging step.
With reference to the first aspect, in a first possible implementation manner of the first aspect, debugging is performed according to a connection relationship of the toolbox, and running information corresponding to a current debugging step is displayed, including:
Determining the precedence relationship of the tool boxes according to the connection relationship of the tool boxes;
and debugging according to the precedence relationship of the tool box, and displaying parameter editing information, variable information and/or image detection results corresponding to the current debugging step.
With reference to the first aspect, in a second possible implementation manner of the first aspect, determining a connection relationship between input data and output data of the toolbox includes:
Converting the calculated parameters of the first toolbox into first output data of an intermediate structure through the first toolbox;
and when the custom data types of the first output data and the input data of the second toolbox are matched, converting the first output data of the intermediate structure input by the second toolbox into a data structure available by the second toolbox through the second toolbox.
With reference to the second possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the custom data type includes one or more of a point type, a line type, a circle type, an image type, and an encapsulation area type.
With reference to the second possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, the first output data of the intermediate structure is data in a binary form.
With reference to the first aspect, in a fifth possible implementation manner of the first aspect, debugging according to a connection relationship of the toolbox includes:
determining identity information of debugging personnel, wherein the identity information comprises research personnel, technical support personnel and client maintenance personnel;
According to the corresponding relation between the identity information of the debugging personnel and the debugging rights, determining a first right corresponding to the research and development personnel, a second right corresponding to the technical support personnel and a third right corresponding to the client maintenance personnel, wherein the first right is higher than the second right, and the second right is higher than the third right.
With reference to the first aspect to the fifth possible implementation manner of the first aspect, in a sixth possible implementation manner of the first aspect, after performing debugging according to the connection relationship of the toolbox and displaying operation information corresponding to a current debugging step, the method further includes:
storing a visual detection scheme corresponding to the connection relation of the tool box;
And detecting the product according to the stored visual detection scheme.
A second aspect of an embodiment of the present application provides a debugging device for a visual inspection scheme, the device including:
A workflow determination unit configured to determine a workflow included in the visual inspection task;
A tool box searching unit, configured to search more than two tool boxes corresponding to the workflow in a predetermined tool box set;
A connection relation determining unit for determining a connection relation between input data and output data of the tool box;
and the debugging unit is used for debugging according to the connection relation of the tool box and displaying the running information corresponding to the current debugging step.
With reference to the second aspect, in a first possible implementation manner of the second aspect, the debug unit includes:
A precedence relationship determining subunit, configured to determine a precedence relationship of the tool box according to the connection relationship of the tool box;
and the debugging display subunit is used for debugging according to the precedence relationship of the tool box and displaying parameter editing information, variable information and/or image detection results corresponding to the current debugging step.
With reference to the second aspect, in a second possible implementation manner of the second aspect, the connection relation determining unit includes:
a first converting subunit, configured to convert, by a first toolbox, the calculated parameters of the first toolbox into first output data of an intermediate structure;
And the second conversion subunit is used for converting the first output data of the intermediate structure input by the second toolbox into a data structure available by the second toolbox through the second toolbox when the custom data types of the first output data and the input data of the second toolbox are matched.
With reference to the second possible implementation manner of the second aspect, in a third possible implementation manner of the second aspect, the custom data type includes one or more of a point type, a line type, a circle type, an image type, and an encapsulation area type.
With reference to the second possible implementation manner of the second aspect, in a fourth possible implementation manner of the second aspect, the first output data of the intermediate structure is data in binary form.
With reference to the second aspect, in a fifth possible implementation manner of the second aspect, the debug unit includes:
the identity information determining subunit is used for determining the identity information of the debugger, wherein the identity information comprises research personnel, technical support personnel and client maintenance personnel;
and the permission determination subunit is used for determining a first permission corresponding to the research and development personnel according to the corresponding relation between the identity information of the debugging personnel and the debugging permission, a second permission corresponding to the technical support personnel and a third permission corresponding to the client maintenance personnel, wherein the first permission is higher than the second permission, and the second permission is higher than the third permission.
With reference to the second aspect to the fifth possible implementation manner of the second aspect, in a sixth possible implementation manner of the second aspect, the apparatus further includes:
The detection scheme storage unit is used for storing a visual detection scheme corresponding to the connection relation of the tool box;
And the product detection unit is used for detecting the product according to the stored visual detection scheme.
A third aspect of an embodiment of the present application provides a debugging device of a visual inspection scheme, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to any one of the first aspects when the computer program is executed.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the method according to any of the first aspects.
Compared with the prior art, the embodiment of the application has the beneficial effects that: according to the embodiment of the application, more than two corresponding tool boxes are searched based on the workflow of the visual detection task, the mapping relation between the input data and the output data of the tool boxes is determined, the visual detection scheme formed by the tool boxes is debugged based on the determined mapping relation, and the running information corresponding to the current debugging step is displayed. The scheme can be modified and reconstructed efficiently according to the tool boxes in the tool box set, and the running information corresponding to the current debugging step is displayed during debugging, so that the convenience of debugging can be effectively improved, and the debugging efficiency of the test scheme is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic implementation flow diagram of a debugging method of a visual inspection scheme according to an embodiment of the present application;
FIG. 2 is a schematic diagram of the connection of a tool box in a workflow provided by an embodiment of the present application;
FIG. 3 is a diagram of data mapping for a different workflow provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a debug interface provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a debugging device for a visual inspection scheme according to an embodiment of the present application;
Fig. 6 is a schematic diagram of a debugging device for a visual inspection scheme according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical scheme of the application, the following description is made by specific examples.
The machine vision product refers to hardware and software components for realizing the machine vision technology, acquires information from images or videos through simulating human vision functions for analysis, decision and control, and is widely applied to the fields of industrial automation, robotics, quality detection, logistics sorting and the like.
However, due to the variety of detection objects and detection requirements, the method of collecting and concentrating packaging is troublesome in scheme modification and reconstruction by a breakpoint debugging method, and is not beneficial to improving scheme generation efficiency.
Based on the above, the embodiment of the application provides a debugging method of a visual detection scheme, so as to solve the problems that in the debugging process of the visual detection scheme, scheme modification and reconstruction are troublesome and the scheme generation efficiency is not improved. The following detailed description refers to the accompanying drawings.
Fig. 1 is a schematic implementation flow diagram of a method for debugging a visual inspection scheme according to an embodiment of the present application, where the method includes:
in S101, a workflow included in the visual inspection task is determined.
The visual inspection task in the embodiment of the application refers to analyzing and interpreting an image or a video by using a machine vision technology to obtain results including defect inspection, assembly inspection, label inspection and the like of an inspection object.
The workflow of the visual inspection task may include a software flow for completing the visual inspection task. For example, for visual inspection tasks of defect detection of a product, the workflow may include product localization, lighting control, image acquisition, image preprocessing, feature extraction and recognition, and decision output. For the two-dimensional code reading and tracking task, the workflow can comprise the steps of image acquisition triggering, bar code decoding, verification storage, operation response and the like.
In S102, two or more toolboxes corresponding to the workflow are searched for in a predetermined toolbox set.
After determining the workflow included in the visual detection task, more than two toolboxes corresponding to the workflow can be searched in the toolbox set according to the workflow, and a required toolbox is selected in a dragging mode.
The tool box set in the embodiment of the application can comprise more than two tool boxes. A tool box may be understood as a software module in a workflow corresponding to the steps set for implementation. For example, a toolbox may correspond to a minimum step in a workflow, or a toolbox may correspond to two or more steps. In general, the steps in the workflow may be divided according to the functions performed by the tool boxes, such that each tool box corresponds to a step. It will be appreciated that a plurality of sub-kits may be included in the kit, and that a kit including a plurality of sub-kits may be selected, or one or more sub-kits may be selected, depending on the actual workflow.
For example, for the visual detection task of defect detection of a product, the work flow can comprise the steps of product positioning, illumination control, image acquisition, image preprocessing, feature extraction and recognition, decision output and the like, and on the basis that the flow corresponds to the tool boxes one by one, the work flow can comprise a product positioning tool box, an illumination control tool box, an image acquisition tool box, an image preprocessing tool box, a feature extraction and recognition tool box and a decision output tool box.
For the two-dimensional code reading and tracking task, the workflow of the two-dimensional code reading and tracking task can comprise the steps of image acquisition triggering, bar code decoding, verification storage, operation response and the like, and the corresponding tool boxes can comprise an image acquisition triggering tool box, a bar code decoding tool box, a verification storage tool box, an operation response tool box and the like.
Wherein, variables, internal parameters and external parameters included in the tool box can be preset in each tool box. For example, the internal parameters may include image filtering parameters, threshold parameters, feature extraction parameters, and the like in image preprocessing. The image filtering parameters may include, for example, standard deviations of filters used to determine the degree of image smoothness; the threshold parameter may include a threshold value in a binarization process for distinguishing a gray scale difference of the normal image and the defective image; the feature extraction parameters may include fitting accuracy parameters such as when detecting circles, super parameters set when training machine learning models, including maximum depths such as decision trees, and the like.
External parameters may include, for example, camera parameters, exposure parameters, mechanical positioning parameters, software configuration parameters, and the like. The camera parameters may include exposure time, focal length, resolution, etc., the exposure parameters may include light source intensity, light source angle, etc., the mechanical positioning parameters may include workpiece position accuracy, movement speed, etc., and the software configuration parameters may include image acquisition frequency, etc.
Variables of the toolbox may include, for example, pixel coordinate variables, area measurements, defect level variables, and the like. The detected or analyzed values can be recorded by the variables of the tool box.
In S103, a connection relationship between input data and output data of the toolbox is determined.
In the embodiment of the application, the tool boxes corresponding to the workflow of the visual detection task can reliably complete the visual detection task through the connection and mapping of the input data and the output data.
For example, visual inspection tasks for defect inspection of a product may include a product localization tool box, an illumination control tool box, an image acquisition tool box, an image preprocessing tool box, a feature extraction and recognition tool box, and a decision output tool box. The input data and the output data may be partially identical. For example, the input data and the output data may include, for example, raw image data, processed image data, recognition results, etc., the input parameters may further include, for example, image correction parameters, noise filtering parameters, edge enhancement parameters, feature extraction parameters (including, for example, feature descriptor parameters, region segmentation parameters, morphology parameters, etc.), trained model parameters, and system state parameters, etc., and the output data may further include, for example, decision results, alarm signals, etc.
The mapping relationship between the tool boxes can be determined according to the matching relationship between the output data and the input data of the tool boxes. For example, the matching relationship may include a matching relationship of a base data type, and/or a matching relationship of a custom data type.
The basic data types may include, for example, integer type (int), long integer type (long), double-precision floating point type (double), boolean type (bool), etc. Custom data types may include data types such as points, lines, circles, images, regions, and the like. Two coordinate values may be used to represent one point in a two-dimensional space, or three coordinate values may be used to represent one point in a three-dimensional space. Two points may be used to represent a line comprising the coordinates of the two points. The circles may be represented by center point coordinates and radius values, the images may be represented by a matrix of pixels, and the regions may be represented by a set of boundaries of a closed shape (e.g., a list of vertices of a polygon). When the basic data type and/or the custom data type of the input data of the second toolbox are the same as those of the output data of the first toolbox, the input data and the custom data are considered to be matched, and a data connection relationship or a mapping relationship can be established.
For example, fig. 2 is a schematic diagram of mapping between input data and output data of a workflow including a tool box a (a first tool box) and a tool box B (a second tool box). The interactive data of the tool box A comprises input data A1, input data A2, output data A1, output data A2 and output data A3, and the interactive data of the tool box B comprises input data B1, input data B2, input data B3, output data B1 and output data B2. When the basic data types of the output data A1 and the input data B1, the output data A2 and the input data B2, and the output data A3 and the input data A3 are consistent, or the custom data types are consistent, the toolbox a and the toolbox B are considered to be matched, and the connection relationship between the toolbox a and the toolbox B can be established.
In a possible implementation manner, in order to improve the processing efficiency of the toolbox and improve the flexibility of the toolbox, a data conversion module may be arranged in the toolbox and used for converting data to be output by the toolbox into data with an intermediate structure, so that mapping connection between different basic types of data is convenient to achieve.
For example, the above toolbox a (first toolbox) and the toolbox B (second toolbox) can map the data required to be output by the toolbox a into the first output data in the middle format, and after the toolbox B receives the first output data, the toolbox B converts the first output data into the available data structure of the toolbox B through the data structure mapping, so that the data of different basic data types of different toolboxes can effectively perform communication interaction through the input interface and the output interface, thereby improving the decoupling degree and flexibility of the toolbox and being beneficial to improving the precision of the toolbox for processing the data.
In a possible implementation, the data of the intermediate structure may be data in binary form. Compared with texts or other advanced formats, the binary data occupies smaller memory space, has higher reading and writing speed, is beneficial to improving the data processing efficiency, and can clearly describe the metadata structure, so that the original data structure can be accurately restored during decoding, and high-efficiency data conversion is realized.
In a possible implementation, as shown in fig. 3, the connection mapping between the input data and the output data is not limited to the same workflow, but may also include connection mapping between different workflows, such as that shown in fig. 3, the output data of a first tool box (tool box 12) in a first workflow may be mapped into a second workflow as input data of a second tool box (tool box 21).
In addition, as shown in fig. 3, a third tool box may be included in the first workflow, and the third tool box (tool box 13) may correspond to one sub-flow, and a plurality of sub-tool boxes may be included in the sub-flow. In the actual use process, the needed sub-tool boxes can be selected according to the needs to generate sub-processes, and tool boxes are generated according to the sub-tool boxes in the sub-processes.
In S104, debugging is performed according to the connection relationship of the tool box, and running information corresponding to the current debugging step is displayed.
In the embodiment of the application, the needed toolbox can be selected from the toolbox set, the connection relation of the data of the selected toolbox can be constructed in a dragging mode and the like, and the debugging operation can be performed based on the connection relation of the constructed toolbox.
Fig. 4 is a schematic diagram of a debug interface according to an embodiment of the present application. As shown in fig. 4, in the debug interface, a main menu area, a tool box area, a parameter editing area, a recipe editing area, a variable viewing area, and an image display area are included.
The main menu area is used for providing a visual inspection scheme related menu, including, for example, newly-built visual inspection scheme, save visual inspection scheme, open visual inspection scheme, and the like. The tool box area is used to provide the tool boxes needed to construct the visual inspection scheme, including, for example, an image acquisition tool box, a light control tool box, and the like.
The parameter editing area is used for editing parameters of different tool boxes, including internal parameters and/or external parameters of the tool boxes. For example, for an image acquisition kit, the editable parameters may include editing parameters such as exposure time, focal length, and resolution, and for an illumination control kit, editing parameters such as light source intensity, light source angle, and the like.
The project editing area may be used to hold a tool box selected by a tool box region drag operation and display an input interface and an output interface of the tool box. The method can receive the connection instruction of the input interface and the output interface, and establishes a data connection channel of the input interface and the output interface under the condition that the custom data types of the input interface and the output interface are matched.
For debugging purposes, the recipe editing area may also include debugging keys, including, for example, a play key, a refresh key, and the like. By clicking the play button, the tool box set in the scheme editing area can be executed step by step according to the workflow.
The variable viewing area is used to view the status of the variables of the visual inspection scheme debugged by the scheme editing area. The variable viewing area may view information including, for example, one or more of the internal execution state of the code, exception information, variable state, execution results, execution time. In the debugging process, the state of the variable corresponding to the current debugging toolbox or debugging step can be gradually displayed according to the debugging keys.
The image display area is used for displaying the detection state of the visual detection scheme in the debugging process, one or more of lines, data values and other contents detected in the debugging process can be displayed in the image display area, so that a user can conveniently and intuitively check feedback information of working steps of each tool box, abnormal positions of the visual detection scheme can be conveniently and effectively found, and the generation efficiency of the effective visual detection scheme is improved.
In addition, after the debugging is completed by editing the tool box parameters, the current debugged visual detection scheme can be stored in the equipment, so that the debugged visual detection scheme can be conveniently called when the product is subjected to visual detection.
In a possible implementation manner, in order to improve the reliability of the visual inspection scheme in the use process, the debugging authority of the visual inspection scheme can be set. And for debugging personnel with different identity information, different use authorities are corresponding. For example, the developer corresponds to a first right, the technical support person corresponds to a second right and the client repair person corresponds to a third right, wherein the first right is higher than the second right, and the second right is higher than the third right.
The first authority of the research staff can open all authorities for constructing an effective visual detection scheme and meeting specific visual detection requirements of clients. The second rights of the technical support personnel can comprise regulation rights of most parameters, so that parameter adjustment or adjustment of the set range can be conveniently carried out for the client product. The third authority of the client repairing personnel can comprise parameter adjustment authorities within a preset range, so that the error modification scheme is reduced to introduce other detection problems.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
Fig. 5 is a schematic diagram of a debugging device for a visual inspection scheme according to an embodiment of the present application. As shown in fig. 5, the apparatus includes:
A workflow determination unit 501 is configured to determine a workflow included in the visual inspection task.
A tool box searching unit 502, configured to search for more than two tool boxes corresponding to the workflow in a predetermined tool box set.
A connection relation determining unit 503 for determining a connection relation between input data and output data of the tool box.
And the debugging unit 504 is configured to debug according to the connection relationship of the tool box, and display operation information corresponding to the current debugging step.
The debugging device of the visual inspection scheme shown in fig. 5 corresponds to the debugging method of the visual inspection scheme shown in fig. 1.
Fig. 6 is a schematic diagram of a debugging device for a visual inspection scheme provided by an embodiment of the present application. As shown in fig. 6, the debugging device 6 of the visual inspection scheme of this embodiment includes: a processor 60, a memory 61 and a computer program 62 stored in said memory 61 and executable on said processor 60, such as a debugging program of a visual inspection scheme. The processor 60, when executing the computer program 62, implements the steps in the debug method embodiments of the various visual inspection schemes described above. Or the processor 60, when executing the computer program 62, performs the functions of the modules/units of the apparatus embodiments described above.
Illustratively, the computer program 62 may be partitioned into one or more modules/units that are stored in the memory 61 and executed by the processor 60 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing a specific function for describing the execution of the computer program 62 in the debugging device 6 of the visual inspection scheme.
The debugging device 6 of the visual detection scheme can be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server and the like. The debugging device of the visual inspection scheme may include, but is not limited to, a processor 60, a memory 61. It will be appreciated by those skilled in the art that fig. 6 is merely an example of a commissioning device 6 of a visual detection scheme and does not constitute a limitation of the commissioning device 6 of a visual detection scheme, and may comprise more or less components than illustrated, or may combine certain components, or different components, e.g. the commissioning device of a visual detection scheme may further comprise an input-output device, a network access device, a bus, etc.
The Processor 60 may be a central processing unit (Central Processing Unit, CPU), other general purpose Processor, digital signal Processor (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the commissioning device 6 of the visual detection scheme, for example a hard disk or a memory of the commissioning device 6 of the visual detection scheme. The memory 61 may also be an external storage device of the debugging device 6 of the visual inspection scheme, such as a plug-in hard disk, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD) or the like, which are provided on the debugging device 6 of the visual inspection scheme. Further, the memory 61 may also comprise both an internal memory unit and an external memory device of the commissioning device 6 of the visual detection scheme. The memory 61 is used for storing the computer program and other programs and data required for the debugging device of the visual inspection scheme. The memory 61 may also be used for temporarily storing data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on this understanding, the present application may also be implemented by implementing all or part of the procedures in the methods of the above embodiments, and the computer program may be stored in a computer readable storage medium, where the computer program when executed by a processor may implement the steps of the respective method embodiments. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium may include content that is subject to appropriate increases and decreases as required by jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is not included as electrical carrier signals and telecommunication signals.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
Claims (10)
1. A method of debugging a visual inspection scheme, the method comprising:
determining a workflow included in the visual inspection task;
Searching more than two toolboxes corresponding to the workflow in a preset toolbox set;
determining a connection relationship between input data and output data of the toolbox;
and debugging according to the connection relation of the tool box, and displaying the operation information corresponding to the current debugging step.
2. The method according to claim 1, wherein the step of debugging according to the connection relation of the tool box and displaying the operation information corresponding to the current debugging step comprises:
Determining the precedence relationship of the tool boxes according to the connection relationship of the tool boxes;
and debugging according to the precedence relationship of the tool box, and displaying parameter editing information, variable information and/or image detection results corresponding to the current debugging step.
3. The method of claim 1, wherein determining a connection relationship between input data and output data of the tool box comprises:
Converting the calculated parameters of the first toolbox into first output data of an intermediate structure through the first toolbox;
and when the custom data types of the first output data and the input data of the second toolbox are matched, converting the first output data of the intermediate structure input by the second toolbox into a data structure available by the second toolbox through the second toolbox.
4. The method of claim 3, wherein the custom data types include one or more of a point type, a line type, a circle type, an image type, and a package region type.
5. A method according to claim 3, wherein the first output data of the intermediate structure is data in binary form.
6. The method of claim 1, wherein the debugging according to the connection relationship of the tool kit comprises:
determining identity information of debugging personnel, wherein the identity information comprises research personnel, technical support personnel and client maintenance personnel;
According to the corresponding relation between the identity information of the debugging personnel and the debugging rights, determining a first right corresponding to the research and development personnel, a second right corresponding to the technical support personnel and a third right corresponding to the client maintenance personnel, wherein the first right is higher than the second right, and the second right is higher than the third right.
7. The method according to any one of claims 1-6, wherein after performing debugging according to the connection relation of the tool box and displaying the operation information corresponding to the current debugging step, the method further comprises:
storing a visual detection scheme corresponding to the connection relation of the tool box;
And detecting the product according to the stored visual detection scheme.
8. A debugging device for a visual inspection scheme, the device comprising:
A workflow determination unit configured to determine a workflow included in the visual inspection task;
A tool box searching unit, configured to search more than two tool boxes corresponding to the workflow in a predetermined tool box set;
A connection relation determining unit for determining a connection relation between input data and output data of the tool box;
and the debugging unit is used for debugging according to the connection relation of the tool box and displaying the running information corresponding to the current debugging step.
9. A debugging device of a visual inspection scheme comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any one of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410204954.8A CN117951032A (en) | 2024-02-23 | 2024-02-23 | Debugging method, device, equipment and storage medium of visual detection scheme |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410204954.8A CN117951032A (en) | 2024-02-23 | 2024-02-23 | Debugging method, device, equipment and storage medium of visual detection scheme |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117951032A true CN117951032A (en) | 2024-04-30 |
Family
ID=90794359
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410204954.8A Pending CN117951032A (en) | 2024-02-23 | 2024-02-23 | Debugging method, device, equipment and storage medium of visual detection scheme |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117951032A (en) |
-
2024
- 2024-02-23 CN CN202410204954.8A patent/CN117951032A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240087104A1 (en) | Method for monitoring manufacture of assembly units | |
US10964025B2 (en) | Assembly monitoring method and device based on deep learning, and readable storage medium | |
JP2020533654A (en) | Holographic anti-counterfeit code inspection method and equipment | |
CN115641332B (en) | Method, device, medium and equipment for detecting product edge appearance defects | |
KR20210020065A (en) | Systems and methods for finding and classifying patterns in images with vision systems | |
CN113888480A (en) | MES-based quality tracing method and system | |
CN110969600A (en) | Product defect detection method and device, electronic equipment and storage medium | |
US11562479B2 (en) | Inspection apparatus, inspection method, and non-volatile storage medium | |
Rio-Torto et al. | Automatic quality inspection in the automotive industry: a hierarchical approach using simulated data | |
CN113886627A (en) | Mobile communication system based on information synchronization | |
CN112434581A (en) | Outdoor target color identification method and system, electronic device and storage medium | |
Singh et al. | ViDAQ: A computer vision based remote data acquisition system for reading multi-dial gauges | |
CN117951032A (en) | Debugging method, device, equipment and storage medium of visual detection scheme | |
CN111240978A (en) | Data report generation and analysis method | |
Singh et al. | Vidaq: A framework for monitoring human machine interfaces | |
Zhang et al. | A YOLOv3-Based Industrial Instrument Classification and Reading Recognition Method | |
CN114708239A (en) | Glue width detection method and device, electronic equipment and storage medium | |
CN113793349A (en) | Target detection method and device, computer readable storage medium and electronic equipment | |
Bründl et al. | Semantic part segmentation of spatial features via geometric deep learning for automated control cabinet assembly | |
US20130080137A1 (en) | Conversion method and system | |
CN117798654B (en) | Intelligent adjusting system for center of steam turbine shafting | |
CN113379742B (en) | Structure detection method and device of device based on artificial intelligence and electronic equipment | |
CN116109627B (en) | Defect detection method, device and medium based on migration learning and small sample learning | |
EP4095795A1 (en) | Automated optical guide | |
US20240168450A1 (en) | Method of generating master state based on graph neural network for real-time anomaly detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination |