CN111652874B - Method, device, terminal and computer readable storage medium for measuring go-no go gauge - Google Patents

Method, device, terminal and computer readable storage medium for measuring go-no go gauge Download PDF

Info

Publication number
CN111652874B
CN111652874B CN202010501965.4A CN202010501965A CN111652874B CN 111652874 B CN111652874 B CN 111652874B CN 202010501965 A CN202010501965 A CN 202010501965A CN 111652874 B CN111652874 B CN 111652874B
Authority
CN
China
Prior art keywords
gauge
virtual
detected object
visual image
stop
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010501965.4A
Other languages
Chinese (zh)
Other versions
CN111652874A (en
Inventor
杨薛鹏
陈晨
王佩闯
黄渭
刘志龙
丁昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Songzhi Intelligent Technology Shenzhen Co ltd
Original Assignee
Songzhi Intelligent Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Songzhi Intelligent Technology Shenzhen Co ltd filed Critical Songzhi Intelligent Technology Shenzhen Co ltd
Priority to CN202010501965.4A priority Critical patent/CN111652874B/en
Publication of CN111652874A publication Critical patent/CN111652874A/en
Application granted granted Critical
Publication of CN111652874B publication Critical patent/CN111652874B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to the field of industrial machine vision, and provides a method, a device, a terminal and a computer readable storage medium for measuring a stop-go gauge, which realize high-efficiency and convenient inspection of mass products in a non-contact mode. The method comprises the following steps: generating an original visual image of the detected object; processing the original visual image of the detected object into a second visual image with contrast with the virtual stop gauge, wherein the virtual stop gauge is generated based on the detected object; introducing a virtual go gauge and a virtual no-go gauge, wherein the virtual go gauge is generated based on the detected object; measuring the go gauge of the detected object according to the coverage relation between the original visual image of the detected object and the edges of the virtual go gauge; and measuring the non-stop gauge on the detected object according to the coverage relation between the second visual image and the edges of the virtual non-stop gauge. The technical scheme provided by the application realizes non-contact measurement, can detect products with complex shapes, and remarkably improves the detection efficiency.

Description

Method, device, terminal and computer readable storage medium for measuring go-no go gauge
Technical Field
The application relates to the field of industrial machine vision, in particular to a method, a device, a terminal and a computer readable storage medium for measuring a stop-go gauge.
Background
In large-scale industrial production, it is a very time-consuming and labor-consuming matter that a large number of products are measured one by using a measuring tool (e.g., a vernier caliper, a dial gauge, etc. with scales). Because the qualified products have a measurement range, namely the products with measurement values within the range are qualified products, people adopt go gauges and no-go gauges to measure. The go gauge and the no-go gauge are jointly called as a no-go gauge, belong to one of gauges, and are used for mass product inspection as a measurement standard.
However, in the conventional measurement method using the stop-pass gauge, there are the following drawbacks:
1) The measurement parts and parameters are set in a large number, so that the operation is inconvenient, the efficiency is low, and the detection area is difficult to set for products with complex shapes;
2) When the product requires non-contact measurement, the existing stop-go gauge measurement method cannot meet the requirement.
Disclosure of Invention
The application provides a method, a device, a terminal and a computer readable storage medium for measuring a stop gauge, which realize high-efficiency and convenient inspection of mass products in a non-contact mode.
In one aspect, the application provides a method for measuring a stop-go gauge, comprising the following steps:
generating an original visual image of the detected object;
processing the original visual image of the detected object into a second visual image with contrast with a virtual stop gauge, wherein the virtual stop gauge is generated based on the detected object;
a virtual go gauge and the virtual no-go gauge are imported, and the virtual go gauge is generated based on the detected object;
measuring the go gauge of the detected object according to the coverage relation between the original visual image of the detected object and the edges of the virtual go gauge;
and measuring the no-go gauge on the detected object according to the coverage relation between the second visual image and the edges of the virtual no-go gauge.
In another aspect, the present application provides a stop gauge measurement device, including:
the image generation module is used for generating an original visual image of the detected object;
the image processing module is used for processing the original visual image of the detected object into a second visual image with contrast with a virtual stop gauge, and the virtual stop gauge is generated based on the detected object;
the image importing module is used for importing a virtual go gauge and the virtual no-go gauge, and the virtual go gauge is generated based on the detected object;
the first virtual measurement module is used for measuring the go gauge of the detected object according to the coverage relation between the original visual image of the detected object and the edges of the virtual go gauge;
and the second virtual measurement module is used for measuring the non-stop gauge of the detected object according to the coverage relation between the second visual image and the edges of the virtual non-stop gauge.
In a third aspect, the present application provides a terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to the above technical solution when the computer program is executed.
In a fourth aspect, the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of a method according to the above-mentioned technical solution.
According to the technical scheme, as the measured object becomes the original visual image and the second visual image of the detected object, the go-no-go gauge is also changed into the virtual go-no-go gauge, and during measurement, the measurement is performed according to the covering relation between the original visual image of the detected object and the edge of the virtual go-no-go gauge and the covering relation between the second visual image and the edge of the virtual go-no-go gauge, namely, the measurement of the real object is converted into the comparison between the images, and on one hand, the non-contact measurement is realized; on the other hand, the detection device can detect products with complex shapes and remarkably improves the detection efficiency without complex measurement positions and parameter settings facing to a large number of detected objects.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for measuring a go-no go gauge provided by an embodiment of the application;
FIG. 2 is a schematic diagram of generating a virtual go gauge and a virtual no-go gauge according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a stop gauge measuring device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In this specification, adjectives such as first and second may be used solely to distinguish one element or action from another element or action without necessarily requiring or implying any actual such relationship or order. Where the environment permits, reference to an element or component or step (etc.) should not be construed as limited to only one of the element, component, or step, but may be one or more of the element, component, or step, etc.
In the present specification, for convenience of description, the dimensions of the various parts shown in the drawings are not drawn in actual scale.
The application provides a method for measuring a stop-go gauge, which mainly comprises the steps S101 to S105 as shown in the attached figure 1, and is described in detail as follows:
step S101: and generating an original visual image of the detected object.
As an embodiment of the present application, an industrial visual inspection system, such as an industrial camera, an industrial lens, an industrial light source, and a software and hardware module of vision measurement software, may be used to generate an original visual image of the inspected object. Of course, other visual inspection systems may be used to generate the original visual image of the inspected object, in principle, as long as the generated original visual image is sufficiently clear.
Step S102: and processing the original visual image of the detected object into a second visual image with contrast with the virtual stop gauge, wherein the virtual stop gauge is generated based on the detected object.
Unlike the prior art, which uses a physical stop gauge to measure the object to be measured, in the embodiment of the present application, a virtual stop gauge and a virtual stop gauge are used for measurement, which are both generated based on the object to be measured. In the embodiment of the present application, the virtual no-go gauge and the virtual go gauge are images in nature, and may be generated before the original visual image of the object to be detected is generated, that is, before the original visual image of the object to be detected is generated, the method of the present application further includes: the method comprises the steps of enlarging and drawing the detected object based on positive tolerance of the edge of the detected object, generating a virtual go gauge, and scaling and drawing the detected object based on negative tolerance of the edge of the detected object, so as to generate the virtual no-go gauge. For example, assuming that the edge size of the object to be detected is 50mm and the positive and negative tolerance is ±0.2mm, redrawing the object to be detected according to the size standard of 50.2mm, and redrawing the obtained object to be detected according to the size standard of 50.2mm as a virtual go gauge; redrawing the detected object according to the size standard of 49.8mm, wherein the detected object redrawn according to the size standard of 49.8mm is used as the virtual stop gauge. Assuming that the object to be detected is a circular object, taking the object to be detected with the assumed edge size of 50mm and positive and negative tolerance of + -0.2 mm as an example, as shown in fig. 2, in a circle with three circle centers overlapping on the left, a dotted circle represents the edge of the object to be detected with the size of 50+ -0.2 mm, wherein a dashed-short horizontal line circle represents the edge of the object to be detected with the size of 50+0.2, namely 50.2mm, and a dashed-dotted circle represents the edge of the object to be detected with the size of 50-0.2, namely 49.8 mm; the detected object is redrawn according to the size standard of 50.2mm, the detected object redrawn according to the size standard of 50.2mm is a virtual go gauge, as shown by a circle with the reference number of 21 in the figure, the detected object is redrawn according to the size standard of 49.8mm, and the detected object redrawn according to the size standard of 49.8mm is a virtual no-go gauge, as shown by a circle with the reference number of 23 in the figure.
In the embodiment of the application, the original visual image of the detected object is processed into the second visual image with contrast with the virtual stop gauge, which is considered to be convenient for observation when the virtual stop gauge and the original visual image of the detected object are in a covering relationship, or conversely, if the original visual image of the detected object is not processed into the image with contrast with the virtual stop gauge, the covering relationship cannot be observed when the original visual image of the detected object covers the virtual stop gauge in fact.
As an embodiment of the present application, the processing of the original visual image of the object to be detected into the second visual image with contrast with the virtual stop gauge may be: and binarizing the original visual image of the detected object, so that the original visual image of the detected object is inverted into a binarized image. In general, the binarization method may be to gray the original visual image of the object to be detected, and then further process the image obtained after gray-level processing into a binarized image. Specifically, one method for binarizing an original visual image of an object to be detected is: aiming at an original visual image of a detected object, carrying out gray processing on the original visual image of the detected object, and carrying out segmentation processing on the obtained original visual image of the detected object based on a skin color segmentation algorithm; and determining a preset threshold value of the original visual image of the detected object after the segmentation processing based on an Ojin algorithm, and performing binarization processing on the gray level image after the segmentation processing based on the preset threshold value. Another method for binarizing the original visual image of the detected object may be: carrying out gray processing on an original visual image of the detected object; judging whether a gray image obtained from an original visual image meets preset definition, if so, performing binarization processing on the gray image by adopting a global iteration threshold to obtain a binarized image, otherwise, calculating to obtain a gray average value of the gray image; determining a threshold coefficient according to the gray average value, and determining a local binarization threshold value based on the gray average value and the threshold coefficient; and carrying out binarization processing on the gray level image based on the binarization threshold value to obtain a binarized image.
Step S103: and introducing a virtual go gauge and a virtual no-go gauge, wherein the virtual go gauge is generated based on the detected object.
In the embodiment of the application, the introduction of the virtual go gauge and the virtual no-go gauge refers to the introduction of the virtual go gauge and the virtual no-go gauge as illustrated in fig. 2 into the industrial visual inspection system described in the previous embodiment.
Step S104: and measuring the go gauge of the detected object according to the coverage relation between the original visual image of the detected object and the edges of the virtual go gauge.
As an embodiment of the present application, according to the coverage relationship between the original visual image of the detected object and the edge of the virtual go gauge, the go gauge measurement on the detected object can be implemented by the following steps S1041 to S1043:
step S1041: and redrawing the original visual image of the detected object and the virtual through gauge in the same area, or redrawing the original visual image of the detected object and the virtual through gauge in the same area after scaling the original visual image of the detected object and the virtual through gauge in the same proportion, so as to obtain a first drawing and a second drawing respectively.
If the original visual image of the detected object and the size of the virtual go gauge are both moderate, the original visual image and the virtual go gauge can be directly redrawn in the same area, and the first drawing and the second drawing are respectively obtained. If the original visual image of the object to be detected and the virtual through gauge are not moderate in size (for example, the size is too large or too small), the original visual image of the object to be detected and the virtual through gauge can be drawn in the same area again after being scaled in the same proportion, and the first drawing and the second drawing are obtained respectively. For example, if the sizes of the original visual image of the object to be detected and the virtual through gauge are too large, the original visual image of the object to be detected and the virtual through gauge can be drawn again in the same area after being reduced in the same proportion, so as to obtain the first drawing and the second drawing respectively.
It should be noted that, in either of the above modes, the first drawing and the second drawing are drawn, and that area of the drawing should be sufficient to enclose the original visual image of the object to be detected and the virtual compass or the scaled image of the object to be detected.
Step S1042: the edges of the first plot and the second plot obtained in step S1041 are compared.
Step S1043: and if the edge of the first drawing does not exceed the edge of the second drawing, determining that the detected object is qualified.
The first drawing is actually an original visual image of the detected object or a drawing obtained by redrawing the original visual image of the detected object after scaling the original visual image of the detected object with the virtual through gauge, and the second drawing is a drawing obtained by redrawing the virtual through gauge or the virtual through gauge after scaling the original visual image of the detected object with the same proportion, so that according to the measurement principle of the through gauge, if the edge of the first drawing does not exceed the edge of the second drawing, the detected object is determined to be qualified, otherwise, the detected object is determined to be unqualified.
Step S105: and measuring the non-stop gauge on the detected object according to the coverage relation between the second visual image and the edges of the virtual non-stop gauge.
As an embodiment of the present application, according to the coverage relationship between the second visual image and the edge of the virtual gauge, the gauge measurement on the object to be detected can be achieved by the following steps S1051 to S1053:
step S1051: and redrawing the second visual image and the virtual stop gauge in the same area, or redrawing the second visual image and the virtual stop gauge in the same area after scaling the second visual image and the virtual stop gauge in the same proportion, so as to respectively obtain a third drawing and a fourth drawing.
If the second visual image of the detected object and the size of the virtual stop gauge are moderate, the second visual image and the virtual stop gauge can be directly redrawn in the same area, and a third drawing and a fourth drawing are respectively obtained. If the second visual image of the object to be detected and the virtual stop gauge are not moderately sized (for example, oversized or undersized), the second visual image of the object to be detected and the virtual stop gauge can be drawn again in the same area after being scaled in the same way, so as to obtain a third drawing and a fourth drawing respectively. For example, if the sizes of the second visual image of the object to be detected and the virtual stop gauge are too large, the second visual image of the object to be detected and the virtual stop gauge can be drawn again in the same area after being reduced in the same proportion, so as to obtain the third drawing and the fourth drawing respectively.
Step S1052: the edges of the third plot are compared to the edges of the fourth plot.
Step S1053: and if the edge of the third drawing does not completely cover the edge of the fourth drawing, determining that the detected object is qualified.
The third drawing is actually a second visual image of the detected object or a picture obtained by redrawing the second visual image of the detected object after scaling the second visual image of the detected object with the virtual stop gauge, and the fourth drawing is a picture obtained by redrawing the virtual stop gauge or the virtual stop gauge after scaling the second visual image of the detected object with the same proportion, so that according to the measurement principle of the stop gauge, if the edge of the third drawing does not completely cover the edge of the fourth drawing, the detected object is determined to be qualified, otherwise, the detected object is determined to be unqualified.
It should be noted that, in either of the above modes, the third drawing and the fourth drawing are drawn, and the area of the drawing should be sufficient to enclose the second visual image of the object to be detected and the virtual stop gauge or the scaled image of the object to be detected.
As can be seen from the above-mentioned method for measuring the go-no-go gauge illustrated in fig. 1, since the measured object becomes the original visual image and the second visual image of the detected object, the go-no-go gauge is also changed into the virtual go-no-go gauge, and during measurement, the measurement is performed according to the covering relationship between the original visual image of the detected object and the edge of the virtual go-no-go gauge and the covering relationship between the second visual image and the edge of the virtual go-no-go gauge, that is, the measurement of the real object is converted into the comparison between the images, thereby realizing the non-contact measurement on one hand; on the other hand, the detection device can detect products with complex shapes and remarkably improve the detection efficiency without setting complicated measurement positions, parameters and the like for a large number of detected objects.
Referring to fig. 3, a stop-go gauge measuring device according to an embodiment of the present application includes an image generating module 301, an image processing module 302, an image importing module 303, a first virtual measuring module 304 and a second virtual measuring module 305, which are described in detail below:
an image generation module 301, configured to generate an original visual image of the object to be detected;
an image processing module 302, configured to process an original visual image of the object to be detected into a second visual image with contrast to the virtual gauge, where the virtual gauge is generated based on the object to be detected;
an image importing module 303, configured to import a virtual go gauge and a virtual no-go gauge, where the virtual go gauge is generated based on the object to be detected;
the first virtual measurement module 304 is configured to measure the gauge on the object according to the coverage relationship between the original visual image of the object and the edge of the virtual gauge;
the second virtual measurement module 305 is configured to measure the gauge on the object according to the coverage relationship between the second visual image and the edge of the virtual gauge.
Optionally, the apparatus illustrated in fig. 3 may further include a virtual go gauge generating module and a virtual no-go gauge generating module, where:
the virtual go gauge generating module is configured to enlarge and draw the detected object based on a positive tolerance of an edge of the detected object before the image generating module 301 generates an original visual image of the detected object, so as to generate a virtual go gauge;
the virtual no-go gauge generating module is configured to scale the object based on a negative tolerance of an edge of the object before the image generating module 301 generates the original visual image of the object, and generate the virtual no-go gauge.
Optionally, the image processing module 302 illustrated in fig. 3 may include a binarizing unit for binarizing the original visual image of the object to be detected, so that the original visual image of the object to be detected is inverted into a binarized image.
Optionally, the first virtual measurement module 304 illustrated in fig. 3 may include a first redrawing unit, a first comparing unit, and a first determining unit, where:
the first redrawing unit is used for redrawing the original visual image of the detected object and the virtual through gauge in the same area, or redrawing the original visual image of the detected object and the virtual through gauge in the same area after scaling in the same proportion, so as to respectively obtain a first drawing and a second drawing;
a first comparing unit for comparing the edge of the first drawing with the edge of the second drawing;
and the first determining unit is used for determining that the detected object is qualified if the edge of the first drawing does not exceed the edge of the second drawing.
Optionally, the second virtual measurement module 305 illustrated in fig. 3 may include a second redrawing unit, a second comparing unit, and a second determining unit, where:
the second redrawing unit is used for redrawing the second visual image and the virtual stop gauge in the same area, or redrawing the second visual image and the virtual stop gauge in the same area after the second visual image and the virtual stop gauge are scaled in the same proportion, so as to respectively obtain a third drawing and a fourth drawing;
a second comparing unit for comparing the edge of the third drawing with the edge of the fourth drawing;
and the second determining unit is used for determining that the detected object is qualified if the edge of the third drawing does not completely cover the edge of the fourth drawing.
As can be seen from the description of the above technical solution, since the measured object becomes the original visual image and the second visual image of the detected object, the go-no-go gauge is also changed into the virtual go-no-go gauge, and during the measurement, the measurement is performed according to the covering relationship between the original visual image of the detected object and the edge of the virtual go-no-go gauge and the covering relationship between the second visual image and the edge of the virtual go-no-go gauge, that is, the measurement of the real object is converted into the comparison between the images, thereby realizing the non-contact measurement on one hand; on the other hand, the detection device can detect products with complex shapes and remarkably improve the detection efficiency without setting complicated measurement positions, parameters and the like for a large number of detected objects.
Fig. 4 is a schematic structural diagram of a terminal according to an embodiment of the present application. As shown in fig. 4, the terminal 4 of this embodiment mainly includes: a processor 40, a memory 41 and a computer program 42 stored in the memory 41 and executable on the processor 40, such as a program of a stop-go gauge measuring method. The steps in the above-described embodiment of the method for measuring the go-no-go gauge are implemented when the processor 40 executes the computer program 42, for example, steps S101 to S105 shown in fig. 1. Alternatively, the processor 40 may implement the functions of the modules/units in the above-described apparatus embodiments when executing the computer program 42, for example, the functions of the image generation module 301, the image processing module 302, the image importing module 303, the first virtual measuring module 304, and the second virtual measuring module 305 shown in fig. 3.
Illustratively, the computer program 42 of the stop-gauge measurement method mainly includes: generating an original visual image of the detected object; processing the original visual image of the detected object into a second visual image with contrast with the virtual stop gauge, wherein the virtual stop gauge is generated based on the detected object; introducing a virtual go gauge and a virtual no-go gauge, wherein the virtual go gauge is generated based on the detected object; measuring the go gauge of the detected object according to the coverage relation between the original visual image of the detected object and the edges of the virtual go gauge; and measuring the non-stop gauge on the detected object according to the coverage relation between the second visual image and the edges of the virtual non-stop gauge. The computer program 42 may be divided into one or more modules/units, which are stored in the memory 41 and executed by the processor 40 to complete the present application. One or more of the modules/units may be a series of computer program instruction segments capable of performing a specific function for describing the execution of the computer program 42 in the terminal 4. For example, the computer program 42 may be divided into functions of an image generation module 301, an image processing module 302, an image importing module 303, a first virtual measurement module 304, and a second virtual measurement module 305 (modules in a virtual apparatus), each of which specifically functions as follows: an image generation module 301, configured to generate an original visual image of the object to be detected; an image processing module 302, configured to process an original visual image of the object to be detected into a second visual image with contrast to the virtual gauge, where the virtual gauge is generated based on the object to be detected; an image importing module 303, configured to import a virtual go gauge and a virtual no-go gauge, where the virtual go gauge is generated based on the object to be detected; the first virtual measurement module 304 is configured to measure the gauge on the object according to the coverage relationship between the original visual image of the object and the edge of the virtual gauge; the second virtual measurement module 305 is configured to measure the gauge on the object according to the coverage relationship between the second visual image and the edge of the virtual gauge.
The terminal 4 may include, but is not limited to, a processor 40, a memory 41. It will be appreciated by those skilled in the art that fig. 4 is merely an example of terminal 4 and is not intended to limit terminal 4, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., a computing device may also include an input-output device, a network access device, a bus, etc.
The processor 40 may be a central processing unit (Central Processjng Unjt, CPU), but may also be other general purpose processors, digital signal processors (Djgjtal Sjgnal Processor, DSP), application specific integrated circuits (Appljcatjon Specjfjc Jntegrated Cjrcujt, ASJC), off-the-shelf programmable gate arrays (Fjeld-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the terminal 4, such as a hard disk or a memory of the terminal 4. The memory 41 may also be an external storage device of the terminal 4, such as a plug-in hard disk provided on the terminal 4, a Smart media ja Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like. Further, the memory 41 may also include both an internal storage unit and an external storage device of the terminal 4. The memory 41 is used to store computer programs and other programs and data required by the terminal. The memory 41 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that the above-described functional units and modules are merely illustrated for convenience and brevity of description, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above device may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal and method may be implemented in other manners. For example, the apparatus/terminal embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a non-transitory computer readable storage medium. Based on such understanding, the present application may implement all or part of the procedures in the methods of the above embodiments, or may be implemented by a computer program for instructing related hardware, and the computer program of the stop-go gauge measurement method may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each method embodiment described above, that is, generate an original visual image of the object to be detected; processing the original visual image of the detected object into a second visual image with contrast with the virtual stop gauge, wherein the virtual stop gauge is generated based on the detected object; introducing a virtual go gauge and a virtual no-go gauge, wherein the virtual go gauge is generated based on the detected object; measuring the go gauge of the detected object according to the coverage relation between the original visual image of the detected object and the edges of the virtual go gauge; and measuring the non-stop gauge on the detected object according to the coverage relation between the second visual image and the edges of the virtual non-stop gauge. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc. The non-transitory computer readable medium may include: any entity or device capable of carrying computer program code, a recording medium, a USB flash disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the non-transitory computer readable medium may include content that is suitably scaled according to the requirements of jurisdictions in which the legislation and patent practice, such as in some jurisdictions, the non-transitory computer readable medium does not include electrical carrier signals and telecommunication signals according to the legislation and patent practice. The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
The foregoing description of the embodiments has been provided for the purpose of illustrating the general principles of the application, and is not meant to limit the scope of the application, but to limit the application to the particular embodiments, and any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the application are intended to be included within the scope of the application.

Claims (8)

1. A method of measuring a stop-go gauge, the method comprising:
generating an original visual image of the detected object;
processing the original visual image of the detected object into a second visual image with contrast with a virtual stop gauge, wherein the virtual stop gauge is generated based on the detected object;
importing a virtual go gauge and the virtual no-go gauge, wherein the virtual go gauge is generated based on the design size of the detected object;
measuring the go gauge of the detected object according to the coverage relation between the original visual image of the detected object and the edges of the virtual go gauge;
measuring the non-stop gauge of the detected object according to the coverage relation between the second visual image and the edges of the virtual non-stop gauge;
the measuring the go gauge of the detected object according to the coverage relation between the original visual image of the detected object and the edges of the virtual go gauge comprises the following steps:
redrawing the original visual image of the detected object and the virtual through gauge in the same area, or redrawing the original visual image of the detected object and the virtual through gauge in the same area after scaling in the same proportion, so as to respectively obtain a first drawing and a second drawing;
comparing the edge of the first plot with the edge of the second plot;
if the edge of the first drawing does not exceed the edge of the second drawing, determining that the detected object is qualified;
and according to the coverage relation between the second visual image and the edges of the virtual no-go gauge, measuring the no-go gauge on the detected object, including:
redrawing the second visual image and the virtual stop gauge in the same area, or redrawing the second visual image and the virtual stop gauge in the same area after scaling the second visual image and the virtual stop gauge in the same proportion, so as to respectively obtain a third drawing and a fourth drawing;
comparing the edges of the third plot with the edges of the fourth plot;
and if the edge of the third drawing does not completely cover the edge of the fourth drawing, determining that the detected object is qualified.
2. The method of measuring a stop-go gauge according to claim 1, wherein before generating the original visual image of the object, the method further comprises:
enlarging and drawing the detected object based on the positive tolerance of the edge of the detected object to generate the virtual go gauge;
and scaling and drawing the detected object based on the negative tolerance of the edge of the detected object to generate the virtual no-go gauge.
3. The method for measuring a stop gauge according to claim 1, wherein the processing the original visual image of the object to be measured into a second visual image with contrast to the virtual stop gauge comprises:
and binarizing the original visual image of the detected object, so that the original visual image of the detected object is inverted into a binarized image.
4. A stop gauge measurement device, the device comprising:
the image generation module is used for generating an original visual image of the detected object;
the image processing module is used for processing the original visual image of the detected object into a second visual image with contrast with a virtual stop gauge, and the virtual stop gauge is generated based on the detected object;
the image importing module is used for importing a virtual go gauge and the virtual no-go gauge, and the virtual go gauge is generated based on the detected object;
the first virtual measurement module is used for measuring the go gauge of the detected object according to the coverage relation between the original visual image of the detected object and the edges of the virtual go gauge;
and the second virtual measurement module is used for measuring the non-stop gauge of the detected object according to the coverage relation between the second visual image and the edges of the virtual non-stop gauge.
5. The no-go gauge measurement device of claim 4, further comprising:
the virtual go gauge generation module is used for enlarging the drawing of the detected object based on the positive tolerance of the edge of the detected object before the image generation module generates the original visual image of the detected object, so as to generate the virtual go gauge;
and the virtual no-go gauge generating module is used for scaling and drawing the detected object based on the negative tolerance of the edge of the detected object before the image generating module generates the original visual image of the detected object, so as to generate the virtual no-go gauge.
6. The stop-go gauge measurement device of claim 4, wherein the image processing module comprises:
and the binarization unit is used for binarizing the original visual image of the detected object so that the original visual image of the detected object is inverted into a binarized image.
7. A terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any one of claims 1 to 3 when the computer program is executed.
8. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 3.
CN202010501965.4A 2020-06-04 2020-06-04 Method, device, terminal and computer readable storage medium for measuring go-no go gauge Active CN111652874B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010501965.4A CN111652874B (en) 2020-06-04 2020-06-04 Method, device, terminal and computer readable storage medium for measuring go-no go gauge

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010501965.4A CN111652874B (en) 2020-06-04 2020-06-04 Method, device, terminal and computer readable storage medium for measuring go-no go gauge

Publications (2)

Publication Number Publication Date
CN111652874A CN111652874A (en) 2020-09-11
CN111652874B true CN111652874B (en) 2023-10-20

Family

ID=72348831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010501965.4A Active CN111652874B (en) 2020-06-04 2020-06-04 Method, device, terminal and computer readable storage medium for measuring go-no go gauge

Country Status (1)

Country Link
CN (1) CN111652874B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002123840A (en) * 2000-10-17 2002-04-26 Nippon Telegr & Teleph Corp <Ntt> Processing method and processor for providing presence type virtual reality
CN101303219A (en) * 2007-05-11 2008-11-12 郑州大学 GPS-based virtual gage design and applying method thereof
CN106127758A (en) * 2016-06-21 2016-11-16 四川大学 A kind of visible detection method based on virtual reality technology and device
CN106247969A (en) * 2016-09-21 2016-12-21 哈尔滨工业大学 A kind of deformation detecting method of industrial magnetic core element based on machine vision
CN106841231A (en) * 2017-03-15 2017-06-13 北方工业大学 Visual precision measurement system and method for tiny parts
CN107992281A (en) * 2017-10-27 2018-05-04 网易(杭州)网络有限公司 Visual display method and device, storage medium, the equipment of compensating sound information
CN109522566A (en) * 2017-09-19 2019-03-26 桂林电子科技大学 A kind of position error assessment method of three projection planes system
CN109520436A (en) * 2018-11-28 2019-03-26 扬州市职业大学 A kind of butterfly spring three-dimensional dimension automatic measurement system and its measurement method based on machine vision
CN110286768A (en) * 2019-06-27 2019-09-27 Oppo广东移动通信有限公司 Dummy object display methods, terminal device and computer readable storage medium
CN110345842A (en) * 2019-08-20 2019-10-18 芜湖欧宝机电有限公司 Logical only detection device and internal orifice dimension detection method
WO2020056108A1 (en) * 2018-09-12 2020-03-19 Brain Corporation Systems and methods for detecting blind spots for robots

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002123840A (en) * 2000-10-17 2002-04-26 Nippon Telegr & Teleph Corp <Ntt> Processing method and processor for providing presence type virtual reality
CN101303219A (en) * 2007-05-11 2008-11-12 郑州大学 GPS-based virtual gage design and applying method thereof
CN106127758A (en) * 2016-06-21 2016-11-16 四川大学 A kind of visible detection method based on virtual reality technology and device
CN106247969A (en) * 2016-09-21 2016-12-21 哈尔滨工业大学 A kind of deformation detecting method of industrial magnetic core element based on machine vision
CN106841231A (en) * 2017-03-15 2017-06-13 北方工业大学 Visual precision measurement system and method for tiny parts
CN109522566A (en) * 2017-09-19 2019-03-26 桂林电子科技大学 A kind of position error assessment method of three projection planes system
CN107992281A (en) * 2017-10-27 2018-05-04 网易(杭州)网络有限公司 Visual display method and device, storage medium, the equipment of compensating sound information
WO2020056108A1 (en) * 2018-09-12 2020-03-19 Brain Corporation Systems and methods for detecting blind spots for robots
CN109520436A (en) * 2018-11-28 2019-03-26 扬州市职业大学 A kind of butterfly spring three-dimensional dimension automatic measurement system and its measurement method based on machine vision
CN110286768A (en) * 2019-06-27 2019-09-27 Oppo广东移动通信有限公司 Dummy object display methods, terminal device and computer readable storage medium
CN110345842A (en) * 2019-08-20 2019-10-18 芜湖欧宝机电有限公司 Logical only detection device and internal orifice dimension detection method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种基于CCD的非接触尺寸测量系统;单桂军 等;《电视技术》;20130922;全文 *

Also Published As

Publication number Publication date
CN111652874A (en) 2020-09-11

Similar Documents

Publication Publication Date Title
CN110660066B (en) Training method of network, image processing method, network, terminal equipment and medium
CN110706182B (en) Method and device for detecting flatness of shielding case, terminal equipment and storage medium
CN109886928B (en) Target cell marking method, device, storage medium and terminal equipment
CN109166156B (en) Camera calibration image generation method, mobile terminal and storage medium
CN111667448B (en) Image processing method, device and equipment
EP3594897A1 (en) Measuring method and apparatus for damaged part of vehicle
CN113240630B (en) Speckle image quality evaluation method and device, terminal equipment and readable storage medium
CN111259890A (en) Water level identification method, device and equipment of water level gauge
US11776202B2 (en) Image processing method and apparatus, computer storage medium, and electronic device
CN109405736B (en) Semiconductor IC component size measuring method and device and terminal equipment
CN108364313B (en) Automatic alignment method, system and terminal equipment
CN109461133B (en) Bridge bolt falling detection method and terminal equipment
CN104966089B (en) A kind of method and device of image in 2 D code edge detection
CN108182708B (en) Calibration method and calibration device of binocular camera and terminal equipment
CN114862929A (en) Three-dimensional target detection method and device, computer readable storage medium and robot
WO2019001164A1 (en) Optical filter concentricity measurement method and terminal device
CN111311671A (en) Workpiece measuring method and device, electronic equipment and storage medium
CN109801428B (en) Method and device for detecting edge straight line of paper money and terminal
CN111652874B (en) Method, device, terminal and computer readable storage medium for measuring go-no go gauge
CN110335219B (en) Correction method and correction device for pixel distortion and terminal
CN108564571B (en) Image area selection method and terminal equipment
CN112785650A (en) Camera parameter calibration method and device
CN111445431B (en) Image segmentation method, terminal equipment and computer readable storage medium
CN109829968B (en) Method and device for generating normal texture map, storage medium and electronic equipment
CN113781392A (en) Method for detecting adhesive path, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant