CN114549438B - Reaction kettle buckle detection method and related device - Google Patents

Reaction kettle buckle detection method and related device Download PDF

Info

Publication number
CN114549438B
CN114549438B CN202210126449.7A CN202210126449A CN114549438B CN 114549438 B CN114549438 B CN 114549438B CN 202210126449 A CN202210126449 A CN 202210126449A CN 114549438 B CN114549438 B CN 114549438B
Authority
CN
China
Prior art keywords
buckle
image
reaction kettle
boundary
identified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210126449.7A
Other languages
Chinese (zh)
Other versions
CN114549438A (en
Inventor
柯建华
徐迎
李广义
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202210126449.7A priority Critical patent/CN114549438B/en
Publication of CN114549438A publication Critical patent/CN114549438A/en
Application granted granted Critical
Publication of CN114549438B publication Critical patent/CN114549438B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Apparatus Associated With Microorganisms And Enzymes (AREA)
  • Measuring Or Testing Involving Enzymes Or Micro-Organisms (AREA)

Abstract

The application relates to the technical field of artificial intelligence and image recognition, in particular to a reaction kettle buckle detection method and a related device, which are used for improving the efficiency of reaction kettle buckle detection and reducing potential safety hazards. After acquiring an image to be identified, the intelligent camera respectively extracts features of the reaction kettle and the buckle, determines a first position of the reaction kettle in the image to be identified, determines a second position of the buckle in the image to be identified, acquires a relative position relation between the reaction kettle and the buckle, and gives an alarm when it is determined that a preset detection area is free of the buckle. Like this, the potential safety hazard greatly reduced that the reation kettle buckle detected no longer needs third party real-time detection, directly detects by intelligent camera, and in the detection area did not appear as the buckle, when the material leaks outward, intelligent camera chance was reported an emergency and asked for help or increased vigilance to reation kettle buckle testing process realizes automaticly, makes the efficiency that detects improve greatly.

Description

Reaction kettle buckle detection method and related device
Technical Field
The application relates to the technical field of artificial intelligence and image recognition, in particular to a reaction kettle buckle detection method and a related device.
Background
In the processes of material transportation, reaction and the like, chemical type enterprises can widely use a reaction kettle as a reaction container; can be connected and fixed through rigid structures such as buckles between traditional reation kettle and the reation kettle, between reation kettle and the pipeline. However, during specific handling, transport, processing, reaction, etc., third party intervention is required for detection.
However, in the related art, the above method has the following disadvantages:
1. the potential safety hazard that reation kettle buckle detected is high.
In the process of detecting the buckle of the reaction kettle, as the real-time detection of the buckle of the reaction kettle is required to be carried out by a third party, when the third party has operation mistakes in the processes of operation transportation, processing, reaction and the like, the conditions of material leakage and the like can be caused, thereby causing major safety accidents; therefore, the reaction kettle buckle detection has higher potential safety hazard.
2. The efficiency of reation kettle buckle detection is lower, and the fortune dimension cost is higher.
The reaction kettle buckle detection mainly depends on the intervened third party to carry out on-site detection at present, and the intervened third party leads to reaction kettle buckle detection process to be unable to realize automation, and there is the problem of instability in the intervened third party, when the intervened third party detects out the mistake, can lead to detecting to interrupt, need intervene new third party again and detect, at this in-process, it can cause the waste of check-out time to detect many times, thereby lead to the efficiency of detection lower, the fortune dimension cost is higher.
Disclosure of Invention
The embodiment of the application provides a reaction kettle buckle detection method and a related device, which are used for improving the reaction kettle buckle detection efficiency and reducing potential safety hazards.
The embodiment of the application provides the following specific technical scheme:
in a first aspect, a method for detecting a reactor buckle is provided, which includes:
acquiring an image to be identified, wherein the image to be identified comprises a reaction kettle and a buckle;
respectively extracting features of the reaction kettle and the buckle in the image to be recognized, determining a first position of the reaction kettle in the image to be recognized, and determining a second position of the buckle in the image to be recognized;
acquiring the relative position relation between the reaction kettle and the buckle based on the first position and the second position;
and when the relative position relation is determined not to meet the preset position condition and the detection area is not provided with the buckle, giving an alarm.
In a second aspect, a device for detecting a buckle of a reaction kettle is provided, comprising:
the device comprises a first acquisition module, a second acquisition module and a display module, wherein the first acquisition module is used for acquiring an image to be identified, and the image to be identified comprises a reaction kettle and a buckle;
the determining module is used for respectively extracting the characteristics of the reaction kettle and the buckle in the image to be recognized, determining a first position of the reaction kettle in the image to be recognized and determining a second position of the buckle in the image to be recognized;
the second obtaining module is used for obtaining the relative position relation between the reaction kettle and the buckle based on the first position and the second position;
and the warning module is used for warning when the relative position relation is determined not to meet the preset position condition, and the buckle is not in the preset detection area determined by the representation of the unsatisfied preset position condition.
Optionally, before the image to be recognized is acquired, the first acquiring module is configured to:
acquiring a reference image, wherein the reference image and the image to be identified have the same shooting angle;
and in the reference image, carrying out position detection on the reaction kettle, and determining that the distance between the reaction kettle and each boundary of the reference image exceeds a set threshold value.
Optionally, before the image to be recognized is acquired, the first acquiring module is further configured to:
shooting and recording the assembling process of the reaction kettle and the buckle.
Optionally, the reference image only includes a reaction kettle, and when the image to be identified is acquired, the first acquiring module is further configured to:
acquiring an original image to be identified, and giving an alarm when the original image to be identified is determined to be the same as the reference image;
and continuously acquiring a new image to be identified until the image to be identified is determined to contain a reaction kettle and a buckle.
Optionally, feature extraction is performed on the reaction kettle and the buckle respectively, a first position of the reaction kettle in the image to be recognized is determined, and a second position of the buckle in the image to be recognized is determined, where the determining module is further configured to:
establishing an image coordinate system based on the image to be identified;
respectively extracting characteristic contour lines of the reaction kettle and the buckle from the image to be identified based on an edge detection algorithm to obtain a first contour coordinate set of the reaction kettle and a second contour coordinate set of the buckle;
the first location is determined based on the first set of contour coordinates, and the second location is determined based on the second set of contour coordinates.
Optionally, based on the first position and the second position, a relative positional relationship between the reaction kettle and the buckle is obtained, and the second obtaining module is further configured to:
respectively determining each boundary of the image to be recognized and a first boundary distance between each boundary and the first position to obtain a first boundary distance set;
respectively determining each boundary of the image to be recognized and a second boundary distance between each boundary and the second position to obtain a second boundary distance set;
for each boundary, performing the following operations:
calculating a distance difference value between a first boundary distance corresponding to one boundary and a second boundary distance corresponding to the one boundary based on the first boundary distance set and the second boundary distance set;
and determining the relative position relationship between the reaction kettle and the buckle based on the obtained distance difference values.
Optionally, after the relative position relationship between the reaction kettle and the buckle is obtained, the second obtaining module is further configured to:
respectively comparing each distance difference value with a corresponding distance threshold value to obtain a comparison result;
if the comparison result is characterized: if the distance difference values are not larger than the corresponding distance threshold values, determining that the relative position relation meets the preset position condition, and presetting a detection area with the buckle;
if the comparison result is characterized: and if at least one of the distance difference values is greater than the corresponding distance threshold value, determining that the relative position relation does not meet the preset position condition, and presetting a detection area without the buckle.
Optionally, after it is determined that the relative position relationship satisfies the preset position condition and the buckle is in the preset detection area, the second obtaining module is further configured to:
and periodically detecting the relative position relation according to a set time interval, when at least one distance difference value is larger than a corresponding distance threshold value, determining that the relative position relation no longer meets the preset position condition, and the buckle no longer exists in the preset detection area, and performing alarm prompt.
Optionally, when it is determined that the relative position relationship does not satisfy the preset position condition, the non-satisfied preset position condition represents that no buckle is present in the preset detection area, and after the alarm is given, the alarm module is further configured to:
and recording the alarm information, and presenting the alarm information to the central processing system platform.
Optionally, when it is determined that there is no buckle in the preset detection area, the alarm module is further configured to determine:
in the preset detection area, the buckle does not appear;
alternatively, the first and second liquid crystal display panels may be,
in the preset detection area, a complete image of the buckle is not presented.
In a third aspect, an electronic device is provided, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the computer program is executed by the processor, the processor is enabled to execute any one of the above-mentioned method for detecting a reactor snap.
In a fourth aspect, a computer-readable storage medium is provided, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the method for detecting a reactor snap in the first aspect is implemented.
In the embodiment of the application, after the intelligent camera acquires the image to be recognized, feature extraction is respectively carried out on the reaction kettle and the buckle, the first position of the reaction kettle in the image to be recognized is determined, the second position of the buckle in the image to be recognized is determined, the relative position relation between the reaction kettle and the buckle is acquired, and when the relative position relation is determined to be not satisfied with the preset position condition, an alarm is given. Like this, when carrying out the reation kettle buckle and examining time measuring, the potential safety hazard greatly reduced that the reation kettle buckle detected no longer need carry out real-time detection through the third party, directly detects by intelligent camera to the buckle does not appear in the detection area, and when the material leaks outward, intelligent camera will report an emergency and ask for help or increased vigilance, and reation kettle buckle testing process realizes automaticly, makes the efficiency that detects improve greatly.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is an application scenario of a reaction kettle buckle detection method provided in the embodiment of the present application;
fig. 2 is a first schematic flow chart of a method for detecting a reactor buckle provided in an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a first principle of a method for detecting a reactor buckle according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram illustrating a second principle of a method for detecting a reactor buckle according to an embodiment of the present disclosure;
fig. 5 is a schematic flow chart illustrating a second method for detecting a reactor buckle according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram third schematically illustrating a reaction kettle buckle detection method according to an embodiment of the present disclosure;
fig. 7 is a schematic flow chart illustrating a third method for detecting a reactor buckle according to an embodiment of the present disclosure;
fig. 8 is a schematic flow chart of a fourth method for detecting a reactor buckle according to the embodiment of the present disclosure;
fig. 9 is a schematic diagram of a fourth schematic diagram of a method for detecting a fastening of a reaction kettle according to an embodiment of the present application;
fig. 10 is a schematic diagram of a fifth schematic diagram of a method for detecting a reactor buckle according to an embodiment of the present application;
fig. 11 is a schematic diagram illustrating a sixth principle of a method for detecting a fastening of a reaction kettle according to an embodiment of the present disclosure;
fig. 12 is a schematic diagram seven illustrating a principle of a method for detecting a reactor buckle according to an embodiment of the present disclosure;
fig. 13 is a schematic flow chart of a fifth method for detecting a reactor buckle according to the embodiment of the present disclosure;
fig. 14 is a schematic flow chart illustrating a sixth method for detecting a reactor buckle according to an embodiment of the present disclosure;
fig. 15 is a schematic diagram eight illustrating a principle of a method for detecting a reactor buckle according to an embodiment of the present application;
FIG. 16 is a schematic diagram nine illustrating a principle of a method for detecting a reactor buckle according to an embodiment of the present disclosure;
FIG. 17 is a schematic structural diagram of a reaction kettle buckle detection device in an embodiment of the present application;
fig. 18 is a schematic structural diagram of an electronic device for detecting a reactor snap in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the technical solutions of the present application. All other embodiments obtained by a person skilled in the art based on the embodiments described in the present application without any creative effort belong to the protection scope of the technical solution of the present application.
Some concepts related to the embodiments of the present application are described below.
The reaction kettle is a stainless steel container for carrying out physical or chemical reaction, the structural design and parameter configuration of the container are carried out according to different process condition requirements, and the design conditions, the process, the inspection, the manufacture and the acceptance are required according to related technical standards so as to realize the heating, the evaporation, the cooling and the low-speed and high-speed mixing reaction functions required by the process. The production must be processed, tested and commissioned strictly to the corresponding standards. The stainless steel reaction kettle has different design structures, parameters and styles according to different production processes, different operating conditions and the like, and belongs to non-standard container equipment. The reaction kettle is a comprehensive reaction container, and the structural function and the configuration accessories of the reaction kettle are designed according to reaction conditions.
The buckle is a machine used for connecting or integrally locking one object with another object and is generally used for connection between the objects. The positioning piece is used for guiding the buckle to smoothly, correctly and quickly reach an installation position when being installed; the fastener function is that the detachable fastener is usually designed such that, when a certain separation force is applied, the snap catches will disengage and the two connecting members separate.
The intelligent camera comprises a computing chip, wherein an identification model is loaded in the computing chip, and the identification model is different algorithm models which are built aiming at image data of different types of reaction kettles and buckles and the like by acquiring the image data of the reaction kettles, the buckles and the like and can be used for identifying different states of an object. The intelligent camera can analyze the acquired image data, support a multi-channel algorithm to analyze the data, and upload the acquired alarm information to the central processing system.
The reaction kettle buckle detection method can be applied to different scenes according to different materials of the reaction kettle, and stainless steel reaction kettles are used for high-temperature and high-pressure chemical reaction tests in departments such as petroleum, chemical engineering, medicine, metallurgy, scientific research, universities and colleges, so that viscous and granular substances can achieve a high stirring effect; the steam heating reaction kettle is used for completing chemical process procedures such as polymerization, condensation, vulcanization, alkylation, hydrogenation and the like in industries such as petroleum, chemical industry, food, medicine, pesticide, scientific research and the like, and completing a plurality of process procedures of organic dye and intermediate; the multifunctional dispersion reaction kettle is used for completing chemical process procedures such as polymerization, condensation, vulcanization, alkylation, hydrogenation and the like in industries such as petroleum, chemical industry, food, medicine, pesticide, scientific research and the like, and completing a plurality of process procedures of organic dyes and intermediates; the electric heating reaction kettle is used for completing chemical process procedures such as polymerization, condensation, vulcanization, alkylation, hydrogenation and the like in industries such as petroleum, chemical industry, food, medicine, pesticide, scientific research and the like, and completing a plurality of process procedures of organic dye and intermediate; a magnetic stirring reaction kettle is used for completing chemical process of polymerization, condensation, vulcanization, alkylation, hydrogenation and the like in industries of petroleum, chemical industry, food, medicine, pesticide, scientific research and the like, and completing a plurality of process processes of organic dye and intermediates.
Referring to fig. 1, in the embodiment of the present application, a specific application scenario includes 1 smart camera 101,2 reaction kettles 102, and 1 fastener 103.
In practical application, the shapes of the 2 reaction kettles 102 may be the same or different, in this embodiment, only 2 reaction kettles with the same shape are detected, and 2 reaction kettles with different shapes may be detected in the same manner, which is not described herein again.
The preferred embodiments of the present application will be described below with reference to the accompanying drawings of the specification, it should be understood that the preferred embodiments described herein are merely for illustrating and explaining the present application, and are not intended to limit the present application, and that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
In the embodiment of the application, at the preprocessing stage that reation kettle buckle detected, the intelligent camera keeps the power-on state, guarantees to shoot and record reation kettle and the complete assembly process of buckle to before acquireing the image of treating discernment, can acquire reference image and carry out position detection to reation kettle, refer to fig. 2 and show, concrete flow is as follows:
step 21: a reference image is acquired.
Specifically, the intelligent camera shoots the reaction kettle, wherein the shooting angle of the intelligent camera can ensure to obtain a complete image of the reaction kettle, the obtained complete image of the reaction kettle is taken as a reference image, and in later shooting, the complete image of the reaction kettle and the buckle can be obtained at the angle.
For example, in the examples of the present application, referring to FIG. 3, a reference image is obtained by a smart camera that includes a complete image of the reaction vessel 102.
Step 22: and in the reference image, carrying out position detection on the reaction kettle, and determining that the distance between each boundary of the reaction kettle and the reference image exceeds a set threshold value.
Specifically, after the intelligent camera acquires the reference image, the position of the reaction kettle in the reference image is detected, four boundary lines including an upper boundary line, a lower boundary line, a left boundary line and a right boundary line of the reaction kettle are marked, distances from the four boundary lines to the four boundary lines of the reference image are obtained, and the obtained four distances exceed a set threshold value.
For example, in the embodiment of the present application, referring to fig. 4, the intelligent camera marks four boundary lines of the reaction vessel 102 in the reference image, and obtains that the distance from the upper boundary line to the upper boundary of the reference image is 1 cm and is greater than the set 0.8 cm, the distance from the lower boundary line to the lower boundary of the reference image is 1 cm and is greater than the set 0.8 cm, the distance from the left boundary line to the left boundary of the reference image is 8 cm and is greater than the set 7.8 cm, and the distance from the right boundary line to the right boundary of the reference image is 8 cm and is greater than the set 7.8 cm, as shown in table 1:
TABLE 1
Figure BDA0003500696310000081
Referring to fig. 5, in the embodiment of the present application, after the pretreatment stage is finished, when the reactor buckle detection is performed, the specific process is as follows:
step 51: and acquiring an image to be identified, wherein the image to be identified comprises a reaction kettle and a buckle.
Specifically, the intelligent camera shoots the reaction kettle and the buckle to obtain an image to be recognized, and the shooting angle of the intelligent camera is the same as the angle of the reference image, so that a complete image of the reaction kettle and the buckle can be obtained.
For example, in the embodiment of the present application, referring to fig. 6, the image to be recognized obtained by the smart camera includes a complete image of the reaction kettle 102 and the buckle 103.
Specifically, referring to fig. 7, if the reference image only includes a reaction kettle, the following steps are specifically executed when the image to be identified is obtained:
step 511: and acquiring an original image to be identified, and giving an alarm when the original image to be identified is determined to be the same as the reference image.
Specifically, the intelligent camera can acquire an original image to be recognized, compares the original image to be recognized with a reference image, and gives an alarm when the original image to be recognized is determined to be the same as the reference image and only comprises a reaction kettle, wherein the alarm information is 'only the reaction kettle without a buckle'.
Step 512: and continuously acquiring a new image to be identified until the image to be identified is determined to contain the reaction kettle and the buckle.
Specifically, the intelligent camera continuously acquires a new image to be identified, compares the new image to be identified with the reference image until the image to be identified is different from the reference image, and determines that the image to be identified comprises both a reaction kettle and a buckle.
Step 52: in the image to be recognized, feature extraction is respectively carried out on the reaction kettle and the buckle, a first position of the reaction kettle in the image to be recognized is determined, and a second position of the buckle in the image to be recognized is determined.
Specifically, the intelligent camera applies an edge detection algorithm to the image to be recognized, respectively extracts the features of the reaction kettle and the buckle, extracts feature contour lines of the reaction kettle and the buckle, and establishes an image coordinate system in the image to be recognized, so that the positions of the reaction kettle and the buckle in the image to be recognized are determined.
Specifically, as shown in fig. 8, when step 52 is executed, the following steps are specifically executed:
step 521: and establishing an image coordinate system based on the image to be identified.
For example, referring to fig. 9, assume that the smart camera establishes an image coordinate system in the image to be recognized, with the lower left vertex as the origin, the lower boundary as the X axis, and the left boundary as the Y axis.
Step 522: based on an edge detection algorithm, respectively extracting characteristic contour lines of the reaction kettle and the buckle from the image to be identified to obtain a first contour coordinate set of the reaction kettle and a second contour coordinate set of the buckle.
Specifically, referring to fig. 10, the smart camera extracts feature contour lines of the reaction vessel 102 and the buckle 103 in the identification image respectively by using a Canny edge detector (Canny edge detector), obtains a feature contour line 203 of the buckle of the feature contour line 202 of the reaction vessel, determines coordinates of a set number of points in the feature contour line 203 of the buckle of the feature contour line 202 of the reaction vessel, uses a point coordinate set of the feature contour line 202 of the reaction vessel as a first contour coordinate set of the reaction vessel, and uses a point coordinate set of the feature contour line 203 of the buckle as a second contour coordinate set of the buckle.
For example, referring to fig. 11, assuming that the smart camera extracts coordinates of 8 points in the characteristic contour line of the reaction vessel, as shown in table 2,
TABLE 2
Dot Coordinates of the object Dot Coordinates of the object
A1 (8,20) E1 (16,6)
B1 (10,13) F1 (14,13)
C1 (8,6) G1 (16,20)
D1 (12,1) H1 (12,25)
As can be seen from the above table, the first contour coordinate set of the reaction vessel is { (8, 20), (10, 13), (8, 6), (12, 1), (16, 6), (14, 13), (16, 20), (12, 25) }.
Referring to fig. 12, assuming that the smart camera extracts coordinates of 8 points in the feature contour line of the buckle, as shown in table 3,
TABLE 3
Dot Coordinates of the object Dot Coordinates of the object
A2 (10,14) E2 (14,12)
B2 (10,13) F2 (14,13)
C2 (10,12) G2 (14,14)
D2 (12,12) H2 (12,14)
As can be seen from the above table, the second set of profile coordinates for the snap is { (10, 14), (10, 13), (10, 12), (12, 12), (14, 13), (14, 14), (12, 14) }.
Step 523: a first location is determined based on the first set of contour coordinates and a second location is determined based on the second set of contour coordinates.
Specifically, the smart camera may respectively determine the first position and the second position according to center coordinates of the first contour coordinate set and the second contour coordinate set; the first position and the second position may be determined respectively according to the coordinates of the point with the largest vertical coordinate in the first contour coordinate set and the second contour coordinate set, and if there are a plurality of points with the largest vertical coordinate, the center point of the plurality of points with the largest vertical coordinate may be taken.
For example, assuming a smart camera, calculating coordinates of the center point of 8 points in a first contour coordinate set as P1 (12, 13) from the first contour coordinate set { (8, 20), (10, 13), (8, 6), (12, 1), (16, 6), (14, 13), (16, 20), (12, 25) } of the reaction vessel, taking the coordinates P1 (12, 13) as the first position of the reaction vessel; and the intelligent camera calculates the coordinates of the center point of 8 points in the second contour coordinate set as P2 (12, 13) according to the second contour coordinate set of { (10, 14), (10, 13), (10, 12), (12, 12), (14, 13), (14, 14), (12, 14) } of the buckle, and takes the coordinates P2 (12, 13) as the second position of the buckle.
Among the 8 points in the first contour coordinate set, the point with the largest ordinate is H1 (12, 25), and then H1 (12, 25) is the first position of the reaction kettle; and taking the central point H2 (12, 14) of the three points as a second position of the buckle, wherein the point with the largest ordinate in the 8 points of the second contour coordinate set is A2 (10, 14), H2 (12, 14) and G2 (14, 14).
Step 53: and acquiring the relative position relation between the reaction kettle and the buckle based on the first position and the second position.
Specifically, the intelligent camera calculates the distance from the boundary of the image to be recognized according to the coordinates of the first position and the second position, and determines the relative position relationship through the distance difference.
Specifically, as shown in fig. 13, when step 53 is executed, the following steps are specifically executed:
step 531: and respectively determining each boundary of the image to be recognized and a first boundary distance between each boundary and the first position to obtain a first boundary distance set.
For example, assuming that the first position is the center point coordinate, the smart camera calculates the distance between the upper boundary of the image to be recognized and the first position to be 13 cm, the distance between the lower boundary of the image to be recognized and the first position to be 13 cm, the distance between the left boundary of the image to be recognized and the first position to be 12 cm, and the distance between the right boundary of the image to be recognized and the first position to be 12 cm, the first boundary distances are set to be {13,12}.
In another case, assuming that the first position is the point coordinate with the maximum vertical coordinate, the intelligent camera calculates that the distance between the upper boundary of the image to be recognized and the first position is 1 cm, the distance between the lower boundary of the image to be recognized and the first position is 25 cm, the distance between the left boundary of the image to be recognized and the first position is 12 cm, and the distance between the right boundary of the image to be recognized and the first position is 12 cm, the first boundary distances are set to be {1,25,12}.
Step 532: and respectively determining each boundary of the image to be recognized and a second boundary distance between each boundary and a second position to obtain a second boundary distance set.
For example, assuming that the second position is the point coordinate with the largest vertical coordinate, the smart camera calculates that the distance between the upper boundary of the image to be recognized and the second position is 13 cm, calculates that the distance between the lower boundary of the image to be recognized and the second position is 13 cm, calculates that the distance between the left boundary of the image to be recognized and the second position is 12 cm, calculates that the distance between the right boundary of the image to be recognized and the second position is 12 cm, and then the second boundary distance set is {13,12}.
In another case, assuming that the second position is the point coordinate with the largest vertical coordinate, the smart camera calculates that the distance between the upper boundary of the image to be recognized and the second position is 12 centimeters, calculates that the distance between the lower boundary of the image to be recognized and the second position is 14 centimeters, calculates that the distance between the left boundary of the image to be recognized and the second position is 12 centimeters, calculates that the distance between the right boundary of the image to be recognized and the second position is 12 centimeters, and then the set of second boundary distances is {12,14}.
Step 533: for each boundary, the following operations are respectively performed:
based on the first set of boundary distances and the second set of boundary distances, a distance difference between the first boundary distance corresponding to one boundary and the second boundary distance corresponding to one boundary is calculated.
For example, it is assumed that the smart camera performs calculation, and a distance difference between a first boundary distance corresponding to the upper boundary and a second boundary distance corresponding to the upper boundary is 0 cm, a distance difference between a first boundary distance corresponding to the lower boundary and a second boundary distance corresponding to the lower boundary is 0 cm, a distance difference between a first boundary distance corresponding to the left boundary and a second boundary distance corresponding to the left boundary is 0 cm, and a distance difference between a first boundary distance corresponding to the right boundary and a second boundary distance corresponding to the right boundary is 0 cm.
In another case, it is assumed that the smart camera performs calculation, and a distance difference between a first boundary distance corresponding to the upper boundary and a second boundary distance corresponding to the upper boundary is 11 centimeters, a distance difference between a first boundary distance corresponding to the lower boundary and a second boundary distance corresponding to the lower boundary is 11 centimeters, a distance difference between a first boundary distance corresponding to the left boundary and a second boundary distance corresponding to the left boundary is 0 centimeter, and a distance difference between a first boundary distance corresponding to the right boundary and a second boundary distance corresponding to the right boundary is 0 centimeter.
Step 534: and determining the relative position relation between the reaction kettle and the buckle based on the obtained distance difference values.
Specifically, the intelligent camera determines the relative position relationship between the reaction kettle and the buckle according to the obtained numerical value of each distance difference.
Referring to fig. 14, after obtaining the relative position relationship between the reaction kettle and the buckle, the intelligent camera determines the relative position relationship, and the specific process is as follows:
step 141: and respectively comparing each distance difference value with a corresponding distance threshold value to obtain a comparison result.
For example, suppose that the smart camera compares the distance difference of 0 cm of the upper boundary with the corresponding distance threshold of 0.2 cm, and obtains that the distance difference of 0 cm of the upper boundary is smaller than the corresponding distance threshold of 0.2 cm; comparing the distance difference value of 0 cm of the lower boundary with the corresponding distance threshold value of 0.2 cm to obtain that the distance difference value of 0 cm of the lower boundary is smaller than the corresponding distance threshold value of 0.2 cm; comparing the distance difference value 0 cm of the left boundary with the corresponding distance threshold value 0.2 cm to obtain that the distance difference value 0 cm of the left boundary is smaller than the corresponding distance threshold value 0.2 cm; comparing the distance difference 0 cm of the right boundary with the corresponding distance threshold 0.2 cm, to obtain a distance difference 0 cm of the upper boundary less than the corresponding distance threshold 0.2 cm, as shown in table 4,
TABLE 4
Figure BDA0003500696310000131
In another case, suppose that the intelligent camera compares the distance difference of the upper boundary of 11 cm with the corresponding distance threshold of 11.2 cm, and obtains that the distance difference of the upper boundary of 11 cm is smaller than the corresponding distance threshold of 11.2 cm; comparing the distance difference value 11 cm of the lower boundary with the corresponding distance threshold value 11.2 cm to obtain that the distance difference value 11 cm of the lower boundary is smaller than the corresponding distance threshold value 11.2 cm; comparing the distance difference value of 0 cm of the left boundary with the corresponding distance threshold value of 0.2 cm to obtain that the distance difference value of 0 cm of the left boundary is smaller than the corresponding distance threshold value of 0.2 cm; comparing the distance difference of 0 cm of the right boundary with the corresponding distance threshold of 0.2 cm, to obtain the distance difference of 0 cm of the upper boundary less than the corresponding distance threshold of 0.2 cm, as shown in table 5,
TABLE 5
Figure BDA0003500696310000141
Step 142: if the comparison result is characterized: and if the distance difference values are not greater than the corresponding distance threshold values, determining that the relative position relation meets the preset position condition.
For example, assuming that none of the 4 distance differences is greater than the corresponding distance threshold, the relative position relationship between the reaction kettle and the buckle meets the preset position condition.
In another case, if the 4 distance differences are not greater than the corresponding distance threshold, the relative position relationship between the reaction kettle and the buckle meets the preset position condition.
Step 143: if the comparison result is characterized: and if at least one distance difference value in the distance difference values is larger than the corresponding distance threshold value, determining that the relative position relation does not meet the preset position condition.
For example, if the first position is a center point coordinate, a distance difference between a first boundary distance corresponding to the upper boundary and a second boundary distance corresponding to the upper boundary is 3 centimeters, and at this time, the distance difference is greater than a corresponding set threshold value of 0.2 centimeters, then the relative position relationship between the reaction kettle and the buckle does not meet the preset position condition.
In another case, if the second position is a point coordinate with the largest vertical coordinate, the distance difference between the first boundary distance corresponding to the upper boundary and the second boundary distance corresponding to the upper boundary is 13 centimeters, and at this time, the distance difference is greater than the corresponding set threshold value of 11.2 centimeters, then the relative position relationship between the reaction kettle and the buckle does not meet the preset position condition.
Step 54: and when the relative position relation is determined not to meet the preset position condition, giving an alarm, and representing the unsatisfied preset position condition to determine that no buckle exists in the preset detection area.
For example, suppose that the smart camera determines that the relative position relationship between the reaction kettle and the buckle does not satisfy the preset position condition, and represents that the preset detection area a has no buckle, where, referring to fig. 15, the preset detection area a has no buckle, and may be a feature contour line 203 in which no buckle appears in the preset detection area a; referring to fig. 16, the preset detection area a has no buckle, or the characteristic contour line 203 of the buckle in the preset detection area a is incomplete, so that a complete image of the buckle cannot be presented.
And when the intelligent camera determines that the relative position relation between the reaction kettle and the buckle does not meet the preset position condition, the intelligent camera displays warning information that the relative position relation between the reaction kettle and the buckle does not meet the preset position condition, and displays the warning information to the central processing system platform.
Based on the same technical concept, the embodiment of the application also provides a reaction kettle buckle detection device, and the reaction kettle buckle detection device can realize the method and the process of the embodiment of the application.
Fig. 17 schematically illustrates a structural diagram of a reaction kettle buckle detection device provided in an embodiment of the present application. As shown in fig. 17, the apparatus includes: a first obtaining module 1701, a determining module 1702, a second obtaining module 1703, an alert module 1704, wherein,
the first obtaining module 1701 is used for obtaining an image to be recognized, wherein the image to be recognized comprises a reaction kettle and a buckle;
a determining module 1702, configured to perform feature extraction on the reaction kettle and the fastener in the image to be recognized, respectively, determine a first position of the reaction kettle in the image to be recognized, and determine a second position of the fastener in the image to be recognized;
a second obtaining module 1703, configured to obtain a relative position relationship between the reaction kettle and the buckle based on the first position and the second position;
and the warning module 1704 is used for warning when the relative position relation is determined not to meet the preset position condition, and the representation of the preset position condition is not met to determine that no buckle exists in the preset detection area.
Optionally, before acquiring the image to be recognized, the first acquiring module 1701 is configured to:
acquiring a reference image, wherein the reference image and the image to be identified have the same shooting angle;
and in the reference image, carrying out position detection on the reaction kettle, and determining that the distance between each boundary of the reaction kettle and the reference image exceeds a set threshold value.
Optionally, before acquiring the image to be recognized, the first acquiring module 1701 is further configured to:
shooting and recording the assembling process of the reaction kettle and the buckle.
Optionally, if the reference image only includes a reaction kettle, when the image to be identified is acquired, the first acquiring module 1701 is further configured to:
acquiring an original image to be identified, and giving an alarm when the original image to be identified is determined to be the same as the reference image;
and continuously acquiring a new image to be identified until the image to be identified is determined to contain the reaction kettle and the buckle.
Optionally, feature extraction is performed on the reaction kettle and the fastener respectively, a first position of the reaction kettle in the image to be recognized is determined, and a second position of the fastener in the image to be recognized is determined, and the determining module 1702 is further configured to:
establishing an image coordinate system based on the image to be identified;
respectively extracting characteristic contour lines of the reaction kettle and the buckle in the image to be identified based on an edge detection algorithm to obtain a first contour coordinate set of the reaction kettle and a second contour coordinate set of the buckle;
a first location is determined based on the first set of profile coordinates and a second location is determined based on the second set of profile coordinates.
Optionally, based on the first position and the second position, the relative position relationship between the reaction kettle and the buckle is obtained, and the second obtaining module 1703 is further configured to:
respectively determining each boundary of the image to be recognized, and obtaining a first boundary distance set between each boundary and a first position;
respectively determining each boundary of the image to be recognized and a second boundary distance between each boundary and a second position to obtain a second boundary distance set;
for each boundary, the following operations are respectively performed:
calculating a distance difference value between a first boundary distance corresponding to one boundary and a second boundary distance corresponding to one boundary based on the first boundary distance set and the second boundary distance set;
and determining the relative position relation between the reaction kettle and the buckle based on the obtained distance difference values.
Optionally, after obtaining the relative position relationship between the reaction kettle and the buckle, the second obtaining module 1703 is further configured to:
comparing each distance difference value with a corresponding distance threshold value respectively to obtain a comparison result;
if the comparison result is characterized: if the distance difference values are not larger than the corresponding distance threshold values, determining that the relative position relation meets a preset position condition, and presetting a buckle in a detection area;
if the comparison result is characterized: and if at least one of the distance difference values is greater than the corresponding distance threshold value, determining that the relative position relation does not meet the preset position condition, and presetting a detection area without buckles.
Optionally, it is determined that the relative position relationship satisfies a preset position condition, and after the detection area is preset with a buckle, the second obtaining module 1703 is further configured to:
and periodically detecting the relative position relation according to a set time interval, when at least one distance difference value is larger than a corresponding distance threshold value, determining that the relative position relation no longer meets a preset position condition, and the preset detection area has no buckle, and giving an alarm.
Optionally, when it is determined that the relative position relationship does not satisfy the preset position condition, and does not satisfy the preset position condition representation, it is determined that there is no buckle in the preset detection area, and after the alarm is performed, the alarm module 1704 is further configured to:
and recording the alarm information and presenting the alarm information to the central processing system platform.
Optionally, when it is determined that there is no buckle in the preset detection area, the alarm module 1704 is further configured to determine:
in the preset detection area, no buckle appears;
alternatively, the first and second liquid crystal display panels may be,
in the preset detection area, a complete image of the buckle is not presented.
Based on the same inventive concept as the method embodiment, an embodiment of the present application further provides an electronic device, which is shown in fig. 18 and is a schematic diagram of a hardware component structure of an electronic device to which the embodiment of the present application is applied, and the electronic device includes:
at least one processor 1801 and a memory 1802 connected to the at least one processor 1801, in this embodiment, a specific connection medium between the processor 1801 and the memory 1802 is not limited in this application, and fig. 18 illustrates an example where the processor 1801 and the memory 1802 are connected through a bus 1800. The bus 1800 is shown in fig. 18 by a thick line, and the connection between other components is merely illustrative and not intended to be limiting. The bus 1800 may be divided into an address bus, a data bus, a control bus, etc., which is indicated in FIG. 18 by only one thick line for ease of illustration, but does not indicate that there is only one bus or type of bus. Alternatively, the processor 1801 may also be referred to as a controller, and is not limited by name.
In an embodiment of the present application, the memory 1802 stores instructions executable by the at least one processor 1801, and the at least one processor 1801 may execute the instructions stored in the memory 1802 to perform a storage automation deployment method as discussed above. The processor 1801 may implement the functions of the various modules in the apparatus shown in fig. 17.
The processor 1801 is a control center of the apparatus, and may be connected to various parts of the entire control device through various interfaces and lines, and perform various functions and process data of the apparatus by operating or executing instructions stored in the memory 1802 and calling data stored in the memory 1802, thereby performing overall monitoring of the apparatus.
In one possible design, the processor 1801 may include one or more processing units, and the processor 1801 may integrate an application processor, which handles primarily operating systems, user interfaces, application programs, and the like, and a modem processor, which handles primarily wireless communications. It is to be appreciated that the modem processor described above may not be integrated into the processor 1801. In some embodiments, the processor 1801 and the memory 1802 may be implemented on the same chip, or in some embodiments, they may be implemented separately on separate chips.
The processor 1801 may be a general-purpose processor, such as a CPU (CPU), digital signal processor, application specific integrated circuit, field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like, that may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method for detecting the reactor buckle disclosed by the embodiment of the application can be directly implemented by a hardware processor, or implemented by combining hardware and software modules in the processor.
Memory 1802, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules. The Memory 1802 may include at least one type of storage medium, and may include, for example, a flash Memory, a hard disk, a multimedia card, a card-type Memory, a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Programmable Read Only Memory (PROM), a Read Only Memory (ROM), a charge Erasable Programmable Read Only Memory (EEPROM), a magnetic Memory, a magnetic disk, an optical disk, and so on. The memory 1802 is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory 1802 of the embodiments of the present application may also be circuitry or any other device capable of performing a storage function for storing program instructions and/or data.
By programming the processor 1801, the code corresponding to the method for detecting a snap in a reaction kettle described in the foregoing embodiment may be solidified in the chip, so that the chip can perform the steps of the method for detecting a snap in a reaction kettle shown in fig. 5 when running. How to program the processor 1801 is well known to those skilled in the art and will not be described herein.
Based on the same inventive concept, the embodiment of the present application further provides a storage medium, where the storage medium stores computer instructions, and when the computer instructions are run on a computer, the computer executes the method for detecting the reactor snap discussed above.
In some possible embodiments, the various aspects of a method for detecting a reactor snap may also be embodied in the form of a program product comprising program code for causing a control apparatus to perform the steps of a method for detecting a reactor snap according to various exemplary embodiments of the present application, as described above in this specification, when the program product is run on a device.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a server, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (13)

1. A reaction kettle buckle detection method is characterized by comprising the following steps:
acquiring an image to be identified, wherein the image to be identified comprises a reaction kettle and a buckle;
respectively extracting features of the reaction kettle and the buckle in the image to be recognized, determining a first position of the reaction kettle in the image to be recognized, and determining a second position of the buckle in the image to be recognized;
acquiring the relative position relation between the reaction kettle and the buckle based on the first position and the second position;
and when the relative position relation is determined not to meet the preset position condition, giving an alarm, wherein the representation of the unsatisfied preset position condition determines that no buckle exists in the preset detection area.
2. The method of claim 1, wherein prior to acquiring the image to be identified, comprising:
acquiring a reference image, wherein the reference image and the image to be identified have the same shooting angle;
and in the reference image, carrying out position detection on the reaction kettle, and determining that the distance between the reaction kettle and each boundary of the reference image exceeds a set threshold value.
3. The method of claim 1, wherein prior to acquiring the image to be identified, further comprising:
shooting and recording the assembling process of the reaction kettle and the buckle.
4. The method of claim 2, wherein the reference image includes only a reaction kettle, and the acquiring the image to be identified further includes:
acquiring an original image to be identified, and giving an alarm when the original image to be identified is determined to be the same as the reference image;
and continuously acquiring a new image to be identified until the image to be identified is determined to contain the reaction kettle and the buckle.
5. The method of claim 1,2 or 3, wherein the performing feature extraction on the reaction vessel and the clasp, respectively, determining a first position of the reaction vessel in the image to be identified, and determining a second position of the clasp in the image to be identified comprises:
establishing an image coordinate system based on the image to be identified;
respectively extracting characteristic contour lines of the reaction kettle and the buckle from the image to be identified based on an edge detection algorithm to obtain a first contour coordinate set of the reaction kettle and a second contour coordinate set of the buckle;
the first location is determined based on the first set of contour coordinates, and the second location is determined based on the second set of contour coordinates.
6. The method of claim 1,2 or 3, wherein said obtaining a relative positional relationship between said reaction vessel and said clasp based on said first position and said second position comprises:
respectively determining each boundary of the image to be recognized and a first boundary distance between each boundary and the first position to obtain a first boundary distance set;
respectively determining each boundary of the image to be recognized and a second boundary distance between each boundary and the second position to obtain a second boundary distance set;
for each boundary, performing the following operations:
calculating a distance difference value between a first boundary distance corresponding to one boundary and a second boundary distance corresponding to the one boundary based on the first boundary distance set and the second boundary distance set;
and determining the relative position relationship between the reaction kettle and the buckle based on the obtained distance difference values.
7. The method of claim 6, wherein after obtaining the relative position relationship between the reaction vessel and the buckle, the method further comprises:
comparing each distance difference value with a corresponding distance threshold value respectively to obtain a comparison result;
if the comparison result is characterized: if the distance difference values are not larger than the corresponding distance threshold values, the relative position relation is determined to meet the preset position condition, and the buckle is arranged in a preset detection area;
if the comparison result is characterized: and if at least one of the distance difference values is greater than the corresponding distance threshold value, determining that the relative position relation does not meet the preset position condition, and the preset detection area is free of the buckle.
8. The method of claim 7, wherein the determining that the relative position relationship satisfies the preset position condition further includes, after the buckle is present in the preset detection area:
and periodically detecting the relative position relation according to a set time interval, when at least one distance difference value is larger than a corresponding distance threshold value, determining that the relative position relation no longer meets the preset position condition, and the buckle no longer exists in the preset detection area, and performing alarm prompt.
9. The method of claim 1,2 or 3, wherein said alerting further comprises, thereafter:
and recording the alarm information, and presenting the alarm information to the central processing system platform.
10. The method of claim 1,2 or 3, wherein the determining that the buckle is absent from a predetermined detection region comprises:
in the preset detection area, the buckle does not appear;
alternatively, the first and second liquid crystal display panels may be,
in the preset detection area, a complete image of the buckle is not presented.
11. The utility model provides a reation kettle buckle detection device which characterized in that includes:
the first acquisition module is used for acquiring an image to be identified, wherein the image to be identified comprises a reaction kettle and a buckle;
the determining module is used for respectively extracting the features of the reaction kettle and the buckle in the image to be recognized, determining a first position of the reaction kettle in the image to be recognized and determining a second position of the buckle in the image to be recognized;
the second acquisition module is used for acquiring the relative position relation between the reaction kettle and the buckle based on the first position and the second position;
and the warning module is used for warning when the relative position relation is determined not to meet the preset position condition, and the representation of the preset position condition is not met to determine that no buckle exists in the preset detection area.
12. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any one of claims 1 to 10 when executing the computer program.
13. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1-10.
CN202210126449.7A 2022-02-10 2022-02-10 Reaction kettle buckle detection method and related device Active CN114549438B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210126449.7A CN114549438B (en) 2022-02-10 2022-02-10 Reaction kettle buckle detection method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210126449.7A CN114549438B (en) 2022-02-10 2022-02-10 Reaction kettle buckle detection method and related device

Publications (2)

Publication Number Publication Date
CN114549438A CN114549438A (en) 2022-05-27
CN114549438B true CN114549438B (en) 2023-03-17

Family

ID=81673483

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210126449.7A Active CN114549438B (en) 2022-02-10 2022-02-10 Reaction kettle buckle detection method and related device

Country Status (1)

Country Link
CN (1) CN114549438B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104316525A (en) * 2014-09-04 2015-01-28 湖北开特汽车电子电器系统股份有限公司 Automobile servo motor production process error-proofing detection system and method
CN112116658A (en) * 2020-09-21 2020-12-22 北京世纪东方通讯设备有限公司 Fastener positioning method and device and readable storage medium

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104359921B (en) * 2014-11-20 2016-11-23 中南大学 A kind of fastener based on structure light disappearance detection method and device thereof
CN106192634B (en) * 2016-08-31 2018-05-22 武汉汉宁轨道交通技术有限公司 A kind of railroad track elastic bar fastener condition automatic detection device and method
US10953899B2 (en) * 2018-11-15 2021-03-23 Avante International Technology, Inc. Image-based monitoring and detection of track/rail faults
CN109815822A (en) * 2018-12-27 2019-05-28 北京航天福道高技术股份有限公司 Inspection figure components target identification method based on Generalized Hough Transform
CN109784238A (en) * 2018-12-29 2019-05-21 上海依图网络科技有限公司 A kind of method and device of determining object to be identified
CN111311560B (en) * 2020-02-10 2023-09-12 中国铁道科学研究院集团有限公司基础设施检测研究所 Method and device for detecting state of steel rail fastener
CN111539927B (en) * 2020-04-20 2023-07-18 南通大学 Detection method of automobile plastic assembly fastening buckle missing detection device
CN111898610B (en) * 2020-07-29 2024-04-19 平安科技(深圳)有限公司 Card unfilled corner detection method, device, computer equipment and storage medium
CN111921468A (en) * 2020-08-05 2020-11-13 重庆大学 Intelligent homogeneous reaction kettle system for new material synthesis and control method
CN113689392A (en) * 2021-08-18 2021-11-23 北京理工大学 Railway fastener defect detection method and device
CN114004790A (en) * 2021-09-30 2022-02-01 珠海格力电器股份有限公司 Battery cover quality detection method and device, computer equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104316525A (en) * 2014-09-04 2015-01-28 湖北开特汽车电子电器系统股份有限公司 Automobile servo motor production process error-proofing detection system and method
CN112116658A (en) * 2020-09-21 2020-12-22 北京世纪东方通讯设备有限公司 Fastener positioning method and device and readable storage medium

Also Published As

Publication number Publication date
CN114549438A (en) 2022-05-27

Similar Documents

Publication Publication Date Title
CN104680519B (en) Seven-piece puzzle recognition methods based on profile and color
US9501554B2 (en) Image processing system, image processing method, and image processing program
Cho et al. 2D barcode detection using images for drone-assisted inventory management
CN111860060A (en) Target detection method and device, terminal equipment and computer readable storage medium
CN114549438B (en) Reaction kettle buckle detection method and related device
Peng et al. Automated product boundary defect detection based on image moment feature anomaly
CN106960196A (en) Industrial video decimal fractions recognition methods based on template matches and SVM
CN110928889A (en) Training model updating method, device and computer storage medium
CN112966618B (en) Dressing recognition method, apparatus, device and computer readable medium
CN112560779B (en) Method and equipment for identifying overflow of feeding port and feeding control system of stirring station
CN114186933A (en) Cold chain food intelligent supervision platform
Yang et al. A deep learning based method for automatic analysis of high-throughput droplet digital PCR images
CN110458188A (en) Industrial vision detection data processing method, device, storage medium and terminal device
CN115937593A (en) Model method establishing method, object detection method and related equipment
CN113020428B (en) Progressive die machining monitoring method, device, equipment and storage medium
CN115099259A (en) Data identification method and device for civil air defense materials and electronic equipment
CN111047518B (en) Site decontamination strategy selection platform
CN209946948U (en) Intelligent reinspection equipment of PC component
CN113505763A (en) Key point detection method and device, electronic equipment and storage medium
CN113344949A (en) Package detection method, system, medium and terminal based on RGB image
CN112203053A (en) Intelligent supervision method and system for subway constructor behaviors
CN111460767A (en) HMI flow chart generation method and device
CN112800804A (en) Price tag-based out-of-stock detection method and device
US20170004361A1 (en) Method for detecting discrepancies in a part drawing
CN116625243B (en) Intelligent detection method, system and storage medium based on frame coil stock cutting machine

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant