CN117095007B - PFA encapsulation flying lever clamping seat assembly monitoring system based on image processing - Google Patents

PFA encapsulation flying lever clamping seat assembly monitoring system based on image processing Download PDF

Info

Publication number
CN117095007B
CN117095007B CN202311360082.6A CN202311360082A CN117095007B CN 117095007 B CN117095007 B CN 117095007B CN 202311360082 A CN202311360082 A CN 202311360082A CN 117095007 B CN117095007 B CN 117095007B
Authority
CN
China
Prior art keywords
motion
analyzed
assembled
gray level
shielding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311360082.6A
Other languages
Chinese (zh)
Other versions
CN117095007A (en
Inventor
赵文强
何雷志
裴杰
何美平
李洪波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xingdongtai Electronic Co ltd
Original Assignee
Shenzhen Xingdongtai Electronic Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xingdongtai Electronic Co ltd filed Critical Shenzhen Xingdongtai Electronic Co ltd
Priority to CN202311360082.6A priority Critical patent/CN117095007B/en
Publication of CN117095007A publication Critical patent/CN117095007A/en
Application granted granted Critical
Publication of CN117095007B publication Critical patent/CN117095007B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of image data processing, and provides a PFA encapsulation fly rod clamping seat assembly monitoring system based on image processing, which comprises the following components: the data acquisition module acquires the motion speed and the motion direction of each pixel point and acquires a simulated motion track curve; the motion characteristic conformity acquisition module is used for determining a motion characteristic region and acquiring the motion characteristic conformity of the motion region; the shielding coefficient acquisition module is used for acquiring shielding coefficients and acquiring an assembled gray level image corresponding to the non-shielded mechanical arm; the mechanical arm region extraction module is used for obtaining the interested degree of the motion characteristic region and extracting the mechanical arm region; the assembly quality detection module is used for acquiring a motion trail curve of the mechanical arm according to the extracted mechanical arm region and judging the assembly quality of the clamping seat according to a comparison result of the motion trail curve of the mechanical arm and the simulated motion trail curve. The invention solves the problem of poor assembly monitoring effect caused by target shielding in the process of assembling and monitoring the PFA encapsulated fly rod clamping seat.

Description

PFA encapsulation flying lever clamping seat assembly monitoring system based on image processing
Technical Field
The invention relates to the technical field of image data processing, in particular to a PFA encapsulated fly rod clamping seat assembly monitoring system based on image processing.
Background
Usually, the produced clamping seat is placed at a fixed position on a workbench or a jig through a mechanical arm, so that the flight bars are aligned correctly, the PFA encapsulation flight bars are positioned on the clamping seat through holes or guide grooves on the clamping seat, then the PFA encapsulation flight bars are connected with the clamping seat by applying pressure through a proper tool or equipment, the connection between the PFA encapsulation flight bars and the clamping seat is realized, and then the PFA encapsulation flight bar clamping seat is assembled by curing treatment. Therefore, when assembling and monitoring the PFA encapsulated fly rod clamping seat, a moving target tracking algorithm is often used for monitoring the moving track of the mechanical arm.
When the moving target tracking algorithm is used for monitoring the moving track of the mechanical arm, the extraction precision of the target area directly influences the precision of the moving target tracking algorithm, the assembly environment in the assembling process of the PFA encapsulated flying lever clamping seat is complex, the moving targets are easy to be shielded, the moving characteristics of different moving objects are similar, the detection precision and the efficiency of the target tracking algorithm are low, and the assembling and monitoring effects of the PFA encapsulated flying lever clamping seat are poor.
Disclosure of Invention
The invention provides an image processing-based PFA encapsulation flying lever clamping seat assembly monitoring system, which aims to solve the problem of poor assembly monitoring effect caused by target shielding in the process of assembling and monitoring the PFA encapsulation flying lever clamping seat, and adopts the following specific technical scheme:
the invention provides an image processing-based PFA encapsulated fly rod clamping seat assembly monitoring system, which comprises the following modules:
the data acquisition module acquires videos in the assembling process of the PFA encapsulated flying lever clamping seat, so as to acquire the moving speed and the moving direction of each pixel point in a moving area in an assembled gray level image and acquire a simulated moving track curve;
the motion characteristic conformity acquisition module acquires the motion direction consistency of the motion area according to the motion direction of the pixel points in the motion area, acquires the motion line segment and the speed gradual change difference of the pixel points in the motion area, acquires the motion speed gradual change of the motion area, determines the motion characteristic area and acquires the motion characteristic conformity of the motion area;
the shielding coefficient acquisition module acquires shielding intersection points of the motion area, acquires shielding characteristic moments of the shielding intersection points, acquires third correlation coefficients of the assembled gray level images, further acquires shielding coefficients corresponding to the assembled gray level images, and acquires the assembled gray level images corresponding to the mechanical arms which are not shielded according to the shielding coefficients;
the mechanical arm region extraction module is used for obtaining the interested degree of the motion characteristic region in the assembled gray level image corresponding to the mechanical arm which is not blocked according to the motion characteristic coincidence degree of the motion characteristic region and the blocking coefficient corresponding to the assembled gray level image, and extracting the mechanical arm region according to the interested degree;
the assembly quality detection module is used for acquiring a motion trail curve of the mechanical arm according to the extracted mechanical arm region, and judging the quality of the assembly of the PFA encapsulated fly rod clamping seat according to a comparison result of the motion trail curve of the mechanical arm and the simulated motion trail curve.
Further, the method for obtaining the consistency of the motion direction of the motion area according to the motion direction of the pixel points in the motion area comprises the following steps:
respectively taking each pixel point contained in each motion area in all the assembled gray images as a pixel point to be analyzed;
numbering the pixel points in the motion area in the sequence from top to bottom and from left to right to obtain the numbers of all the pixel points in the motion area;
the pixel point which is one larger than the number of the pixel point to be analyzed is marked as the adjacent pixel point of the pixel point to be analyzed;
the absolute value of the difference value of the motion direction of the pixel to be analyzed and the adjacent pixel of the pixel to be analyzed is recorded as the motion direction difference value of the pixel to be analyzed;
the sum of the motion direction differences of all the pixel points contained in the motion area is recorded as the sum of the motion differences of the motion area;
the reciprocal of the sum of the motion differences for the motion areas is noted as the motion direction consistency for the motion areas.
Further, the method for obtaining the motion line segment and the speed gradual change difference of the pixel points in the motion area comprises the following steps:
taking a window taking the pixel to be analyzed as a central pixel and the side length as a first preset threshold value as a first window of the pixel to be analyzed;
the pixel point with the largest difference between the motion speed in the first window of the pixel point to be analyzed and the motion speed of the pixel point to be analyzed is marked as the first pixel point of the pixel point to be analyzed;
taking a pixel point to be analyzed as an endpoint, marking a ray passing through a first pixel point of the pixel point to be analyzed as a motion direction ray of the pixel point to be analyzed, and marking an intersection point of the motion direction ray and the edge of a motion area where the pixel point to be analyzed is positioned as an edge intersection point of the pixel point to be analyzed;
marking a line segment taking an edge intersection point of the pixel point to be analyzed and the pixel point to be analyzed as an endpoint as a motion line segment of the pixel point to be analyzed;
the absolute value of the difference value of the motion speed between two adjacent pixel points on the motion line segment of the pixel point to be analyzed is recorded as the motion speed difference of the two adjacent pixel points;
and taking the average value of the motion speed differences of all adjacent pixel points on the motion line segment of the pixel point to be analyzed as the speed gradual change difference of the pixel point to be analyzed.
Further, the method for acquiring the motion velocity gradient of the motion region and determining the motion characteristic region comprises the following steps:
the absolute value of the difference between the speed gradient difference of the pixel to be analyzed and the speed gradient difference of the pixel with the serial number larger than that of the pixel to be analyzed is recorded as the absolute value of the speed gradient difference of the pixel to be analyzed;
the product of the absolute value of the velocity gradient difference of the pixel points to be analyzed and the average value of the velocity gradient differences of all the pixel points in the motion area is recorded as a first product of the pixel points to be analyzed;
recording the inverse of the sum of the first product of the pixel points to be analyzed and the first minimum positive number as a motion speed characteristic value of the pixel points to be analyzed;
the sum of the motion speed characteristic values of all the pixel points contained in the motion area is recorded as the motion speed gradual change of the motion area;
and recording a motion region with the motion speed gradient being greater than or equal to a second preset threshold value as a motion characteristic region.
Further, the method for obtaining the motion characteristic conformity of the motion area comprises the following steps:
and (5) recording a normalized value of the product of the motion direction consistency and the motion speed gradient of the motion area as the motion characteristic consistency of the motion area.
Further, the method for acquiring the shielding intersection point of the motion area and the shielding characteristic moment of the shielding intersection point comprises the following steps:
extracting centroid points of all the motion areas in all the assembled gray images by using a gray centroid method respectively;
taking a centroid point of the motion area as a starting point, and taking a ray with a motion direction as an angle as a centroid direction ray of the motion area;
when an intersection point exists between the obtained centroid direction ray of the motion area and the centroid direction ray corresponding to the motion feature area, the intersection point is marked as a shielding intersection point of the motion area;
acquiring a barycenter direction ray of a shielding intersection point of a motion area, and recording Euclidean distance between an endpoint of the barycenter direction ray and the shielding intersection point on the barycenter direction ray as a motion distance of the barycenter direction ray;
the ratio between the movement distance of the ray in the centroid direction and the movement speed of the shielding intersection point on the ray in the centroid direction is recorded as the movement time of the ray in the centroid direction;
the absolute value of the difference value of the motion time of each two barycenter direction rays of the occlusion intersection point of the motion area is recorded as the motion time difference of the occlusion intersection point;
recording the motion time corresponding to the minimum value in the motion time difference of the shielding intersection point as a first shielding time;
recording the motion time corresponding to the second small value in the motion time difference of the shielding intersection point as second shielding time;
and recording the mean value of the first shielding time and the second shielding time as the shielding characteristic moment of the shielding intersection point.
Further, the method for obtaining the third correlation coefficient of the assembled gray scale image comprises the following steps:
taking each assembly gray level image as an assembly gray level image to be analyzed respectively, and recording other assembly gray level images except the assembly gray level image to be analyzed as comparison assembly gray level images;
taking each motion characteristic region in the assembled gray level image to be analyzed as a motion characteristic region to be analyzed respectively, and taking each motion characteristic region in the compared assembled gray level image as a comparison motion characteristic region respectively;
the method comprises the steps of recording the maximum value of correlation coefficients between a motion characteristic region to be analyzed and a comparison motion characteristic region in a comparison assembly gray level image as a first correlation coefficient of the comparison motion characteristic region;
the average value of the first correlation coefficients of all the comparison motion characteristic areas is recorded as a second correlation coefficient of the motion characteristic area to be analyzed;
and (3) recording the minimum value of all the second correlation coefficients as a third correlation coefficient of the assembled gray scale image to be analyzed.
Further, the method for obtaining the assembled gray level image corresponding to the non-occluded mechanical arm according to the occlusion coefficient further comprises the following steps:
in the method, in the process of the invention,the shielding coefficient corresponding to the assembled gray level image; />A third correlation coefficient for assembling the gray scale image;for assembling the number of occlusion intersections contained in the gray scale image; />Is->Shielding characteristic moments of the shielding intersection points; />The minimum value of the motion characteristic coincidence degree of all the motion characteristic areas in the assembled gray level image is set; />A third preset threshold value; />Acquisition time for assembling the grayscale image; />Is an exponential function with a natural constant as a base;
when the assembled gray level image does not contain the motion characteristic region, assigning the shielding coefficient corresponding to the assembled gray level image as a constant;
when the normalized value of the shielding coefficient corresponding to the assembled gray level image is larger than or equal to a fourth preset threshold value, the mechanical arm in the assembled gray level image is considered to be shielded;
and when the normalized value of the shielding coefficient corresponding to the assembled gray level image is smaller than a fourth preset threshold value, the mechanical arm in the assembled gray level image is considered not to be shielded.
Further, the method for extracting the mechanical arm region according to the interested degree of the motion characteristic region in the assembled gray level image corresponding to the mechanical arm which is not blocked according to the motion characteristic coincidence degree of the motion characteristic region and the blocking coefficient corresponding to the assembled gray level image comprises the following steps:
the method for acquiring the interested degree of the motion characteristic region in the assembled gray level image corresponding to the non-shielded mechanical arm comprises the following steps of:
in the method, in the process of the invention,for movement characteristic region->Is of interest in (a); />For movement characteristic region->Is a motion characteristic coincidence degree of (a); />For movement characteristic region->Assembled gray level image->Corresponding shielding coefficients; />For movement characteristic region->Assembled gray level image->Nearby->Inside the assembled gray image, motion feature region +.>The motion characteristic conformity of the corresponding motion characteristic region; />A fifth preset threshold value; />For movement characteristic region->Assembled gray level image->Nearby->Inside the assembled gray image, motion feature region +.>Corresponding movement characteristic region and movement characteristic region +.>Correlation coefficients between;
and (5) marking a motion characteristic region with the greatest interest degree in the assembled gray level image as a mechanical arm region.
Further, the method for acquiring the motion track curve of the mechanical arm according to the extracted mechanical arm region and judging the quality of the assembly of the PFA encapsulated fly rod clamping seat according to the comparison result of the motion track curve of the mechanical arm and the simulated motion track curve comprises the following steps:
acquiring a centroid point of a mechanical arm region in an assembled gray level image by using a gray level centroid method, taking the centroid point as a moving point, arranging all the moving points according to the acquisition time of the assembled gray level image where the moving points are positioned, and acquiring a motion trail curve of the mechanical arm by using a data fitting algorithm for the arranged moving points;
obtaining the similarity of a motion trail curve and a simulated motion trail curve of the mechanical arm by using a curve similarity algorithm;
when the similarity is smaller than or equal to a risk judgment threshold value, the assembling quality of the PFA encapsulated fly rod clamping seat is considered to be not up to the standard;
and when the similarity is greater than a risk judgment threshold, the assembly quality of the PFA encapsulated fly rod clamping seat is considered to reach the standard.
The beneficial effects of the invention are as follows:
the method comprises the steps of collecting video images of the assembling process of the PFA encapsulated fly rod clamping seat, obtaining a simulated motion track curve through simulation software, adaptively obtaining a motion area in the video images through an image segmentation algorithm and an optical flow method, and completing the construction of a motion characteristic conformity index by combining the motion characteristic of the mechanical arm; then, based on the motion condition and the relativity of the mechanical arm, the clamping seat and the encapsulated flying rod in the assembly process, constructing the shielding coefficient of the image, completing judgment of the shielding condition, and further judging the possibility that the mechanical arm in the assembled gray level image is shielded based on the shielding coefficient and the related coefficient between the images, so as to acquire an accurate motion area; and the interested degree of the motion area is further obtained, the self-adaptive detection extraction of the mechanical arm area is completed, the motion track curve of the mechanical arm area is obtained according to the extracted mechanical arm area, the quality of the assembly of the PFA encapsulated fly rod clamping seat is judged according to the comparison result of the motion track curve and the simulated motion track curve of the mechanical arm, the extraction precision of the target area in a target tracking algorithm is improved, the problem that the assembly monitoring effect is poor due to target shielding in the assembly monitoring process of the PFA encapsulated fly rod clamping seat is solved, and the monitoring efficiency and the monitoring precision of the assembly of the PFA encapsulated fly rod clamping seat are improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings can be obtained according to these drawings without inventive faculty for a person skilled in the art.
FIG. 1 is a schematic flow diagram of a PFA encapsulated fly rod cartridge assembly monitoring system based on image processing according to one embodiment of the present invention;
fig. 2 is a schematic view of occlusion intersection acquisition.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, a flowchart of a PFA encapsulated fly rod cartridge assembly monitoring system based on image processing according to an embodiment of the present invention is shown, the system includes: the device comprises a data acquisition module, a motion characteristic coincidence degree acquisition module, a shielding coefficient acquisition module, a mechanical arm region extraction module and an assembly quality detection module.
The data acquisition module acquires videos of the PFA encapsulated flying lever clamping seat in the assembling process, and further acquires the motion speed and the motion direction of each pixel point in the motion area in the assembled gray level image, and acquires a simulated motion track curve.
And acquiring videos of the PFA encapsulated flying lever clamping seat in the assembling process under the light supplementing environment of a fixed light source by using an industrial camera, and taking each frame in the acquired videos as an assembling image, wherein the assembling image is an RGB image. And carrying out graying treatment on the assembled image by using a weighted graying method to obtain the assembled gray image. The weighted gray scale is a known technique, and will not be described herein.
And arranging the assembly gray level images according to the acquisition time of the assembly images corresponding to the assembly gray level images to acquire the assembly gray level video. And judging whether the moving object exists in the assembled gray level images or not and the positions of the moving object by using a frame difference method for adjacent assembled gray level images in the assembled gray level video, and sequencing the assembled gray level images with the moving object according to the time sequence to obtain a moving video image sequence. The frame difference method is a known technique and will not be described in detail.
And obtaining the motion speed and the motion direction of each pixel point of the corresponding moving object in each assembled gray level image by using a light flow method for the moving video image sequence. And (3) using a Graph-based image segmentation algorithm to the assembled gray level image to obtain segmentation areas, and marking the segmentation areas with the motion speeds of all the pixel points not being zero as motion areas.
The V-REP simulation software is used for processing the assembling process of the PFA encapsulated fly rod clamping seat, a simulated motion track curve of the mechanical arm in the assembling process is obtained, and the simulated motion track curve is the track curve of the preset motion of the mechanical arm in the assembling process of the PFA encapsulated fly rod clamping seat.
So far, the motion speed, the motion direction and the simulated motion track curve of each pixel point in the motion area in the assembled gray level image are obtained.
The motion characteristic conformity acquisition module acquires the motion direction consistency of the motion area according to the motion direction of the pixel points in the motion area, acquires the motion line segment and the speed gradual change difference of the pixel points in the motion area, acquires the motion speed gradual change of the motion area, determines the motion characteristic area and acquires the motion characteristic conformity of the motion area.
The environment of the PFA encapsulation flying lever clamping seat assembling process is complex, the problem that the mechanical arm is blocked possibly occurs in the video of the PFA encapsulation flying lever clamping seat assembling process is acquired, at the moment, the motion trail of the mechanical arm in the video of the PFA encapsulation flying lever clamping seat assembling process is directly extracted, the extracted motion trail curve precision is low, so that the motion area in an assembling gray level image needs to be analyzed, and the problem that whether the mechanical arm is blocked or not is judged.
When the mechanical arm moves, one end of the mechanical arm is fixed, so that the movement directions of the pixel points corresponding to the mechanical arm are the same, the angular speeds are the same, and the movement linear speed is reduced along with the reduction of the distance between the pixel points and the fixed end. When this feature is more pronounced, the less likely the robotic arm will be to experience a problem of being occluded.
And respectively analyzing all pixel points contained in each motion area in all the assembled gray images.
And respectively taking each pixel point contained in each motion area in all the assembled gray level images as a pixel point to be analyzed. And numbering the pixel points in the motion area in the sequence from top to bottom and from left to right, and obtaining the numbers of all the pixel points in the motion area.
And marking the pixel point which is one larger than the number of the pixel point to be analyzed as the adjacent pixel point of the pixel point to be analyzed, and marking the absolute value of the difference value of the motion directions of the pixel point to be analyzed and the adjacent pixel point of the pixel point to be analyzed as the motion direction difference value of the pixel point to be analyzed. And recording the sum of the motion direction difference values of all the pixel points contained in the motion area as the sum of the motion difference values of the motion area. The reciprocal of the sum of the motion differences for the motion areas is noted as the motion direction consistency for the motion areas.
So far, the consistency of the movement direction of the movement area is obtained.
When the motion direction consistency of the motion area is larger, the probability of the problem that the mechanical arm corresponding to the motion area is blocked is smaller.
And marking a window taking the pixel to be analyzed as a central pixel and the side length as a first preset threshold as a first window of the pixel to be analyzed. Wherein the first preset threshold has an empirical value of 3. And marking the pixel point with the largest difference between the motion speed in the first window of the pixel point to be analyzed and the motion speed of the pixel point to be analyzed as the first pixel point of the pixel point to be analyzed. And taking the pixel point to be analyzed as an endpoint, marking the ray passing through the first pixel point of the pixel point to be analyzed as a movement direction ray of the pixel point to be analyzed, and marking the intersection point of the movement direction ray and the edge of the movement area where the pixel point to be analyzed is positioned as the edge intersection point of the pixel point to be analyzed. And marking the line segment taking the edge intersection point of the pixel point to be analyzed and the pixel point to be analyzed as the end point as the motion line segment of the pixel point to be analyzed.
And recording the absolute value of the difference value of the motion speed between two adjacent pixel points on the motion line segment of the pixel point to be analyzed as the motion speed difference of the two adjacent pixel points, and taking the average value of the motion speed differences of all the adjacent pixel points on the motion line segment of the pixel point to be analyzed as the speed gradual change difference of the pixel point to be analyzed.
When the gradual speed change difference of the pixel points to be analyzed is smaller, the characteristics that the movement directions of the mechanical arm positions corresponding to the pixel points to be analyzed are the same and the angular speeds are the same are more obvious.
And acquiring the motion speed gradient of the motion area according to the motion line segments, the speed gradient difference and the motion speed of the pixel points of all the pixel points contained in the motion area.
In the method, in the process of the invention,is the gradual change of the movement speed of the movement area; />Numbered +.>A gradual difference in the speed of the pixels of (a); />Numbered +.>A gradual difference in the speed of the pixels of (a); />The number of pixels contained in the motion region; />The average value of the gradual speed change difference of all pixel points in the motion area; />The first minimum positive number, the denominator zero, the empirical value 0.001.
When the gradual speed change difference of the pixel points in the motion area is smaller, the motion speed gradual change of the motion area is larger, and the probability of the problem that the mechanical arm corresponding to the motion area is blocked is smaller.
And (5) recording a normalized value of the product of the motion direction consistency and the motion speed gradient of the motion area as the motion characteristic consistency of the motion area.
When the motion characteristic conformity of the motion region is larger, the motion region is more likely to be an unobstructed mechanical arm region.
When the mechanical arm moves with the clamping seat or the flying rod, the clamping seat or the flying rod is positioned at the head end of the mechanical arm and moves along with the mechanical arm, so that the clamping seat or the flying rod has the same motion characteristic as the mechanical arm when the mechanical arm moves, and when the non-shielded mechanical arm area is judged according to the analysis, the area corresponding to the clamping seat or the flying rod is easily misjudged as the mechanical arm area, so that the extracted motion trail is lower in precision.
The assembly process can be divided into three processes of the mechanical arm grabbing the clamping seat to fix the clamping seat at the corresponding position, the mechanical arm grabbing the flying rod to move and aligning the flying rod with the clamping seat to complete the assembly. The simultaneous occurrence process of the clamping seat and the flying lever in the moving video image sequence is the alignment and assembly process of the flying lever and the clamping seat, and the mechanical arm is in the images in the three processes, so that the extraction of the mechanical arm can be completed based on the correlation coefficient of the areas between different images in the moving video image sequence, and the follow-up tracking of the mechanical arm is facilitated. When the corresponding position of the mechanical arm is extracted, the mechanical arm area which is not shielded needs to be extracted first.
And recording a motion region with the motion velocity gradient being greater than or equal to a second preset threshold value as a motion characteristic region, and continuing the analysis. Wherein the empirical value of the second preset threshold is 0.75.
Thus, a motion feature region is acquired.
The shielding coefficient acquisition module acquires a shielding intersection point of the motion area, acquires shielding characteristic time of the shielding intersection point, and acquires a third correlation coefficient of the assembled gray level image, so as to acquire a shielding coefficient corresponding to the assembled gray level image, and acquires the assembled gray level image corresponding to the non-shielded mechanical arm according to the shielding coefficient.
And respectively extracting mass center points of the motion areas by using a gray-scale mass center method for all the motion areas in all the assembled gray-scale images. And taking a ray taking the centroid point of the motion area as a starting point and taking the motion direction as an angle as a centroid direction ray of the motion area. And when the intersection point exists between the barycenter direction ray of the motion region and the barycenter direction ray corresponding to other motion characteristic regions, the intersection point is marked as a shielding intersection point of the motion region. The gray centroid method is a well-known technique and will not be described in detail. The occlusion intersection acquisition schematic diagram is shown in fig. 2.
And acquiring a barycenter direction ray of the movement region, wherein the shielding intersection point is positioned, marking the Euclidean distance between the end point of the barycenter direction ray and the shielding intersection point on the barycenter direction ray as the movement distance of the barycenter direction ray, and marking the ratio between the movement distance of the barycenter direction ray and the movement speed of the shielding intersection point on the barycenter direction ray as the movement time of the barycenter direction ray.
The absolute value of the difference value of the motion time of each two barycenter direction rays of the motion area is recorded as the motion time difference of the shielding intersection point, the motion time corresponding to the minimum value in the motion time difference of the shielding intersection point is recorded as the first shielding time, the motion time corresponding to the second small value in the motion time difference of the shielding intersection point is recorded as the second shielding time, and the mean value of the first shielding time and the second shielding time is recorded as the shielding characteristic moment of the shielding intersection point.
When the difference between the shielding characteristic time of the shielding intersection point and the acquisition time of the assembled gray level image where the shielding intersection point is located is smaller, the possibility that the mechanical arm in the assembled gray level image where the shielding intersection point is located is more blocked.
And obtaining correlation coefficients between every two motion characteristic areas contained in all the assembled gray images by using an NCC matching algorithm.
And respectively taking each assembly gray level image as an assembly gray level image to be analyzed, marking other assembly gray level images except the assembly gray level image to be analyzed as comparison assembly gray level images, respectively taking each motion characteristic region in the assembly gray level image to be analyzed as a motion characteristic region to be analyzed, and respectively taking each motion characteristic region in the comparison assembly gray level image as a comparison motion characteristic region.
And (3) marking the maximum value of the correlation coefficient between the motion characteristic region to be analyzed and the comparison motion characteristic region in the comparison assembly gray level image as a first correlation coefficient of the comparison motion characteristic region, marking the average value of the first correlation coefficients of all the comparison motion characteristic regions as a second correlation coefficient of the motion characteristic region to be analyzed, and marking the minimum value of all the second correlation coefficients as a third correlation coefficient of the comparison assembly gray level image to be analyzed.
The smaller the third correlation coefficient of the assembled gray scale image, the greater the likelihood that the robotic arm in the assembled gray scale image will appear occluded.
And acquiring a shielding coefficient corresponding to the assembled gray level image according to the third correlation coefficient of the assembled gray level image, the motion characteristic coincidence degree of the motion characteristic region in the assembled gray level image, the acquisition time of the assembled gray level image and the shielding characteristic moment of the shielding intersection point in the assembled gray level image.
In the method, in the process of the invention,the shielding coefficient corresponding to the assembled gray level image; />A third correlation coefficient for assembling the gray scale image;for assembling the number of occlusion intersections contained in the gray scale image; />Is->Shielding characteristic moments of the shielding intersection points; />The minimum value of the motion characteristic coincidence degree of all the motion characteristic areas in the assembled gray level image is set; />For a third preset thresholdValues, empirical value 2; />Acquisition time for assembling the grayscale image; />Is an exponential function with a base of natural constant.
When the third correlation coefficient of the assembled gray image is smaller and the minimum value of the motion characteristic coincidence degree of the motion characteristic region in the assembled gray image is smaller, the difference between the acquisition time of the assembled gray image and the shielding characteristic moment of the shielding intersection point in the assembled gray image is smaller, the shielding coefficient corresponding to the assembled gray image is larger, the possibility that the mechanical arm in the assembled gray image is shielded is larger, the precision of the motion region extracted from the assembled gray image is lower, and the motion region is required to be corrected.
When the assembled gray level image does not contain the motion characteristic region, the shielding coefficient corresponding to the assembled gray level image is assigned as a constant 1.
When the normalized value of the shielding coefficient corresponding to the assembled gray level image is larger than or equal to a fourth preset threshold value, the mechanical arm in the assembled gray level image is considered to be shielded; and when the normalized value of the shielding coefficient corresponding to the assembled gray level image is smaller than a fourth preset threshold value, the mechanical arm in the assembled gray level image is considered not to be shielded. Wherein the empirical value of the fourth preset threshold is 0.5.
So far, the shielding coefficient corresponding to the assembled gray level image and the assembled gray level image corresponding to the non-shielded mechanical arm are obtained.
The mechanical arm region extraction module is used for obtaining the interested degree of the motion characteristic region in the assembled gray level image corresponding to the mechanical arm which is not blocked according to the motion characteristic coincidence degree of the motion characteristic region and the blocking coefficient corresponding to the assembled gray level image, and extracting the mechanical arm region according to the interested degree.
Analyzing the motion characteristic region in the assembled gray level image corresponding to the non-shielded mechanical arm, and acquiring the interested degree of the motion characteristic region in the assembled gray level image corresponding to the non-shielded mechanical arm according to the motion characteristic coincidence degree of the motion characteristic region and the shielding coefficient corresponding to the assembled gray level image.
In the method, in the process of the invention,for movement characteristic region->Is of interest in (a); />For movement characteristic region->Is a motion characteristic coincidence degree of (a); />For movement characteristic region->Assembled gray level image->Corresponding shielding coefficients; />For movement characteristic region->Assembled gray level image->Nearby->Inside the assembled gray image, motion feature region +.>Movement of corresponding movement feature regionA characteristic compliance; />A fifth preset threshold value, wherein the empirical value is 10; />For movement characteristic region->Assembled gray level image->Nearby->Inside the assembled gray image, motion feature region +.>Corresponding movement characteristic region and movement characteristic region +.>Correlation coefficient between the two.
When the motion characteristic coincidence degree of the analyzed motion characteristic region is larger, the motion characteristic coincidence degree of the motion characteristic region corresponding to the motion characteristic region in the assembly gray level image adjacent to the assembly gray level image where the analyzed motion characteristic region is located is larger, the correlation coefficient between the motion characteristic region corresponding to the motion characteristic region and the analyzed motion characteristic region in the assembly gray level image adjacent to the assembly gray level image where the motion characteristic region is located is larger, and the shielding coefficient corresponding to the assembly gray level image where the analyzed motion characteristic region is located is smaller, the interested degree of the motion characteristic region is larger, and the possibility that an object corresponding to the motion characteristic region is a mechanical arm is larger.
And marking a motion characteristic region with the greatest interested degree in the assembled gray level image as a mechanical arm region, and completing the extraction of the mechanical arm region.
The assembly quality detection module is used for acquiring a motion trail curve of the mechanical arm according to the extracted mechanical arm region, and judging the quality of the assembly of the PFA encapsulated fly rod clamping seat according to a comparison result of the motion trail curve of the mechanical arm and the simulated motion trail curve.
The method comprises the steps of obtaining a centroid point of a mechanical arm region in an assembled gray level image by using a gray level centroid method, taking the centroid point as a moving point, arranging all the moving points according to the obtaining time of the assembled gray level image where the moving points are located, and obtaining a motion trail curve of the mechanical arm by using a least square method and other data fitting algorithms for the arranged moving points.
The similarity of the motion track curve of the mechanical arm and the simulated motion track curve is obtained by using a curve similarity algorithm, when the similarity is smaller than or equal to a risk judgment threshold value, the assembly quality of the PFA encapsulated fly rod clamping seat is considered to be not up to standard, related personnel are reminded, the equipment is conveniently and timely adjusted, and the assembly quality and the assembly efficiency are improved; and when the similarity is greater than a risk judgment threshold, the assembly quality of the PFA encapsulated fly rod clamping seat is considered to reach the standard.
Wherein, the experience value of the risk judgment threshold value is 0.9.
Thus, the assembling and monitoring of the PFA encapsulated fly rod clamping seat are completed.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. The foregoing description of the preferred embodiments of the present invention is not intended to be limiting, but rather, any modifications, equivalents, improvements, etc. that fall within the principles of the present invention are intended to be included within the scope of the present invention.

Claims (3)

1. The PFA encapsulated fly rod clamping seat assembly monitoring system based on image processing is characterized by comprising the following modules:
the data acquisition module acquires videos in the assembling process of the PFA encapsulated flying lever clamping seat, so as to acquire the moving speed and the moving direction of each pixel point in a moving area in an assembled gray level image and acquire a simulated moving track curve;
the motion characteristic conformity acquisition module acquires the motion direction consistency of the motion area according to the motion direction of the pixel points in the motion area, acquires the motion line segment and the speed gradual change difference of the pixel points in the motion area, acquires the motion speed gradual change of the motion area and acquires the motion characteristic conformity of the motion area;
the shielding coefficient acquisition module acquires shielding intersection points of the motion area, acquires shielding characteristic moments of the shielding intersection points, acquires third correlation coefficients of the assembled gray level images, further acquires shielding coefficients corresponding to the assembled gray level images, and acquires the assembled gray level images corresponding to the mechanical arms which are not shielded according to the shielding coefficients;
the mechanical arm region extraction module is used for obtaining the interested degree of the motion characteristic region in the assembled gray level image corresponding to the mechanical arm which is not blocked according to the motion characteristic coincidence degree of the motion characteristic region and the blocking coefficient corresponding to the assembled gray level image, and extracting the mechanical arm region according to the interested degree;
the assembling quality detection module is used for acquiring a motion trail curve of the mechanical arm according to the extracted mechanical arm region and judging the assembling quality of the PFA encapsulated fly rod clamping seat according to a comparison result of the motion trail curve of the mechanical arm and the simulated motion trail curve;
the method for acquiring the consistency of the motion direction of the motion area according to the motion direction of the pixel points in the motion area comprises the following steps: respectively taking each pixel point contained in each motion area in all the assembled gray images as a pixel point to be analyzed; numbering the pixel points in the motion area in the sequence from top to bottom and from left to right to obtain the numbers of all the pixel points in the motion area; the pixel point which is one larger than the number of the pixel point to be analyzed is marked as the adjacent pixel point of the pixel point to be analyzed; the absolute value of the difference value of the motion direction of the pixel to be analyzed and the adjacent pixel of the pixel to be analyzed is recorded as the motion direction difference value of the pixel to be analyzed; the sum of the motion direction differences of all the pixel points contained in the motion area is recorded as the sum of the motion differences of the motion area; the reciprocal of the sum of the motion differences of the motion areas is recorded as the motion direction consistency of the motion areas;
the method for acquiring the motion line segment and the speed gradual change difference of the pixel points in the motion area comprises the following steps: taking a window taking the pixel to be analyzed as a central pixel and the side length as a first preset threshold value as a first window of the pixel to be analyzed; the pixel point with the largest difference between the motion speed in the first window of the pixel point to be analyzed and the motion speed of the pixel point to be analyzed is marked as the first pixel point of the pixel point to be analyzed; taking a pixel point to be analyzed as an endpoint, marking a ray passing through a first pixel point of the pixel point to be analyzed as a motion direction ray of the pixel point to be analyzed, and marking an intersection point of the motion direction ray and the edge of a motion area where the pixel point to be analyzed is positioned as an edge intersection point of the pixel point to be analyzed; marking a line segment taking an edge intersection point of the pixel point to be analyzed and the pixel point to be analyzed as an endpoint as a motion line segment of the pixel point to be analyzed; the absolute value of the difference value of the motion speed between two adjacent pixel points on the motion line segment of the pixel point to be analyzed is recorded as the motion speed difference of the two adjacent pixel points; taking the average value of the motion speed differences of all adjacent pixel points on the motion line segment of the pixel point to be analyzed as the speed gradual change difference of the pixel point to be analyzed;
the method for acquiring the motion velocity gradient of the motion region and determining the motion characteristic region comprises the following steps of: the absolute value of the difference between the speed gradient difference of the pixel to be analyzed and the speed gradient difference of the pixel with the serial number larger than that of the pixel to be analyzed is recorded as the absolute value of the speed gradient difference of the pixel to be analyzed; the product of the absolute value of the velocity gradient difference of the pixel points to be analyzed and the average value of the velocity gradient differences of all the pixel points in the motion area is recorded as a first product of the pixel points to be analyzed; recording the inverse of the sum of the first product of the pixel points to be analyzed and the first minimum positive number as a motion speed characteristic value of the pixel points to be analyzed; the sum of the motion speed characteristic values of all the pixel points contained in the motion area is recorded as the motion speed gradual change of the motion area; recording a motion region with the motion speed gradient being greater than or equal to a second preset threshold value as a motion characteristic region;
the method for acquiring the motion characteristic conformity of the motion area comprises the following steps: the normalized value of the product of the motion direction consistency and the motion speed gradient of the motion area is recorded as the motion characteristic consistency of the motion area;
the method for acquiring the shielding intersection point of the motion area and the shielding characteristic moment of the shielding intersection point comprises the following steps: extracting centroid points of all the motion areas in all the assembled gray images by using a gray centroid method respectively; taking a centroid point of the motion area as a starting point, and taking a ray with a motion direction as an angle as a centroid direction ray of the motion area; when an intersection point exists between the obtained centroid direction ray of the motion area and the centroid direction ray corresponding to the motion feature area, the intersection point is marked as a shielding intersection point of the motion area; acquiring a barycenter direction ray of a shielding intersection point of a motion area, and recording Euclidean distance between an endpoint of the barycenter direction ray and the shielding intersection point on the barycenter direction ray as a motion distance of the barycenter direction ray; the ratio between the movement distance of the ray in the centroid direction and the movement speed of the shielding intersection point on the ray in the centroid direction is recorded as the movement time of the ray in the centroid direction; the absolute value of the difference value of the motion time of each two barycenter direction rays of the occlusion intersection point of the motion area is recorded as the motion time difference of the occlusion intersection point; recording the motion time corresponding to the minimum value in the motion time difference of the shielding intersection point as a first shielding time; recording the motion time corresponding to the second small value in the motion time difference of the shielding intersection point as second shielding time; recording the mean value of the first shielding time and the second shielding time as the shielding characteristic moment of the shielding intersection point;
the method for acquiring the third correlation coefficient of the assembled gray level image comprises the following steps: taking each assembly gray level image as an assembly gray level image to be analyzed respectively, and recording other assembly gray level images except the assembly gray level image to be analyzed as comparison assembly gray level images; taking each motion characteristic region in the assembled gray level image to be analyzed as a motion characteristic region to be analyzed respectively, and taking each motion characteristic region in the compared assembled gray level image as a comparison motion characteristic region respectively; the method comprises the steps of recording the maximum value of correlation coefficients between a motion characteristic region to be analyzed and a comparison motion characteristic region in a comparison assembly gray level image as a first correlation coefficient of the comparison motion characteristic region; the average value of the first correlation coefficients of all the comparison motion characteristic areas is recorded as a second correlation coefficient of the motion characteristic area to be analyzed; the minimum value of all the second correlation coefficients is recorded as a third correlation coefficient of the assembled gray level image to be analyzed;
the method for acquiring the assembled gray level image corresponding to the non-occluded mechanical arm according to the occlusion coefficient comprises the following steps:
in the method, in the process of the invention,the shielding coefficient corresponding to the assembled gray level image; />A third correlation coefficient for assembling the gray scale image; />For assembling the number of occlusion intersections contained in the gray scale image; />Is->Shielding characteristic moments of the shielding intersection points; />The minimum value of the motion characteristic coincidence degree of all the motion characteristic areas in the assembled gray level image is set; />A third preset threshold value; />Acquisition time for assembling the grayscale image; />Is an exponential function with a natural constant as a base; when assembling ashWhen the degree image does not contain the motion characteristic region, assigning a shielding coefficient corresponding to the assembled gray level image as a constant; when the normalized value of the shielding coefficient corresponding to the assembled gray level image is larger than or equal to a fourth preset threshold value, the mechanical arm in the assembled gray level image is considered to be shielded; and when the normalized value of the shielding coefficient corresponding to the assembled gray level image is smaller than a fourth preset threshold value, the mechanical arm in the assembled gray level image is considered not to be shielded.
2. The PFA encapsulated flight bar cassette assembly monitoring system based on image processing according to claim 1, wherein the method for acquiring the interest degree of the motion characteristic region in the assembled gray scale image corresponding to the non-shielded mechanical arm according to the motion characteristic coincidence degree of the motion characteristic region and the shielding coefficient corresponding to the assembled gray scale image comprises the following steps:
the method for acquiring the interested degree of the motion characteristic region in the assembled gray level image corresponding to the non-shielded mechanical arm comprises the following steps of:
in the method, in the process of the invention,for movement characteristic region->Is of interest in (a); />For movement characteristic region->Is a motion characteristic coincidence degree of (a); />For movement characteristic region->Assembled gray level image->Corresponding shielding coefficients; />For movement characteristic region->Assembled gray level image->Nearby->Inside the assembled gray image, motion feature region +.>The motion characteristic conformity of the corresponding motion characteristic region; />A fifth preset threshold value; />For movement characteristic region->Assembled gray level image->Nearby->Inside the assembled gray image, motion feature region +.>Corresponding movement characteristic region and movement characteristic region +.>Correlation coefficients between;
and (5) marking a motion characteristic region with the greatest interest degree in the assembled gray level image as a mechanical arm region.
3. The system for assembling and monitoring the PFA encapsulated flying lever clamping seat based on image processing according to claim 1, wherein the method for acquiring the motion track curve of the mechanical arm according to the extracted mechanical arm area and judging the assembling quality of the PFA encapsulated flying lever clamping seat according to the comparison result of the motion track curve of the mechanical arm and the simulated motion track curve is as follows:
acquiring a centroid point of a mechanical arm region in an assembled gray level image by using a gray level centroid method, taking the centroid point as a moving point, arranging all the moving points according to the acquisition time of the assembled gray level image where the moving points are positioned, and acquiring a motion trail curve of the mechanical arm by using a data fitting algorithm for the arranged moving points;
obtaining the similarity of a motion trail curve and a simulated motion trail curve of the mechanical arm by using a curve similarity algorithm;
when the similarity is smaller than or equal to a risk judgment threshold value, the assembling quality of the PFA encapsulated fly rod clamping seat is considered to be not up to the standard;
and when the similarity is greater than a risk judgment threshold, the assembly quality of the PFA encapsulated fly rod clamping seat is considered to reach the standard.
CN202311360082.6A 2023-10-20 2023-10-20 PFA encapsulation flying lever clamping seat assembly monitoring system based on image processing Active CN117095007B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311360082.6A CN117095007B (en) 2023-10-20 2023-10-20 PFA encapsulation flying lever clamping seat assembly monitoring system based on image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311360082.6A CN117095007B (en) 2023-10-20 2023-10-20 PFA encapsulation flying lever clamping seat assembly monitoring system based on image processing

Publications (2)

Publication Number Publication Date
CN117095007A CN117095007A (en) 2023-11-21
CN117095007B true CN117095007B (en) 2024-01-30

Family

ID=88780258

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311360082.6A Active CN117095007B (en) 2023-10-20 2023-10-20 PFA encapsulation flying lever clamping seat assembly monitoring system based on image processing

Country Status (1)

Country Link
CN (1) CN117095007B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115471530A (en) * 2022-08-31 2022-12-13 广东工业大学 Robot self-positioning precision evaluation method based on laser radar
CN115905774A (en) * 2022-10-27 2023-04-04 广西规亿工程技术集团有限公司 Dynamic shielding space model for visual domain of exit ramp of highway curve
CN115937827A (en) * 2023-02-17 2023-04-07 深圳市蓝鲸智联科技有限公司 Monitoring video processing method for automobile emergency active risk avoidance
CN116309577A (en) * 2023-05-19 2023-06-23 山东晨光胶带有限公司 Intelligent detection method and system for high-strength conveyor belt materials
CN116453062A (en) * 2023-06-12 2023-07-18 青岛义龙包装机械有限公司 Packaging machine assembly risk monitoring method based on robot high-precision compliant assembly

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220371199A1 (en) * 2018-02-21 2022-11-24 Outrider Technologies, Inc. System and method for connection of service lines to trailer fronts by automated trucks

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115471530A (en) * 2022-08-31 2022-12-13 广东工业大学 Robot self-positioning precision evaluation method based on laser radar
CN115905774A (en) * 2022-10-27 2023-04-04 广西规亿工程技术集团有限公司 Dynamic shielding space model for visual domain of exit ramp of highway curve
CN115937827A (en) * 2023-02-17 2023-04-07 深圳市蓝鲸智联科技有限公司 Monitoring video processing method for automobile emergency active risk avoidance
CN116309577A (en) * 2023-05-19 2023-06-23 山东晨光胶带有限公司 Intelligent detection method and system for high-strength conveyor belt materials
CN116453062A (en) * 2023-06-12 2023-07-18 青岛义龙包装机械有限公司 Packaging machine assembly risk monitoring method based on robot high-precision compliant assembly

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
一种避免遮挡物影响的运动目标检测方法;任世卿 等;电脑知识与技术;第11卷(第17期);第160-162页 *
基于视觉的机械臂轨迹优化方法研究;李东民 等;机床与液压;第51卷(第8期);第35-41页 *
空间机械臂抓捕动态目标快速稳定跟踪方法;王晓雪 等;中国空间科学技术;第38卷(第1期);第18-28页 *

Also Published As

Publication number Publication date
CN117095007A (en) 2023-11-21

Similar Documents

Publication Publication Date Title
CN109800689B (en) Target tracking method based on space-time feature fusion learning
CN108764071B (en) Real face detection method and device based on infrared and visible light images
EP3709216A1 (en) Methods and apparatuses for object detection in a scene represented by depth data of a range detection sensor and image data of a camera
EP3525000A1 (en) Methods and apparatuses for object detection in a scene based on lidar data and radar data of the scene
CN111652085B (en) Object identification method based on combination of 2D and 3D features
CN108921163A (en) A kind of packaging coding detection method based on deep learning
Barath et al. Graph-cut RANSAC: Local optimization on spatially coherent structures
CN112070799A (en) Fish trajectory tracking method and system based on artificial neural network
US20220383525A1 (en) Method for depth estimation for a variable focus camera
CN110021029B (en) Real-time dynamic registration method and storage medium suitable for RGBD-SLAM
CN104615986A (en) Method for utilizing multiple detectors to conduct pedestrian detection on video images of scene change
CN107862713B (en) Camera deflection real-time detection early warning method and module for polling meeting place
CN110443247A (en) A kind of unmanned aerial vehicle moving small target real-time detecting system and method
CN109902576B (en) Training method and application of head and shoulder image classifier
CN110287907A (en) A kind of method for checking object and device
CN115049821A (en) Three-dimensional environment target detection method based on multi-sensor fusion
CN115760893A (en) Single droplet particle size and speed measuring method based on nuclear correlation filtering algorithm
CN111435429B (en) Gesture recognition method and system based on binocular stereo data dynamic cognition
CN108053425B (en) A kind of high speed correlation filtering method for tracking target based on multi-channel feature
CN117095007B (en) PFA encapsulation flying lever clamping seat assembly monitoring system based on image processing
CN111709269B (en) Human hand segmentation method and device based on two-dimensional joint information in depth image
Li et al. Face detection based on depth information using HOG-LBP
CN112329893A (en) Data-driven heterogeneous multi-target intelligent detection method and system
CN112464933A (en) Intelligent recognition method for small dim target of ground-based staring infrared imaging
CN112561885A (en) YOLOv 4-tiny-based gate valve opening detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant