CN112435252B - Warhead fragment perforation and pit detection method - Google Patents

Warhead fragment perforation and pit detection method Download PDF

Info

Publication number
CN112435252B
CN112435252B CN202011405498.1A CN202011405498A CN112435252B CN 112435252 B CN112435252 B CN 112435252B CN 202011405498 A CN202011405498 A CN 202011405498A CN 112435252 B CN112435252 B CN 112435252B
Authority
CN
China
Prior art keywords
target plate
equivalent target
image
perforation
pit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011405498.1A
Other languages
Chinese (zh)
Other versions
CN112435252A (en
Inventor
李翰山
张晓倩
高俊钗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Technological University
Original Assignee
Xian Technological University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Technological University filed Critical Xian Technological University
Priority to CN202011405498.1A priority Critical patent/CN112435252B/en
Publication of CN112435252A publication Critical patent/CN112435252A/en
Application granted granted Critical
Publication of CN112435252B publication Critical patent/CN112435252B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method for detecting breaking perforation and pit of a warhead, which belongs to the technical field of image processing and comprises the following steps: for an equivalent target plate image acquired after static explosion of a warhead, carrying out geometric correction and actual physical size recovery based on a homography matrix of photographic transformation; linearly dividing the equivalent target plate based on the vertex and the boundary; smoothing the equivalent target plate image based on the guided filtering; combining the change of the illumination of the flash lamp at different positions on the equivalent target plate, and generating a hyperboloid threshold value by adopting image gradients; dividing a broken piece perforation area with bright gray and a broken piece pit area with dark gray on an equivalent target plate by using a hyperboloid threshold value; the broken perforation and pit area are used as seed points with gray level similarity to perform area growth; and removing the pseudo target area by using the fragment perforation and pit area and shape characteristic criterion. The calculation method can save the state information of the broken perforation and the pit, accurately divide the broken perforation and the pit area, and conveniently and rapidly realize the detection of the broken perforation and the pit.

Description

Warhead fragment perforation and pit detection method
Technical Field
The invention belongs to the technical field of image processing, relates to the technical field of image processing of equivalent target plates hit by warhead fragments, and particularly relates to a method for detecting perforation and pits of the warhead fragments.
Background
The accurate fragment field parameters are obtained under the static explosion condition, so that the method is a basis for constructing the power situation of the warhead and is also a premise for analyzing the dynamic explosion state of the fragment field. Before static explosion, equivalent target plates with a certain distance and angle are arranged around the warhead, the number, the position, the perforation, the pit area and the like of fragments distributed on each equivalent target plate after explosion are quite different due to uncertainty of fragment dispersion characteristics, the condition that each equivalent target plate is hit by the fragment needs to be quickly obtained and counted, for example, the number of fragment perforation on the equivalent target plate is a precondition and a basis for fragment dispersion characteristic calculation, and the fragment dispersion characteristic is an important index for checking ammunition power parameters. Aiming at the target plate method for measuring parameters of a fragment field, a rapid, omnibearing and automatic fragment data statistics detection technology is urgently needed.
At present, the detection of the broken perforation on the equivalent target plate mainly comprises manual detection, but under the general condition, the explosion radius of the warhead is larger, the number of the equivalent target plates is more, and the number of the broken perforation is also more, so that the detection workload of the broken perforation is larger. Meanwhile, when the condition of breaking and perforating is complex, the judgment standards are not uniform, and the situation needs to be confirmed by multiple persons together on site. The state information of the broken perforation is not stored, the specific position of the broken perforation cannot be reproduced, and the subsequent checking is more unfavorable, so that the objectivity, accuracy and reliability of detection are affected. Therefore, it is necessary to study a fragment perforation and pit detection method based on an image processing technique.
In order to further objectively, accurately and reliably detect the broken perforation and the pit, the invention provides a method for detecting the broken perforation and the pit of the warhead.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides a method for detecting perforation and pits of a fight part broken sheet.
In order to achieve the above object, the present invention provides the following technical solutions:
a fragment perforation and pit detection method based on hyperboloid threshold segmentation comprises the following steps:
step 1,Collecting a first equivalent target plate image after static explosion of a warhead, accurately selecting four vertexes of the first equivalent target plate image in a manual interaction mode, and obtaining vertex image coordinates (x '' i ,y′ i ,1) T The method comprises the steps of carrying out a first treatment on the surface of the Combining the prior shape and the physical size of the equivalent target plate to obtain the physical coordinates (x i ,y i ,1) T I=1, 4, and a homography matrix H of the photographic transformation is established and calculated 3×3
Figure BDA0002816525900000021
Based on homography matrix H 3×3 Performing geometric correction and actual physical size recovery on the first equivalent target plate image to obtain a second equivalent target plate image;
step 2, dividing the second equivalent target plate image based on the linearity of four vertexes and boundaries of the equivalent target plate in the second equivalent target plate image to obtain a third equivalent target plate image p i
Step 3, smoothing the third equivalent target plate image p based on the guided filtering i Obtaining a fourth equivalent target plate image q i And for the fourth equivalent target plate image q i Conducting guide filtering;
the fourth equivalent target plate image after the guide filtering is as follows:
q i ′=a k I i +b i∈ω k (2)
wherein q i ' fourth equivalent target plate image as output, I i For guiding the image, a third equivalent target plate image p is used i As a guide image, a k And b is the invariant coefficient of the linear function when the window center is at k, a k And b is the minimized image p i And q i ' cost function of difference E:
Figure BDA0002816525900000031
step 4, combining the equivalent of the on-site arrangement of the flash lamp illuminationThe change of different positions on the target plate adopts a fourth equivalent target plate image q after guiding and filtering i ' gradient generation hyperboloid threshold;
step 5, guiding the filtered fourth equivalent target plate image q by hyperboloid threshold segmentation i The obtained bright gray fragment perforation area and dark gray fragment pit area;
step 6, taking the broken perforation and the pit area as seed points with gray level similarity for area growth;
and 7, removing the pseudo target area by using the fragment perforation and pit area and shape characteristic criterion.
Preferably, the step 4 specifically includes:
step 4.1, calculating a fourth equivalent target plate image q after guided filtering based on a canny operator i ' gray scale gradient; taking an absolute value for a gradient which is positive and negative firstly, setting 0 for the gradient which is negative and positive firstly, and calculating a first gradient threshold value by adopting an average value;
step 4.2, obtaining edge points by adopting a thinning algorithm, and taking the gray value of the smooth image corresponding to the edge points as a first reference threshold value;
step 4.3, taking an absolute value for the gradient which is negative before positive, setting 0 for the gradient which is positive before negative, and calculating a second gradient threshold value based on the average value;
step 4.4, obtaining edge points by adopting a thinning algorithm; taking the gray value of the smooth image corresponding to the edge point as a second reference threshold value;
step 4.5, for a first equivalent target plate image shot by using flash lamp illumination, establishing a world coordinate system OXYZ by taking the upper left corner of a space equivalent target plate as an original point and two vertical sides as coordinate axes; according to the camera imaging model, combining the prior shape and the prior size of the equivalent target plate, adopting a homography matrix to calculate a rotation matrix R= [ R ] of a camera coordinate system relative to a world coordinate system 1 ,r 2 ,r 3 ]And a translation vector t, then according to the equation:
Figure BDA0002816525900000041
three-dimensional coordinates of each pixel point on the first equivalent target plate image are obtained from a world coordinate system [ X Y Z] T Conversion to camera coordinate system [ x y z ]] T Thereby obtaining the image of each pixel point on the first equivalent target plate to the optical center (x 0 ,y 0 ,z 0 ) Distance d of (2):
Figure BDA0002816525900000051
the size of the flash light source is far away from a shot large object, and can be approximated to a point light source, and the position of the point light source is approximated to the optical center of a camera, so that the relation between the imaging gray scale f (x, y) of the illumination of the flash light source in an image and the distance d is as follows:
Figure BDA0002816525900000052
thereby giving gray scale variation ratios of different positions of the first equivalent target plate image relative to the center position of the imaging object;
and 4.6, generating a first curved surface threshold value and a second curved surface threshold value based on the reference threshold value 1 and the second reference threshold value and combining the gray scale change proportion of the flash illumination.
Preferably, the step 5 specifically includes:
step 5.1, guiding the filtered fourth equivalent target plate image q by utilizing the threshold segmentation of the first curved surface i Setting the pixel gray level larger than the threshold value of the double first curved surfaces as 1, otherwise setting the pixel gray level as 0, and obtaining a first segmentation image of the bright gray level fragment perforation area;
step 5.2, guiding the filtered fourth equivalent target plate image q by utilizing the threshold segmentation of the double second curved surfaces i Setting the pixel gray level smaller than the threshold value of the second curved surface to be 1, otherwise setting the pixel gray level to be 0, and obtaining a second segmentation image of the dark gray level fragment pit area;
and 5.3, performing AND operation on the segmented image 1 and the second segmented image to obtain a third segmented image containing the broken piece perforation and the broken piece pit area.
Preferably, the step 6 specifically includes:
for the third segmented image, taking the fragment perforation and pit area as seed point areas, and performing single pixel expansion pre-operation on the periphery of the fragment perforation and pit area; if the expansion pixel corresponds to the seed point area, guiding the fourth equivalent target plate image after filtering qi Setting the expanded pixel as 1 if the difference between the gray scale of the pixel and the gray average value of the gray scale is within the range of a threshold criterion R, and carrying out region growth, otherwise, not carrying out region growth; the dilation pre-operation and region growing are repeated until no pixel growth stops, and finally a growing image is obtained.
Preferably, the step 7 specifically includes:
for the broken perforation and pit area in the growth image to contain pseudo target, determining the range criterion W of the pixel number contained in the broken perforation and pit n And shape ratio range criterion W of their major axis and minor axis b The number of pixels and the shape ratio of each fragment perforation or pit area are calculated and compared with a range criterion, and if the number of pixels and the shape ratio are out of the range criterion, the number of pixels and the shape ratio are pseudo targets and removed.
The method for detecting the breaking perforation and the pit of the warhead has the following beneficial effects: aiming at imaging noise with larger amplitude and density caused by the roughness of the equivalent target plate, adopting guide filtering to obtain a denoising image with a protected edge; aiming at the conditions of inconsistent background gray and different target attributes, combining a flash light illumination function, simultaneously and accurately dividing and detecting the areas of the broken perforation and the pit, and shooting the geometric correction of the equivalent target plate by any pose, so that the actual position coordinates of the broken perforation and the pit can be calculated; the method can objectively, accurately and reliably detect the broken perforation and the pit, can save state information of the broken perforation and the pit, accurately divide the broken perforation and the pit area, and conveniently and rapidly realize detection of the broken perforation and the pit.
Drawings
In order to more clearly illustrate the embodiments of the present invention and the design thereof, the drawings required for the embodiments will be briefly described below. The drawings in the following description are only some of the embodiments of the present invention and other drawings may be made by those skilled in the art without the exercise of inventive faculty.
Fig. 1 is a flowchart of a method for detecting perforation and pit of a warhead fragment according to embodiment 1 of the present invention.
Detailed Description
The present invention will be described in detail below with reference to the drawings and the embodiments, so that those skilled in the art can better understand the technical scheme of the present invention and can implement the same. The following examples are only for more clearly illustrating the technical aspects of the present invention, and are not intended to limit the scope of the present invention.
Example 1
The invention provides a method for detecting breaking perforation and pit of a warhead, which is shown in fig. 1 specifically and comprises the following steps:
step 1, acquiring a first equivalent target plate image after static explosion of a warhead, accurately selecting four vertexes of the first equivalent target plate image by adopting a manual interaction mode, carrying out geometric correction and actual physical size recovery on a homography matrix of the equivalent target plate based on photographic transformation by combining the prior shape and the prior size, and obtaining vertex image coordinates (x '' i ,y′ i ,1) T The method comprises the steps of carrying out a first treatment on the surface of the Combining the prior shape and the physical size of the equivalent target plate to obtain the physical coordinates (x i ,y i ,1) T I=1,..4, building and calculating homography matrix H of photographic transformation 3×3
Figure BDA0002816525900000071
Based on homography matrix H 3×3 Performing geometric correction and actual physical size recovery on the first equivalent target plate image to obtain a second equivalent target plate image;
step 2, dividing the second equivalent target plate image based on the linearity of four vertexes and boundaries of the equivalent target plate in the second equivalent target plate image to obtain a third equivalent target plateImage p i
Step 3, smoothing the third equivalent target plate image p based on the guided filtering i Obtaining a fourth equivalent target plate image q i And for the fourth equivalent target plate image q i Conducting guide filtering;
the fourth equivalent target plate image after the guide filtering is as follows:
q i ′=a k I i +b i∈ω k (2)
wherein q i ' fourth equivalent target plate image as output, I i For guiding the image, a third equivalent target plate image p is used i As a guide image, a k And b is the invariant coefficient of the linear function when the window center is at k, a k And b is the minimized image p i And q i ' cost function of difference E:
Figure BDA0002816525900000081
step 4, combining the change of different positions of the flash lamp illumination on the equivalent target plate arranged on site, and adopting a fourth equivalent target plate image q after guiding and filtering i ' gradient generation hyperboloid threshold;
specifically, in this embodiment, step 4 specifically includes:
step 4.1, calculating a fourth equivalent target plate image q after guided filtering based on a canny operator i ' gray scale gradient; taking an absolute value for a gradient which is positive and negative firstly, setting 0 for the gradient which is negative and positive firstly, and calculating a first gradient threshold value by adopting an average value;
step 4.2, obtaining edge points by adopting a thinning algorithm, and taking the gray value of the smooth image corresponding to the edge points as a first reference threshold value;
step 4.3, taking an absolute value for the gradient which is negative before positive, setting 0 for the gradient which is positive before negative, and calculating a second gradient threshold value based on the average value;
step 4.4, obtaining edge points by adopting a thinning algorithm; taking the gray value of the smooth image corresponding to the edge point as a second reference threshold value;
step 4.5, for a first equivalent target plate image shot by using flash lamp illumination, establishing a world coordinate system OXYZ by taking the upper left corner of a space equivalent target plate as an original point and two vertical sides as coordinate axes; according to the camera imaging model, combining the prior shape and the prior size of the equivalent target plate, adopting a homography matrix to calculate a rotation matrix R= [ R ] of a camera coordinate system relative to a world coordinate system 1 ,r 2 ,r 3 ]And a translation vector t, then according to the equation:
Figure BDA0002816525900000091
three-dimensional coordinates of each pixel point on the first equivalent target plate image are obtained from a world coordinate system [ X Y Z] T Conversion to camera coordinate system [ x y z ]] T Thereby obtaining the image of each pixel point on the first equivalent target plate to the optical center (x 0 ,y 0 ,z 0 ) Distance d of (2):
Figure BDA0002816525900000092
the size of the flash light source is far away from a shot large object, and can be approximated to a point light source, and the position of the point light source is approximated to the optical center of a camera, so that the relation between the imaging gray scale f (x, y) of the illumination of the flash light source in an image and the distance d is as follows:
Figure BDA0002816525900000101
thereby giving gray scale variation ratios of different positions of the first equivalent target plate image relative to the center position of the imaging object;
and 4.6, generating a first curved surface threshold value and a second curved surface threshold value based on the reference threshold value 1 and the second reference threshold value and combining the gray scale change proportion of the flash illumination.
Step 5, guiding the filtered fourth equivalent target plate image by hyperboloid threshold segmentation qi The obtained bright gray fragment perforation area and dark gray fragment pit area;
specifically, in this embodiment, step 5 specifically includes:
step 5.1, guiding the filtered fourth equivalent target plate image q by utilizing the threshold segmentation of the first curved surface i Setting the pixel gray level larger than the threshold value of the double first curved surfaces as 1, otherwise setting the pixel gray level as 0, and obtaining a first segmentation image of the bright gray level fragment perforation area;
step 5.2, guiding the filtered fourth equivalent target plate image q by utilizing the threshold segmentation of the double second curved surfaces i Setting the pixel gray level smaller than the threshold value of the second curved surface to be 1, otherwise setting the pixel gray level to be 0, and obtaining a second segmentation image of the dark gray level fragment pit area;
and 5.3, performing AND operation on the segmented image 1 and the second segmented image to obtain a third segmented image containing the broken piece perforation and the broken piece pit area.
Step 6, taking the broken perforation and the pit area as seed points with gray level similarity for area growth;
specifically, in this embodiment, step 6 specifically includes:
for the third segmented image, taking the fragment perforation and pit area as seed point areas, and performing single pixel expansion pre-operation on the periphery of the fragment perforation and pit area; if the expansion pixel corresponds to the fourth equivalent target plate image q after the guided filtering of the seed point area i Setting the expanded pixel as 1 if the difference between the gray scale of the pixel and the gray average value of the gray scale is within the range of a threshold criterion R, and carrying out region growth, otherwise, not carrying out region growth; the dilation pre-operation and region growing are repeated until no pixel growth stops, and finally a growing image is obtained.
And 7, removing the pseudo target area by using the fragment perforation and pit area and shape characteristic criterion.
Specifically, in this embodiment, step 7 specifically includes:
for the broken perforation and pit area in the growth image to contain pseudo target, determining the range criterion W of the pixel number contained in the broken perforation and pit n And their aspect ratio of major to minor axesExample Range criterion W b The number of pixels and the shape ratio of each fragment perforation or pit area are calculated and compared with a range criterion, and if the number of pixels and the shape ratio are out of the range criterion, the number of pixels and the shape ratio are pseudo targets and removed.
The detection method provided by the embodiment objectively, accurately and reliably detects the broken perforation and the pit, adopts the homography matrix of projection transformation to carry out geometric correction on the equivalent target plate, and ensures the effect consistency of shooting the equivalent target plate in any pose; the equivalent target plate segmentation based on the linearity of the vertex and the boundary removes complex background interference; based on the guided filtering, denoising protects the edge at the same time; combining the change of the illumination of the flash lamp at different positions on the equivalent target plate, adopting image gradient to generate hyperboloid threshold value, and simultaneously taking into account the broken perforation of bright gray and the broken pit of dark gray; dividing a broken piece perforation area with bright gray and a broken piece pit area with dark gray on an equivalent target plate by using hyperboloid threshold values respectively, and effectively extracting seed points; the broken perforation and pit area are used as seed points with gray level similarity to perform area growth, and a target area is accurately obtained; and removing the pseudo target area by utilizing the area and shape characteristic criteria of the broken perforation and the pit, obtaining an effective target area, accurately dividing the broken perforation and the pit area, and conveniently and rapidly realizing detection of the broken perforation and the pit.
The above embodiments are merely preferred embodiments of the present invention, the protection scope of the present invention is not limited thereto, and any simple changes or equivalent substitutions of technical solutions that can be obviously obtained by those skilled in the art within the technical scope of the present invention disclosed in the present invention belong to the protection scope of the present invention.

Claims (5)

1. The method for detecting the perforation and pit of the broken piece of the warhead is characterized by comprising the following steps:
step 1, acquiring a first equivalent target plate image after static explosion of a warhead, selecting four vertexes of the first equivalent target plate image in a manual interaction mode, and acquiring vertex image coordinates (x '' i ,y′ i ,1) T The method comprises the steps of carrying out a first treatment on the surface of the Combining the prior shape and the physical size of the equivalent target plate to obtain the equivalent target plate topPoint physical coordinates (x i ,y i ,1) T I=1,..4, building and calculating homography matrix H of photographic transformation 3×3
Figure FDA0004152408090000011
Based on homography matrix H 3×3 Performing geometric correction and actual physical size recovery on the first equivalent target plate image to obtain a second equivalent target plate image;
step 2, dividing the second equivalent target plate image based on the linearity of four vertexes and boundaries of the equivalent target plate in the second equivalent target plate image to obtain a third equivalent target plate image p i
Step 3, for the third equivalent target plate image p i Smoothing to obtain a fourth equivalent target plate image q i And for the fourth equivalent target plate image q i Conducting guide filtering;
the fourth equivalent target plate image after the guide filtering is as follows:
q′ i =a k I i +b (2)
wherein q i ' fourth equivalent target plate image as output, I i For guiding the image, i.e. the third equivalent target plate image p i ,a k And b is the invariant coefficient of the linear function when the window center is at k;
step 4, combining the change of different positions of the flash lamp illumination on the equivalent target plate arranged on site, and adopting a fourth equivalent target plate image q i ' gradient generation hyperboloid threshold;
step 5, segmenting the fourth equivalent target plate image q by using hyperboloid threshold value i The obtained bright gray fragment perforation area and dark gray fragment pit area;
step 6, taking the broken perforation and the pit area as seed points with gray level similarity for area growth;
and 7, removing the pseudo target area by using the fragment perforation and pit area and shape characteristic criterion.
2. The warhead fragment perforation and pit detection method of claim 1, wherein the step 4 specifically comprises:
step 4.1, calculating a fourth equivalent target plate image q based on a canny operator i ' gray scale gradient; taking an absolute value for a gradient which is positive and negative firstly, setting 0 for the gradient which is negative and positive firstly, and calculating a first gradient threshold value by adopting an average value;
step 4.2, obtaining edge points of a first gradient threshold by adopting a thinning algorithm, and taking a gray value of a smooth image corresponding to the edge points of the first gradient threshold as a first reference threshold;
step 4.3, taking an absolute value for the gradient which is negative before positive, setting 0 for the gradient which is positive before negative, and calculating a second gradient threshold value based on the average value;
step 4.4, obtaining edge points of a second gradient threshold value by adopting a thinning algorithm; taking the gray value of the smooth image corresponding to the edge point of the second gradient threshold value as a second reference threshold value;
step 4.5, for a first equivalent target plate image shot by using flash lamp illumination, establishing a world coordinate system OXYZ by taking the upper left corner of a space equivalent target plate as an original point and two vertical sides as coordinate axes; according to the camera imaging model, combining the prior shape and the prior size of the equivalent target plate, adopting a homography matrix to calculate a rotation matrix R= [ R ] of a camera coordinate system relative to a world coordinate system 1 ,r 2 ,r 3 ]And a translation vector t, then according to the equation:
Figure FDA0004152408090000021
three-dimensional coordinates of each pixel point on the first equivalent target plate image are obtained from a world coordinate system [ X Y Z] T Conversion to camera coordinate system [ x y z ]] T Thereby obtaining the image of each pixel point on the first equivalent target plate to the optical center (x 0 ,y 0 ,z 0 ) Distance d of (2):
Figure FDA0004152408090000031
the relation between the imaging gray scale f (x, y) of the flash light source illumination in the image and the distance d is as follows:
Figure FDA0004152408090000032
thereby giving gray scale variation ratios of different positions of the first equivalent target plate image relative to the center position of the imaging object;
and 4.6, generating a first curved surface threshold value and a second curved surface threshold value based on the first reference threshold value and the second reference threshold value and combining the gray scale change proportion of the flash illumination.
3. The warhead fragment perforation and pit detection method of claim 2, wherein the step 5 specifically comprises:
step 5.1, segmenting the fourth equivalent target plate image q by utilizing the first curved surface threshold value i Setting the pixel gray level larger than the threshold value of the double first curved surfaces as 1, otherwise setting the pixel gray level as 0, and obtaining a first segmentation image of the bright gray level fragment perforation area;
step 5.2, guiding the filtered fourth equivalent target plate image q by utilizing the threshold segmentation of the double second curved surfaces i Setting the pixel gray level smaller than the threshold value of the second curved surface to be 1, otherwise setting the pixel gray level to be 0, and obtaining a second segmentation image of the dark gray level fragment pit area;
and 5.3, performing AND operation on the segmented image 1 and the second segmented image to obtain a third segmented image containing the broken piece perforation and the broken piece pit area.
4. The warhead fragment perforation and pit detection method of claim 3, wherein the step 6 specifically comprises:
for the third segmented image, taking the fragment perforation and pit area as seed point areas, and performing single pixel expansion pre-operation on the periphery of the fragment perforation and pit area; if the expansion pixel corresponds to the seed point regionFourth equivalent target plate image q after conducting filtering i Setting the expanded pixel as 1 if the difference between the gray scale of the pixel and the gray average value of the gray scale is within the range of a threshold criterion R, and carrying out region growth, otherwise, not carrying out region growth; the dilation pre-operation and region growing are repeated until no pixel growth stops, and finally a growing image is obtained.
5. The warhead fragment perforation and pit detection method of claim 4, wherein the step 7 specifically comprises:
for the broken perforation and pit area in the growth image to contain pseudo target, determining the range criterion W of the pixel number contained in the broken perforation and pit n And shape ratio range criterion W of their major axis and minor axis b The number of pixels and the shape ratio of each fragment perforation or pit area are calculated and compared with a range criterion, and if the number of pixels and the shape ratio are out of the range criterion, the number of pixels and the shape ratio are pseudo targets and removed.
CN202011405498.1A 2020-12-04 2020-12-04 Warhead fragment perforation and pit detection method Active CN112435252B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011405498.1A CN112435252B (en) 2020-12-04 2020-12-04 Warhead fragment perforation and pit detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011405498.1A CN112435252B (en) 2020-12-04 2020-12-04 Warhead fragment perforation and pit detection method

Publications (2)

Publication Number Publication Date
CN112435252A CN112435252A (en) 2021-03-02
CN112435252B true CN112435252B (en) 2023-05-09

Family

ID=74691160

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011405498.1A Active CN112435252B (en) 2020-12-04 2020-12-04 Warhead fragment perforation and pit detection method

Country Status (1)

Country Link
CN (1) CN112435252B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113344990B (en) * 2021-04-27 2022-09-20 成都飞机工业(集团)有限责任公司 Hole site representation projection system and self-adaptive fitting hole site alignment method
CN114240990B (en) * 2021-12-07 2023-04-28 电子科技大学 SAR image point target segmentation method
CN114187289B (en) * 2021-12-23 2022-08-09 武汉市坤瑞塑胶模具制品有限公司 Plastic product shrinkage pit detection method and system based on computer vision

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2519611C1 (en) * 2013-02-26 2014-06-20 Сергей Михайлович Мужичек Method for determining characteristics of fragmentation field of ammunition, and device for its implementation
CN104331554A (en) * 2014-10-30 2015-02-04 中国工程物理研究院总体工程研究所 Method for collecting and analyzing warhead fragment target test data and reconstructing and representing fragment power field
CN104568983A (en) * 2015-01-06 2015-04-29 浙江工业大学 Active-omni-directional-vision-based pipeline inside functional defect detection device and detection method
CN106845372A (en) * 2016-12-31 2017-06-13 华中科技大学 The ship target detection recognition method and system of a kind of space remote sensing optical imagery
CN108896017A (en) * 2018-05-09 2018-11-27 西安工业大学 A kind of closely fried Fragment Group location parameter measurement of bullet and calculation method
CN110095410A (en) * 2019-05-07 2019-08-06 西北核技术研究所 Pattern measurement method, system and ballistic deflection measurement method are injured in target plate perforation
CN111366592A (en) * 2020-04-15 2020-07-03 西北核技术研究院 Automatic fragment detection system based on industrial photogrammetry
CN111488683A (en) * 2020-04-09 2020-08-04 西安工业大学 Fragment flying parameter determination method based on image processing technology

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2519611C1 (en) * 2013-02-26 2014-06-20 Сергей Михайлович Мужичек Method for determining characteristics of fragmentation field of ammunition, and device for its implementation
CN104331554A (en) * 2014-10-30 2015-02-04 中国工程物理研究院总体工程研究所 Method for collecting and analyzing warhead fragment target test data and reconstructing and representing fragment power field
CN104568983A (en) * 2015-01-06 2015-04-29 浙江工业大学 Active-omni-directional-vision-based pipeline inside functional defect detection device and detection method
CN106845372A (en) * 2016-12-31 2017-06-13 华中科技大学 The ship target detection recognition method and system of a kind of space remote sensing optical imagery
CN108896017A (en) * 2018-05-09 2018-11-27 西安工业大学 A kind of closely fried Fragment Group location parameter measurement of bullet and calculation method
CN110095410A (en) * 2019-05-07 2019-08-06 西北核技术研究所 Pattern measurement method, system and ballistic deflection measurement method are injured in target plate perforation
CN111488683A (en) * 2020-04-09 2020-08-04 西安工业大学 Fragment flying parameter determination method based on image processing technology
CN111366592A (en) * 2020-04-15 2020-07-03 西北核技术研究院 Automatic fragment detection system based on industrial photogrammetry

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Calculation Model and Method of Target Damage Efficiency Assessment Based on Warhead Fragment Dispersion;Hanshan Li等;《IEEE Transactions on Instrumentation and Measurement 》;20200817;第1-9页 *
Experimental study on the combined damage of multi-layered composite structures subjected to close-range explosion of simulated warheads;Dian Li 等;《Accepted Manuscript》;20171211;第1-32页 *
含能破片毁伤效应研究现状;蓝肖颖;《飞航导弹》;20201130;第91-94页 *
弹丸近炸破片密度分布与运动规律研究;桑晓月等;《机械与电子》;20171031;第35卷(第10期);第36-39页 *

Also Published As

Publication number Publication date
CN112435252A (en) 2021-03-02

Similar Documents

Publication Publication Date Title
CN112435252B (en) Warhead fragment perforation and pit detection method
US12094152B2 (en) Method for fully automatically detecting chessboard corner points
US11900634B2 (en) Method for adaptively detecting chessboard sub-pixel level corner points
CN109978839B (en) Method for detecting wafer low-texture defects
CN107093205B (en) A kind of three-dimensional space building window detection method for reconstructing based on unmanned plane image
TWI485650B (en) Method and arrangement for multi-camera calibration
CN105261022B (en) PCB board matching method and device based on outer contour
CN111488683B (en) Fragment flying parameter determination method based on image processing technology
CN107808161A (en) A kind of Underwater targets recognition based on light vision
CN114758222B (en) Concrete pipeline damage identification and volume quantification method based on PointNet ++ neural network
CN112258455A (en) Detection method for detecting spatial position of part based on monocular vision
CN106709500A (en) Image feature matching method
CN118279596B (en) Underwater fish sunlight refraction image denoising method and system
CN107680035B (en) Parameter calibration method and device, server and readable storage medium
CN113313116A (en) Vision-based accurate detection and positioning method for underwater artificial target
CN113705564B (en) Pointer type instrument identification reading method
CN114612418A (en) Method, device and system for detecting surface defects of mouse shell and electronic equipment
CN110223356A (en) A kind of monocular camera full automatic calibration method based on energy growth
CN117635874A (en) Fragment field three-dimensional rapid inversion system based on image multi-feature extraction and fusion algorithm
CN111861984B (en) Method and device for determining lung region, computer equipment and storage medium
CN109671084A (en) A kind of measurement method of workpiece shapes
CN112950565A (en) Method and device for detecting and positioning water leakage of data center and data center
CN114511894A (en) System and method for acquiring pupil center coordinates
CN116206140A (en) Crack image matching method based on binocular stereoscopic vision and U-Net neural network
CN107392936B (en) Target tracking method based on meanshift

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant