CN113049184A - Method, device and storage medium for measuring mass center - Google Patents

Method, device and storage medium for measuring mass center Download PDF

Info

Publication number
CN113049184A
CN113049184A CN202110366315.8A CN202110366315A CN113049184A CN 113049184 A CN113049184 A CN 113049184A CN 202110366315 A CN202110366315 A CN 202110366315A CN 113049184 A CN113049184 A CN 113049184A
Authority
CN
China
Prior art keywords
point
image
mark
points
coding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110366315.8A
Other languages
Chinese (zh)
Inventor
关士成
崔爱莲
孙科杰
凌山珊
高洪飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
UNIT 63853 OF PLA
Original Assignee
UNIT 63853 OF PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by UNIT 63853 OF PLA filed Critical UNIT 63853 OF PLA
Priority to CN202110366315.8A priority Critical patent/CN113049184A/en
Publication of CN113049184A publication Critical patent/CN113049184A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M1/00Testing static or dynamic balance of machines or structures
    • G01M1/12Static balancing; Determining position of centre of gravity
    • G01M1/122Determining position of centre of gravity

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a method, a device and a storage medium for measuring the center of mass, wherein the method comprises the following steps: sticking a return light reflection mark on the surface of a measured object as a measurement characteristic point; shooting a measured object under the irradiation of a light source at a specific position by using a professional camera to obtain more than two quasi-binary images; processing the binary image by using a digital photogrammetric system to obtain three-dimensional point cloud data of the object to be measured and establish an object three-dimensional coordinate system; measuring the positions of the suspension point vertical lines of the object to be measured at different suspension angles in the three-dimensional coordinate system of the object, and determining the centroid coordinate of the object to be measured according to the coordinate value of the intersection point of the two suspension point vertical lines. The method can automatically extract the spatial information of the shot object, does not need special measuring equipment such as a platform and a weighbridge, does not need to measure parameters such as the weight and the size of the object, and can ensure and improve the measuring precision, the reliability, the measuring efficiency and the automation degree.

Description

Method, device and storage medium for measuring mass center
Technical Field
The invention relates to the technical field of digital photogrammetry, in particular to a method, equipment and a storage medium for measuring a mass center.
Background
The position of the mass center of the artillery is one of important performance parameters of the artillery and is closely related to the driving safety and the shooting stability of the artillery. The mass center position of the artillery has important influence on the overall design and layout of the artillery, and the mass center position height measurement is a difficult point of mass center position measurement.
At present, the existing centroid measuring method mainly comprises a swinging method, a platform support reaction force method, a hoisting method and a suspension method. The equipment required by the swing method is complex, and the swing method is not suitable for vehicles with large mass and large volume, and the application of the swing method is limited; the platform support reaction method needs special equipment, so that the investment is large and the popularization rate is low; the hoisting method has small angle change, small axial load caused by weight reaction and large error; the suspension method requires less equipment, but involves the work of measuring the size and the like after suspension, and has the problem that the centroid is difficult to calculate.
Therefore, how to solve the problems of excessive measurement parameters, more measurement devices and higher operation difficulty in the conventional centroid height measurement is a technical problem to be solved urgently by those skilled in the art.
Disclosure of Invention
In view of this, the present invention aims to provide a method, a device and a storage medium for measuring a centroid, which can perform high-precision measurement and object space description well, and have high measurement precision, high measurement speed and high automation degree. The specific scheme is as follows:
a method of centroid measurement comprising:
sticking a return light reflection mark on the surface of a measured object as a measurement characteristic point;
shooting the measured object under the irradiation of a light source at a specific position by using a professional camera to obtain more than two quasi-binary images;
processing the quasi-binary image by using a digital photogrammetry system to obtain three-dimensional point cloud data of the object to be measured and establish an object three-dimensional coordinate system;
and measuring the positions of the suspension point vertical lines of the object to be detected at different suspension angles under the three-dimensional coordinate system of the object, and determining the centroid coordinate of the object to be detected according to the coordinate value of the intersection point of the two suspension point vertical lines.
Preferably, in the centroid measuring method provided by the embodiment of the present invention, the return light reflection mark is a mark point and a coded mark made of a return light reflection material; the mark points are positioned at the periphery of the coding mark;
the distance between every two mark points is measured by a reference ruler;
each coding mark has unique digital coding information which is used as a common point between different images so as to enable the quasi-binary image to be automatically spliced.
Preferably, in the centroid measuring method provided by the embodiment of the present invention, the mark point is a circular mark point; the coding mark is a dot coding mark; the coding mark determines the coding through different positions of the coding point in a predesigned coordinate system;
the light source is an annular flash lamp; and the optical axis of the annular flash lamp is coaxial with the optical axis of the professional camera lens.
Preferably, in the centroid measurement method provided in the embodiment of the present invention, the processing of the quasi-binary image to obtain the three-dimensional point cloud data of the object to be measured specifically includes:
identifying and positioning the coding mark from the quasi-binary image, and determining the image point coordinates of the mark point;
performing image matching and splicing on the quasi-binary image according to the image point coordinates of the coding mark and the mark point;
and resolving to obtain the three-dimensional point cloud data of the object to be detected by using a light beam adjustment method.
Preferably, in the centroid measuring method provided by the embodiment of the present invention, before identifying and locating the coding flag from the quasi-binary image, the method further includes:
adopting a Canny operator to carry out edge detection on the quasi-binary image;
performing edge tracking on the quasi-binary image with the boundary information obtained after the edge detection;
and judging the tracked edges, eliminating false edges or non-mark edges, and extracting the image point edges of the coding points and the image point edges of the mark points.
Preferably, in the centroid measuring method provided in the embodiment of the present invention, the identifying the coding flag from the quasi-binary image specifically includes:
finding image points of the template points in the quasi-binary image;
restoring the found image points to the coordinates in the pre-designed coordinate system through affine transformation, and simultaneously solving affine transformation parameters;
restoring the image points of the encoding points around the template point by using the affine transformation parameters;
and decoding the coding points and identifying the coding of the coding marks.
Preferably, in the centroid measuring method provided in the embodiment of the present invention, the locating the coding mark from the quasi-binary image, and determining the image point coordinates of the mark point specifically include:
determining the image point center coordinates of the coding points by adopting a least square method according to the image point edges of the coding points;
and determining the image point center coordinates of the mark points according to the center coordinates of the coding points and by combining the position relationship between the coding marks and the mark points.
Preferably, in the centroid measuring method provided in the embodiment of the present invention, the measuring the positions of the suspension point vertical lines of the object to be measured at different suspension angles in the three-dimensional coordinate system of the object, and determining the centroid coordinate of the object to be measured according to the coordinate value of the intersection point of the two suspension point vertical lines specifically includes:
a plurality of measurement characteristic points are adhered to the suspension point vertical line of the object to be measured;
under different suspension angles, obtaining the space three-dimensional coordinates of the measurement characteristic points pasted on the perpendicular line of the suspension point under the object three-dimensional coordinate system through characteristic extraction and space calculation;
fitting the space three-dimensional coordinates of the measurement characteristic points pasted on the hanging point vertical line by adopting a least square method to obtain a corresponding hanging point vertical line linear equation;
and combining two fitted suspension point vertical line linear equations, and solving to obtain the barycenter coordinate of the object to be detected.
The embodiment of the present invention further provides a centroid measuring device, which includes a processor and a memory, wherein the centroid measuring method provided by the embodiment of the present invention is implemented when the processor executes a computer program stored in the memory.
Embodiments of the present invention further provide a computer-readable storage medium for storing a computer program, where the computer program is executed by a processor to implement the above centroid measuring method provided by the embodiments of the present invention.
According to the technical scheme, the method for measuring the centroid, provided by the invention, comprises the following steps: sticking a return light reflection mark on the surface of a measured object as a measurement characteristic point; shooting a measured object under the irradiation of a light source at a specific position by using a professional camera to obtain more than two quasi-binary images; processing the binary image by using a digital photogrammetric system to obtain three-dimensional point cloud data of the object to be measured and establish an object three-dimensional coordinate system; measuring the positions of the suspension point vertical lines of the object to be measured at different suspension angles in the three-dimensional coordinate system of the object, and determining the centroid coordinate of the object to be measured according to the coordinate value of the intersection point of the two suspension point vertical lines.
The centroid measuring method provided by the invention is based on a digital photogrammetry technology, has a simple principle and is simple and convenient to operate, the measurement process is completed by adding an artificial mark with obvious characteristics on a measured object as a measurement characteristic point in an auxiliary way, the spatial information of the object to be measured can be automatically extracted, special measuring equipment such as a platform and a weighbridge is not needed, parameters such as the weight and the size of the object do not need to be measured, the required parameters are less, the problem that the measuring equipment and the measuring parameters are excessive in the previous centroid height measurement can be solved, high-precision measurement and object space description are well carried out, and the measurement precision, the reliability, the measurement efficiency and the automation degree can be ensured and improved. In addition, the invention also provides corresponding equipment and a computer readable storage medium aiming at the mass center measuring method, so that the mass center measuring method has higher practicability, and the equipment and the computer readable storage medium have corresponding advantages.
Drawings
In order to more clearly illustrate the embodiments of the present invention or technical solutions in related arts, the drawings used in the description of the embodiments or related arts will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a flow chart of a centroid measurement method provided by an embodiment of the invention;
FIG. 2 is a schematic diagram of a landmark provided in an embodiment of the invention;
FIG. 3 is a diagram of an encoding flag according to an embodiment of the present invention;
FIG. 4 is a schematic view of a high refractive index glass microsphere provided in an embodiment of the present invention;
FIG. 5 is a diagram illustrating an incident angle and a deviation angle of a light source according to an embodiment of the present invention;
FIG. 6 is an elliptical imaging plot of a circular marker provided by an embodiment of the present invention;
fig. 7a to 7c are schematic diagrams of edge tracking according to embodiments of the present invention;
FIG. 8 is a schematic diagram of a design of dot code mark points according to an embodiment of the present invention;
FIG. 9 is a schematic illustration of a nuclear line provided in accordance with an embodiment of the present invention;
FIG. 10 is a schematic illustration of a corresponding epipolar line for multiple camera stations provided by an embodiment of the present invention;
FIG. 11 is an initial match result provided by an embodiment of the present invention;
FIG. 12 shows the exact match results provided by embodiments of the present invention;
FIG. 13 is a schematic diagram of a centroid measurement test provided by an embodiment of the present invention;
FIG. 14 is a schematic diagram of a first condition measurement provided in accordance with an embodiment of the present invention;
FIG. 15 is a schematic diagram of a second condition measurement provided by the embodiments of the present invention;
fig. 16 is a schematic diagram of a centroid space solution calculation method according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides a method for measuring the mass center, which comprises the following steps as shown in figure 1:
and S101, pasting a light reflection mark on the surface of the measured object as a measurement characteristic point.
It should be noted that most measured objects, especially weapon equipment, have single color, no obvious characteristic points and good contrast characteristics, so the invention adds artificial marks with obvious characteristics on the measured object as measurement characteristic points to assist in completing the measurement process, can well perform high-precision measurement and object space description, and can ensure and improve measurement precision, reliability and measurement efficiency.
S102, shooting a measured object under the irradiation of a light source at a specific position by using a professional camera to obtain more than two quasi-binary images.
In practical application, a professional camera is a camera which is packaged by a common digital camera through high-precision calibration, is used for high-precision three-dimensional measurement, and has the advantages of stable mechanical structure, good optical performance, high resolution, strong storage capacity and the like. The professional camera is a component of a digital photogrammetric system. When in shooting, the high-resolution measurement professional camera is used for shooting the object to be measured at different positions and directions under the irradiation of the light source at a specific position, the shot image of the model of the object to be measured is dim, and the image of the marked feature points is high in gray level and clear, namely a quasi-binary image, so that the model can be effectively distinguished from a background environment, and the image of the measured feature points can be clearly obtained.
And S103, processing the binary image by adopting a digital photogrammetric system to obtain three-dimensional point cloud data of the object to be detected and establish an object three-dimensional coordinate system.
It should be noted that the digital photogrammetry method is a non-contact large-scene measurement method, and a professional camera is used to obtain a digital image of a measured object to obtain a spatial three-dimensional coordinate of the object, so as to complete the measurement of the shape, position, posture, motion and the like of the measured object. The photogrammetry mainly has the advantages of high three-dimensional measurement precision, high measurement speed, high automation degree and the like. Monocular photogrammetry means that only one photogrammetry sensor is used for shooting images to complete the measurement work of the position and the posture of a measured object in a space range, the composition is simple and convenient, and the calibration, the measurement and the calculation are clear and clear. The monocular photogrammetry system is not easily limited by the field range, and the measurement of the space position and the posture of an object can be carried out in the required measurement range and the required measurement distance by replacing the imaging lens. The digital photogrammetry system adopted in the invention is to shoot a measured object by one (or a plurality of) high-resolution professional cameras, obtain a quasi-binary digital image of the object by adopting a return light reflection mark, and obtain accurate three-dimensional space coordinates after computer image processing.
And S104, measuring the positions of the suspension point vertical lines of the object to be measured at different suspension angles in the three-dimensional coordinate system of the object, and determining the centroid coordinate of the object to be measured according to the coordinate value of the intersection point of the two suspension point vertical lines.
In the centroid measuring method provided by the embodiment of the invention, based on the digital photogrammetry technology, the principle is simple and clear, the operation is simple and convenient, the measurement process is completed by adding the artificial mark with obvious characteristics on the measured object as the measurement characteristic point in an auxiliary way, the spatial information of the measured object can be automatically extracted, special measuring equipment such as a platform and a weighbridge is not needed, parameters such as the weight and the size of the object do not need to be measured, the needed parameters are less, the problem that the measuring equipment and the measured parameters are excessive in the previous centroid height measurement can be solved, the high-precision measurement and the object space description can be well carried out, and the measurement precision, the reliability, the measurement efficiency and the automation degree can be ensured and improved.
Further, in the centroid measuring method according to the embodiment of the present invention, in order to obtain a high-quality digital image, in step S101, as shown in fig. 2 and 3, the retroreflective markers may be marker points and coded markers made of retroreflective material. As shown in fig. 4, one side of the retroreflective material is composed of glass beads or microcrystalline cubes having a diameter of about 50 μm. Each of the microbeads has a cat eye or a reflecting prism function, and the direction of reflected light is the same as that of incident light. The mark points are positioned at the periphery of the coding mark; the distance between every two mark points is measured by a standard ruler, namely, two ends of the standard ruler are respectively fixed with an artificial mark point, the distance between the two mark points is known before measurement, and the standard ruler provides a high-precision length reference in the measurement process. Each coding mark has unique digital coding information and is used as a common point between different images to enable the quasi-binary images to be automatically spliced, namely the coding mark is an artificial mark with digital coding information, and the coding mark can be accurately identified and positioned and the images can be spliced.
In the quasi-binary image, the image of the target object is "blanked", and the image of the retroreflection mark is particularly clear and prominent, so that a bright spot with a dark background and only a group of generally circular or elliptical bright spots is formed. The quasi-binary image can realize quick, accurate and reliable positioning in high-precision mark image measurement.
Preferably, as shown in fig. 2, the marker used in the present invention may be a circular marker; the circular mark points are mainly pasted around the coding mark points in the measuring process, and the coordinate information of the circular mark points is read by means of the position relation of the peripheral coding mark points. The camera movement and the offset angle are controlled within a certain range in the measuring process, so that the circular mark points can be basically presented in different shot images, the position relation between the circular mark points and the coding mark points is basically not changed, the matching information of the circular mark points can be obtained only by analyzing the position relation between the circular mark points and the adjacent coding mark points, and special compiling of identity is not needed, namely, coding is not needed.
Preferably, as shown in fig. 3, the coding mark used in the present invention may be a dot-like coding mark; the dot-like code marks are digital codes formed according to different distributions on a plane. The dot-shaped coding mark is designed by introducing a coordinate system, codes are determined by different positions of coding points in the predesigned coordinate system, the same codes cannot be generated due to integral translation and rotation of the coding points, and the coding principle of the dot-shaped coding mark belongs to absolute codes; the dot-shaped coding mark is composed of coding dots with the same size, so that a complex algorithm is needed to realize the identification. The conditions used in the identification process are more, and the identification algorithm is stable.
Preferably, the light source may be an annular flash lamp. As shown in fig. 5, the light reflection ability of the retro-reflective material depends on two angles, namely, the incident angle γ of light and the deviation angle β (or observation angle) of the light source. The light ray incidence angle gamma is the included angle between the axis of the flash lamp light source and the normal of the plane where the return light reflecting material is located; and the light source deviation angle beta is the included angle between the axis of the flash lamp and the optical axis of the camera. The smaller the light source deviation angle β, the stronger the reflectivity of the retro-reflective material. In order to reduce the angle beta, an annular flash lamp usually used for macro photography is selected and sleeved in front of a professional camera lens, so that the optical axis of the annular flash lamp is coaxial with that of the professional camera lens, namely, the angle beta is almost zero.
In specific implementation, in the centroid measurement method provided in the embodiment of the present invention, the step S103 processes the binary image to obtain three-dimensional point cloud data of the object to be measured, which may specifically include: firstly, identifying and positioning a coding mark from a quasi-binary image, and determining the image point coordinates of a mark point; then, according to the image point coordinates of the coding marks and the mark points, aligning the binary images for image matching and splicing; and finally, resolving to obtain three-dimensional point cloud data of the object to be detected by using a beam adjustment method.
It can be understood that the digital photogrammetry is to obtain two-dimensional image coordinates of a characteristic target (such as a circular return light reflection mark) by processing an image of the characteristic target in an image of a measured object, i.e. to locate the image of the characteristic target, and then to perform the measurement. If the method of software processing can be used for improving the positioning precision of the characteristic target on the image, the method is equivalent to directly improving the measurement precision. One of the key technologies in digital photogrammetry is the identification and positioning of markers, and in high-precision three-dimensional measurement, all subsequent processing depends on the identification and accurate positioning of markers in the initial image processing process. In the process of processing the image of the mark and performing sub-pixel positioning by a computer, two tasks, mark identification and mark positioning, are usually completed. The identification of the marker image requires that the marker image be uniquely detected in the image, and the localization is an accurate centering of the marker image.
Because the invention uses the circular artificial mark, the positioning precision of the center of the circular artificial mark influences the three-dimensional coordinate precision of the detected point in the space. As shown in fig. 6, the circular mark is a portion of an ellipse or cone after being imaged by the lens. To calculate the center of the circular marker, it can be implemented by precisely calculating the center of the elliptical image in the image. In order to achieve high-precision positioning of the center of the ellipse, the edge of the marker image can be firstly extracted, and then the accurate position of the center of the marker can be determined by identification and calculation according to the extracted edge of the marker image.
Therefore, in a specific implementation, in the centroid measuring method provided in the embodiment of the present invention, before identifying and locating the code mark from the quasi-binary image, the method may further include: firstly, adopting a Canny operator to align a binary image for edge detection; then, performing edge tracking on the quasi-binary image with the boundary information obtained after the edge detection; and finally, judging the tracked edges, eliminating false edges or non-mark edges, and extracting the image point edges of the coding points and the image point edges of the mark points.
It should be noted that the edge of the object is a basic feature of the image, the edge refers to a set of pixels in the image having a step or a peaked change in gray level, and the edge widely exists between the object and the background. It is manifested on discontinuities in the image (e.g. abrupt changes in gray levels, abrupt changes in texture, etc.). Discontinuities over a large range are called boundaries. The key to the identification of the artificial marker image is the extraction of the edges.
The edge detection is to use edge points to outline each object so as to analyze whether the image contains some objects to be identified. The purpose of edge detection is to highlight the edges of an image in order to extract image features. If a pixel falls on the boundary of an object in the image, its neighborhood will become a band of variation in gray level. The two features most useful for this change are the rate of change and the direction of the gray scale, which are expressed in the magnitude and direction of the gradient vector, respectively. Since the edge point is generally located at a position where the change of the gray value in the image is severe, that is, a place where the derivative of the gray value is large or extremely large, the classical edge extraction method examines the change of the gray value of each pixel in a certain neighborhood of the image, and detects an edge by using the change rule of the derivative of the first or second direction adjacent to the edge, which is called as an edge detection gradient method. The method adopts the Canny operator to carry out edge detection, and the Canny operator generates single-pixel edges, is insensitive to noise and is suitable for extracting the edges of the circular artificial signs.
The edge detection method proposed by Canny is optimal for step-type edges affected by white noise. The optimality of the Canny detector is related to the following three criteria: important edges are not lost, and false edges should not be generated; the deviation between the actual edge and the detected edge position is minimal; reducing the multiple responses to a single edge response. This is covered by the first criterion part because when there are two responses corresponding to a single edge, one of them should be considered spurious. This standard addresses the edge problem affected by noise, acting against non-smooth edge detection operators.
Canny derives a new detector based on several concepts as follows:
the edge detection operator is expressed for the 1D signal and the first two optimal criteria, and a complete solution can be obtained by a calculus method.
If a third criterion (multiple responses) is added, an optimal solution needs to be obtained through a numerical optimization method. The optimal filter can be effectively approximated as the first derivative of a gaussian smoothing filter with a standard deviation of less than 20% error, which is convenient for implementation.
The edge detection operator is generalized to the 2D case. The step edge is determined by the position, direction and possibly magnitude (intensity). It can be shown that convolving the image with a symmetric 2D gaussian and differentiating along the gradient direction (perpendicular to the edge direction) constitutes a simple and efficient direction operator.
Let G be a 2D Gaussian smoothing operator, also called Gaussian filter, whose expression is:
Figure BDA0003007687260000091
where x, y are the image coordinates and σ is the standard deviation of the associated probability distribution. The standard deviation σ is the only parameter of the gaussian filter and is proportional to the size of the operational neighborhood of the filter. The more distant the pixel from the operator center has less influence, the more than 3 σ the pixel from the center has negligible influence.
Convolving the image with an operator G, and setting GnIs the first derivative of G in the n direction, expressed as:
Figure BDA0003007687260000092
the direction n should be perpendicular to the edge. Although this direction is not known a priori, a reliable estimate based on the direction of the smoothed gradient is available. If f is an image function, the normal n to the edge can be estimated as follows:
Figure BDA0003007687260000101
edge at GnConvolved with the image f at the location of the local maximum in the n direction, so there are:
Figure BDA0003007687260000102
substituting the formula with (2) to obtain:
Figure BDA0003007687260000103
equation (5) shows how to find the local maximum in the direction perpendicular to the edge, this operator is often referred to as non-maximum suppression.
Since the convolution and differentiation in equation (5) are operations satisfying the association law, the image f can be convolved with a symmetric gaussian G, and then the second-order directional derivative can be calculated by using the estimated value of the direction n calculated according to equation (2). The intensity of the edge (gradient magnitude of the image function f) is calculated as follows:
Figure BDA0003007687260000104
spurious responses to a single edge due to noise cause a so-called "streak" problem. In general, this problem is very common in edge detection. The output of the edge detection operator is typically thresholded to determine which edges are salient. Striation refers to the situation where the edge profile is broken, caused by fluctuations in the operator output above or below a threshold. The striation can be eliminated by a hysteresis thresholding process. If the edge response exceeds a high threshold, these pixel points constitute the deterministic output of the edge detection operator at a certain scale. Individual weak responses usually correspond to noise, but if these points are connected to some point with a strong response, they are likely to be true edges in the image. These connected pixels are treated as edge pixels when their response exceeds a low threshold. The low and high thresholds here need to be determined based on an estimate of the signal-to-noise ratio.
The appropriate scale of the operator depends on the object conditions contained in the image. One approach to this problem is to use multiple scales and gather the resulting information. The different scales of canny detection operator are represented by the gaussian filter standard deviation σ. It is possible that there are several scales of operators that give a prominent response to an edge (i.e. the signal-to-noise ratio exceeds a threshold), in which case the operator with the smallest scale is selected because its positioning is most accurate.
The Canny algorithm is a multi-stage process that first performs gaussian smoothing of the image, which is achieved by gaussian convolution. And then, carrying out simple two-dimensional first-order differential operation on the smoothed image to obtain a gradient image, searching possible edge points in the image by adopting a non-maximum inhibition algorithm, and finally searching the edge points of the image through double-threshold value recursion to obtain a single-pixel width edge image.
The performance of the Canny operator is mainly determined by 3 parameters: the standard deviation sigma of the gaussian filter used in the smoothing process, and two thresholds h required in the tracking process1And h2. Increasing the width of the standard deviation of the gaussian filter can reduce the sensitivity of the detection process to noise, but at the cost of losing some detail information of the image and blurring the target edge, so that the probability of target edge detection error also increases slightly with increasing gaussian spread parameter. High threshold h2Controlling the nature of the starting point for edge detection in the gradient map, h2The smaller the value is, the more edge information is kept, the finer the obtained target edge is, but the mixed false edges are increased; with h2The false edge can be effectively suppressed, but the edge information may be lost. After finding the point above the high threshold, the threshold h is low1Controlling the end point property of the current detection, h1The smaller the edge information is, the more edge information is retained, and the more continuous the edge is; with h1The less the edge features of the target are visible, the more the edge breaks. Different parameter settings can produce different effects when the output pictures are different. σ -2 is currently the popular choice.
In addition, the image obtained after Canny operator processing is a binary image with boundary information. There are false edges in the image due to the effect of noise in the image. By raising the threshold value h2Some false edges can be eliminated, but some details are lost. Therefore, the invention obtains the edge by an edge tracking method, and identifies the obtained edge to judge whether the edge accords with the edge of the circular mark.
Edge tracking is mainly accomplished by the following steps:
first, the image is searched from the top left until a pixel of a new area is found, which is then the pixel p0Is the smallest column number of pixels with the smallest row value among all pixels of this new area. Pixel p0Is the starting pixel of the region boundary. A variable dir is defined that stores the previous direction of movement along the boundary from the previous edge element to the current boundary element. Setting: dir-7, as shown in fig. 7a, 8 directions are defined for the neighborhood of each pixel.
Secondly, searching a 3 × 3 neighborhood of the current pixel in a counterclockwise direction, and taking a remainder of dividing (dir +6) by 8 when dir is an odd number as shown in fig. 7 b; when dir is an even number, the remainder of dividing (dir +7) by 8 is taken, and the search for neighborhoods is started according to the search direction of fig. 7c, as shown in fig. 7 c. The first pixel found to have the same value as the current pixel is a new edge pn. The value of dir is updated.
Third, if the current edge pixel pnEqual to the second edge pixel p1And the previous edge pixel pn-1Is equal to p0Stopping, otherwise repeating the second step.
In the shooting process, the change of the position of the camera causes certain deviation of an image formed by a circular mark point, the image formed by the circle is generally an ellipse, and the identification of the mark is to extract an elliptical edge meeting certain requirements. Because the image processed by the Canny operator is an image containing a false edge or an image containing a non-mark edge, the mark image edge can be obtained by edge tracking, but the false edge can be tracked at the same time, so that the tracked edge and pixels contained in the edge need to be judged, and the identification of the circular mark is realized.
In particular, it is possible to verify whether the image spots are present by means of a geometric checkAnd meets the standard of the mark image. These tests are the basis for the localization of the marker images. Geometric tests include tests for size, number of edge pixels (perimeter), aspect ratio (aspect ratio), area, form factor (circularity). Wherein the dimension may be the circumference of the spot edge. The perimeter of the pattern S may be derived by a boundary tracing method. The perimeter l is the number of pixels marking the image boundary,
Figure BDA0003007687260000121
is the number of 4 neighborhood pixels in the image edge, s2Is the number of pixels in the 8 neighborhood in the image edge. The image spot is considered not to be a logo image if the edge perimeter size exceeds a given range of criteria. The invention sets a range for the perimeter of the edge of the candidate mark, and if the edge pixel of a certain candidate area is not in the range, the edge is removed. The ratio of the two perpendicular distances of the image spot, called the aspect ratio or aspect ratio, can also be used as the inspection criterion, and the maximum distance or distance in the scanning direction can be used. The area of the pattern S in the image is defined as the total number of pixels contained in the closed pattern. When the gray mark image is subjected to edge tracking, the area A of the image contained in the edge can be obtained. The shape factor (circularity) is a degree to which the shape of the evaluation target object approaches a circle. The shape factor k is a function of the shape factor k,
Figure BDA0003007687260000122
is a parameter for describing the geometrical characteristics of the planar two-dimensional graph, k is more than 0 and less than or equal to 1, and k is 1 when the two-dimensional graph is a circle. When the imaging angle of the circular mark reaches 70 degrees, the k value is about 0.66. The elliptical shape changes with the change of the photographing angle, and if the shape factor of the candidate region does not meet the set value, the edge is also removed.
Most false edges and false mark images in the image can be basically removed after the mark images in the image are subjected to geometric inspection. However, in the case of complex background of the measurement site, there may be some false marks, so that further image identification is required. Identification is generally performed using information of the image itself, i.e. pixel gray scale inspection. The pixel gray scale test includes black/white ratio (black/white ratio): the ratio of the minimum closed window area of the logo image comprising non-zero intensity (non-background intensity) to a given window area is called the black-white ratio, which can be used as a logo image inspection standard, and the optimal value for a circular logo is 1/pi. However, due to the effects of discrete pixel sampling and perspective projection distortion, the ratio range must be determined in advance from image information and experience. The pixel gray scale inspection also includes a brightness standard: the brightness standard refers to the brightness value of the pixel already included in the light spot to determine whether the light spot meets the logo image standard. Such a criterion may take as a check condition the average, maximum or minimum of all pixels in the spot. For example, the spot image may be rejected when the average is too dark or too bright, based on the spot average.
In order to establish the image dimension checking standard and the ratio standard, the prior knowledge of the actual length of the mark, the image dimension, the expected perspective projection distortion and the like is needed to be known. These inspection standards typically remove the effects of false marks such as background light sources and specular reflections and other noise. The image spot shape factor can be used as a check criterion to eliminate image spots that are incorrectly shaped.
Further, in a specific implementation, in the centroid measuring method provided in the embodiment of the present invention, the identifying a coding flag from the quasi-binary image may specifically include: firstly, finding image points of template points in a quasi-binary image; then, restoring the found image points to the coordinates in a pre-designed coordinate system through affine transformation, and simultaneously solving affine transformation parameters; then, restoring the image points of the coding points around the template point by using the affine transformation parameters; and finally, decoding the coding points and identifying the coding of the coding marks.
It should be noted that the identification of the dot code marks can be regarded as a matching problem of the dot sets in the two images. As shown in fig. 8, A, B, C, D, E five points are the design template points of the coding mark, these five template points can be regarded as the point set in one image, and the coding mark imaged in the image after being photographed is regarded as the point set in another image. The identification method is to find out the image points of the template points in the image by using the designed template points, then restore the points in the image to the design coordinates through affine transformation, and simultaneously obtain affine transformation parameters. And recovering the image points of the coding points around the template point by using the affine transformation parameters, and then comparing the image points with the design coordinates to decode the coding points to obtain the codes of the coding marks. The identification process of the dot code mark comprises the following steps: calculating the cross ratio of the design template, as shown in fig. 8, passing through the cross ratio of four straight lines of A, C, D, E with point B as the center; searching a point set meeting the conditions in the image, and matching by using an intersection invariant to obtain a corresponding point set of template points in the image; affine parameters are obtained through affine transformation; and recovering the encoding points of the affine parameters, and simultaneously decoding to obtain the encoding of the encoding mark.
In a specific implementation, in the centroid measuring method provided in the embodiment of the present invention, locating the coding mark from the quasi-binary image, and determining the image point coordinates of the mark point may specifically include: firstly, determining the image point center coordinates of the coding points by adopting a least square method according to the image point edges of the coding points; then, the center coordinates of the image points of the mark points are determined according to the center coordinates of the coding points and the position relation of the coding marks and the mark points.
It should be noted that the high-precision positioning of the center of the circular artificial marker is based on the correct recognition of the marker image. The invention adopts a least square ellipse fitting method, and the premise of using the fitting method is that the characteristics of the target, such as the gray distribution of an image, the noise of a shadow mode, a measured object and the like, meet the known or assumed function form. The continuous function form of the target can be obtained by fitting the gray scale or the coordinates of the target in the discrete image, so that the sub-pixel positioning of the target by determining each parameter value describing the object can be carried out.
The least square ellipse fitting method is to utilize the coordinates of point set on the circumference of the mark image, to calculate five parameters of ellipse by least square fitting method, and to calculate the coordinates of the center of ellipse according to the parameters of ellipse, so as to locate the coordinates of the center of mark image. The set of points on the circumference is implemented by an edge detection and mark recognition algorithm. The number of parameters describing the ellipse is five, namely a major semi-axis, a minor semi-axis, an ellipse center coordinate and an included angle between the major semi-axis and an x-axis in an image coordinate system. The analytical expression of the planar ellipse is:
Ax2+2Bxy+Cy2+2Dx+2Ey+1=0 (7)
when the number of the edge points is large, the coefficient A, B, C, D, E of the ellipse equation can be obtained by using least square ellipse fitting, and 5 parameters of the ellipse can be calculated according to the coefficient of the ellipse equation. The calculation formula is as follows:
Figure BDA0003007687260000141
Figure BDA0003007687260000142
in formulas (8) and (9): (x)0、y0) In order to fit the central coordinates of the ellipse, theta is the included angle between the major semi-axis and the x-axis of the image coordinate system. The equations (7) and (9) are transformed into the following equations:
Figure BDA0003007687260000143
the major and minor semiaxes P of the ellipse can be obtained from the parameters obtained by the equation (10)1And P2As in formula (11):
Figure BDA0003007687260000144
in order to suppress the influence of image noise and improve positioning accuracy, the boundary may be fitted multiple times. I.e. after the first fit, each boundary point is substituted into the above equation and the residual is calculated. Then, a part of points with larger residual errors are removed, and secondary ellipse fitting is carried out on the rest points. This process may be repeated several times until the mean square error is less than a certain threshold.
Next, an automatic matching of the manually coded mark with the mark point is performed. The automation of measurement is always a goal pursued by people, and the automatic matching of the manual coding mark and the mark point is a good shortcut for accelerating the measurement speed and realizing the measurement automation.
In the digital industrial photogrammetry, because the artificial mark points with regular shapes are used, the image points can be accurately positioned before matching, and the image matching only needs to determine the image points with the same name (namely the image points of the same mark point on different photos). For the characteristics of digital photogrammetry, matching based on epipolar line constraints is used more. The epipolar principle is that one or two cameras shoot the same object at different positions (camera stations) to obtain two different angle images of the measured object, called stereopair (model), and the process in the computer vision field is called binocular stereovision.
FIG. 9 shows a stereopair with object point P imaged P on picture 1 and picture 2, respectively1And p2,p1And p2Called corresponding image point (also called homonymous image point); object space point P, projection center s1And s2Three points are coplanar, this plane being called the epipolar plane (epipolar plane) of the object point P; intersection line of nuclear plane and image plane1And l2) Referred to as epipolar line (epipolar line). Obviously, the corresponding image point p1And p2Must be located at the corresponding nuclear line l1And l2The above.
Epipolar lines are an important concept in photogrammetry. In the digital industrial photogrammetry, after the rough orientation and the image point positioning are finished, the corresponding epipolar lines of the image point on other images can be obtained, the image point matching range is converted from two-dimensional matching to one-dimensional matching, and the matching speed and the matching accuracy are greatly improved.
According to the epipolar line principle, if a pixel of an object point on one picture is known, the corresponding pixel of the pixel on the other pictures is necessarily on the corresponding epipolar line. If the exact or approximate value of the slice orientation parameter is known, the corresponding epipolar line can be calculated.
Now take three camera stations as an example, as shown in FIG. 10, where S1、S2And S3For each camera lens optical center, I1、I2And I3Is an image plane, O1、O2And O3For principal point, P 'and P' are the corresponding image points of object point P, l12And l13Is the image point p' at I2And I3The corresponding epipolar line above.
Let a certain point coordinate in the image space coordinate system of the camera station 1 be (x)1,y1,z1) The coordinates of the camera 2 in the image space coordinate system are (x, y, z), the following formula holds:
Figure BDA0003007687260000151
in the above formula, (Xs)i,Ysi,Zsi) For the translation parameter (i ═ 1,2) of camera station i in the object space coordinate system, M1And M2Respectively, rotation matrices of the camera 1 and the camera 2 with respect to the object space coordinate system. In the camera station 1 image space coordinate system, S1And the coordinates of p' are known values of (0,0,0) and (x), respectively1,y1-f). Thus, according to the above formula, S can be obtained1And p' in the image space coordinate system of the camera station 2, are respectively set as (Xs)12,Ys12,Zs12) And (x)12,y12,z12)。
Because of S1P' and S2Three points are coplanar (nuclear plane), and in the image space coordinate system of the camera station 2, the plane equation can be expressed as:
Figure BDA0003007687260000161
in the camera 2, the plane equation of the image plane number is as follows:
z=-f (14)
substituting the formula (14) into the formula (13) to obtain the image point p' on the image plane I2Upper epipolar line l12The equation of (2):
Figure BDA0003007687260000162
using k as determinant value in the above formula1、k2And c, the above formula can be abbreviated as:
k1x-k2y-c·f=0 (16)
in the same way, the image point p' in the image plane I can be obtained3Upper epipolar line l13The equation of (1).
The nuclear line matching is carried out in two steps, wherein the first step is initial matching and the range of the same-name point is determined; and secondly, precisely matching to determine a unique identical-name point. The specific process is as follows:
initial matching: as shown in fig. 11, theoretically the same-name points on the shots 2, 3 should be on the corresponding epipolar lines. However, since there are errors in the camera parameters, the camera parameters and the coordinates of the image points, the homonymous points are usually offset from the epipolar line by a certain distance in the actual measurement. Thus, given a threshold value ε (ε is related to the accuracy of the initial parameter), all image points whose distance from the perpendicular to the epipolar line is less than ε are initially treated as homonymous points, i.e.
Figure BDA0003007687260000163
As shown in FIG. 11, p ", p ″21、p22、p23And p', p31、p32、p33Namely the same name points found preliminarily.
And (3) precise matching: as shown in FIG. 12, to reduce ambiguity of matching, p' is determined to be in shot I2And I3Go up a unique homologous point, will photograph I2All preliminary synonym points p ", p on21、p22、p23In the above-mentioned method on the photo I3To find the corresponding epipolar line lp3、l213、l223、l233Are respectively provided with13Intersect at a point p3″、p213、p223、p233. Then, find pp31、p32、p33And p3″、p213、p223、p233Two points with the minimum distance between the two groups of points are positioned in the photo I2And I3The upper corresponding point is I1The dotted synonyms of the p' point above. As shown in FIG. 12, the two closest points are p3"and p'", the corresponding pixel being I2P' points and I on3P ' "above, i.e., the same-name pixels of p ' are p" and p ' ".
The epipolar matching is carried out based on three camera stations, if the number of the camera stations is more than 3, the camera stations can be grouped and matched in a mode that every adjacent 3 camera stations are a group, and finally, the matching results of each group are synthesized to obtain the final result.
In addition, in the process of fast resolving the self-checking beam adjustment, the precise coordinates of the mark points on the image can be obtained by processing the digital image of the measured target shot by a professional digital measuring camera. The rough orientation of the picture can be realized by identifying the coded mark, namely, the initial value of the external orientation element (camera parameter) of the picture is determined. On the basis, the accurate coordinates of the object space point can be calculated by using the adjustment of the light beam method as long as the image points with the same name (namely the image points of the same mark point on different images) on each image are determined.
The light beam adjustment method takes the coordinates of image points as observed values, takes each light beam as a basic unit, lists basic error equations by utilizing a collinear equation, performs adjustment processing in the whole area, and performs optimization solution on internal and external parameters of a camera, the coordinates of space points and the like. If the internal parameters and distortion parameters of the camera are added to the error equation as unknown parameters, and the parameter values of the camera are obtained by calculation while the three-dimensional coordinates of the points and the external parameters of the camera are calculated, the method is called a self-calibration light beam adjustment method. The distortion parameters of the camera can be accurately checked without adding extra observation values, so that the measurement precision is improved.
In a specific implementation, in the centroid measuring method provided in the embodiment of the present invention, step S104 measures positions of suspension point perpendicular lines of the object to be measured at different suspension angles in the three-dimensional coordinate system of the object, and determines a centroid coordinate of the object to be measured according to a coordinate value of an intersection point of the two suspension point perpendicular lines, which specifically includes: a plurality of measurement characteristic points are stuck on a hanging point vertical line of an object to be measured; under different suspension angles, obtaining a space three-dimensional coordinate of a measurement characteristic point pasted on a suspension point vertical line under an object three-dimensional coordinate system through characteristic extraction and space calculation; fitting the space three-dimensional coordinates of the measurement characteristic points pasted on the hanging point vertical line by adopting a least square method to obtain a corresponding hanging point vertical line linear equation; and (4) combining two suspension point vertical line linear equations obtained by fitting, and solving to obtain the barycenter coordinate of the object to be measured.
Specifically, the centroid measurement method provided by the embodiment of the present invention is described by taking an object to be measured as a certain type of artillery as an example:
firstly, pasting a measuring mark and a coding mark on the surface of the artillery, and placing a standard ruler which is accurately calibrated at a proper position.
And step two, shooting the artillery from different positions by using a professional camera.
And thirdly, resolving the three-dimensional coordinates of the center of the measurement mark by adopting a digital photogrammetry system, and establishing the three-dimensional point cloud data of the artillery.
And step four, establishing a coordinate origin and a coordinate system of the artillery. Specifically, as shown in fig. 13, the origin of coordinates O determines the intersection of the front wheel axis and the left-right symmetry plane; x axis (ordinate): through the origin, a horizontal straight line which is vertical to the axial lead of the front wheel points to the rear; y-axis (abscissa): a front wheel axis pointing leftward facing the traveling direction; z axis (height coordinate): the straight line through the origin, perpendicular to the horizontal, points upward, and the position of the centroid within the coordinate system is denoted C. And (4) carrying out scaling, translation, rotation and affine transformation on the artillery three-dimensional point cloud data in the third step to obtain the artillery three-dimensional space coordinates in the artillery three-dimensional coordinate system.
And step five, hanging the artillery. Specifically, in order to obtain the suspension point perpendicular line more intuitively, a line adhered with a measuring mark is hung at the starting point of the suspension line, namely the suspension point, and the line is the suspension point perpendicular line of the artillery. And pasting four characteristic points on the suspension point vertical line, so as to obtain a spatial straight line by fitting the spatial coordinates of the characteristic points in the next step, namely the suspension point vertical line of the artillery required to be taken. The spatial position of the suspension point is kept unchanged in the whole suspension process. The artillery suspension state is shown in fig. 14. The artillery suspension state is shown in fig. 15 by changing the length of the side locks.
And step six, repeating the step two to the step four under different suspension angles to obtain complete images of the feature points on the artillery and the feature points on the vertical line of the suspension point, and obtaining the spatial three-dimensional coordinates of the marked feature points on the vertical line of the suspension point under the artillery coordinate system through feature extraction and spatial calculation. As shown in fig. 16, after hoisting, the spatial three-dimensional coordinates of the feature points marked on the other vertical line can be obtained, the spatial point coordinates are fitted by using the least square method to obtain a linear equation of the vertical line of the hoisting point under the three-dimensional coordinate system of the artillery, the spatial linear equations of the two vertical lines are combined, and the barycentric coordinates of the artillery are obtained by solving.
The artillery mass center measuring method based on the photogrammetry technology is simple and clear in principle and easy and convenient to operate, can automatically extract the spatial information of the shot artillery, does not need special measuring equipment such as a weighbridge and a platform, does not need to measure the weight, the size and other parameters of the artillery, and can well solve the problems of excessive measuring parameters, more measuring equipment and higher operation difficulty in the traditional artillery mass center measurement. Actual measurement shows that the measuring method can meet the precision requirement of the centroid measurement, is particularly suitable for artillery and equipment provided with a hoisting device, and is a meaningful exploration of the centroid measuring method.
The result of measuring the mass center by the platform support reaction method is used as a target reference value for comparison, and a certain reference value is achieved. From the comparison of the two methods of measurement results, it can be seen that the centroid coordinate error is within the allowable range.
Correspondingly, the embodiment of the invention also discloses a centroid measuring device, which comprises a processor and a memory; wherein the centroid measurement method disclosed in the aforementioned embodiments is implemented when the processor executes the computer program stored in the memory.
For more specific processes of the above method, reference may be made to corresponding contents disclosed in the foregoing embodiments, and details are not repeated here.
Further, the present invention also discloses a computer readable storage medium for storing a computer program; the computer program when executed by a processor implements the centroid measurement method disclosed previously.
For more specific processes of the above method, reference may be made to corresponding contents disclosed in the foregoing embodiments, and details are not repeated here.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other. The device and the storage medium disclosed by the embodiment correspond to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The method for measuring the centroid provided by the embodiment of the invention comprises the following steps: sticking a return light reflection mark on the surface of a measured object as a measurement characteristic point; shooting a measured object under the irradiation of a light source at a specific position by using a professional camera to obtain more than two quasi-binary images; processing the binary image by using a digital photogrammetric system to obtain three-dimensional point cloud data of the object to be measured and establish an object three-dimensional coordinate system; measuring the positions of the suspension point vertical lines of the object to be measured at different suspension angles in the three-dimensional coordinate system of the object, and determining the centroid coordinate of the object to be measured according to the coordinate value of the intersection point of the two suspension point vertical lines. The centroid measuring method is based on a digital photogrammetry technology, is simple and clear in principle and simple and convenient to operate, can automatically extract the space information of a shot object by adding an artificial mark with obvious characteristics on the measured object as a measuring characteristic point to assist in completing the measuring process, does not need special measuring equipment such as a platform and a weighbridge, does not need to measure the weight, the size and other parameters of the object, needs fewer parameters, can solve the problem that the measuring equipment and the measuring parameters are too many in the previous centroid height measurement, well performs high-precision measurement and object space description, and can ensure and improve the measuring precision, the reliability, the measuring efficiency and the automation degree. In addition, the invention also provides corresponding equipment and a computer readable storage medium aiming at the mass center measuring method, so that the mass center measuring method has higher practicability, and the equipment and the computer readable storage medium have corresponding advantages.
Finally, it is also noted that, in the present invention, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The centroid measuring method, the centroid measuring device and the centroid measuring storage medium provided by the invention are introduced in detail, a specific example is applied in the invention to explain the principle and the implementation of the invention, and the description of the embodiment is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A method of mass center measurement, comprising:
sticking a return light reflection mark on the surface of a measured object as a measurement characteristic point;
shooting the measured object under the irradiation of a light source at a specific position by using a professional camera to obtain more than two quasi-binary images;
processing the quasi-binary image by using a digital photogrammetry system to obtain three-dimensional point cloud data of the object to be measured and establish an object three-dimensional coordinate system;
and measuring the positions of the suspension point vertical lines of the object to be detected at different suspension angles under the three-dimensional coordinate system of the object, and determining the centroid coordinate of the object to be detected according to the coordinate value of the intersection point of the two suspension point vertical lines.
2. The centroid measuring method as claimed in claim 1, wherein said back light reflection mark is a mark point and a coded mark made of back light reflection material; the mark points are positioned at the periphery of the coding mark;
the distance between every two mark points is measured by a reference ruler;
each coding mark has unique digital coding information which is used as a common point between different images so as to enable the quasi-binary image to be automatically spliced.
3. The centroid measurement method according to claim 2, wherein said index point is a circular index point; the coding mark is a dot coding mark; the coding mark determines the coding through different positions of the coding point in a predesigned coordinate system;
the light source is an annular flash lamp; and the optical axis of the annular flash lamp is coaxial with the optical axis of the professional camera lens.
4. The centroid measurement method according to claim 3, wherein the processing of the quasi-binary image to obtain the three-dimensional point cloud data of the object to be measured specifically comprises:
identifying and positioning the coding mark from the quasi-binary image, and determining the image point coordinates of the mark point;
performing image matching and splicing on the quasi-binary image according to the image point coordinates of the coding mark and the mark point;
and resolving to obtain the three-dimensional point cloud data of the object to be detected by using a light beam adjustment method.
5. The centroid measurement method as claimed in claim 4, wherein before identifying and locating said coded mark from said quasi-binary image, further comprising:
adopting a Canny operator to carry out edge detection on the quasi-binary image;
performing edge tracking on the quasi-binary image with the boundary information obtained after the edge detection;
and judging the tracked edges, eliminating false edges or non-mark edges, and extracting the image point edges of the coding points and the image point edges of the mark points.
6. The centroid measurement method according to claim 5, wherein identifying the code flag from the quasi-binary image specifically comprises:
finding image points of the template points in the quasi-binary image;
restoring the found image points to the coordinates in the pre-designed coordinate system through affine transformation, and simultaneously solving affine transformation parameters;
restoring the image points of the encoding points around the template point by using the affine transformation parameters;
and decoding the coding points and identifying the coding of the coding marks.
7. The centroid measurement method according to claim 6, wherein locating the coded mark from the quasi-binary image and determining the image point coordinates of the mark point specifically comprises:
determining the image point center coordinates of the coding points by adopting a least square method according to the image point edges of the coding points;
and determining the image point center coordinates of the mark points according to the center coordinates of the coding points and by combining the position relationship between the coding marks and the mark points.
8. The centroid measurement method according to claim 7, wherein the positions of the suspension point vertical lines of the object to be measured at different suspension angles in the three-dimensional coordinate system of the object are measured, and the centroid coordinate of the object to be measured is determined according to the coordinate value of the intersection point of the two suspension point vertical lines, and specifically comprises:
a plurality of measurement characteristic points are adhered to the suspension point vertical line of the object to be measured;
under different suspension angles, obtaining the space three-dimensional coordinates of the measurement characteristic points pasted on the perpendicular line of the suspension point under the object three-dimensional coordinate system through characteristic extraction and space calculation;
fitting the space three-dimensional coordinates of the measurement characteristic points pasted on the hanging point vertical line by adopting a least square method to obtain a corresponding hanging point vertical line linear equation;
and combining two fitted suspension point vertical line linear equations, and solving to obtain the barycenter coordinate of the object to be detected.
9. A centroid measuring apparatus comprising a processor and a memory, wherein said processor when executing a computer program stored in said memory implements the centroid measuring method as claimed in any one of claims 1 to 8.
10. A computer-readable storage medium for storing a computer program, wherein the computer program, when executed by a processor, implements the centroid measurement method as claimed in any one of claims 1 to 8.
CN202110366315.8A 2021-04-06 2021-04-06 Method, device and storage medium for measuring mass center Pending CN113049184A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110366315.8A CN113049184A (en) 2021-04-06 2021-04-06 Method, device and storage medium for measuring mass center

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110366315.8A CN113049184A (en) 2021-04-06 2021-04-06 Method, device and storage medium for measuring mass center

Publications (1)

Publication Number Publication Date
CN113049184A true CN113049184A (en) 2021-06-29

Family

ID=76517602

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110366315.8A Pending CN113049184A (en) 2021-04-06 2021-04-06 Method, device and storage medium for measuring mass center

Country Status (1)

Country Link
CN (1) CN113049184A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113358325A (en) * 2021-07-02 2021-09-07 中国空气动力研究与发展中心低速空气动力研究所 Wind tunnel object throwing position and posture measuring method, device, equipment and storage medium
CN113884234A (en) * 2021-09-06 2022-01-04 中国科学院合肥物质科学研究院 Complementary single-pixel centroid detection system and method
CN114001860A (en) * 2021-10-13 2022-02-01 中信重工机械股份有限公司 Method for measuring mass center of large plate component in non-contact manner
CN114440834A (en) * 2022-01-27 2022-05-06 中国人民解放军战略支援部队信息工程大学 Object space and image space matching method of non-coding mark
CN118134995A (en) * 2024-04-30 2024-06-04 成都飞机工业(集团)有限责任公司 Method, device, equipment and storage medium for arranging measurement points of long and narrow patches

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5034896A (en) * 1990-01-26 1991-07-23 The Boeing Company Method and apparatus for real time estimation of aircraft center of gravity
EP0488292A2 (en) * 1990-11-29 1992-06-03 Matsushita Electric Industrial Co., Ltd. Three-dimensional shape data reading system
JPH0564749U (en) * 1992-02-10 1993-08-27 本田技研工業株式会社 Three-dimensional center of gravity measuring device
JP2009270915A (en) * 2008-05-07 2009-11-19 Kagawa Univ Method and device for measuring three-dimensional shape
CN101995219A (en) * 2010-11-05 2011-03-30 天津工业大学 Three-point coding mark point based method for measuring key points of vehicle frame
JP2012002558A (en) * 2010-06-15 2012-01-05 Yamato Scale Co Ltd Gravity center position measurement method and device therefor
CN102359847A (en) * 2011-09-01 2012-02-22 中联重科股份有限公司 Determining method of height of gravity center of object and determining device
CN102359846A (en) * 2011-09-01 2012-02-22 中联重科股份有限公司 Object gravity-center height measuring method and measuring device thereof
CN103310215A (en) * 2013-07-03 2013-09-18 天津工业大学 Detecting and identifying method for annular coding mark point
JP2013228334A (en) * 2012-04-26 2013-11-07 Topcon Corp Three-dimensional measuring system, three-dimensional measuring method and three-dimensional measuring program
CN104154877A (en) * 2014-09-03 2014-11-19 中国人民解放军国防科学技术大学 Three-dimensional reconstruction and size measurement method of complex convex-surface object
CN104458124A (en) * 2014-11-27 2015-03-25 江西洪都航空工业集团有限责任公司 Barycenter measuring method
CN104930985A (en) * 2015-06-16 2015-09-23 大连理工大学 Binocular vision three-dimensional morphology measurement method based on time and space constraints
CN110715769A (en) * 2019-10-23 2020-01-21 浙江理工大学 Method for calibrating stress point position of weighing sensor of multi-point method centroid measuring equipment
CN112362238A (en) * 2020-11-13 2021-02-12 中国第一汽车股份有限公司 Gravity center measuring method

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5034896A (en) * 1990-01-26 1991-07-23 The Boeing Company Method and apparatus for real time estimation of aircraft center of gravity
EP0488292A2 (en) * 1990-11-29 1992-06-03 Matsushita Electric Industrial Co., Ltd. Three-dimensional shape data reading system
JPH0564749U (en) * 1992-02-10 1993-08-27 本田技研工業株式会社 Three-dimensional center of gravity measuring device
JP2009270915A (en) * 2008-05-07 2009-11-19 Kagawa Univ Method and device for measuring three-dimensional shape
JP2012002558A (en) * 2010-06-15 2012-01-05 Yamato Scale Co Ltd Gravity center position measurement method and device therefor
CN101995219A (en) * 2010-11-05 2011-03-30 天津工业大学 Three-point coding mark point based method for measuring key points of vehicle frame
CN102359847A (en) * 2011-09-01 2012-02-22 中联重科股份有限公司 Determining method of height of gravity center of object and determining device
CN102359846A (en) * 2011-09-01 2012-02-22 中联重科股份有限公司 Object gravity-center height measuring method and measuring device thereof
JP2013228334A (en) * 2012-04-26 2013-11-07 Topcon Corp Three-dimensional measuring system, three-dimensional measuring method and three-dimensional measuring program
CN103310215A (en) * 2013-07-03 2013-09-18 天津工业大学 Detecting and identifying method for annular coding mark point
CN104154877A (en) * 2014-09-03 2014-11-19 中国人民解放军国防科学技术大学 Three-dimensional reconstruction and size measurement method of complex convex-surface object
CN104458124A (en) * 2014-11-27 2015-03-25 江西洪都航空工业集团有限责任公司 Barycenter measuring method
CN104930985A (en) * 2015-06-16 2015-09-23 大连理工大学 Binocular vision three-dimensional morphology measurement method based on time and space constraints
CN110715769A (en) * 2019-10-23 2020-01-21 浙江理工大学 Method for calibrating stress point position of weighing sensor of multi-point method centroid measuring equipment
CN112362238A (en) * 2020-11-13 2021-02-12 中国第一汽车股份有限公司 Gravity center measuring method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
上官文斌;贺良勇;田子龙;黄兴;徐驰: "汽车动力总成质心与惯性参数测试实验台的开发", 振动工程学报, no. 02, 15 April 2010 (2010-04-15) *
李沛;罗武胜;李冠章;: "一种基于双目视觉原理的飞机尾旋运动姿态测量方法", 国防科技大学学报, no. 02 *
王志军;马凯;: "基于成像角度的特征点质心提取精度研究", 激光杂志, no. 04 *
申茂盛;: "基于双目视觉的质心定位算法的研究", 软件, no. 12, pages 69 - 74 *
马开锋著: "卫星天线形面热变形近景摄影测量与数据处理", 30 November 2018, 北京:地质出版社, pages: 34 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113358325A (en) * 2021-07-02 2021-09-07 中国空气动力研究与发展中心低速空气动力研究所 Wind tunnel object throwing position and posture measuring method, device, equipment and storage medium
CN113884234A (en) * 2021-09-06 2022-01-04 中国科学院合肥物质科学研究院 Complementary single-pixel centroid detection system and method
CN113884234B (en) * 2021-09-06 2023-06-09 中国科学院合肥物质科学研究院 Complementary single-pixel centroid detection system and method
US11940348B1 (en) 2021-09-06 2024-03-26 Hefei Institutes Of Physical Science, Chinese Academy Of Sciences System and method for detecting centroid of complementary single pixel
CN114001860A (en) * 2021-10-13 2022-02-01 中信重工机械股份有限公司 Method for measuring mass center of large plate component in non-contact manner
CN114001860B (en) * 2021-10-13 2023-09-15 中信重工机械股份有限公司 Non-contact type method for measuring mass center of large plate member
CN114440834A (en) * 2022-01-27 2022-05-06 中国人民解放军战略支援部队信息工程大学 Object space and image space matching method of non-coding mark
CN114440834B (en) * 2022-01-27 2023-05-02 中国人民解放军战略支援部队信息工程大学 Object space and image space matching method of non-coding mark
CN118134995A (en) * 2024-04-30 2024-06-04 成都飞机工业(集团)有限责任公司 Method, device, equipment and storage medium for arranging measurement points of long and narrow patches

Similar Documents

Publication Publication Date Title
CN113049184A (en) Method, device and storage medium for measuring mass center
CN106651942B (en) Three-dimensional rotating detection and rotary shaft localization method based on characteristic point
CN109801333B (en) Volume measurement method, device and system and computing equipment
CN111123242B (en) Combined calibration method based on laser radar and camera and computer readable storage medium
US20080205748A1 (en) Structural light based depth imaging method and system using signal separation coding, and error correction thereof
CN111709985B (en) Underwater target ranging method based on binocular vision
CN113324478A (en) Center extraction method of line structured light and three-dimensional measurement method of forge piece
CN104376328B (en) Coordinate-based distributed coding mark identification method and system
CN112184765A (en) Autonomous tracking method of underwater vehicle based on vision
CN116563377A (en) Mars rock measurement method based on hemispherical projection model
CN113963067B (en) Calibration method for calibrating large-view-field visual sensor by using small target
CN113888641A (en) Stumpage breast diameter measurement method based on machine vision and deep learning
CN115790539B (en) Cooperative target underwater photogrammetry method
CN114596355B (en) High-precision pose measurement method and system based on cooperative targets
CN109815966A (en) A kind of mobile robot visual odometer implementation method based on improvement SIFT algorithm
CN107131889B (en) Full-angle imaging reference ruler
Frangione et al. Multi-step approach for automated scaling of photogrammetric micro-measurements
CN110345866B (en) Measuring device and method for hole measurement of handheld scanner
CN115330832A (en) Computer vision-based transmission tower full-freedom displacement monitoring system and method
CN113095324A (en) Classification and distance measurement method and system for cone barrel
JP4546155B2 (en) Image processing method, image processing apparatus, and image processing program
CN113192029A (en) Welding seam identification method based on ToF
Zhang et al. Automatic Extrinsic Parameter Calibration for Camera-LiDAR Fusion using Spherical Target
CN114897968B (en) Method and device for determining vehicle vision, computer equipment and storage medium
Bernat et al. Automation of measurements of selected targets of photopoints in application to photogrammetric reconstruction of road accidents

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination