CN116051792A - Product graph generation method, device, computer equipment and storage medium - Google Patents

Product graph generation method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN116051792A
CN116051792A CN202310106894.1A CN202310106894A CN116051792A CN 116051792 A CN116051792 A CN 116051792A CN 202310106894 A CN202310106894 A CN 202310106894A CN 116051792 A CN116051792 A CN 116051792A
Authority
CN
China
Prior art keywords
measurement
detection frame
depth image
detection
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310106894.1A
Other languages
Chinese (zh)
Inventor
肖寒
何苗
莫宇
刘枢
吕江波
沈小勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Smartmore Technology Co Ltd
Original Assignee
Shenzhen Smartmore Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Smartmore Technology Co Ltd filed Critical Shenzhen Smartmore Technology Co Ltd
Priority to CN202310106894.1A priority Critical patent/CN116051792A/en
Publication of CN116051792A publication Critical patent/CN116051792A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The application discloses a method, a device, a computer device and a storage medium for generating a product graph, wherein the method comprises the following steps: respectively acquiring a first depth image corresponding to a calibration block under a first measurement view angle and a second depth image corresponding to a second measurement view angle, wherein the first measurement view angle is different from the second measurement view angle; performing first frame selection processing on the first depth image according to the first measurement visual angle to obtain a first detection frame, performing second frame selection processing on the second depth image according to the second measurement visual angle to obtain a second detection frame, wherein the first frame selection processing is different from the second frame selection processing; and determining a point cloud measurement map of the product to be measured, which corresponds to the calibration block, based on the point set included in the first detection frame and the point set included in the second detection frame. By adopting the method and the device, the product can be measured under multiple measuring visual angles, and the higher measuring precision is achieved in the application scene.

Description

Product graph generation method, device, computer equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method and apparatus for generating a product map, a computer device, and a storage medium.
Background
With the development of product measurement technology, a method for measuring attribute characteristics of a product to be measured by using 3D measurement is widely applied to actual production. Wherein the attribute features may be 3D size, flatness, etc.
In practical applications, the 3D measurement of the product to be measured is usually completed by using the calibration block and the camera together. The calibration block is a standard sample piece designed in advance according to a product to be measured. In 3D measurement, a reference coordinate system is established by taking a camera as a reference, and the coordinate parameters of a given measurement position are measured by the camera based on the reference coordinate system. Generally, in 3D measurement, a plurality of acquisition cameras are required to acquire a plurality of depth images of a calibration block, so that the depth images acquired by the acquisition cameras are required to be converted into a uniform world coordinate system, so as to obtain coordinate parameters of each measurement position in the world coordinate system, so as to determine attribute characteristics of a product to be measured.
However, the 3D measurement method in the related art has low measurement accuracy and cannot be adapted to diversified 3D measurement items.
Disclosure of Invention
In order to solve the technical problems, the application provides a method, a device, a computer device and a storage medium for generating a product graph, which can have higher measurement precision in an application scene of measuring products under multiple measurement view angles.
In a first aspect, an embodiment of the present application provides a method for generating a product map, including:
respectively acquiring a first depth image corresponding to a calibration block under a first measurement view angle and a second depth image corresponding to a second measurement view angle; the first measurement viewing angle is different from the second measurement viewing angle;
performing first frame selection processing on the first depth image according to the first measurement visual angle to obtain a first detection frame, and performing second frame selection processing on the second depth image according to the second measurement visual angle to obtain a second detection frame; the first framing process is different from the second framing process;
and determining a point cloud measurement map of the product to be measured, which corresponds to the calibration block, based on the point set included in the first detection frame and the point set included in the second detection frame.
In a second aspect, an embodiment of the present application provides a product map generating apparatus, including:
the acquisition unit is used for respectively acquiring a first depth image corresponding to the calibration block under a first measurement view angle and a second depth image corresponding to the calibration block under a second measurement view angle; the first measurement viewing angle is different from the second measurement viewing angle;
the processing unit is used for carrying out first frame selection processing on the first depth image according to the first measurement visual angle to obtain a first detection frame, and carrying out second frame selection processing on the second depth image according to the second measurement visual angle to obtain a second detection frame; the first framing process is different from the second framing process;
The determining unit is used for determining a point cloud measuring graph of the product to be measured, which corresponds to the calibration block, based on the point set included in the first detection frame and the point set included in the second detection frame.
In a third aspect, an embodiment of the present application provides a computer device, where the computer device includes a processor and a memory, where the memory stores a computer program, and where the processor implements a product graph generating method as described above when executing the computer program.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium, where a computer program is stored, where the computer program when executed by a processor implements a product graph generating method as described above.
It can be seen that the embodiment of the application provides a product graph generating method, which includes that first, a first depth image corresponding to a calibration block under a first measurement view angle and a second depth image corresponding to a second measurement view angle are respectively obtained; wherein the first measurement viewing angle and the second measurement viewing angle are different. Because the calibration block has different characteristic performances under different measurement visual angles, different frame selection processing is selected for the depth images under different measurement visual angles, so that the detection frames corresponding to the measurement visual angles are matched with the characteristic performances of the calibration block under the measurement visual angles, and therefore, the detection frames can more accurately identify the areas of the calibration block in the measurement visual angles, and the subsequent measurement process is facilitated. Based on the method, the device and the system, the product can be measured under multiple measuring angles, and the device and the system have higher measuring precision in an application scene. And finally, determining a point cloud measuring graph of the product to be measured corresponding to the calibration block by utilizing the point set included in the first detection frame and the point set included in the second detection frame, so that the attribute characteristics of the product to be measured can be determined by utilizing the point cloud measuring graph.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a method for generating a product graph according to an embodiment of the present application;
fig. 2a is a schematic diagram of a binary image corresponding to a first depth image according to an embodiment of the present application;
fig. 2b is a schematic diagram of a first detection frame according to an embodiment of the present application;
fig. 3a is a schematic diagram of a binarized image corresponding to a second depth image according to an embodiment of the present application;
FIG. 3b is a schematic diagram of a pending binarized image according to an embodiment of the present application;
FIG. 3c is a schematic diagram of a second detection frame according to an embodiment of the present disclosure;
FIG. 4a is a schematic diagram of a top view of a calibration block according to an embodiment of the present application;
FIG. 4b is a schematic diagram of a side view of a calibration block provided in an embodiment of the present application;
FIG. 4c is a schematic diagram of a partial view of a calibration assembly in a calibration block according to an embodiment of the present application;
fig. 5a is a schematic layout diagram of a shooting position according to an embodiment of the present application;
FIG. 5b is a schematic diagram of an original depth image according to an embodiment of the present disclosure;
fig. 5c is a schematic diagram of a point cloud measurement diagram of a full view angle of a product according to an embodiment of the present application;
FIG. 6 is a block diagram of a product graph generating device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a computer readable storage medium according to an embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will clearly and completely describe the technical solution in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
In 3D measurement, the measurement of the product to be measured is typically done with a calibration block. Generally, a plurality of measurement view angles are set, an acquisition camera is installed at each measurement view angle, further, depth images under different measurement view angles can be acquired through the acquisition camera, and finally, after the depth images under different measurement view angles are processed and subjected to coordinate conversion, a point cloud measurement map of a product to be measured can be obtained, and 3D measurement is completed. In practical application, the attribute characteristics of the product to be measured, such as the 3D size, flatness, etc., of the product to be measured can be determined by using the point cloud measurement map. The point cloud measurement map may be an image formed by ordered points obtained after coordinate transformation based on the depth image.
In practical applications, the calibration block has different characteristic performances under different measurement angles, for example, the calibration block generally shows more regular characteristics under a first measurement angle such as a top measurement angle, and the calibration block generally shows less regular characteristics under a second measurement angle such as an inner measurement angle and an outer measurement angle, which is mainly because each surface of the calibration block can generally comprise two types of planes and convex surfaces, the plane performances are more regular, and the convex surfaces are not regular.
However, in an application scenario of measuring a product under multiple measurement view angles, a processing mode in a related technology is adopted, so that a detection frame determined by depth images acquired under different measurement view angles cannot well conform to characteristic performance conditions under each measurement view angle, and therefore the determined detection frame cannot accurately identify the area of a calibration block in each measurement view angle, and further the accuracy of a point cloud measurement map obtained by processing the depth images and converting coordinates by using the detection frame is low.
Therefore, the method, the device, the computer equipment and the storage medium for generating the product graph select different frame selection processes for the depth images under different measurement view angles, so that the detection frames corresponding to the measurement view angles are matched with the characteristic performance of the calibration blocks under the measurement view angles, and therefore, the detection frames can accurately identify the areas of the calibration blocks in the measurement view angles, and the measurement accuracy is improved.
The method for generating the product graph provided by the embodiment of the application can be implemented by computer equipment, and the computer equipment can be terminal equipment or a server; the server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud computing service. The terminal equipment comprises, but is not limited to, mobile phones, computers, intelligent voice interaction equipment, intelligent household appliances, vehicle-mounted terminals and the like. The terminal device and the server may be directly or indirectly connected through wired or wireless communication, which is not limited in this application.
The following examples are provided to illustrate the invention:
fig. 1 is a flowchart of a method for generating a product graph according to an embodiment of the present application, taking a server as an example of the foregoing computer device, where the method includes:
s101: and respectively acquiring a first depth image corresponding to the calibration block under a first measurement view angle and a second depth image corresponding to the calibration block under a second measurement view angle.
In practical applications, for a product to be measured, 3D measurement of the product to be measured is generally completed by means of building calibration blocks. In one possible implementation, the calibration block may be constructed as follows: firstly, a plurality of target calibration components can be determined from a plurality of to-be-calibrated components according to morphological properties and measurement requirements of a product to be measured, and then the plurality of target calibration components are assembled to obtain a calibration block. The to-be-calibrated assembly can be a pre-produced standard sample; the morphological properties of the product to be measured are used to reflect the appearance shape (such as prismatic table shape) of the product to be measured, and the measurement requirement is determined according to the service, such as only the 3D size needs to be measured, and the 3D size and the flatness need to be measured. Based on the method, different products to be measured can be directly assembled into the calibration block matched with the current product to be measured by utilizing the pre-produced standard sample, so that the corresponding calibration block is not required to be produced for the current product to be measured independently, the reusability of the calibration assembly can be improved, and the cost is reduced.
In an application scenario in which a product is measured at multiple measurement viewing angles, a first measurement viewing angle and a second measurement viewing angle may be first determined, the first measurement viewing angle being different from the second measurement viewing angle. The measurement view angle when the characteristics of the calibration block in the measurement view angle are more regular may be taken as a first measurement view angle, and the measurement view angle when the characteristics of the calibration block in the measurement view angle are irregular may be taken as a second measurement view angle. For example, the first measurement view may be a top measurement view, i.e. the acquisition camera at the first measurement view is located on top of the calibration block. The second measuring view angle may be an outside measuring view angle, i.e. the acquisition camera at the second measuring view angle is located outside the calibration block.
It should be noted that the number of measurement views included in the first measurement view and the second measurement view is not limited in any way. For example, the second measurement viewing angle may include an inner measurement viewing angle, etc., in addition to the outer measurement viewing angle. For another example, the second measurement view angle may further include an inner arc angle measurement view angle, an outer arc angle measurement view angle, and the like, and specifically, the measurement view angles and the number included in the first measurement view angle and the second measurement view angle may be determined according to the current measurement requirement and the characteristic performance of the calibration block under each measurement view angle.
In practical application, the depth image is also called as a distance image, which may refer to the distance between each point in the scene and the acquisition camera, and may be used as the Z-direction coordinate of each point in the current camera coordinate system, and the depth image may be calculated as point cloud data by means of coordinate transformation. Thus, first, a first depth image of the calibration block at a first measuring angle of view and a second depth image of the calibration block at a second measuring angle of view may be acquired, respectively. The first depth image may be acquired by a first acquisition camera installed at a first measurement viewing angle, and the second depth image may be acquired by a second acquisition camera installed at a second measurement viewing angle. The first acquisition camera and the second acquisition camera may be depth cameras, such as line scan depth cameras.
Based on the above, the depth values of the points in the first depth image are used for representing the distance between the points on the surface of the calibration block in the first measurement view angle and the first acquisition camera, and the distance between the points and the first acquisition camera can be used as the Z-direction coordinate of the points in the camera coordinate system of the first acquisition camera. And the second depth image is similar, and the first depth image and the second depth image can be used for determining a point cloud measuring image of the product to be measured.
S102: and performing first frame selection processing on the first depth image according to the first measurement visual angle to obtain a first detection frame, and performing second frame selection processing on the second depth image according to the second measurement visual angle to obtain a second detection frame.
It can be understood that the first depth image and the second depth image are original depth images acquired by the first acquisition camera and the second acquisition camera, respectively, and in the actual acquisition process, since the acquisition camera has a certain distance from the calibration block, that is, a part of the area near the calibration block is included in the field of view of the acquisition camera besides the calibration block. Therefore, a detection frame needs to be determined from the original depth image, and is used for identifying the area where the calibration block is located. In specific implementation, a first frame selection process may be performed on the first depth image according to the first measurement view angle to obtain a first detection frame, and a second frame selection process may be performed on the second depth image according to the second measurement view angle to obtain a second detection frame. Wherein the first framing process is different from the second framing process. Based on the method, different frame selection processes can be selected for the depth images under different measurement view angles, so that the detection frames under each measurement view angle are matched with the characteristic performance of the calibration block under the measurement view angle, and therefore, each detection frame can more accurately represent the area of the calibration block in each measurement view angle.
In the original depth image, the pixel values of each point are within a certain interval, and the pixel values of each point are generally different. In practical applications, the range of the pixel value distribution of each point in the original depth image is mainly related to the type of the acquisition camera and the storage mode of the depth image, and different storage modes correspond to different range ranges. Common storage modes include Uint16, int16, clout32, double, etc., for example, int16 corresponds to a range of [ -2 ] -15 ,2 15 -1]The range of Clout32 is [ -2 -31 ,2 31 -1]. In practical application, the corresponding acquisition camera model and storage mode can be flexibly selected according to the actual acquisition requirements (such as acquisition precision requirements) and the like. In the binarized image, the pixel value of each point is only 0 or 255, so that compared with the original depth image, the method is simpler and more convenient to determine the detection frame. Thus, to facilitate determination of the detection box, in one possible implementation, one mayThe first detection frame can be determined directly according to the pixel values of each point in the binary image corresponding to the first depth image because the characteristic of the calibration block under the first measurement view angle is relatively regular. For the second depth image, binarization processing can be performed on the second depth image according to the second measurement view angle to obtain a binarized image corresponding to the second depth image, and because the characteristics of the calibration block under the second measurement view angle are irregular, concave region detection can be performed on the binarized image corresponding to the second depth image to obtain a target concave region, then the target concave region in the binarized image corresponding to the second depth image is removed to obtain a to-be-determined binarized image, and finally a second detection frame is determined according to the pixel values of each point in the to-be-determined binarized image. Based on the method, the target concave region existing in the binarization graph due to irregular characteristic expression of the calibration block can be removed, so that interference of the target concave region on determining the second detection frame is avoided, and the accuracy of the second detection frame is improved.
It should be noted that, the specific implementation manner of the binarization process is not limited in any way. For example, a thresholding binarization processing mode is adopted, specifically, a binarization threshold value is determined based on an original depth image, and for points in the original depth image, the pixel value of which is greater than or equal to the binarization threshold value, the points can be considered as effective areas, so that the pixel value of the points is converted into 255; for points in the original depth image, the pixel values of which are smaller than the binarization threshold value, the points can be regarded as invalid areas, and therefore the pixel values of the points are converted into 0, and the binarization processing of the depth image is realized. In this embodiment of the present application, the effective area may be an area where the index block is located, and the ineffective area may be other areas that are outside the index block and appear in the field of view of the acquisition camera.
Based on the idea of binarization processing, in a possible implementation manner, removing the target concave area in the binarized image corresponding to the second depth image to obtain a specific implementation manner of the undetermined binarized image may include: performing pixel value inversion processing on the pixel value at the midpoint of the target concave region to obtain a to-be-determined binarized image; since the pixel value of the point in the target pit area in the binarized image is 255, but the target pit area is actually an invalid area, the pixel value of the point in the target pit area can be converted from 255 to 0 in the pixel value inversion processing, and after completion, a pending binarized image can be obtained.
It will be appreciated that a plurality of points of the same pixel value are distributed consecutively to form a block region. Accordingly, the area formed by the dots with the pixel value of 0 is an invalid area, and the area formed by the dots with the pixel value of 255 is an effective area. In one possible implementation manner, when determining the first detection frame according to the pixel values of each point in the binarized image corresponding to the first depth image, a plurality of first areas to be determined may be generated by first using the point in the binarized image corresponding to the first depth image, where the pixel value is the target pixel value; the target pixel value is 255 pixels in the binarization process. For example, taking the first measurement view as the top measurement view as an example, the binarized image corresponding to the first depth image may be shown in fig. 2 a; the white area is the first area to be determined, and the pixel value at the midpoint of the area is 255; the black region is an invalid region, and the pixel value at the midpoint of the partial region is 0. In practical applications, there may be a noise point or the like, so that there may be a partial interference area in the first waiting area, and generally, the area of the interference area is smaller than that of the real effective area. Therefore, the target first waiting area can be selected from the plurality of first waiting areas according to the area sizes of the plurality of first waiting areas, for example, a rectangular area with the area Top-k is selected as the target first waiting area. Based on the above, the first target area to be determined obtained after the interference area is filtered is the final effective area, and finally the first detection frame can be determined according to the area boundary of the first target area to be determined.
Similarly, when determining the second detection frame according to the pixel values of each point in the to-be-determined binarized image, a plurality of second to-be-determined areas may be generated by first using the points in the to-be-determined binarized image where the pixel values are the target pixel values. For example, taking the second measurement view angle as the outer measurement view angle as an example, the binarized image corresponding to the second depth image may be shown in fig. 3a, and the undetermined binarized image obtained after performing concave region detection on the binarized image and removing the target concave region may be shown in fig. 3 b; the white area is the second area to be determined, and the pixel value at the midpoint of the area is 255; the black region is an invalid region, and the pixel value at the midpoint of the partial region is 0. And then, the target second undetermined area can be screened from the second undetermined areas according to the area sizes of the second undetermined areas, for example, a rectangular area with the area Top-r is selected as the target second undetermined area. Based on the above, the target second undetermined area obtained after the interference area is removed is the final effective area, and finally, the second detection frame can be determined according to the area boundary of the target second undetermined area.
Note that, the present application is not limited to any particular configuration of k and r. For example, for a calibration block assembled from a plurality of target calibration components, k and r may be set according to the number of target calibration components that the calibration block exhibits at a particular measurement perspective.
It should be further noted that, the present application is not limited in any way for the specific embodiment of determining the first detection frame according to the region boundary of the first target region to be determined and determining the second detection frame according to the region boundary of the second target region to be determined. For ease of understanding, the present application provides the following two ways of determination as examples:
because the first target area to be determined and the second target area to be determined are used for representing the real effective area, in one possible implementation manner, the area boundary of the first target area to be determined may be directly used as the first detection frame, and the area boundary of the second target area to be determined may be directly used as the second detection frame. Based on this, the detection frame at each measurement view angle can be quickly determined.
In another possible implementation manner, the area range can be enlarged appropriately on the basis of the area boundaries of the first target area to be determined and the second target area to be determined, so that the calibration block is ensured to be completely contained in the detection frame. In specific implementation, the region boundary of the first target region to be determined can be expanded, and the expanded region boundary is used as the first detection frame. And similarly, performing expansion processing on the region boundary of the second undetermined region of the target, and taking the expanded region boundary as a second detection frame. Based on this, the first detection frame identifies a region that is larger than the target first area to be determined, and the second detection frame identifies a region that is larger than the target second area to be determined, so as to completely contain the calibration block within the detection frame.
In practical applications, the measurement engineer may determine the magnitude of the expansion process according to empirical values, while avoiding excessive expansion, and expanding the invalid region into the detection frame.
It should be noted that, the number of the first detection frames and the second detection frames is not limited in this application. Generally, regarding the calibration block and the measurement view angle, for example, taking the foregoing fig. 2a as an example, the first detection frame obtained after expansion may be shown in fig. 2b, and includes three first detection frames. Taking the foregoing fig. 3b as an example, the second detection frame obtained after the expansion process may be shown in fig. 3c, and includes a second detection frame. That is, three detection frames are determined at the top measurement view angle, and one detection frame is determined at the outside measurement view angle. That is, in this embodiment, k=3 and r=1 as described above.
S103: and determining a point cloud measurement map of the product to be measured, which corresponds to the calibration block, based on the point set included in the first detection frame and the point set included in the second detection frame.
The first detection frame comprises a point set used for representing the surface points of the calibration block under the first measurement visual angle, and the second detection frame comprises a point set used for representing the surface points of the calibration block under the second measurement visual angle.
In practical application, plane detection can be performed on a point set included in a first detection frame to obtain M first fitting planes, and plane detection can be performed on a point set included in a second detection frame to obtain M second fitting planes; wherein M is a positive integer. Further, the corner calculation can be performed by using M first fitting planes to obtain N first corner points, and the corner calculation can be performed by using M second fitting planes to obtain N second corner points; wherein N is a positive integer; the N first corner points obtained based on this may be regarded as vertices of the calibration block at the first measurement view angle, and the N second corner points may be regarded as vertices of the calibration block at the second measurement view angle. And finally, determining a point cloud measuring graph of the product to be measured corresponding to the calibration block by utilizing the N first corner points and the N second corner points.
It should be noted that, the present application is not limited in any way for the specific embodiment of performing planar detection by using the point set included in the detection frame. For ease of understanding, embodiments of the present application provide a Two-stage plane detection algorithm as an example:
in the first stage, plane detection can be performed on the point set included in the first detection frame, so as to obtain M first detection planes. In a specific implementation, for example, a RASAC plane detection algorithm may be used to detect a plurality of first planes formed by a point set included in the first detection frame, and when the remaining points in the point set included in the first detection frame are too few or a plane is detected to be no longer present, the detection may be stopped, and then a plane with an area arranged in the first M bits may be selected from the plurality of first planes according to the area size as the first detection plane. Similarly, plane detection can be performed on the point set included in the second detection frame, so as to obtain M second detection planes.
In the second stage, performing corrosion operation on each first detection plane of the M first detection planes detected in the first stage to obtain M first to-be-determined planes, removing plane boundaries and peripheral abnormal values through corrosion operation based on the M first to-be-determined planes, and achieving the effect of denoising and smoothing, so that the first to-be-determined planes have higher precision than the first detection planes. In the implementation, the etching operation may be performed on each first detection plane first to obtain a corresponding mask, where a plane formed by the point set included in the mask is a corresponding first plane to be determined. In order to further improve the precision, fitting processing can be performed by using a point set included in each first plane to be determined in the M first planes to obtain M first fitting planes. That is, a corresponding first fitting plane is obtained by re-fitting with the point sets in each first plane to be determined. Similarly, each second detection plane of the M second detection planes may be subjected to corrosion operation to obtain M second to-be-fixed planes, and then fitting processing is performed by using a point set included in each second to-be-fixed plane of the M second to-be-fixed planes, so as to obtain M second fitting planes. In the fitting process, the average vertical distance between the point in each plane to be fixed and the corresponding fitting plane is the smallest, and the fitting plane obtained based on the average vertical distance has higher flatness, so that the subsequent measuring precision is improved.
Note that, the present application is not limited to the settings of M and N. For example, m=6 and n=8 may be set according to actual measurement requirements, morphological properties of the product to be measured, and the like.
It will be appreciated that the first corner point may be considered as the vertex of the calibration block at the first measurement view angle, the second corner point may be considered as the vertex of the calibration block at the second measurement view angle, and the vertex may be the intersection of any three surfaces in the calibration block. For easy understanding, in this embodiment of the present application, taking m=6 and n=8 as examples, a specific implementation manner of calculating corner points by using M fitting planes to obtain N corner points is described:
the cosine values can be calculated by utilizing the normal vector of any two planes in the 6 fitting planes, and the two fitting planes with the largest cosine values are taken as the top surface and the bottom surface, so that the fitting planes which are more likely to be the top surface and the bottom surface can be screened from the 6 fitting planes. For the remaining 4 fitting planes, the order may be clockwise or counterclockwise depending on the coordinates of the center point of each fitting plane, based on which the 6 fitting planes may constitute a cube. Further, the intersection points of the three fitting planes can be sequentially obtained according to the sequence, and then the ordered 8 corner points can be obtained. The coordinates of the center point of the fitting plane refer to the coordinates under the camera coordinate system of the corresponding acquisition camera.
It can be understood that, for the process of obtaining N first corner points by using M first fitting planes to perform corner calculation and obtaining N second corner points by using M second fitting planes to perform corner calculation, the above manner of obtaining corner points may be referred to, and will not be described in detail here.
It will be appreciated that different measurement perspectives have different camera coordinate systems, and that in order to be able to evaluate from an overall perspective, it is also necessary to convert the points from the camera coordinate systems at different measurement perspectives into a unified world coordinate system in order to determine a point cloud measurement map of the product to be measured by coordinate conversion. In the implementation, the camera coordinates of the first corner point in the first camera coordinate system and the camera coordinates of the second corner point in the second camera coordinate system can be respectively obtained; the first camera coordinate system is a camera coordinate system of a first acquisition camera installed at a first measurement view angle, and the second camera coordinate system is a camera coordinate system of a second acquisition camera installed at a second measurement view angle. Further, affine transformation matrices are calculated for the first corner points, the second corner points and the plurality of vertices of the calibration block in the world coordinate system, respectively. Finally, a point cloud measurement map of the product to be measured can be determined based on the affine transformation matrix and the acquired plurality of depth images of the product to be measured. The world coordinate system is established by taking the calibration block as a reference, for example, the center of the calibration block can be taken as an origin, and the axes parallel to the long side and the wide side of the calibration block are respectively the y axis and the x axis to establish the world coordinate system.
It should be noted that, in the world coordinate system, three-dimensional coordinates of each vertex of the calibration block are known, for example, the three-dimensional coordinates of each vertex may be determined by CAD three-dimensional drawing of the calibration block.
It can be seen that the embodiment of the application provides a product graph generating method, which includes that first, a first depth image corresponding to a calibration block under a first measurement view angle and a second depth image corresponding to a second measurement view angle are respectively obtained; wherein the first measurement viewing angle and the second measurement viewing angle are different. Because the calibration block has different characteristic performances under different measurement visual angles, different frame selection processing is selected for the depth images under different measurement visual angles, so that the detection frames corresponding to the measurement visual angles are matched with the characteristic performances of the calibration block under the measurement visual angles, and therefore, the detection frames can more accurately identify the areas of the calibration block in the measurement visual angles, and the subsequent measurement process is facilitated. Based on the method, the device and the system, the product can be measured under multiple measuring angles, and the device and the system have higher measuring precision in an application scene. And finally, determining a point cloud measuring graph of the product to be measured corresponding to the calibration block by utilizing the point set included in the first detection frame and the point set included in the second detection frame, so that the attribute characteristics of the product to be measured can be determined by utilizing the point cloud measuring graph.
For easy understanding, in the embodiment of the application, for the product a to be measured, a calibration block corresponding to the product a to be measured is constructed by using the calibration assembly, specifically, the top view of the calibration block may be shown in fig. 4a, the lateral view of the calibration block may be shown in fig. 4b, the partial view of the calibration assembly may be shown in fig. 4c, and the view angle of the partial view is the view angle corresponding to the lateral view of the calibration block.
Corresponding to fig. 4a, the current product a to be measured is determined according to the actual measurement requirement, and the full view angle measurement is completed under five measurement view angles, i.e. an inner side measurement view angle, an outer side measurement view angle, a top measurement view angle, an inner arc angle measurement view angle and an outer arc angle measurement view angle. The top measurement view angle is the first measurement view angle, and the inner measurement view angle, the outer measurement view angle, the inner arc angle measurement view angle and the outer arc angle measurement view angle are the second measurement view angles. Since only a part of the photographing can be photographed at one time, photographing positions shown in fig. 5a are provided, and the photographing positions include 20 photographing positions, namely, four photographing positions under an outside measurement view angle, namely, an outer (1), an outer (2), an outer (3) and an outer (4), four photographing positions under an inside measurement view angle, namely, an inner (1), an inner (2), an inner (3) and an inner (4), four photographing positions under a top measurement view angle, namely, a top (1), a top (2), a top (3) and a top (4), four photographing positions under an inside angle measurement view angle, namely, an inner arc (2), an inner arc (3) and an inner arc (4), and four photographing positions under an outside angle measurement view angle, namely, an outer arc (2), an outer arc (3) and an outer arc (4).
Because the inner side measurement viewing angle and the inner arc angle measurement viewing angle are distributed on the same side, and the outer side measurement viewing angle and the outer arc angle measurement viewing angle are distributed on the same side, in practical application, three acquisition cameras can be installed and respectively responsible for shooting the image acquisition of the top measurement viewing angle, the inner side measurement viewing angle and the inner arc angle measurement viewing angle, and the outer side measurement viewing angle and the outer arc angle measurement viewing angle. It can be understood that after the three acquisition cameras are installed, the relative positions of the three acquisition cameras are unchanged, in practical application, the whole acquisition device can be mounted on a motion platform, the acquisition cameras can be controlled to move around a shaft conveniently, based on the motion, a part can be shot at one time, after shooting is finished, the motion is controlled to shoot other parts, and finally shooting of all 20 shooting positions is realized. The motion platform can be a high-precision five-axis motion platform, can control the acquisition device to move around an X axis, a Y axis and a Z axis, and is convenient for acquiring the motion of the camera to different shooting positions for shooting.
Corresponding to this kind of installation mode of gathering the camera, at actual shooting, can divide into eight groups with 20 shooting positions that the aforesaid, first group can include outside (1), interior (1), top (1), the second group can include outside (2), interior (2), top (2), the third group can include outside (3), interior (3), top (3), the fourth group can include outside (4), interior (4), top (4), the fifth group can include outside arc (1), inside arc (1), the sixth group can include outside arc (2), inside arc (2), the seventh group can include outside arc (3), inside arc (3), the eighth group can include outside arc (4), inside arc (4), specifically can see the fig. 5a and show. Based on this, 20 original depth images can be obtained in total by eight shots, and the corresponding 20 original depth images can be seen in fig. 5 b.
Further, the method provided by the embodiment of the application can be used for processing the binarized images corresponding to the 20 original depth images respectively, and finally, the images are converted into a uniform world coordinate system, so that the complete product full view angle diagram of the product A to be measured can be obtained. It should be noted that, through coordinate transformation, the product full view image is a point cloud measurement image under the world coordinate system, and can be specifically shown in fig. 5 c. In practical application, the point cloud measurement map may be set to be a color map, and the degree of distinction is higher.
In practical application, the 3D size, flatness and other attribute characteristics of the product a to be measured can be determined by using the point cloud measurement diagram of the full view angle of the product shown in fig. 5 c.
It is to be understood that this corresponds substantially to the method embodiments, so that reference is made to the partial description of the method embodiments for relevant reasons.
Fig. 6 is a block diagram of a product graph generating device according to an embodiment of the present application, including:
an obtaining unit 601, configured to obtain a first depth image corresponding to the calibration block under a first measurement view angle and a second depth image corresponding to the calibration block under a second measurement view angle respectively; the first measurement viewing angle is different from the second measurement viewing angle;
The processing unit 602 is configured to perform a first frame selection process on the first depth image according to the first measurement view angle to obtain a first detection frame, and perform a second frame selection process on the second depth image according to the second measurement view angle to obtain a second detection frame; the first framing process is different from the second framing process;
the determining unit 603 is configured to determine a point cloud measurement map of the product to be measured corresponding to the calibration block based on the point set included in the first detection frame and the point set included in the second detection frame.
In some embodiments, in performing a first frame selection process on the first depth image according to the first measurement view angle to obtain a first detection frame, the processing unit 602 is specifically configured to:
performing binarization processing on the first depth image according to the first measurement visual angle to obtain a binarization image corresponding to the first depth image;
determining a first detection frame according to pixel values of each point in the binarized image corresponding to the first depth image;
in terms of performing a second frame selection process on the second depth image according to the second measurement view angle to obtain a second detection frame, the processing unit 602 is specifically configured to:
performing binarization processing on the second depth image according to the second measurement view angle to obtain a binarization image corresponding to the second depth image;
Carrying out concave region detection on the binarized image corresponding to the second depth image to obtain a target concave region;
removing a target concave region in the binarized image corresponding to the second depth image to obtain a to-be-determined binarized image;
and determining a second detection frame according to the pixel values of each point in the undetermined binarized image.
In some embodiments, in determining the first detection frame according to the pixel values of each point in the binarized image corresponding to the first depth image, the processing unit 602 is specifically configured to:
generating a plurality of first regions to be determined by using the points with the pixel values in the binarized image corresponding to the first depth image as target pixel values;
screening a target first waiting area from the plurality of first waiting areas according to the area sizes of the plurality of first waiting areas;
determining a first detection frame according to the region boundary of a first target region to be determined;
in determining the second detection frame according to the pixel values of each point in the to-be-determined binarized image, the processing unit 602 is specifically configured to:
generating a plurality of second undetermined areas by using points with pixel values in the undetermined binarized image as target pixel values;
screening target second undetermined areas from the second undetermined areas according to the area sizes of the second undetermined areas;
And determining a second detection frame according to the region boundary of the second undetermined region of the target.
In some embodiments, in determining the first detection frame according to the region boundary of the target first predetermined region, the processing unit 602 is specifically configured to:
performing expansion processing on the region boundary of the first target region to be detected to obtain a first detection frame;
in determining the second detection frame according to the region boundary of the target second pending region, the processing unit 602 is specifically configured to:
and performing expansion processing on the region boundary of the target second undetermined region to obtain a second detection frame.
In some embodiments, in removing the target concave area in the binarized image corresponding to the second depth image to obtain the pending binarized image, the processing unit 602 is specifically configured to:
and carrying out pixel value inversion processing on the pixel value at the midpoint of the target concave region to obtain a pending binarized image.
In some embodiments, in determining the point cloud measurement map of the product to be measured corresponding to the calibration block based on the point set included in the first detection frame and the point set included in the second detection frame, the determining unit 603 is specifically configured to:
performing plane detection on the point set included in the first detection frame to obtain M first fitting planes, and performing plane detection on the point set included in the second detection frame to obtain M second fitting planes; m is a positive integer;
Performing corner calculation by using M first fitting planes to obtain N first corner points, and performing corner calculation by using M second fitting planes to obtain N second corner points; n is a positive integer;
and determining a point cloud measuring graph of the product to be measured, which corresponds to the calibration block, by utilizing the N first corner points and the N second corner points.
In some embodiments, in performing plane detection on the point set included in the first detection frame to obtain M first fitting planes, the determining unit 603 is specifically configured to:
performing plane detection on a point set included in the first detection frame to obtain M first detection planes;
performing corrosion operation on each first detection plane in the M first detection planes to obtain M first to-be-determined planes;
fitting processing is carried out by utilizing point sets included in each first plane to be fixed in the M first planes to obtain M first fitting planes;
in terms of performing plane detection on the point set included in the second detection frame to obtain M second fitting planes, the determining unit 603 is specifically configured to:
performing plane detection on the point set included in the second detection frame to obtain M second detection planes;
performing corrosion operation on each second detection plane in the M second detection planes to obtain M second to-be-fixed planes;
And fitting the point set included in each second plane to be fixed in the M second planes to be fixed to obtain M second fitting planes.
In some embodiments, in determining the point cloud measurement map of the product to be measured corresponding to the calibration block by using the N first corner points and the N second corner points, the determining unit 603 is specifically configured to:
respectively acquiring camera coordinates of a first corner point in a first camera coordinate system and camera coordinates of a second corner point in a second camera coordinate system; the first camera coordinate system is a camera coordinate system of a first acquisition camera installed at a first measurement view angle, and the second camera coordinate system is a camera coordinate system of a second acquisition camera installed at a second measurement view angle;
calculating affine transformation matrixes for the first angular point, the second angular point and a plurality of vertexes of the calibration block in a world coordinate system respectively; the world coordinate system is established by taking the calibration block as a reference;
and determining a point cloud measurement map of the product to be measured based on the affine transformation matrix and the acquired multiple depth images of the product to be measured.
In some embodiments, the calibration block is constructed by:
determining a plurality of target calibration assemblies from a plurality of to-be-calibrated assemblies according to morphological properties and measurement requirements of the product to be measured;
And assembling the plurality of target calibration assemblies to obtain a calibration block.
It can be seen that the embodiment of the application provides a product graph generating method, which includes that first, a first depth image corresponding to a calibration block under a first measurement view angle and a second depth image corresponding to a second measurement view angle are respectively obtained; wherein the first measurement viewing angle and the second measurement viewing angle are different. Because the calibration block has different characteristic performances under different measurement visual angles, different frame selection processing is selected for the depth images under different measurement visual angles, so that the detection frames corresponding to the measurement visual angles are matched with the characteristic performances of the calibration block under the measurement visual angles, and therefore, the detection frames can more accurately identify the areas of the calibration block in the measurement visual angles, and the subsequent measurement process is facilitated. Based on the method, the device and the system, the product can be measured under multiple measuring angles, and the device and the system have higher measuring precision in an application scene. And finally, determining a point cloud measuring graph of the product to be measured corresponding to the calibration block by utilizing the point set included in the first detection frame and the point set included in the second detection frame, so that the attribute characteristics of the product to be measured can be determined by utilizing the point cloud measuring graph.
The embodiment of the present application provides a computer device, the schematic structural diagram of which may be seen in fig. 7, where a computer device 700 includes a processor 701 and a memory 702, where the memory 702 stores a computer program, and the processor 701 implements the product graph generating method provided in the foregoing embodiment when executing the computer program.
The computer device may comprise a terminal device or a server, and the aforementioned product map generating means may be configured in the computer device.
An embodiment of the present application provides a computer readable storage medium, and a schematic structural diagram of the computer readable storage medium may be referred to in fig. 8, where a computer program is stored on the computer readable storage medium 800, and the computer program when executed by a processor implements a product graph generating method provided in the foregoing embodiment.
Those of ordinary skill in the art will appreciate that: all or part of the steps of implementing the above-described method embodiments may be performed by hardware associated with computer program instructions, and the computer program may be stored in the computer-readable storage medium 800. The present application is not limited in any way to the form of computer-readable storage medium 800. For example, the computer-readable storage medium 800 may be at least one of the following storage media: a Read-only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk or an optical disk, or the like, which can store computer program codes. For example, the random access memory may be a static random access memory (Static Random Access Memory, SRAM) or a dynamic random access memory (Dynamic Random Access Memory, DRAM).
It should be noted that, for the apparatus embodiment, since it basically corresponds to the method embodiment, reference should be made to a part of the description of the method embodiment for the relevant point. The apparatus embodiments described above are merely illustrative, wherein elements illustrated as separate elements may or may not be physically separate, and elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing detailed description of the method, apparatus, computer device, and storage medium for generating a product graph in the embodiments of the present application has been provided, and specific examples have been applied to illustrate the principles and embodiments of the present application, where the foregoing examples are provided to assist in understanding the method of the present application. Also, as will be apparent to one of ordinary skill in the art, there are variations in the specific embodiments and the scope of the application of the method according to the present application.
In view of the foregoing, the disclosure should not be construed as limiting the application, and any changes or substitutions that would be easily recognized by those skilled in the art within the technical scope of the disclosure of the present application are intended to be encompassed within the scope of the present application. Further combinations of the implementations provided in the above aspects may be made to provide further implementations.

Claims (12)

1. A method of generating a product map, comprising:
respectively acquiring a first depth image corresponding to a calibration block under a first measurement view angle and a second depth image corresponding to a second measurement view angle; the first measurement viewing angle is different from the second measurement viewing angle;
Performing first frame selection processing on the first depth image according to the first measurement view angle to obtain a first detection frame, and performing second frame selection processing on the second depth image according to the second measurement view angle to obtain a second detection frame; the first framing process is different from the second framing process;
and determining a point cloud measurement map of the product to be measured, which corresponds to the calibration block, based on the point set included in the first detection frame and the point set included in the second detection frame.
2. The method of claim 1, wherein the performing a first frame selection process on the first depth image according to the first measurement view angle to obtain a first detection frame includes:
performing binarization processing on the first depth image according to the first measurement view angle to obtain a binarized image corresponding to the first depth image;
determining a first detection frame according to pixel values of each point in the binarized image corresponding to the first depth image;
and performing a second frame selection process on the second depth image according to the second measurement view angle to obtain a second detection frame, including:
performing binarization processing on the second depth image according to the second measurement view angle to obtain a binarization image corresponding to the second depth image;
Performing concave region detection on the binarized image corresponding to the second depth image to obtain a target concave region;
removing the target concave region in the binarized image corresponding to the second depth image to obtain a to-be-determined binarized image;
and determining a second detection frame according to the pixel values of each point in the undetermined binarized image.
3. The method according to claim 2, wherein determining the first detection frame according to the pixel values of each point in the binarized image corresponding to the first depth image includes:
generating a plurality of first regions to be determined by using points with pixel values in the binarized image corresponding to the first depth image as target pixel values;
screening a target first waiting area from the plurality of first waiting areas according to the area sizes of the plurality of first waiting areas;
determining a first detection frame according to the region boundary of the first target region to be determined;
the determining a second detection frame according to the pixel values of each point in the to-be-determined binarized image comprises:
generating a plurality of second undetermined areas by using the pixel values in the undetermined binarized image as points of the target pixel values;
screening target second undetermined areas from the second undetermined areas according to the area sizes of the second undetermined areas;
And determining a second detection frame according to the region boundary of the second undetermined region of the target.
4. A method according to claim 3, wherein said determining a first detection frame from the region boundary of the target first predetermined region comprises:
performing expansion processing on the region boundary of the target first region to be determined to obtain a first detection frame;
the determining a second detection frame according to the region boundary of the target second undetermined region includes:
and performing expansion processing on the region boundary of the target second undetermined region to obtain a second detection frame.
5. The method according to claim 2, wherein the removing the target concave region in the binarized image corresponding to the second depth image, to obtain a pending binarized image, includes:
and carrying out pixel value inversion processing on the pixel value at the midpoint of the target concave region to obtain a to-be-determined binarized image.
6. The method according to any one of claims 1-5, wherein the determining a point cloud measurement map of the product to be measured corresponding to the calibration block based on the point set included in the first detection frame and the point set included in the second detection frame includes:
Performing plane detection on the point set included in the first detection frame to obtain M first fitting planes, and performing plane detection on the point set included in the second detection frame to obtain M second fitting planes; m is a positive integer;
performing corner calculation by using the M first fitting planes to obtain N first corner points, and performing corner calculation by using the M second fitting planes to obtain N second corner points; the N is a positive integer;
and determining a point cloud measuring graph of the product to be measured, which corresponds to the calibration block, by utilizing the N first corner points and the N second corner points.
7. The method of claim 6, wherein performing plane detection on the point set included in the first detection frame to obtain M first fitting planes includes:
performing plane detection on a point set included in the first detection frame to obtain M first detection planes;
performing corrosion operation on each of the M first detection planes to obtain M first planes to be determined;
fitting processing is carried out by utilizing point sets included in each first plane to be determined in the M first planes to be determined respectively, so that M first fitting planes are obtained;
performing plane detection on the point set included in the second detection frame to obtain M second fitting planes, including:
Performing plane detection on the point set included in the second detection frame to obtain M second detection planes;
performing corrosion operation on each of the M second detection planes to obtain M second to-be-fixed planes;
and fitting the point sets included in each second plane to be fixed in the M second planes to be fixed to obtain M second fitting planes.
8. The method according to claim 6, wherein determining the point cloud measurement map of the product to be measured corresponding to the calibration block by using the N first corner points and the N second corner points includes:
respectively acquiring camera coordinates of the first corner point in a first camera coordinate system and camera coordinates of the second corner point in a second camera coordinate system; the first camera coordinate system is a camera coordinate system of a first acquisition camera installed at the first measurement view angle, and the second camera coordinate system is a camera coordinate system of a second acquisition camera installed at the second measurement view angle;
calculating affine transformation matrixes for the first angular point, the second angular point and a plurality of vertexes of the calibration block in a world coordinate system respectively; the world coordinate system is established by taking the calibration block as a reference;
And determining a point cloud measurement map of the product to be measured based on the affine transformation matrix and the acquired multiple depth images of the product to be measured.
9. The method of claim 1, wherein the calibration block is constructed by:
determining a plurality of target calibration assemblies from a plurality of to-be-calibrated assemblies according to the morphological attribute and the measurement requirement of the product to be measured;
and assembling a plurality of target calibration assemblies to obtain the calibration blocks.
10. A product map generating apparatus, comprising:
the acquisition unit is used for respectively acquiring a first depth image corresponding to the calibration block under a first measurement view angle and a second depth image corresponding to the calibration block under a second measurement view angle; the first measurement viewing angle is different from the second measurement viewing angle;
the processing unit is used for performing first frame selection processing on the first depth image according to the first measurement visual angle to obtain a first detection frame, and performing second frame selection processing on the second depth image according to the second measurement visual angle to obtain a second detection frame; the first framing process is different from the second framing process;
and the determining unit is used for determining a point cloud measuring graph of the product to be measured, which corresponds to the calibration block, based on the point set included in the first detection frame and the point set included in the second detection frame.
11. A computer device, characterized in that it comprises a processor and a memory, the memory storing a computer program, the processor implementing the method according to any of claims 1-9 when executing the computer program.
12. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the method according to any of claims 1-9.
CN202310106894.1A 2023-01-16 2023-01-16 Product graph generation method, device, computer equipment and storage medium Pending CN116051792A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310106894.1A CN116051792A (en) 2023-01-16 2023-01-16 Product graph generation method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310106894.1A CN116051792A (en) 2023-01-16 2023-01-16 Product graph generation method, device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116051792A true CN116051792A (en) 2023-05-02

Family

ID=86119957

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310106894.1A Pending CN116051792A (en) 2023-01-16 2023-01-16 Product graph generation method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116051792A (en)

Similar Documents

Publication Publication Date Title
CN107705333B (en) Space positioning method and device based on binocular camera
CN110349195B (en) Depth image-based target object 3D measurement parameter acquisition method and system and storage medium
CN111210429B (en) Point cloud data partitioning method and device and obstacle detection method and device
AU2015284556A1 (en) Depth estimation using multi-view stereo and a calibrated projector
CN111582054B (en) Point cloud data processing method and device and obstacle detection method and device
CN111080662A (en) Lane line extraction method and device and computer equipment
CN111144213B (en) Object detection method and related equipment
CN108629810B (en) Calibration method and device of binocular camera and terminal
CN112381847B (en) Pipeline end space pose measurement method and system
CN109658497B (en) Three-dimensional model reconstruction method and device
CN116129037B (en) Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof
CN112686877A (en) Binocular camera-based three-dimensional house damage model construction and measurement method and system
CN114998328A (en) Workpiece spraying defect detection method and system based on machine vision and readable storage medium
CN111553946A (en) Method and device for removing ground point cloud and obstacle detection method and device
CN110599586A (en) Semi-dense scene reconstruction method and device, electronic equipment and storage medium
WO2021142843A1 (en) Image scanning method and device, apparatus, and storage medium
JP7432793B1 (en) Mapping methods, devices, chips and module devices based on three-dimensional point clouds
JP2019091122A (en) Depth map filter processing device, depth map filter processing method and program
CN114581331A (en) Point cloud noise reduction method and device suitable for multiple scenes
JP2024507089A (en) Image correspondence analysis device and its analysis method
CN117132737B (en) Three-dimensional building model construction method, system and equipment
CN116051792A (en) Product graph generation method, device, computer equipment and storage medium
CN113628286B (en) Video color gamut detection method, device, computing equipment and computer storage medium
CN114396875B (en) Rectangular package volume measurement method based on vertical shooting of depth camera
CN111383264A (en) Positioning method, positioning device, terminal and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination