CN110319899A - Volume measuring method, device and system - Google Patents

Volume measuring method, device and system Download PDF

Info

Publication number
CN110319899A
CN110319899A CN201910741396.8A CN201910741396A CN110319899A CN 110319899 A CN110319899 A CN 110319899A CN 201910741396 A CN201910741396 A CN 201910741396A CN 110319899 A CN110319899 A CN 110319899A
Authority
CN
China
Prior art keywords
depth
under test
value
object under
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910741396.8A
Other languages
Chinese (zh)
Inventor
高松山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen JWIPC Technology Co Ltd
Original Assignee
Shenzhen JWIPC Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen JWIPC Technology Co Ltd filed Critical Shenzhen JWIPC Technology Co Ltd
Priority to CN201910741396.8A priority Critical patent/CN110319899A/en
Publication of CN110319899A publication Critical patent/CN110319899A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01FMEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
    • G01F17/00Methods or apparatus for determining the capacity of containers or cavities, or the volume of solid bodies

Abstract

The invention discloses a kind of volume measuring methods, device and system, and wherein volume measuring method is the following steps are included: receive the depth map of the scene containing object under test, the depth map of scene is on the basis of preset first level face;Reference distance value is received from single-point range unit, target detection point is located at the upper surface of object under test;Target depth corresponding with target detection point is obtained according to the depth map of scene;Compensation factor is obtained, is compensated according to depth value of the compensation factor to the depth map of scene, to obtain compensation depth value;The point cloud coordinate compensated according to compensation depth value, and the volume of object under test is obtained according to the point cloud coordinate of compensation.The numerical value that the present invention passes through the accurate distance measured compensates the depth value of depth map, can be with error existing for the depth value of Corrected Depth figure, to improve the precision of the volume of obtained object under test.

Description

Volume measuring method, device and system
Technical field
The invention belongs to object volume field of measuring technique more particularly to a kind of volume measuring methods, device and system.
Background technique
Currently, being that representative object volume measures in factory, warehouse etc. with logistics goods with the popularity of the internet Scene it is more and more.It quickly object is sorted, is assembled to realize, and precisely dispatching, it is necessary to object body Product is measured and is recorded.
Currently, the method for having had already appeared the cubing based on depth camera.Referring to Fig.1, depth camera is from upper direction Lower shooting object under test obtains the depth map of the scene containing object under test.Depth map, that is, depth Z axis coordinate set, depth Figure is also referred to as range image, refers to figure of the distance (depth) by each point in from image acquisition device to scene as pixel value Picture, it directly reflects the geometry of each object visible surface in scene.The depth map of scene includes corresponding to each pixel Depth information.By taking A point in Fig. 1 as an example, corresponding depth is d.It is available according to the corresponding depth information of each pixel The point cloud coordinate of scene.Then, the corresponding coordinate point set of object under test is filtered out from the point cloud coordinate of scene.Finally, root The volume of object under test can be calculated according to the corresponding coordinate point set of object under test.
However, the precision that existing depth camera obtains depth information is not high enough, especially by environment (such as the temperature that works Degree etc.) influence when, error is larger.For example, the corresponding depth of A point that depth camera obtains is d by taking A point in Fig. 1 as an example.But The numerical value is usually present certain error.The point cloud coordinate that the error is introduced into subsequent scene calculates and object under test The calculating of volume, the bulking value error that will lead to are larger.
Summary of the invention
The embodiment of the present invention technical problems to be solved are to overcome volume in the prior art based on depth camera The not high enough defect of the method precision of measurement provides a kind of volume measuring method, device and system.
The embodiment of the present invention is to solve above-mentioned technical problem by following technical proposals:
A kind of volume measuring method of the embodiment of the present invention, comprising the following steps:
The depth map of the scene containing object under test is received, the depth map of scene is on the basis of preset first level face;
Receive reference distance value from single-point range unit, reference distance value be target detection point and preset first level face it Between distance, target detection point is located at the upper surface of object under test;
Target depth corresponding with target detection point is obtained according to the depth map of scene;
Compensation factor is obtained, is compensated according to depth value of the compensation factor to the depth map of scene, it is deep to obtain compensation Angle value;Compensation factor is the ratio of reference distance value and target depth;
The point cloud coordinate compensated according to compensation depth value, and the body of object under test is obtained according to the point cloud coordinate of compensation Product.
Optionally, single-point range unit includes laser range finder, and laser range finder is set on preset first level face, is swashed The detection light beam of optar is projeced into the upper surface of object under test vertically.
Optionally, it is obtained according to the depth map of scene with the step of target detection point corresponding target depth and includes:
The corresponding depth value of target detection point is extracted from the depth map of scene according to the location information of laser range finder to make For target depth.
Optionally, it is obtained according to the depth map of scene with the step of target detection point corresponding target depth and includes:
Depth data corresponding with the upper surface of object under test is obtained from the depth map of scene;
Average depth value is obtained, and using average depth value as target depth, average depth value is being averaged for depth data Value.
Optionally, object under test is cuboid, and object under test is horizontal positioned;
It is then obtained according to the depth map of scene with the step of target detection point corresponding target depth and includes:
The point cloud coordinate of the upper surface of object under test is obtained according to the depth map of scene, according to cloud coordinate using minimum two Multiplication is fitted an objective plane;
It obtains the distance between objective plane and preset first level face and is used as target depth.
Optionally, volume measuring method is further comprising the steps of:
The image of the scene containing object under test is obtained, obtain the profile of object under test in the picture and is labeled.
The embodiment of the present invention also provides a kind of volume measurement device, including depth map receiving unit, reference distance value connect Receive unit, target depth acquiring unit, compensating unit, volume output unit;
Depth map receiving unit is used to receive the depth map of the scene containing object under test, and the depth map of scene is to preset the On the basis of one horizontal plane;
Reference distance value receiving unit is used to receive reference distance value from single-point range unit, and reference distance value is target inspection The distance between measuring point and preset first level face, target detection point are located at the upper surface of object under test;
Target depth acquiring unit is used to obtain target depth corresponding with target detection point according to the depth map of scene;
Compensating unit is used to be carried out according to depth value of the compensation factor to the depth map of scene for obtaining compensation factor Compensation, to obtain compensation depth value;Compensation factor is the ratio of reference distance value and target depth;
Volume output unit is used for the point cloud coordinate compensated according to compensation depth value, and according to the point cloud coordinate of compensation Obtain the volume of object under test.
Optionally, single-point range unit includes laser range finder, and laser range finder is set on preset first level face, is swashed The detection light beam of optar is projeced into the upper surface of object under test vertically;
Then target depth acquiring unit is used to extract mesh from the depth map of scene according to the location information of laser range finder The corresponding depth value of test point is marked as target depth.
Optionally, target depth acquiring unit is also used to obtain the upper surface pair with object under test from the depth map of scene The depth data answered;
Target depth acquiring unit is also used to obtain average depth value, and using average depth value as target depth, average Depth value is the average value of depth data.
Optionally, object under test is cuboid, and object under test is horizontal positioned;
Target depth acquiring unit is also used to obtain the point cloud coordinate of the upper surface of object under test according to the depth map of scene, And an objective plane is fitted using least square method according to cloud coordinate;
Target depth acquiring unit is also used to obtain the distance between objective plane and preset first level face as target Depth.
The embodiment of the present invention also provides a kind of volume measuring system, and volume measuring system includes depth camera mould group, list The volume measurement device of point range unit and the embodiment of the present invention;
Depth camera mould group is used to obtain the depth map of scene;
Single-point range unit is for obtaining reference distance value.
Optionally, volume measuring system further includes image acquisition unit and mark unit;
Image acquisition unit is used to obtain the image of the scene containing object under test;
Mark unit is for obtaining the profile of object under test in the picture and being labeled.
The positive effect of the embodiment of the present invention is: the embodiment of the present invention passes through the accurate distance measured Numerical value compensates the depth value of depth map, can be with error existing for the depth value of Corrected Depth figure, thus what raising obtained The precision of the volume of object under test.
Detailed description of the invention
Detailed description of the invention several embodiments of the invention, and together with the description to explain the principle of the present invention.Fields It will be recognized that the specific embodiment illustrated in figure is exemplary, and the model being not intended to be limiting of the invention It encloses.
Fig. 1 is the structural schematic diagram of the volume measurement device of the embodiment of the present invention 1.
Fig. 2 is the structural schematic diagram of the volume measuring system of the embodiment of the present invention 1.
Fig. 3 is that the volume measuring system of the embodiment of the present invention 1 measures the schematic diagram of object under test volume.
Fig. 4 is the flow chart of the volume measuring method of the embodiment of the present invention 1.
Fig. 5 is that the volume measuring system of the embodiment of the present invention 2 measures the schematic diagram of object under test volume.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on this hair Embodiment in bright, those skilled in the art's every other embodiment obtained without making creative work, It shall fall within the protection scope of the present invention.
It should be appreciated that ought use in this specification and in the appended claims, term " includes " and "comprising" instruction Described feature, entirety, operation, the presence of element and/or component, but one or more of the other feature, entirety, behaviour is not precluded Make, the presence or addition of element and/or component.
It is also understood that mesh of the term used in this description of the invention merely for the sake of description specific embodiment And be not intended to limit the present invention.As description of the invention and it is used in the attached claims, unless on Other situations are hereafter clearly indicated, otherwise " one " of singular, "one" and "the" are intended to include plural form.
It will be further appreciated that the term "and/or" used in description of the invention and the appended claims is Refer to any combination and all possible combinations of one or more of associated item listed, and including these combinations.
Embodiment 1
The present embodiment provides a kind of volume measurement devices, and referring to Fig.1, which includes depth map receiving unit 101, reference distance value receiving unit 102, target depth acquiring unit 103, compensating unit 104, volume output unit 105.
Depth map receiving unit 101 is used to receive the depth map of the scene containing object under test, and the depth map of scene is with pre- If on the basis of first level face.Reference distance value receiving unit 102 is used to receive reference distance value, reference from single-point range unit Distance value is the distance between target detection point and preset first level face, and target detection point is located at the upper surface of object under test. Target depth acquiring unit 103 is used to obtain target depth corresponding with target detection point according to the depth map of scene.Compensation is single Member 104 is used to be compensated according to depth value of the compensation factor to the depth map of scene for obtaining compensation factor, to obtain Compensation depth value;Compensation factor is the ratio of reference distance value and target depth.Volume output unit 105 is used for deep according to compensation The point cloud coordinate that angle value is compensated, and the volume of object under test is obtained according to the point cloud coordinate of compensation.
The present embodiment also provides a kind of volume measuring system, and referring to Fig. 2, which includes depth camera mould group 2, the volume measurement device 1 of single-point range unit 3 and the present embodiment.Depth camera mould group 2 is used to obtain the depth map of scene.It is single Point range unit 3 is for obtaining reference distance value.
As an alternative embodiment, single-point range unit 3 includes laser range finder 11.Referring to Fig. 3, laser ranging Instrument 11 is set on preset first level face 12, and the detection light beam 14 of laser range finder 11 is projeced into object under test 13 straight down Upper surface, the point that the detection light beam 14 of laser range finder 11 is directed toward is target detection point 15.Depth camera mould group 2 is also provided with In on preset first level face 12, the coverage of depth camera mould group 2 is illustrated with dotted line in figure.Target depth acquiring unit 103 make for extracting the corresponding depth value of target detection point from the depth map of scene according to the location information of laser range finder 11 For target depth.
During measuring volume, depth camera mould group 2 obtains the depth map of the scene containing object under test 13.The depth Figure is spent using the depth value of each point as pixel value.Depth map receiving unit 101 receives depth map from depth camera mould group 2.Swash Optar 11 obtains itself and the distance between 15 h1 of target detection point.Because the detection light beam 14 of laser range finder 11 is along vertical Direction, so distance h1 is the distance between target detection point 15 and preset first level face 12.Because of laser range finder 11 The distance measured has high precision, therefore, using distance h1 as reference distance value.Reference distance value receiving unit 102 from swash Optar 11 receives reference distance value.
After the setup, position is known to laser range finder 11.It is according to the position that laser range finder 11 is arranged It can determine its corresponding target detection point 15 in the position of depth map.Target depth acquiring unit 103 is according to laser range finder 11 Location information the corresponding depth value (being set as h2) of target detection point is extracted from the depth map of scene, as target depth.
Next, compensating unit 104 obtains compensation factor c1.Compensation factor c1=h1/h2.Then, compensating unit 104 It is compensated according to depth value of the compensation factor c1 to the depth map of scene, to obtain compensation depth value that is, in depth map and often The corresponding depth value of one point is multiplied by compensation factor c1, to obtain the corresponding compensation depth value of each point.
Next, the point cloud coordinate that volume output unit 105 is compensated according to compensation depth value, sits from the point cloud of compensation The corresponding coordinate point set of object under test 13 is filtered out in mark, is obtained further according to the corresponding coordinate point set of object under test 13 to be measured The volume of object 13.
As an alternative embodiment, the volume measurement device of the present embodiment is realized using a chip, depth map is connect Receive unit 101, reference distance value receiving unit 102, target depth acquiring unit 103, compensating unit 104, volume output unit 105 are integrated in the chip.In another optional embodiment, volume measurement device is realized using a processor, deep It is defeated to spend figure receiving unit 101, reference distance value receiving unit 102, target depth acquiring unit 103, compensating unit 104, volume The corresponding function of unit 105 is all made of computer program realization out.
As an alternative embodiment, image acquisition unit obtains the image of the scene containing object under test 13.Figure As acquiring unit includes RGB (RGB) camera mould group, the cromogram of scene of the RGB camera mould group acquisition containing object under test 13 Picture provides visual measurement experience for user.Unit is marked for obtaining profile and the progress of object under test 13 in the picture Mark.Unit is marked according at least one from RGB phase of bilateral filtering algorithm, Kalman filtering algorithm, Gaussian filter algorithm etc. The edge contour of object under test 13 is identified in the image that machine mould group obtains, and the profile is marked in eye-catching mode, with convenient User locks object under test 13.
The present embodiment also provides a kind of volume measuring method, referring to Fig. 4, should the following steps are included:
Step S301, the depth map of the scene containing object under test 13 is received.The depth map of scene is with preset first level On the basis of face.
Step S302, reference distance value is received from single-point range unit.Reference distance value is target detection point and default the The distance between one horizontal plane, target detection point are located at the upper surface of object under test 13.
Step S303, target depth corresponding with target detection point is obtained according to the depth map of scene.
Step S304, compensation factor is obtained, is compensated according to depth value of the compensation factor to the depth map of scene, with To compensation depth value.Compensation factor is the ratio of reference distance value and target depth;
Step S305, the point cloud coordinate compensated according to compensation depth value, and according to the point cloud coordinate of compensation obtain to Survey the volume of object 13.
In a kind of optional embodiment, which is realized using the volume measurement device of the present embodiment.
In another optional embodiment, the volume measuring method is real using the volume measuring system of the present embodiment It is existing.The depth map of scene of the acquisition of depth camera mould group 2 containing object under test 13.The depth map using the depth value of each point as Pixel value.In step S301, depth map receiving unit 101 receives depth map from depth camera mould group 2.
Laser range finder 11 obtains itself and the distance between 15 h1 of target detection point.Because of the detection light of laser range finder 11 Beam 14 along the vertical direction, so distance h1 is the distance between target detection point 15 and preset first level face 12.Because swashing The distance that optar 11 measures has high precision, therefore, using distance h1 as reference distance value.In step s 302, join It examines distance value receiving unit 102 and receives reference distance value from laser range finder 11.
After the setup, position is known to laser range finder 11.It is according to the position that laser range finder 11 is arranged It can determine its corresponding target detection point 15 in the position of depth map.In step S303,103 basis of target depth acquiring unit The location information of laser range finder 11 extracts the corresponding depth value (being set as h2) of target detection point from the depth map of scene, as Target depth.
Next, in step s 304, compensating unit 104 obtains compensation factor c1.Compensation factor c1=h1/h2.Then, Compensating unit 104 is compensated according to depth value of the compensation factor c1 to the depth map of scene, to obtain compensation depth value that is, Corresponding depth value is put multiplied by compensation factor c1, to obtain the corresponding compensation depth value of each point with each in depth map.
Next, in step S305, the point cloud coordinate that volume output unit 105 is compensated according to compensation depth value, The corresponding coordinate point set of object under test 13 is filtered out from the point cloud coordinate of compensation, further according to the corresponding coordinate of object under test 13 Point set obtains the volume of object under test 13.
As an alternative embodiment, checking for the ease of user, the volume measuring method of the present embodiment further includes Following steps: the image of the scene containing object under test 13 is obtained;The profile for obtaining object under test 13 in the picture is gone forward side by side rower Note.
The numerical value that the present embodiment passes through the accurate distance measured compensates the depth value of depth map, can correct depth Error existing for the depth value of figure is spent, to improve the precision of the volume of obtained object under test 13.Using one in depth map Operand can be effectively reduced as target depth in the depth value of a point.
Embodiment 2
The present embodiment provides a kind of volume measurement device, the volume measurement device bases of the volume measurement device and embodiment 1 This is identical, and difference is, the target depth acquiring unit 103 of the volume measurement device of the present embodiment is different.In the present embodiment, Referring to Fig. 5, object under test 13 is cuboid, and object under test 13 is horizontal positioned.Target depth acquiring unit 103 is according to scene Pixel set corresponding with object under test 13 is obtained in depth map.Because the corresponding depth value of object under test 13 (is held with background Carry object under test 13 pedestal) depth value have notable difference therefore can be by object under test 13 and background according to depth value It distinguishes, to obtain depth data corresponding with the upper surface of object under test from the depth map of scene.The depth data includes Depth value corresponding with each pixel of the upper surface of object under test in depth map.Then, target depth acquiring unit 103 obtains Take average depth value as target depth, average depth value is the average value of depth data, the as corresponding institute of object under test 13 There is the average value of the depth value of pixel, that is, after the depth value of the corresponding all pixels of object under test 13 is summed, divided by be measured The quantity of the corresponding pixel of object 13.Because object under test 13 is cuboid, object under test 13 is horizontal positioned, so, it is average deep Angle value can characterize between the upper surface and preset first level face 12 of the object under test 13 obtained by depth camera mould group 2 Distance.Using average depth value as target depth, the random error bring that can reduce the depth value of each point influences, and improves Precision.
The other parts of the volume measurement device of the present embodiment and the volume measurement device of embodiment 1 are essentially identical, herein It repeats no more.
The present embodiment also provides a kind of volume measuring system, the volume measuring system of the present embodiment and the volume of embodiment 1 Measuring system is essentially identical, and difference is, the volume measuring system of the present embodiment includes the volume measurement device of the present embodiment, That is, target depth acquiring unit 103 has differences.
During measuring volume, depth camera mould group 2 obtains the depth map of the scene containing object under test 13.The depth Figure is spent using the depth value of each point as pixel value.Depth map receiving unit 101 receives depth map from depth camera mould group 2.Swash Optar 11 obtains itself and the distance between 15 h1 of target detection point.Because the detection light beam 14 of laser range finder 11 is along vertical Direction, so distance h1 is the distance between target detection point 15 and preset first level face 12.Because of laser range finder 11 The distance measured has high precision, therefore, using distance h1 as reference distance value.Reference distance value receiving unit 102 from swash Optar 11 receives reference distance value.
Target depth acquiring unit 103 is according to acquisition pixel set corresponding with object under test 13 in the depth map of scene. Because the corresponding depth value of object under test 13 and the depth value of background (carrying the pedestal of object under test 13) have notable difference, Therefore, object under test 13 and background can be distinguished according to depth value.Then, target depth acquiring unit 103 obtains average deep For angle value h3 as target depth, average depth value h3 is the average value of the depth value of the corresponding all pixels of object under test 13, That is, after the depth value of the corresponding all pixels of object under test 13 is summed, divided by the quantity of the corresponding pixel of object under test 13.Cause Object under test 13 is cuboid, and object under test 13 is horizontal positioned, so, average depth value can be characterized through depth camera mould The distance between upper surface and preset first level face 12 of the object under test 13 that group 2 obtains.
Next, compensating unit 104 obtains compensation factor c1.Compensation factor c1=h1/h3.Then, compensating unit 104 It is compensated according to the corresponding depth value of compensation factor c1 measuring targets 13, to obtain the first compensation depth value that is, depth map Each of depth value corresponding with object under test 13 multiplied by compensation factor c1, to obtain corresponding with object under test 13 One compensation depth value.
Next, volume output unit 105 according to first compensation depth value corresponding with object under test 13 obtain with it is to be measured The point cloud coordinate of the corresponding compensation of object 13 obtains object under test further according to the point cloud coordinate of compensation corresponding with object under test 13 13 volume.
The present embodiment also provides a kind of volume measuring method, the volume measuring method of the present embodiment and the volume of embodiment 1 Measurement method is essentially identical, and difference is that the step of obtaining target depth is distinct.
In a kind of optional embodiment, which is realized using the volume measurement device of the present embodiment.
In another optional embodiment, the volume measuring method is real using the volume measuring system of the present embodiment It is existing.The depth map of scene of the acquisition of depth camera mould group 2 containing object under test 13.The depth map using the depth value of each point as Pixel value.In step S301, depth map receiving unit 101 receives depth map from depth camera mould group 2.
Laser range finder 11 obtains itself and the distance between 15 h1 of target detection point.Because of the detection light of laser range finder 11 Beam 14 along the vertical direction, so distance h1 is the distance between target detection point 15 and preset first level face 12.Because swashing The distance that optar 11 measures has high precision, therefore, using distance h1 as reference distance value.In step s 302, join It examines distance value receiving unit 102 and receives reference distance value from laser range finder 11.
In step S303, target depth acquiring unit 103 is right with object under test 13 according to obtaining in the depth map of scene The pixel set answered.Because of the depth value of the corresponding depth value of object under test 13 and background (carrying the pedestal of object under test 13) With notable difference, therefore, object under test 13 and background can be distinguished according to depth value.Then, target depth acquiring unit 103 obtain average depth value h3 as target depth, and average depth value h3 is the depth of the corresponding all pixels of object under test 13 The average value of value, that is, after the depth value of the corresponding all pixels of object under test 13 is summed, it is corresponding divided by object under test 13 The quantity of pixel.Because object under test 13 is cuboid, object under test 13 is horizontal positioned, so, average depth value can characterize The distance between upper surface and preset first level face 12 of the object under test 13 obtained by depth camera mould group 2.
Next, in step s 304, compensating unit 104 obtains compensation factor c1.Compensation factor c1=h1/h3.Then, Compensating unit 104 is compensated according to the corresponding depth value of compensation factor c1 measuring targets 13, to obtain the first compensation depth Value that is, each of depth map depth value corresponding with object under test 13 multiplied by compensation factor c1, with obtain with it is to be measured The corresponding first compensation depth value of object 13.
Next, volume output unit 105 is according to first compensation depth corresponding with object under test 13 in step S305 Value obtains the point cloud coordinate of compensation corresponding with object under test 13, further according to the point cloud coordinate of compensation corresponding with object under test 13 Obtain the volume of object under test 13.
Embodiment 3
The present embodiment provides a kind of volume measurement device, the volume measurement device bases of the volume measurement device and embodiment 1 This is identical, and difference is, the target depth acquiring unit 103 of the volume measurement device of the present embodiment is different.In the present embodiment, Referring to Fig. 5, object under test 13 is cuboid, and object under test 13 is horizontal positioned.Target depth acquiring unit 103 is according to scene The point cloud coordinate of depth map acquisition scene.Then, the corresponding coordinate points of object under test 13 are filtered out from the point cloud coordinate of scene Set.That is, target depth acquiring unit 103 obtains the coordinate of each point corresponding with the upper surface of object under test 13.It connects down Come, target depth acquiring unit 103 is according to the coordinate of the corresponding each point in upper surface of object under test 13, according to least square method It is fitted the function of plane corresponding with the upper surface of object under test 13, the referred to as function of objective plane.Then, target depth obtains Unit 103 obtains the distance between objective plane and preset first level face 12 h4.Preset first level face 12 is provided with Afterwards, corresponding function is known.Therefore, the distance between objective plane and preset first level face 12 are can to obtain 's.Target depth acquiring unit 103 is using distance h4 as target depth.Because object under test 13 is cuboid, object under test 13 is water Placing flat, so, objective plane tool has high degree of fitting with the plane where the upper surface of object under test 13, and objective plane can The plane where upper surface to characterize object under test 13, in turn, distance h4 can be characterized to be obtained by depth camera mould group 2 The distance between upper surface and preset first level face 12 of object under test 13.
The other parts of the volume measurement device of the present embodiment and the volume measurement device of embodiment 1 are essentially identical, herein It repeats no more.
The present embodiment also provides a kind of volume measuring system, the volume measuring system of the present embodiment and the volume of embodiment 1 Measuring system is essentially identical, and difference is, the volume measuring system of the present embodiment includes the volume measurement device of the present embodiment, That is, target depth acquiring unit 103 has differences.
During measuring volume, depth camera mould group 2 obtains the depth map of the scene containing object under test 13.The depth Figure is spent using the depth value of each point as pixel value.Depth map receiving unit 101 receives depth map from depth camera mould group 2.Swash Optar 11 obtains itself and the distance between 15 h1 of target detection point.Because the detection light beam 14 of laser range finder 11 is along vertical Direction, so distance h1 is the distance between target detection point 15 and preset first level face 12.Because of laser range finder 11 The distance measured has high precision, therefore, using distance h1 as reference distance value.Reference distance value receiving unit 102 from swash Optar 11 receives reference distance value.
Target depth acquiring unit 103 obtains the point cloud coordinate of scene according to the depth map of scene.Then, from the point of scene The corresponding coordinate point set of object under test 13 is filtered out in cloud coordinate.That is, target depth acquiring unit 103 obtains and determinand The coordinate of the corresponding each point in the upper surface of body 13.Next, upper table of the target depth acquiring unit 103 according to object under test 13 The coordinate of the corresponding each point in face is fitted the function of plane corresponding with the upper surface of object under test 13 according to least square method, The referred to as function of objective plane.Then, target depth acquiring unit 103 obtains between objective plane and preset first level face 12 Distance h4.After being provided with of preset first level face 12, corresponding function is known.Therefore, objective plane and pre- If the distance between first level face 12 can obtain.Target depth acquiring unit 103 is using distance h4 as target depth.Cause Object under test 13 is cuboid, and object under test 13 is horizontal positioned, so, objective plane can characterize the upper table of object under test 13 Plane where face, in turn, distance h4 can characterize the upper surface of the object under test 13 obtained by depth camera mould group 2 and pre- If the distance between first level face 12.
Next, compensating unit 104 obtains compensation factor c1.Compensation factor c1=h1/h4.Then, compensating unit 104 It is compensated according to the corresponding depth value of each point of compensation factor c1 measuring targets 13, to obtain the first compensation depth value That is, each of depth map depth value corresponding with object under test 13 is multiplied by compensation factor c1, to obtain and object under test 13 corresponding first compensation depth values.
Next, volume output unit 105 according to first compensation depth value corresponding with object under test 13 obtain with it is to be measured The point cloud coordinate of the corresponding compensation of object 13 obtains determinand further according to and with the point cloud coordinate of the corresponding compensation of object under test 13 The volume of body 13.
The present embodiment also provides a kind of volume measuring method, the volume measuring method of the present embodiment and the volume of embodiment 1 Measurement method is essentially identical, and difference is that the step of obtaining target depth is distinct.
In a kind of optional embodiment, which is realized using the volume measurement device of the present embodiment.
In another optional embodiment, the volume measuring method is real using the volume measuring system of the present embodiment It is existing.The depth map of scene of the acquisition of depth camera mould group 2 containing object under test 13.The depth map using the depth value of each point as Pixel value.In step S301, depth map receiving unit 101 receives depth map from depth camera mould group 2.
Laser range finder 11 obtains itself and the distance between 15 h1 of target detection point.Because of the detection light of laser range finder 11 Beam 14 along the vertical direction, so distance h1 is the distance between target detection point 15 and preset first level face 12.Because swashing The distance that optar 11 measures has high precision, therefore, using distance h1 as reference distance value.In step s 302, join It examines distance value receiving unit 102 and receives reference distance value from laser range finder 11.
In step S303, target depth acquiring unit 103 obtains the point cloud coordinate of scene according to the depth map of scene.So Afterwards, the corresponding coordinate point set of object under test 13 is filtered out from the point cloud coordinate of scene.That is, target depth acquiring unit 103 obtain the coordinate of each point corresponding with the upper surface of object under test 13.Next, 103 basis of target depth acquiring unit The coordinate of the corresponding each point in the upper surface of object under test 13, according to the upper surface pair of least square method fitting and object under test 13 The function for the plane answered, the referred to as function of objective plane.Then, target depth acquiring unit 103 obtains objective plane and presets The distance between first level face 12 h4.After being provided with of preset first level face 12, corresponding function is known.Cause This, the distance between objective plane and preset first level face 12 can obtain.Target depth acquiring unit 103 is with distance H4 is target depth.Because object under test 13 is cuboid, object under test 13 is horizontal positioned, so, objective plane can characterize Plane where the upper surface of object under test 13, in turn, distance h4 can characterize the determinand obtained by depth camera mould group 2 The distance between upper surface and preset first level face 12 of body 13.
Next, in step s 304, compensating unit 104 obtains compensation factor c1.Compensation factor c1=h1/h4.Then, Compensating unit 104 is compensated according to the corresponding depth value of compensation factor c1 measuring targets 13, to obtain the first compensation depth Value that is, each of depth map depth value corresponding with object under test 13 multiplied by compensation factor c1, with obtain with it is to be measured The corresponding first compensation depth value of object 13.
Next, volume output unit 105 is according to first compensation depth corresponding with object under test 13 in step S305 Value obtains the point cloud coordinate of compensation corresponding with object under test 13, obtains further according to coordinate point set corresponding with object under test 13 The volume of object under test 13.
Although specific embodiments of the present invention have been described above, it will be appreciated by those of skill in the art that this is only For example, protection scope of the present invention is to be defined by the appended claims.Those skilled in the art without departing substantially from Under the premise of the principle and substance of the present invention, many changes and modifications may be made, but these change and Modification each falls within protection scope of the present invention.

Claims (10)

1. a kind of volume measuring method, which comprises the following steps:
The depth map of the scene containing object under test is received, the depth map of the scene is on the basis of preset first level face;
Reference distance value is received from single-point range unit, the reference distance value is target detection point and the preset first level The distance between face, the target detection point are located at the upper surface of the object under test;
Target depth corresponding with the target detection point is obtained according to the depth map of the scene;
Compensation factor is obtained, is compensated according to depth value of the compensation factor to the depth map of the scene, to be mended Repay depth value;The compensation factor is the ratio of the reference distance value and the target depth;
The point cloud coordinate compensated according to the compensation depth value, and obtained according to the point cloud coordinate of the compensation described to be measured The volume of object.
2. volume measuring method as described in claim 1, which is characterized in that the single-point range unit includes laser ranging Instrument, the laser range finder are set on the preset first level face, and the detection light beam of the laser range finder projects vertically In the upper surface of the object under test.
3. volume measuring method as claimed in claim 2, which is characterized in that it is described according to the depth map of scene obtain with it is described The step of target detection point corresponding target depth includes:
It is corresponding that the target detection point is extracted from the depth map of the scene according to the location information of the laser range finder Depth value is as the target depth.
4. volume measuring method as claimed in claim 1 or 2, which is characterized in that it is described according to the depth map of scene obtain with The step of target detection point corresponding target depth includes:
Depth data corresponding with the upper surface of the object under test is obtained from the depth map of the scene;
Average depth value is obtained, and using the average depth value as the target depth, the average depth value is the depth The average value of degree evidence.
5. volume measuring method as claimed in claim 1 or 2, which is characterized in that the object under test be cuboid, it is described to It is horizontal positioned for surveying object;
Then described obtained according to the depth map of scene with the step of target detection point corresponding target depth includes:
The point cloud coordinate of the upper surface of the object under test is obtained according to the depth map of the scene, is adopted according to described cloud coordinate An objective plane is fitted with least square method;
It obtains the distance between the objective plane and the preset first level face and is used as the target depth.
6. a kind of volume measurement device, which is characterized in that including depth map receiving unit, reference distance value receiving unit, target Depth acquiring unit, compensating unit, volume output unit;
The depth map receiving unit is used to receive the depth map of the scene containing object under test, and the depth map of the scene is with pre- If on the basis of first level face;
The reference distance value receiving unit is used to receive reference distance value from single-point range unit, and the reference distance value is mesh The distance between test point and the preset first level face are marked, the target detection point is located at the upper table of the object under test Face;
The target depth acquiring unit is used to obtain target corresponding with the target detection point according to the depth map of scene deep Degree;
The compensating unit is used for according to the compensation factor for obtaining compensation factor to the depth of the depth map of the scene Angle value compensates, to obtain compensation depth value;The compensation factor is the ratio of the reference distance value and the target depth Value;
The volume output unit is used for the point cloud coordinate compensated according to the compensation depth value, and according to the compensation Point cloud coordinate obtains the volume of the object under test.
7. volume measurement device as claimed in claim 6, which is characterized in that the single-point range unit includes laser ranging Instrument, the laser range finder are set on the preset first level face, and the detection light beam of the laser range finder projects vertically In the upper surface of the object under test;
Then the target depth acquiring unit is used for the depth map according to the location information of the laser range finder from the scene The middle corresponding depth value of the target detection point that extracts is as the target depth.
8. volume measurement device as claimed in claims 6 or 7, which is characterized in that the target depth acquiring unit is also used to Depth data corresponding with the upper surface of the object under test is obtained from the depth map of the scene;
The target depth acquiring unit is also used to obtain average depth value, and deep using the average depth value as the target Degree, the average depth value are the average value of the depth data.
9. volume measurement device as claimed in claims 6 or 7, which is characterized in that the object under test be cuboid, it is described to It is horizontal positioned for surveying object;
The target depth acquiring unit is also used to obtain the upper surface of the object under test according to the depth map of the scene Point cloud coordinate, and an objective plane is fitted using least square method according to described cloud coordinate;
The target depth acquiring unit is also used to obtain the distance between the objective plane and the preset first level face As the target depth.
10. a kind of volume measuring system, which is characterized in that the volume measuring system includes depth camera mould group, single-point ranging Device and the volume measurement device as described in any one in claim 6-9;
The depth camera mould group is used to obtain the depth map of the scene;
The single-point range unit is for obtaining the reference distance value;
The volume measuring system further includes image acquisition unit and mark unit;
Described image acquiring unit is used to obtain the image of the scene containing the object under test;
The mark unit is used to obtain the profile of the object under test in described image and is labeled.
CN201910741396.8A 2019-08-12 2019-08-12 Volume measuring method, device and system Pending CN110319899A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910741396.8A CN110319899A (en) 2019-08-12 2019-08-12 Volume measuring method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910741396.8A CN110319899A (en) 2019-08-12 2019-08-12 Volume measuring method, device and system

Publications (1)

Publication Number Publication Date
CN110319899A true CN110319899A (en) 2019-10-11

Family

ID=68126004

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910741396.8A Pending CN110319899A (en) 2019-08-12 2019-08-12 Volume measuring method, device and system

Country Status (1)

Country Link
CN (1) CN110319899A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101533529A (en) * 2009-01-23 2009-09-16 北京建筑工程学院 Range image-based 3D spatial data processing method and device
CN102609941A (en) * 2012-01-31 2012-07-25 北京航空航天大学 Three-dimensional registering method based on ToF (Time-of-Flight) depth camera
CN103927782A (en) * 2014-01-06 2014-07-16 河南科技大学 Method for depth image surface fitting
US20150365652A1 (en) * 2014-06-13 2015-12-17 Lips Corporation Depth camera system
CN106767933A (en) * 2017-02-10 2017-05-31 深圳奥比中光科技有限公司 The measuring system of depth camera error, measuring method, evaluation method and compensation method
CN107255821A (en) * 2017-06-07 2017-10-17 旗瀚科技有限公司 A kind of method for splicing simulated laser radar data based on many depth cameras
CN107727220A (en) * 2017-10-11 2018-02-23 上海展扬通信技术有限公司 A kind of human body measurement method and body measurement system based on intelligent terminal
US20180093781A1 (en) * 2016-09-30 2018-04-05 Sony Interactive Entertainment Inc. Unmanned aerial vehicle movement via environmental interactions
CN108010071A (en) * 2017-12-01 2018-05-08 中国人民解放军后勤工程学院 A kind of Luminance Distribution measuring system and method using 3D depth surveys

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101533529A (en) * 2009-01-23 2009-09-16 北京建筑工程学院 Range image-based 3D spatial data processing method and device
CN102609941A (en) * 2012-01-31 2012-07-25 北京航空航天大学 Three-dimensional registering method based on ToF (Time-of-Flight) depth camera
CN103927782A (en) * 2014-01-06 2014-07-16 河南科技大学 Method for depth image surface fitting
US20150365652A1 (en) * 2014-06-13 2015-12-17 Lips Corporation Depth camera system
US20180093781A1 (en) * 2016-09-30 2018-04-05 Sony Interactive Entertainment Inc. Unmanned aerial vehicle movement via environmental interactions
CN106767933A (en) * 2017-02-10 2017-05-31 深圳奥比中光科技有限公司 The measuring system of depth camera error, measuring method, evaluation method and compensation method
CN107255821A (en) * 2017-06-07 2017-10-17 旗瀚科技有限公司 A kind of method for splicing simulated laser radar data based on many depth cameras
CN107727220A (en) * 2017-10-11 2018-02-23 上海展扬通信技术有限公司 A kind of human body measurement method and body measurement system based on intelligent terminal
CN108010071A (en) * 2017-12-01 2018-05-08 中国人民解放军后勤工程学院 A kind of Luminance Distribution measuring system and method using 3D depth surveys

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
商佳尚: "差值法与比值法的误差修正效果比较研究及应用", 《计测技术》 *

Similar Documents

Publication Publication Date Title
CN106969706A (en) Workpiece sensing and three-dimension measuring system and detection method based on binocular stereo vision
CN108986070B (en) Rock crack propagation experiment monitoring method based on high-speed video measurement
CN106392304B (en) A kind of laser assisted weld seam Intelligent tracing system and method
US20180108143A1 (en) Height measuring system and method
CN108733053A (en) A kind of Intelligent road detection method based on robot
CN107044821A (en) A kind of measuring method and system of contactless tubing object
CN102063718A (en) Field calibration and precision measurement method for spot laser measuring system
US10290117B2 (en) System for extracting position information of object in point cloud data by using component
CN107869954B (en) Binocular vision volume weight measurement system and implementation method thereof
CN110672020A (en) Stand tree height measuring method based on monocular vision
CN107101582A (en) Axial workpiece run-out error On-line Measuring Method based on structure light vision
CN104034733A (en) Service life prediction method based on binocular vision monitoring and surface crack image recognition
CN103196370A (en) Measuring method and measuring device of conduit connector space pose parameters
CN107356202A (en) A kind of laser scanning measurement system target sights method automatically
KR20130033374A (en) Surveying method
CN105405126A (en) Multi-scale air-ground parameter automatic calibration method based on monocular vision system
EP1459035B1 (en) Method for determining corresponding points in stereoscopic three-dimensional measurements
CN107345789A (en) A kind of pcb board hole location detecting device and method
CN105894511A (en) Calibration target setting method and device and parking auxiliary system
CN107504917B (en) Three-dimensional size measuring method and device
CN107271445B (en) Defect detection method and device
CN103175512B (en) Shooting measurement method of attitude of tail end of boom of concrete pump truck
CN109035343A (en) A kind of floor relative displacement measurement method based on monitoring camera
CN108871185A (en) Method, apparatus, equipment and the computer readable storage medium of piece test
CN110517314A (en) Method, apparatus and computer readable storage medium are determined based on the pallet pose of TOF camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20191011