CN116538926A - Handheld dimension measuring device and dimension measuring method - Google Patents

Handheld dimension measuring device and dimension measuring method Download PDF

Info

Publication number
CN116538926A
CN116538926A CN202310602563.7A CN202310602563A CN116538926A CN 116538926 A CN116538926 A CN 116538926A CN 202310602563 A CN202310602563 A CN 202310602563A CN 116538926 A CN116538926 A CN 116538926A
Authority
CN
China
Prior art keywords
distance
physical
auxiliary line
calculation mode
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310602563.7A
Other languages
Chinese (zh)
Inventor
丁超
林喆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sunmi Technology Group Co Ltd
Shenzhen Michelangelo Technology Co Ltd
Original Assignee
Shanghai Sunmi Technology Group Co Ltd
Shenzhen Michelangelo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sunmi Technology Group Co Ltd, Shenzhen Michelangelo Technology Co Ltd filed Critical Shanghai Sunmi Technology Group Co Ltd
Priority to CN202310602563.7A priority Critical patent/CN116538926A/en
Publication of CN116538926A publication Critical patent/CN116538926A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/08Measuring arrangements characterised by the use of optical techniques for measuring diameters

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a handheld dimension measuring device and a dimension measuring method, wherein the device comprises a camera, a camera and a display unit, wherein the camera is used for acquiring images of an object to be measured; the TOF sensor is used for acquiring the physical distance from the camera to the object to be detected; the interaction module is used for determining a starting point and an end point of measurement of the object to be measured according to user operation on a target image of the object to be measured and acquiring a calculation mode, wherein the calculation mode comprises a length calculation mode and a diameter calculation mode; and the processor is used for identifying a target image of the object to be detected from the image, calculating the pixel distance from the starting point to the end point, and calculating the physical size of the object to be detected according to the focal length, the pixel distance, the physical distance and the calculation mode of the image. According to the invention, the external dimension or the internal diameter of the object to be measured is calculated by combining the internal parameters of the camera and the mathematical geometric relationship, a 3D image is not required to be generated, the calculation mode is simple, and the calculated amount is small.

Description

Handheld dimension measuring device and dimension measuring method
Technical Field
The invention mainly relates to the technical field of measurement, in particular to handheld size measurement equipment and a size measurement method.
Background
With the development of intelligent technology, there is an increasing need to measure the geometric dimensions of objects by taking images with handheld devices. For example, in a measurement scenario for farm and grazing products, a quality inspector may wish to take images of fruit, vegetables or poultry via a handheld device to measure their geometry. The common size measurement method is to generate 3D depth information by using a monocular camera, establish a 3D image based on the 3D depth information, and calculate the distance between two points in space according to the 3D image. The calculation cost for generating the 3D map by the handheld monocular device is too high, cloud service resource coordination is generally needed, and the method is not suitable for application in the scene without the Internet.
Thus, there is a need for a handheld sizing device and sizing method that is computationally inexpensive and does not require networking.
Disclosure of Invention
The invention aims to solve the technical problems that the current size measurement method needs to generate a 3D map, is too high in calculation cost and is not suitable for handheld equipment.
In order to solve the above technical problems, the present invention provides a handheld dimension measuring device, including: the camera is used for collecting images of the object to be detected; the TOF sensor is used for acquiring the physical distance from the camera to the object to be detected; the interaction module is used for determining a starting point and an end point of the measurement of the object to be measured according to the user operation on the target image and acquiring a calculation mode, wherein the calculation mode comprises a length calculation mode and a diameter calculation mode; and the processor is used for identifying a target image of the object to be detected from the image, calculating the pixel distance from the starting point to the end point, and calculating the physical size of the object to be detected according to the focal length of the image, the pixel distance, the physical distance and the calculation mode.
Optionally, the processor is further configured to: and when the calculation mode is the length calculation mode, calculating the physical length of the object to be measured according to the ratio of the physical distance to the focal length and the pixel distance.
Optionally, the processor is further configured to calculate the physical length using the following formula:
wherein P is 1 For the physical length L 1 For the pixel distance, D 2 For the physical distance D 1 Is the focal length.
Optionally, the processor is further configured to: and when the calculation mode is the diameter calculation mode, acquiring the deviation distance between the midpoints of the starting point and the ending point and the photosensitive center of the camera, and calculating the physical diameters of the cross sections of the object to be measured at the starting point and the ending point according to the focal length, the physical distance, the deviation distance and the pixel distance.
Optionally, the processor is further configured to: calculating a first angle and a second angle according to the focal length, the offset distance and the pixel distance; calculating a first auxiliary line distance, a second auxiliary line distance and a third auxiliary line distance according to the physical distance, the first angle and the second angle; the physical diameter is calculated from the first auxiliary line distance, the second auxiliary line distance, the third auxiliary line distance, the first angle, and the second angle.
Optionally, the processor is further configured to calculate the first auxiliary line distance, the second auxiliary line distance, and the third auxiliary line distance by the following formula:
M 2 =sin(θ 2 )*D CF
M 1 =cos(θ 2 )*D CF *tan(θ 12 )-M 2
wherein,,< 2 for the second auxiliary line distance M 1 For the first auxiliary line distance M 3 For the third auxiliary line distance θ 2 For the second angle, θ 1 For the first angle, D CF Is the physical distance.
Optionally, the processor is further configured to calculate the physical diameter using the following formula:
c=M 1 2 +M 3 2 +2M 1 M 3 cos(θ 12 +π/2)
wherein r is a physical radius, the physical diameter is 2r, M 2 For the second auxiliary line distance M 1 For the first auxiliary line distance M 3 For the third auxiliary line distance θ 2 For the second angle, θ 1 Is the first angle.
Optionally, the processor includes: the semantic segmentation model is used for carrying out semantic segmentation on the image and outputting target image pixels and background area pixels; the processor is further configured to import the target image pixel into a newly created layer to obtain the target image.
Optionally, the interaction module is further configured to: acquiring a ranging straight line on the target image; and determining the starting point and the end point according to the intersection point of the ranging straight line and the target image boundary.
In order to solve the above technical problems, the present invention provides a dimension measurement method, including: acquiring an image of an object to be measured through a camera, and acquiring the physical distance from the camera to the object to be measured based on a TOF technology; identifying a target image of the object to be detected from the image; determining a starting point and an end point of the measurement of the object to be measured according to the user operation on the target image, and calculating the pixel distance from the starting point to the end point; acquiring a calculation mode, wherein the calculation mode comprises a length calculation mode and a diameter calculation mode; and calculating the physical size of the object to be measured according to the focal length of the image, the pixel distance, the physical distance and the calculation mode.
Optionally, calculating the physical size of the object to be measured according to the focal length of the image, the pixel distance, the physical distance, and the calculation mode includes: and when the calculation mode is the length calculation mode, calculating the physical length of the object to be measured according to the ratio of the physical distance to the focal length and the pixel distance.
Alternatively, the physical length is calculated using the following formula:
wherein P is 1 For the physical length L 1 For the pixel distance, D 2 For the physical distance D 1 Is the focal length.
Optionally, calculating the physical size of the object to be measured according to the focal length of the image, the pixel distance, the physical distance, and the calculation mode includes: and when the calculation mode is the diameter calculation mode, acquiring the deviation distance between the midpoints of the starting point and the ending point and the photosensitive center of the camera, and calculating the physical diameters of the cross sections of the object to be measured at the starting point and the ending point according to the focal length, the physical distance, the deviation distance and the pixel distance.
Optionally, calculating the physical diameter of the cross section of the object to be measured at the start point and the end point according to the focal length, the physical distance, the offset distance and the pixel distance includes: calculating a first angle and a second angle according to the focal length, the offset distance and the pixel distance; calculating a first auxiliary line distance, a second auxiliary line distance and a third auxiliary line distance according to the physical distance, the first angle and the second angle; the physical diameter is calculated from the first auxiliary line distance, the second auxiliary line distance, the third auxiliary line distance, the first angle, and the second angle.
Optionally, the first auxiliary line distance, the second auxiliary line distance, and the third auxiliary line distance are calculated by the following formula:
M 2 =sin(θ 2 )*D CF
M 1 =cos(θ 2 )*D CF *tan(θ 12 )-M 2
wherein M is 2 For the second auxiliary line distance M 1 For the first auxiliary line distance M 3 For the third auxiliary line distance θ 2 For the second angle, θ 1 For the first angle, D CF Is the physical distance.
Alternatively, the physical diameter is calculated using the following formula:
c=M 1 2 +M 3 2 +2M 1 M 3 cos(θ 12 +π/2)
wherein r is a physical radius, the physical diameter is 2r, M 2 For the second auxiliary line distance M 1 For the first auxiliary line distance M 3 For the third auxiliary line distance θ 2 For the second angle, θ 1 Is the first angle.
Optionally, identifying the target image of the object to be measured from the image includes: inputting the image into a trained semantic segmentation model, and outputting a target image pixel and a background area pixel; and importing the target image pixels into a newly built layer to obtain the target image.
Optionally, acquiring the starting point and the end point of the measurement of the object to be measured according to the target image includes: acquiring a ranging straight line on the target image; and determining the starting point and the end point according to the intersection point of the ranging straight line and the target image boundary.
Compared with the prior art, the invention has the following advantages:
the invention provides a handheld dimension measuring device and a dimension measuring method, which are used for calculating the outer dimension or the inner diameter of an object to be measured by combining camera internal parameters and mathematical geometric relations, and are simple in calculation and do not need networking, and a 3D image is not required to be generated.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the principles of the invention. In the accompanying drawings:
FIG. 1 is a system block diagram of a handheld dimensional measurement device according to an embodiment of the invention.
FIG. 2 is an application scenario diagram of the handheld dimension measurement device of one embodiment of FIG. 1.
Fig. 3 is a schematic diagram of a first state of an interaction module according to an embodiment of the invention.
Fig. 4 is a schematic diagram of the geometrical relationship corresponding to fig. 3.
Fig. 5 is a second state diagram of an interaction module according to an embodiment of the invention.
Fig. 6 is a schematic diagram of calculation of the object to be measured corresponding to fig. 5.
Fig. 7 is a schematic diagram of the geometrical relationship corresponding to fig. 6.
Fig. 8 is a flow chart of a TOF-based dimension measurement method in accordance with an embodiment of the present invention.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is obvious to those skilled in the art that the present application may be applied to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
As used in this application and in the claims, the terms "a," "an," "the," and/or "the" are not specific to the singular, but may include the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
The relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present application unless it is specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description. Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but should be considered part of the specification where appropriate. In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
In the description of the present application, it should be understood that, where azimuth terms such as "front, rear, upper, lower, left, right", "transverse, vertical, horizontal", and "top, bottom", etc., indicate azimuth or positional relationships generally based on those shown in the drawings, only for convenience of description and simplification of the description, these azimuth terms do not indicate and imply that the apparatus or elements referred to must have a specific azimuth or be constructed and operated in a specific azimuth, and thus should not be construed as limiting the scope of protection of the present application; the orientation word "inner and outer" refers to inner and outer relative to the contour of the respective component itself.
Spatially relative terms, such as "above … …," "above … …," "upper surface at … …," "above," and the like, may be used herein for ease of description to describe one device or feature's spatial location relative to another device or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as "above" or "over" other devices or structures would then be oriented "below" or "beneath" the other devices or structures. Thus, the exemplary term "above … …" may include both orientations of "above … …" and "below … …". The device may also be positioned in other different ways (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
In addition, the terms "first", "second", etc. are used to define the components, and are merely for convenience of distinguishing the corresponding components, and unless otherwise stated, the terms have no special meaning, and thus should not be construed as limiting the scope of the present application. Furthermore, although terms used in the present application are selected from publicly known and commonly used terms, some terms mentioned in the specification of the present application may be selected by the applicant at his or her discretion, the detailed meanings of which are described in relevant parts of the description herein. Furthermore, it is required that the present application be understood, not simply by the actual terms used but by the meaning of each term lying within.
Flowcharts are used in this application to describe the operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in order precisely. Rather, the various steps may be processed in reverse order or simultaneously. At the same time, other operations are added to or removed from these processes.
As mentioned in the background art, a common size measurement method is to generate 3D depth information using a monocular camera, create a 3D image based on the 3D depth information, and calculate a spatial two-point distance from the 3D image. The calculation cost for generating the 3D map by the handheld monocular device is too high, cloud service resource coordination is generally needed, and the method is not suitable for application in the scene without the Internet. In view of the problem, the invention provides a handheld dimension measuring device and a dimension measuring method, which are used for calculating the outer dimension or the inner diameter of an object to be measured by combining camera internal parameters and mathematical geometric relations, and are free from generating 3D images, simple in calculation and networking.
FIG. 1 is a system block diagram of a handheld dimensional measurement device according to an embodiment of the invention. As shown in fig. 1, the handheld dimensional measurement device 100 includes a TOF sensor 11, a camera 12, an interaction module 13, and a processor 14.
The TOF sensor 11 is used to acquire the physical distance from the camera to the object to be measured. ToF is an abbreviation for Time-of-Flight, and transliteration is the "Time of Flight". The TOF sensor is a sensor for actively emitting infrared light, and obtains the distance of a target object by emitting light pulses and receiving pulse signals reflected by the object and finally calculating the flight time of the light pulses. The camera 12 is used for capturing images of an object to be measured. The camera 12 may be an RGB camera for capturing color images of the object to be measured. The processor 14 is configured to obtain an image of the object to be measured from the camera 12, and obtain a focal length of the image, and identify a target image of the object to be measured from the image. The interaction module 13 is configured to obtain a start point and an end point of measurement of the object to be measured according to the target image, and obtain a calculation mode, where the calculation mode includes a length calculation mode and a diameter calculation mode. The processor 14 is further configured to calculate a pixel distance from the start point to the end point, and calculate a physical size of the object to be measured according to the focal length, the pixel distance, the physical distance, and the calculation mode.
FIG. 2 is an application scenario diagram of the handheld dimension measurement device of one embodiment of FIG. 1. In this application scenario, the handheld dimension measuring device 100 is used to measure the physical dimension of the object a to be measured. The physical dimensions include an outer dimension or an inner diameter. Taking the object a to be measured as a cucumber as an example, the handheld dimension measuring device 100 is used for measuring the length of the cucumber and the diameter of the central section of the cucumber. As shown in fig. 2, the handheld dimension measurement device 100 includes a TOF sensor 11, a camera 12, an interaction module 13, and a processor 14 (refer to fig. 1). The camera 12 is used for acquiring an image of the object a to be measured. When an image is acquired, a TOF center alignment mark Z in a picture is aligned with the center of an object A to be detected as much as possible, and then a current picture is photographed and locked. The TOF sensor 11 is used for acquiring the physical distance D from the camera 12 to the object A to be measured 2 . The processor 14 is configured to identify a target image A1 of the object a to be detected from the images acquired by the camera 12, and send the identified target image A1 to the interaction module 13. The interaction module 13 comprises a display component for displaying the target image A1.
Optionally, the processor includes a semantic segmentation model for semantically segmenting the image, outputting the target image pixels and the background region pixels. The processor is also used for importing the target image pixels into the newly built image layer to obtain the target image. Semantic segmentation is the pixel-level classification of semantic (sematic) content on a picture, whereby the same piece of object is expressed with the same pixel value. The semantic segmentation model can be trained in advance, taking the dimension scene of the farm and pasture products as an example, and taking different farm and pasture product photos by using the camera of the handheld dimension measuring device or the cameras of other devices. When shooting, a single object is taken as a main body, and the main body occupies more than 50% of the picture. For example, apples are counted as single apples, bananas can be counted as single bananas or as a hanging of bananas, and pigs, cows, sheep and the like are shot as single counted. After shooting is completed, the main body target is segmented and marked by using a marking tool, and the background areas of the main body and the non-main body are divided into two types of pixels. For example, a main body is framed with polygons, and pixels within a closed frame belong to main body pixels, and pixels outside the frame belong to background class pixels. Inputting the marked data into an untrained semantic segmentation model for training to obtain a model capable of segmenting the agricultural and grazing product main body from the picture. And quantifying and pruning the model to realize embedded deployment optimization, storing the model into a magnetic disk, and deploying the model into handheld size measurement equipment.
The semantic segmentation model can be U-Net, FCN, segNet, PSPNet, deep Lab v1/v2/v3/v3+ model and the like, and the application is not limited to the model. Taking the U-Net model as an example, the U-Net model comprises a convolution layer, a ReLu layer, a copy and cut layer, a maximum pooling layer and an upper convolution layer. The U-Net model can be divided into two parts: an encoder and a decoder. The function of the encoder is to extract features from the original image and compress the size of the feature image to a small fixed size. The encoder consists of 4 pooling layers and 4 convolution layers, each pooling layer consists of a maximum pooling layer and an activation function ReLU, and each convolution layer consists of a 3×3 convolution kernel, a step size of 1, a convolution layer filled with 1 and an activation function ReLU. The decoder functions to upsample the features extracted by the encoder and restore the feature image size to the original input size for pairing with the target mask. The decoder consists of 4 convolutional layers and 4 dilated layers, each consisting of a 3×3 convolutional kernel, a step size of 1, a convolutional layer filled with 1 and an activation function ReLU, and each dilated layer consisting of a maximum dilated layer and an activation function ReLU.
The interaction module 13 includes an input component for determining a start point and an end point of measurement of the object to be measured according to a user operation on a target image of the object to be measured and acquiring a calculation mode including a length calculation mode and a diameter calculation mode. In one example, the interaction module 13 is a touch display screen. Fig. 3 is a schematic diagram of a first state of an interaction module according to an embodiment of the invention. As shown in fig. 3, the interaction module 13 displays the target image A1, the calculation mode 1, and the calculation mode 2. Calculation mode 1 may be a length calculation mode and calculation mode 2 may be a diameter calculation mode. The user can slide out the ranging straight line N1 with a finger on the target image A1 and then select the calculation mode 1 or the calculation mode 2. Taking the selection of the calculation mode 1 as an example, the interaction module 13 acquires a ranging straight line N1 on the target image A1; the start point S1 and the end point E1 are determined from the intersection point of the ranging straight line N1 and the boundary of the target image A1. The interaction module 13 sends the start point S1, the end point E1 and the calculation mode 1 to the processor.
The processor 14 is further configured to calculate a pixel distance from the start point to the end point, obtain a focal length of the image, and calculate a physical size of the object to be measured according to the focal length, the pixel distance, the physical distance, and the calculation mode. Fig. 4 is a schematic diagram of the geometrical relationship corresponding to fig. 3. As shown in fig. 4, the processor calculates a pixel distance L from the start point S1 to the end point E1 1 Focal length D at the time of image capturing 1 Physical distance D acquired from TOF sensor 11 2 . The physical length P of the object A to be measured can be calculated according to the geometric relationship of the equilateral triangle 1 . Specifically, the physical length is calculated using the following formula:
wherein P is 1 Is of physical length L 1 For pixel distance, D 2 Is the physical distance D 1 Is the focal length.
Thus far, the length L of the object to be measured A (cucumber) can be measured 2
Fig. 5 is a second state diagram of an interaction module according to an embodiment of the invention. As shown in fig. 5, the interaction module 13 displays the target image A1, the calculation mode 1, and the calculation mode 2. Calculation mode 1 may be a length calculation mode and calculation mode 2 may be a diameter calculation mode. The user may slide out the ranging straight line N2 with a finger on the target image A1 and then select the calculation mode 1 or the calculation mode 2. Taking the selection of the calculation mode 2 as an example, the interaction module 13 acquires a ranging straight line N2 on the target image A1; the start point S2 and the end point E2 are determined from the intersection point of the ranging straight line N2 and the boundary of the target image A1. The interaction module 13 sends the start point S2, the end point E2 and the calculation mode 2 to the processor 14.
Fig. 6 is a schematic diagram of calculation of the object to be measured corresponding to fig. 5. The calculation mode is calculation mode 2, i.e. when in diameter calculation mode, the processor is used for calculating the diameter D of the cylindrical cross section formed by the tangent of the plane formed by the straight line drawn by the object to be measured a and the TOF and the object to be measured.
Fig. 7 is a schematic diagram of the geometrical relationship corresponding to fig. 6. As shown in fig. 7, when the calculation mode is the calculation mode 2, that is, the diameter calculation mode, since the operator can only align the TOF center alignment mark Z as much as possible with the center of the object a to be measured, an error always exists. The processor is further configured to obtain a deviation distance h between a midpoint C1 of the start point S2 and the end point E2 and the photosensitive center O of the camera, where C2 is the center of the object a to be measured. Calculating the pixel distance L from the start point S2 to the end point E2 2 According to the offset distance h and the pixel distance L 2 And focal length D 1 Can calculate the first angle theta 1 And a second angle theta 2 . Specifically, the first angle θ can be calculated by the following triangle geometry formula 1 And a second angle theta 2
First auxiliary line distance M 1 Distance M of second auxiliary line 2 Distance M from the third auxiliary line 3 Are the distances of the geometric auxiliary lines, M 1 And M 2 On a line they are perpendicular to the target field of view 1. According to the physical distance D CF First angle theta 1 And a second angle theta 2 Calculate the first auxiliary line distance M 1 Distance M of second auxiliary line 2 Distance M from the third auxiliary line 3
Optionally, the processor is further configured to calculate the first auxiliary line distance, the second auxiliary line distance, and the third auxiliary line distance by the following formula:
M 2 =sin(θ 2 )*D CF
M 1 =cos(θ 2 )*D CF *tan(θ 12 )-M 2
wherein M is 2 For the second auxiliary line distance M 1 For the first auxiliary line distance M 3 For the third auxiliary line distance θ 2 At a second angle of θ 1 At a first angle, D CF Is the physical distance D CF =D 2
Finally, a physical diameter is calculated based on the first auxiliary line distance, the second auxiliary line distance, the third auxiliary line distance, the first angle, and the second angle.
Alternatively, the physical diameter is calculated using the following formula:
c=M 1 2 +M 3 2 +2M 1 M 3 cos(θ 12 +π/2)
wherein r is a physical radius, the physical diameter d=2r, m 2 For the second auxiliary line distance M 1 For the first auxiliary line distance M 3 For the third auxiliary line distance θ 2 At a second angle of θ 1 Is at a first angle.
The invention provides a handheld dimension measuring device with a TOF sensor, which combines the internal parameters of a camera and the mathematical geometry principle and can calculate the external dimension or the internal diameter of agricultural products. The handheld dimensional measurement device does not need to generate 3D images, is computationally simple and does not need to be networked.
In some embodiments, the handheld dimensional measurement device further includes program storage units of different forms, such as Read Only Memory (ROM) and Random Access Memory (RAM), capable of storing various data files for computer processing and/or communication, and possible program instructions for execution by the processor. The processor executes these instructions to implement the main part of the method. And the result processed by the processor is transmitted to the interaction module through the communication port and is displayed on the interaction module. The handheld dimensional measurement device also includes an internal communication bus and a communication port. An internal communication bus may enable data communication between the handheld dimensional measurement device components. The communication port may enable data communication of the handheld dimensional measurement device with the outside. The handheld dimensional measurement device further includes a power supply module. The power supply module is used for supplying power to the handheld dimension measuring equipment.
Fig. 8 is a flow chart of a TOF-based dimension measurement method in accordance with an embodiment of the present invention. As shown in fig. 8, the size measurement method includes the steps of:
step S81: and acquiring an image of the object to be measured through the camera, and acquiring the physical distance from the camera to the object to be measured based on the TOF technology.
The camera can be an RGB camera, and color images of the object to be detected are acquired through the RGB camera.
TOF technology is a technology for obtaining the distance of a target object by transmitting light pulses and receiving pulse signals reflected by the object and finally calculating the flight time of the light pulses. A TOF sensor may be placed near the camera, and the physical distance from the camera to the object to be measured is obtained by the TOF sensor.
Step S82: a target image of the object to be measured is identified from the image.
Optionally, identifying the object to be detected from the image includes inputting the image into a trained semantic segmentation model, outputting target image pixels and background region pixels; and importing the target image pixels into the newly built image layer to obtain a target image.
The semantic segmentation model can be trained in advance, taking the dimension scene of the farm and grazing products as an example, and different farm and grazing product photos are shot by using a camera. When shooting, a single object is taken as a main body, and the main body occupies more than 50% of the picture. For example, apples are counted as single apples, bananas can be counted as single bananas or as a hanging of bananas, and pigs, cows, sheep and the like are shot as single counted. After shooting is completed, the main body target is segmented and marked by using a marking tool, and the background areas of the main body and the non-main body are divided into two types of pixels. For example, a main body is framed with polygons, and pixels within a closed frame belong to main body pixels, and pixels outside the frame belong to background class pixels. Inputting the marked data into an untrained semantic segmentation model for training to obtain a model capable of segmenting the agricultural and grazing product main body from the picture.
Step S83: and determining a starting point and an end point of the measurement of the object to be measured according to the user operation on the target image, and calculating the pixel distance from the starting point to the end point.
Optionally, determining the starting point and the end point of the measurement of the object to be measured according to the user operation on the target image includes obtaining a ranging line on the target image; and determining a starting point and an end point according to the intersection point of the ranging straight line and the boundary of the target image.
Step S84: the computing mode is acquired, and the computing mode comprises a length computing mode and a diameter computing mode.
The user may select a computing mode through an interaction module that receives the computing mode selected by the user. The length calculation mode may calculate a physical length of the object to be measured. The diameter calculation mode may calculate a physical diameter of the object to be measured.
Step S85: and calculating the physical size of the object to be measured according to the focal length, the pixel distance, the physical distance and the calculation mode of the image.
Optionally, step S85 includes calculating the physical length of the object according to the ratio of the physical distance to the focal length and the pixel distance when the calculation mode is the length calculation mode. The physical length can be calculated using the following formula:
wherein P is 1 Is of physical length L 1 For pixel distance, D 2 Is the physical distance D 1 Is the focal length.
Optionally, step S85 includes obtaining the offset distances between the midpoints of the start point and the end point and the photosensitive center of the camera when the calculation mode is the diameter calculation mode, and calculating the physical diameters of the cross sections of the object to be measured at the start point and the end point according to the focal length, the physical distance, the offset distance and the pixel distance. A specific manner of calculation may be found in figure 7,
first, according to focal length D 1 A deviation distance h and a pixel distance L 2 Calculate the first angle theta 1 And a second angle theta 2 . The first angle θ can be calculated by the following triangle geometry formula 1 And a second angle theta 2
Then, according to the physical distance D CF First angle theta 1 And a second angle theta 2 Calculate the first auxiliary line distance M 1 Distance M of second auxiliary line 2 Distance M from the third auxiliary line 3 . First auxiliary line distance M 1 Distance M of second auxiliary line 2 Distance M from the third auxiliary line 3 Are the distances of the geometric auxiliary lines, M 1 And M 2 On a line they are perpendicular to the target field of view 1. The first auxiliary line distance, the second auxiliary line distance, and the third auxiliary line distance may be calculated by the following formula:
M 2 =sin(θ 2 )*D CF
M 1 =cos(θ 2 )*D CF *tan(θ 12 )-M 2
wherein M is 2 For the second auxiliary line distance M 1 For the first auxiliary line distance M 3 For the third auxiliary line distance θ 2 At a second angle of θ 1 At a first angle, D CF Is the physical distance D CF =D 2
Finally, a physical diameter is calculated based on the first auxiliary line distance, the second auxiliary line distance, the third auxiliary line distance, the first angle, and the second angle.
Alternatively, the physical diameter is calculated using the following formula:
c=M 1 2 +M 3 2 +2M 1 M 3 cos(θ 12 +π/2)
wherein r is a physical radius, the physical diameter d=2r, m 2 For the second auxiliary line distance M 1 For the first auxiliary line distance M 3 For the third auxiliary line distance θ 2 At a second angle of θ 1 Is at a first angle.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the above disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations of the present application may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this application, and are therefore within the spirit and scope of the exemplary embodiments of this application.
Meanwhile, the present application uses specific words to describe embodiments of the present application. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present application. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present application may be combined as suitable.
Some aspects of the present application may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.) or by a combination of hardware and software. The above hardware or software may be referred to as a "data block," module, "" engine, "" unit, "" component, "or" system. The processor may be one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital signal processing devices (DAPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), processors, controllers, microcontrollers, microprocessors, or a combination thereof. Furthermore, aspects of the present application may take the form of a computer product, comprising computer-readable program code, embodied in one or more computer-readable media. For example, computer-readable media can include, but are not limited to, magnetic storage devices (e.g., hard disk, floppy disk, tape … …), optical disk (e.g., compact disk CD, digital versatile disk DVD … …), smart card, and flash memory devices (e.g., card, stick, key drive … …).
The computer readable medium may comprise a propagated data signal with the computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take on a variety of forms, including electro-magnetic, optical, etc., or any suitable combination thereof. A computer readable medium can be any computer readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer readable medium may be propagated through any suitable medium, including radio, cable, fiber optic cable, radio frequency signals, or the like, or a combination of any of the foregoing.
Likewise, it should be noted that in order to simplify the presentation disclosed herein and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the subject application. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
While the present application has been described with reference to the present specific embodiments, those of ordinary skill in the art will recognize that the above embodiments are for illustrative purposes only, and that various equivalent changes or substitutions can be made without departing from the spirit of the present application, and therefore, all changes and modifications to the embodiments described above are intended to be within the scope of the claims of the present application.

Claims (18)

1. A handheld dimensional measurement device, comprising:
the camera is used for collecting images of the object to be detected;
the TOF sensor is used for acquiring the physical distance from the camera to the object to be detected;
the interaction module is used for determining a starting point and an end point of the measurement of the object to be measured according to the user operation on the target image of the object to be measured, and acquiring a calculation mode, wherein the calculation mode comprises a length calculation mode and a diameter calculation mode;
and the processor is used for identifying a target image of the object to be detected from the image, sending the target image to the interaction module, calculating the pixel distance from the starting point to the end point, and calculating the physical size of the object to be detected according to the focal length of the image, the pixel distance, the physical distance and the calculation mode.
2. The device of claim 1, wherein the processor is further configured to:
and when the calculation mode is the length calculation mode, calculating the physical length of the object to be measured according to the ratio of the physical distance to the focal length and the pixel distance.
3. The apparatus of claim 2, wherein the processor is further configured to calculate the physical length using the formula:
wherein P is 1 For the physical length L 1 For the pixel distance, D 2 For the physical distance D 1 Is the focal length.
4. The device of claim 1, wherein the processor is further configured to:
and when the calculation mode is the diameter calculation mode, acquiring the deviation distance between the midpoints of the starting point and the ending point and the photosensitive center of the camera, and calculating the physical diameters of the cross sections of the object to be measured at the starting point and the ending point according to the focal length, the physical distance, the deviation distance and the pixel distance.
5. The device of claim 4, wherein the processor is further configured to:
calculating a first angle and a second angle according to the focal length, the offset distance and the pixel distance;
calculating a first auxiliary line distance, a second auxiliary line distance and a third auxiliary line distance according to the physical distance, the first angle and the second angle;
the physical diameter is calculated from the first auxiliary line distance, the second auxiliary line distance, the third auxiliary line distance, the first angle, and the second angle.
6. The apparatus of claim 5, wherein the processor is further configured to calculate a first auxiliary line distance, a second auxiliary line distance, and a third auxiliary line distance by the following formula:
M 2 =sin(θ 2 )*D CF
M 1 =cos(θ 2 )*D CF *tan(θ 12 )-M 2
wherein M is 2 For the second auxiliary line distance M 1 For the first auxiliary line distance M 3 For the third auxiliary line distance θ 2 For the second angle, θ 1 For the first angle, D CF Is the physical distance.
7. The apparatus of claim 6, wherein the processor is further configured to calculate the physical diameter using the formula:
c=M 1 2 +M 3 2 +2M 1 M 3 cos(θ 12 +π/2)
wherein r is a physical radius, the physical diameter is 2r, M 2 For the second auxiliary line distance M 1 For the first auxiliary line distance M 3 For the third auxiliary line distance θ 2 For the second angle, θ 1 Is the first angle.
8. The apparatus of any one of claims 1-7, wherein the processor comprises: the semantic segmentation model is used for carrying out semantic segmentation on the image and outputting target image pixels and background area pixels; the processor is further configured to import the target image pixel into a newly created layer to obtain the target image.
9. The apparatus of any of claims 1-7, wherein the interaction module is further to:
acquiring a ranging straight line on the target image;
and determining the starting point and the end point according to the intersection point of the ranging straight line and the target image boundary.
10. A method of dimensional measurement, comprising:
acquiring an image of an object to be measured through a camera, and acquiring the physical distance from the camera to the object to be measured based on a TOF technology;
identifying a target image of the object to be detected from the image;
determining a starting point and an end point of the measurement of the object to be measured according to the user operation on the target image, and calculating the pixel distance from the starting point to the end point;
acquiring a calculation mode, wherein the calculation mode comprises a length calculation mode and a diameter calculation mode;
and calculating the physical size of the object to be measured according to the focal length of the image, the pixel distance, the physical distance and the calculation mode.
11. The method of claim 10, wherein calculating the physical size of the object under test from the focal length of the image, the pixel distance, the physical distance, and the calculation mode comprises:
and when the calculation mode is the length calculation mode, calculating the physical length of the object to be measured according to the ratio of the physical distance to the focal length and the pixel distance.
12. The method of claim 11, wherein the physical length is calculated using the formula:
wherein P is 1 For the physical length L 1 For the pixel distance, D 2 For the physical distance D 1 Is the focal length.
13. The method of claim 10, wherein calculating the physical size of the object under test from the focal length of the image, the pixel distance, the physical distance, and the calculation mode comprises:
and when the calculation mode is the diameter calculation mode, acquiring the deviation distance between the midpoints of the starting point and the ending point and the photosensitive center of the camera, and calculating the physical diameters of the cross sections of the object to be measured at the starting point and the ending point according to the focal length, the physical distance, the deviation distance and the pixel distance.
14. The method of claim 13, wherein calculating the physical diameter of the cross section of the object to be measured at the start point and the end point based on the focal length, the physical distance, the offset distance, and the pixel distance comprises:
calculating a first angle and a second angle according to the focal length, the offset distance and the pixel distance;
calculating a first auxiliary line distance, a second auxiliary line distance and a third auxiliary line distance according to the physical distance, the first angle and the second angle;
the physical diameter is calculated from the first auxiliary line distance, the second auxiliary line distance, the third auxiliary line distance, the first angle, and the second angle.
15. The method of claim 14, wherein the first auxiliary line distance, the second auxiliary line distance, and the third auxiliary line distance are calculated by the following formula:
M 2 =sin(θ 2 )*D CF
M 1 =cos(θ 2 )*D CF *tan(θ 12 )-M 2
wherein M is 2 For the second auxiliary line distance M 1 For the first auxiliary line distance M 3 For the third auxiliary line distance θ 2 For the second angle, θ 1 For the first angle, D CF Is the physical distance.
16. The method of claim 15, wherein the physical diameter is calculated using the formula:
wherein r is a physical radius, the physical diameter is 2r, M 2 For the second auxiliary line distance M 1 For the first auxiliary line distance M 3 For the third auxiliary line distance θ 2 For the second angle, θ 1 Is the first angle.
17. The method of claim 10, wherein identifying the target image of the object to be measured from the image comprises:
inputting the image into a trained semantic segmentation model, and outputting a target image pixel and a background area pixel;
and importing the target image pixels into a newly built layer to obtain the target image.
18. The method of claim 10, wherein determining a start point and an end point of the object under test measurement based on user operation on the target image comprises:
acquiring a ranging straight line on the target image;
and determining the starting point and the end point according to the intersection point of the ranging straight line and the target image boundary.
CN202310602563.7A 2023-05-25 2023-05-25 Handheld dimension measuring device and dimension measuring method Pending CN116538926A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310602563.7A CN116538926A (en) 2023-05-25 2023-05-25 Handheld dimension measuring device and dimension measuring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310602563.7A CN116538926A (en) 2023-05-25 2023-05-25 Handheld dimension measuring device and dimension measuring method

Publications (1)

Publication Number Publication Date
CN116538926A true CN116538926A (en) 2023-08-04

Family

ID=87443488

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310602563.7A Pending CN116538926A (en) 2023-05-25 2023-05-25 Handheld dimension measuring device and dimension measuring method

Country Status (1)

Country Link
CN (1) CN116538926A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117593355A (en) * 2023-11-23 2024-02-23 云途信息科技(杭州)有限公司 Pavement element area calculation method, device, computer equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117593355A (en) * 2023-11-23 2024-02-23 云途信息科技(杭州)有限公司 Pavement element area calculation method, device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
EP3330925B1 (en) Method for 3d reconstruction of an environment of a mobile device, corresponding computer program product and device
CN109146947B (en) Marine fish three-dimensional image acquisition and processing method, device, equipment and medium
US8773514B2 (en) Accurate 3D object reconstruction using a handheld device with a projected light pattern
CN107392958B (en) Method and device for determining object volume based on binocular stereo camera
CN113409382B (en) Method and device for measuring damaged area of vehicle
WO2018157513A1 (en) Method and system for measuring volume of regular three-dimensional object
CN103907123B (en) The detection on the head of the people in depth image
CN104424640B (en) The method and apparatus for carrying out blurring treatment to image
WO2023035822A1 (en) Target detection method and apparatus, and device and storage medium
CN113888689A (en) Image rendering model training method, image rendering method and image rendering device
CN114097006A (en) Cross-modality sensor data alignment
Mirzaalian Dastjerdi et al. Measuring surface area of skin lesions with 2D and 3D algorithms
CN109978753B (en) Method and device for drawing panoramic thermodynamic diagram
CN116538926A (en) Handheld dimension measuring device and dimension measuring method
CN109272546A (en) A kind of fry length measurement method and system
CN112254902A (en) Method and device for generating three-dimensional laser point cloud picture based on laser and visible light scanning
US10529085B2 (en) Hardware disparity evaluation for stereo matching
CN114419133A (en) Method and device for judging whether container of plant is suitable for maintaining plant
CN105335959A (en) Quick focusing method and device for imaging apparatus
CN107145892B (en) A kind of image significance object detection method based on adaptive syncretizing mechanism
CN111652168B (en) Group detection method, device, equipment and storage medium based on artificial intelligence
US20230334819A1 (en) Illuminant estimation method and apparatus for electronic device
CN111383185B (en) Hole filling method based on dense parallax map and vehicle-mounted equipment
CN116343143A (en) Target detection method, storage medium, road side equipment and automatic driving system
Yang et al. Improved YOLOv4 based on dilated coordinate attention for object detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination