CN112146628B - Measurement method and device, electronic equipment and storage medium - Google Patents

Measurement method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112146628B
CN112146628B CN202010901187.8A CN202010901187A CN112146628B CN 112146628 B CN112146628 B CN 112146628B CN 202010901187 A CN202010901187 A CN 202010901187A CN 112146628 B CN112146628 B CN 112146628B
Authority
CN
China
Prior art keywords
image
point
processed
distance
measured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010901187.8A
Other languages
Chinese (zh)
Other versions
CN112146628A (en
Inventor
薛地
郭玉京
周杨
林君仪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen TetrasAI Technology Co Ltd
Original Assignee
Shenzhen TetrasAI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen TetrasAI Technology Co Ltd filed Critical Shenzhen TetrasAI Technology Co Ltd
Priority to CN202010901187.8A priority Critical patent/CN112146628B/en
Publication of CN112146628A publication Critical patent/CN112146628A/en
Application granted granted Critical
Publication of CN112146628B publication Critical patent/CN112146628B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a measuring method and device, electronic equipment and a storage medium. The method comprises the following steps: acquiring a first image to be processed and a second image to be processed; the first image to be processed comprises a first surface of an object to be measured, and the second image to be processed comprises a second surface of the object to be measured; at least one edge of the object to be measured is arranged between the first surface and the second surface; determining the distance between an object point corresponding to a first pixel point in the first image to be processed and the first surface to obtain a first distance; the first pixel point belongs to the object to be measured; determining the distance between an object point corresponding to a second pixel point in the second image to be processed and the second surface to obtain a second distance; the second pixel point and the first pixel point are the same name points; and obtaining the size of the object to be measured according to the first distance and the second distance.

Description

Measurement method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer vision, and in particular, to a measurement method and apparatus, an electronic device, and a storage medium.
Background
In daily life, people often need to measure the size of an object. In conventional methods, one typically uses a length measuring tool (e.g., tape, ruler, vernier) to measure the size of the object. However, this conventional measurement method is time-consuming and labor-consuming for the measurer, and has low measurement efficiency. Therefore, how to efficiently and accurately measure the size of an object is of great importance.
Disclosure of Invention
The application provides a measuring method and device, electronic equipment and a storage medium.
In a first aspect, there is provided a measurement method comprising:
Acquiring a first image to be processed and a second image to be processed; the first image to be processed comprises a first surface of an object to be measured, and the second image to be processed comprises a second surface of the object to be measured; at least one edge of the object to be measured is arranged between the first surface and the second surface;
determining the distance between an object point corresponding to a first pixel point in the first image to be processed and the first surface to obtain a first distance; the first pixel point belongs to the object to be measured;
Determining the distance between an object point corresponding to a second pixel point in the second image to be processed and the second surface to obtain a second distance; the second pixel point and the first pixel point are the same name points;
And obtaining the size of the object to be measured according to the first distance and the second distance.
In combination with any one of the embodiments of the present application, the measurement method is applied to a terminal, the terminal includes a two-dimensional imaging device, and the acquiring a first image to be processed and a second image to be processed includes:
Under the condition that a large-size object measurement instruction aiming at the object to be measured is detected, shooting the object to be measured by using the two-dimensional imaging equipment to obtain at least two images to be confirmed;
And carrying out object surface detection processing on the at least two images to be confirmed, selecting one image from the images containing the first surface as the first image to be processed, and selecting one image from the images containing the second surface as the second image to be processed.
In combination with any one of the embodiments of the present application, the object plane detection processing is performed on the at least two images to be confirmed, and selecting one image from the images including the first plane as the first image to be processed, and selecting one image from the images including the second plane as the second image to be processed includes:
Performing object surface detection processing on the at least two images to be confirmed to obtain a first surface image to be confirmed containing the first surface and a second surface image to be confirmed containing the second surface;
And under the condition that the same name points belonging to the object to be measured exist in the first surface to-be-confirmed image and the second surface to-be-confirmed image through image matching processing of the first surface to-be-confirmed image and the second surface to-be-confirmed image, the first surface to-be-confirmed image is taken as the first to-be-processed image, and the second surface to-be-confirmed image is taken as the second to-be-processed image.
In combination with any one of the embodiments of the present application, before determining the distance between the object point corresponding to the first pixel point in the first image to be processed and the first surface to obtain the first distance, the method further includes:
acquiring a first depth map of the first image to be processed;
the determining the distance from the first pixel point in the image to be processed to the first surface to obtain a first distance includes:
and obtaining the first distance according to the first image to be processed and the first depth map.
In combination with any one of the embodiments of the present application, the first pixel belongs to a first edge of the object to be measured;
the obtaining the first distance according to the first image to be processed and the first depth map includes:
Acquiring a first depth value of the first pixel point and a second depth value of the first corner point from the first depth map; the first corner point belongs to the first face, and the first corner point is an endpoint of the first edge;
Obtaining a first image point of the first pixel point under a camera coordinate system of the two-dimensional imaging device according to the first depth value and the coordinate of the first pixel point in the first image to be processed;
obtaining a second image point of the first angle point under the camera coordinate system according to the second depth value and the coordinates of the first angle point in the first image to be processed;
determining a distance between the first image point and the second image point to obtain the first distance.
In combination with any one of the embodiments of the present application, the first pixel belongs to a first edge of the object to be measured; the first edge belongs to a third surface of the object to be measured;
the obtaining the first distance according to the first image to be processed and the first depth map includes:
Acquiring a projection plane of the third face under a camera coordinate system of the two-dimensional imaging device;
Determining a first image point corresponding to a first pixel point and a second image point corresponding to a first angle point from the projection plane; the first corner point is an endpoint of the first edge;
a distance between the first image point and the second image point is determined as the first distance.
In combination with any one of the embodiments of the present application, the first pixel belongs to a first edge of the object to be measured; the obtaining the first distance according to the first image to be processed and the first depth map includes:
Acquiring a third depth value of the first line segment from the depth map; the two endpoints of the first line segment are the first pixel point and the first corner point respectively; the first corner point is an endpoint of the first edge; the third depth value is positively correlated with the depth value of the first pixel point, and/or the third depth value is positively correlated with the depth value of the first corner point;
Obtaining a mapping relation between the depth value and the transmission factor; the transmission factor characterizes the conversion relation between the size in the image and the real size; the transmission factor is positively correlated with the depth value;
Obtaining a transmission factor of the first line segment according to the mapping relation and the third depth value;
And obtaining the first distance according to the transmission factor of the first line segment and the length of the first line segment.
In combination with any one of the embodiments of the present application, before the determining the distance between the second pixel point and the second surface, the method further includes:
acquiring a second depth map of the second image to be processed;
the determining the distance between the second pixel point and the second surface to obtain a second distance includes:
and obtaining the second distance according to the second image to be processed and the second depth map.
In combination with any one of the embodiments of the present application, the second pixel belongs to the first edge;
The obtaining the second distance according to the second image to be processed and the second depth map includes:
acquiring a fourth depth value of the second pixel point and a fifth depth value of the second corner point from the second depth map; the second corner point belongs to the second face, and the second corner point is an endpoint of the first edge;
Obtaining a third image point of the second pixel point under a camera coordinate system of the two-dimensional imaging device according to the fourth depth value and the coordinates of the second pixel point in the second image to be processed;
Obtaining a fourth image point of the second corner under the camera coordinate system according to the fifth depth value and the coordinates of the second corner in the second image to be processed;
and determining the distance between the third image point and the fourth image point to obtain the second distance.
In a second aspect, there is provided a measurement device, the device comprising:
The acquisition unit is used for acquiring a first image to be processed and a second image to be processed; the first image to be processed comprises a first surface of an object to be measured, and the second image to be processed comprises a second surface of the object to be measured; at least one edge of the object to be measured is arranged between the first surface and the second surface;
the first processing unit is used for determining the distance between the object point corresponding to the first pixel point in the first image to be processed and the first surface to obtain a first distance; the first pixel point belongs to the object to be measured;
The second processing unit is used for determining the distance between the object point corresponding to the second pixel point in the second image to be processed and the second surface to obtain a second distance; the second pixel point and the first pixel point are the same name points;
and the third processing unit is used for obtaining the size of the object to be measured according to the first distance and the second distance.
In combination with any one of the embodiments of the present application, the measuring apparatus further includes a two-dimensional imaging device, and the acquiring unit is configured to:
Under the condition that a large-size object measurement instruction aiming at the object to be measured is detected, shooting the object to be measured by using the two-dimensional imaging equipment to obtain at least two images to be confirmed;
And carrying out object surface detection processing on the at least two images to be confirmed, selecting one image from the images containing the first surface as the first image to be processed, and selecting one image from the images containing the second surface as the second image to be processed.
In combination with any one of the embodiments of the present application, the obtaining unit is configured to:
Performing object surface detection processing on the at least two images to be confirmed to obtain a first surface image to be confirmed containing the first surface and a second surface image to be confirmed containing the second surface;
And under the condition that the same name points belonging to the object to be measured exist in the first surface to-be-confirmed image and the second surface to-be-confirmed image through image matching processing of the first surface to-be-confirmed image and the second surface to-be-confirmed image, the first surface to-be-confirmed image is taken as the first to-be-processed image, and the second surface to-be-confirmed image is taken as the second to-be-processed image.
In combination with any one of the embodiments of the present application, the obtaining unit is further configured to obtain a first depth map of the first image to be processed before determining a distance between an object point corresponding to a first pixel point in the first image to be processed and the first surface to obtain a first distance;
the first processing unit is used for:
and obtaining the first distance according to the first image to be processed and the first depth map.
In combination with any one of the embodiments of the present application, the first pixel belongs to a first edge of the object to be measured;
the first processing unit is used for:
Acquiring a first depth value of the first pixel point and a second depth value of the first corner point from the first depth map; the first corner point belongs to the first face, and the first corner point is an endpoint of the first edge;
Obtaining a first image point of the first pixel point under a camera coordinate system of the two-dimensional imaging device according to the first depth value and the coordinate of the first pixel point in the first image to be processed;
obtaining a second image point of the first angle point under the camera coordinate system according to the second depth value and the coordinates of the first angle point in the first image to be processed;
determining a distance between the first image point and the second image point to obtain the first distance.
In combination with any one of the embodiments of the present application, the first pixel belongs to a first edge of the object to be measured; the first edge belongs to a third surface of the object to be measured;
the first processing unit is used for:
Acquiring a projection plane of the third face under a camera coordinate system of the two-dimensional imaging device;
Determining a first image point corresponding to a first pixel point and a second image point corresponding to a first angle point from the projection plane; the first corner point is an endpoint of the first edge;
a distance between the first image point and the second image point is determined as the first distance.
In combination with any one of the embodiments of the present application, the first pixel belongs to a first edge of the object to be measured; the first processing unit is used for:
Acquiring a third depth value of the first line segment from the depth map; the two endpoints of the first line segment are the first pixel point and the first corner point respectively; the first corner point is an endpoint of the first edge; the third depth value is positively correlated with the depth value of the first pixel point, and/or the third depth value is positively correlated with the depth value of the first corner point;
Obtaining a mapping relation between the depth value and the transmission factor; the transmission factor characterizes the conversion relation between the size in the image and the real size; the transmission factor is positively correlated with the depth value;
Obtaining a transmission factor of the first line segment according to the mapping relation and the third depth value;
And obtaining the first distance according to the transmission factor of the first line segment and the length of the first line segment.
In combination with any one of the embodiments of the present application, the obtaining unit is configured to obtain a second depth map of the second image to be processed before determining a distance between the second pixel point and the second surface to obtain a second distance;
the second processing unit is used for:
and obtaining the second distance according to the second image to be processed and the second depth map.
In combination with any one of the embodiments of the present application, the second pixel belongs to the first edge;
the second processing unit is used for:
acquiring a fourth depth value of the second pixel point and a fifth depth value of the second corner point from the second depth map; the second corner point belongs to the second face, and the second corner point is an endpoint of the first edge;
Obtaining a third image point of the second pixel point under a camera coordinate system of the two-dimensional imaging device according to the fourth depth value and the coordinates of the second pixel point in the second image to be processed;
Obtaining a fourth image point of the second corner under the camera coordinate system according to the fifth depth value and the coordinates of the second corner in the second image to be processed;
and determining the distance between the third image point and the fourth image point to obtain the second distance.
In a third aspect, a processor is provided for performing the method of the first aspect and any one of its possible implementation manners described above.
In a fourth aspect, there is provided an electronic device comprising: a processor, transmission means, input means, output means and memory for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to carry out the method as described in the first aspect and any one of its possible implementations.
In a fifth aspect, there is provided a computer readable storage medium having stored therein a computer program comprising program instructions which, when executed by a processor, cause the processor to carry out a method as in the first aspect and any one of its possible implementations.
In a sixth aspect, a computer program product is provided, the computer program product comprising a computer program or instructions which, when run on a computer, cause the computer to perform the method of the first aspect and any one of the possible implementations thereof.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
In order to more clearly describe the embodiments of the present application or the technical solutions in the background art, the following description will describe the drawings that are required to be used in the embodiments of the present application or the background art.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic diagram of a homonymy point provided in an embodiment of the application;
FIG. 2 is a schematic diagram of an image pixel coordinate system according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of a measurement method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an object to be measured according to an embodiment of the present application;
FIG. 5 is a schematic view of another object to be measured according to an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a measuring device according to an embodiment of the present application;
fig. 7 is a schematic hardware structure of a measurement device according to an embodiment of the present application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms first, second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
It should be understood that, in the present application, "at least one (item)" means one or more, "a plurality" means two or more, "at least two (items)" means two or three and three or more, "and/or" for describing an association relationship of an association object, three kinds of relationships may exist, for example, "a and/or B" may mean: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" may indicate that the context-dependent object is an "or" relationship, meaning any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural. The character "/" may also represent divisors in mathematical operations, e.g., a/b=a divided by b; 6/3=2. "at least one of the following" or its similar expressions.
For convenience of description, pixels of the same object point in different images are referred to as the same name points. As shown in fig. 1, the pixel a and the pixel C are the same name points, and the pixel B and the pixel D are the same name points.
In the embodiment of the application, the positions in the image refer to the positions under the pixel coordinates of the image. The abscissa of the pixel coordinate system in the embodiment of the application is used for representing the column number where the pixel point is located, and the ordinate under the pixel coordinate system is used for representing the row number where the pixel point is located. For example, in the image shown in fig. 2, a pixel coordinate system XOY is constructed with the upper left corner of the image as the origin O of coordinates, the direction parallel to the rows of the image as the direction of the X axis, and the direction parallel to the columns of the image as the direction of the Y axis. The units of the abscissa and the ordinate are pixel points. For example, in fig. 2, the coordinate of the pixel a 11 is (1, 1), the coordinate of the pixel a 23 is (3, 2), the coordinate of the pixel a 42 is (2, 4), and the coordinate of the pixel a 34 is (4, 3).
In the embodiment of the application, an image point of a pixel point in a two-dimensional image under a camera coordinate system is a projection point of the pixel point under the camera coordinate system, the distance between the projection point and the optical center of the two-dimensional imaging device is the distance between an object point corresponding to the pixel point and the two-dimensional imaging device, and the projection point, the pixel point and the optical center of the two-dimensional imaging device are on the same straight line.
In the embodiment of the application, a projection plane of a pixel point plane in a two-dimensional image under a camera coordinate system is a plane containing projection points of pixel points in the pixel point plane under the camera coordinate system.
In daily life, people often need to measure the size of an object. In conventional methods, one typically uses a length measuring tool (e.g., tape, ruler, vernier) to measure the size of the object. However, this conventional measurement method is time-consuming and labor-consuming for the measurer, and has low measurement efficiency. Therefore, how to efficiently and accurately measure the size of an object is of great importance.
With the development of computer vision technology, measurement methods based on computer vision technology are being applied. In the method, the electronic equipment obtains the three-dimensional coordinates of each point in the object to be measured under the camera coordinate system by processing the RGB image and the depth map of the RGB image containing the object to be measured, and further obtains the size of the object to be measured according to the three-dimensional coordinates of each point in the object to be measured under the camera coordinate system, so that the measurement efficiency is improved.
However, when the size of the object to be measured is large, one RGB image cannot contain the entire object to be measured, and at this time, the accuracy of the size of the object to be measured obtained by this method is low.
Based on the above, the embodiment of the application provides a technical scheme for measuring the size of the object to be measured, which can improve the accuracy of the size of the object to be measured. The execution main body of the embodiment of the application is a measuring device. Alternatively, the measuring device may be one of the following: cell phone, computer, server, panel computer.
Embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. Referring to fig. 3, fig. 3 is a flow chart of a measurement method according to an embodiment of the application.
Referring to fig. 3, fig. 3 is a flow chart of a measurement method according to an embodiment of the application.
301. And acquiring a first image to be processed and a second image to be processed.
In the embodiment of the application, the first image to be processed and the second image to be processed both comprise a part of the object to be measured, wherein the first image to be processed comprises a first surface of the object to be measured, and the second image to be processed comprises a second surface of the object to be measured. There is at least one edge of the object to be measured between the first face and the second face.
For example, in the case where the object to be measured is a rectangular parallelepiped, the first surface is an upper bottom surface, and the second surface is a lower bottom surface, 4 height sides (height side refers to a side whose length is used to indicate the height of the object to be measured) exist between the first surface and the second surface.
As another example, for the object to be measured shown in fig. 4. In the case where the first face is the planar ABCD and the second face is the EFG, there are three sides between the first face and the second face: AE. BF, CG.
In one implementation of acquiring a first image to be processed and a second image to be processed, a measurement device receives two images input by a user through an input assembly as the first image to be processed and the second image to be processed, respectively, wherein the input assembly includes: a keyboard, a mouse, a touch screen, a touch pad, an audio input device, and the like.
In another implementation manner of acquiring the first image to be processed and the second image to be processed, the measuring device receives the two images sent by the first control device and respectively uses the two images as the first image to be processed and the second image to be processed, wherein the first control device comprises a mobile phone, a computer, a tablet personal computer, a server and the like.
In yet another implementation of acquiring the first image to be processed and the second image to be processed, the measuring device acquires the first image to be processed and the second image to be processed using a two-dimensional imaging device. For example, the measuring device is a mobile phone, and the two-dimensional imaging device is an RGB camera on the mobile phone. The mobile phone shoots an object to be measured by using the RGB camera, and a first image to be processed and a second image to be processed can be acquired.
302. And determining the distance between the object point corresponding to the first pixel point in the first image to be processed and the first surface to obtain a first distance.
In the embodiment of the application, the first distance is a distance from an object point corresponding to the first pixel point to the first surface. For example, the first surface is an upper bottom surface of the object to be measured, and the object point corresponding to the first pixel point is an object point a. At this time, the first distance is the distance from the object point a to the upper bottom surface of the object to be measured.
In one possible implementation, the first pixel point is located on a first edge of the first object to be measured, and the corner point located on the first edge and belonging to the first plane is referred to as a first target corner point. The measuring device obtains a distance between the first pixel point and the first target angular point (hereinafter referred to as a first reference distance) according to the coordinates of the first pixel point in the first image to be processed and the coordinates of the first target angular point in the image to be processed. The measuring device obtains the distance between the object point corresponding to the first pixel point and the object point corresponding to the first target angular point according to the first reference distance as a first distance.
Optionally, the first image to be processed further includes a reference object, where the reference object is an object with a fixed size. For example, since the height of the table is fixed, the table may be used as a reference object. For another example, since the height of the cup is fixed, the cup can be used as a reference object. The measuring device obtains the true dimensions of the reference object before performing step 302. And obtaining the size of the reference object in the first image to be processed by performing object detection processing on the first image to be processed. The measuring device calculates the ratio of the real size of the reference object to the size of the reference object in the first image to be processed to obtain a first ratio, and takes the product of the first ratio and the first reference distance as the first distance.
In another possible implementation, the measuring device acquires a first depth map of the first image to be processed before performing step 302. And acquiring a depth value of the first pixel point and a depth value of the first target corner point from the first depth map. The measuring device obtains a first image point of the first pixel point under a camera coordinate system (the camera coordinate system in the embodiment of the application is a camera coordinate system of an imaging device for acquiring the first image to be processed) according to the coordinate of the first pixel point in the first image to be processed and the depth value of the first pixel point. And the measuring device obtains a first target image point of the first target angular point under a camera coordinate system according to the coordinate of the first target angular point in the first image to be processed and the depth value of the first target angular point. The distance between the first image point and the first target image point is calculated as the first distance.
In a further possible implementation, the first pixel point is not on the side of the first object to be measured. The measuring device acquires a first depth map of a first image to be processed before executing step 302. And obtaining a projection plane of the first surface under the camera coordinate system and a first image point of the first pixel point under the camera coordinate system according to the first image to be processed and the first depth map. The measuring device calculates a distance from the first image point to the projection plane as a first distance.
303. And determining the distance between the object point corresponding to the second pixel point in the second image to be processed and the second surface to obtain a second distance.
In the embodiment of the application, the second pixel point and the first pixel point are the same name points. The implementation of the measuring device to obtain the second distance may be referred to as the implementation of obtaining the first distance in step 302.
304. And obtaining the size of the object to be measured according to the first distance and the second distance.
In an embodiment of the present application, the size of the object to be measured includes one of the following: length, width, height. Assuming that the first distance is d 1, the second distance is d 2, and the size of the object to be measured is H. In one possible implementation, d 1,d2, H satisfies the following equation:
H=d 1+d21 … formula (1)
Wherein α 1 is a real number. Alternatively, α 1 =0.
In another possible implementation, d 1,d2, H satisfies the following equation:
G=β 1×(d1+d2) … formula (2)
Wherein β 1 is a positive number. Alternatively, β 1 =1.
In yet another possible implementation, d 1,d2, H satisfies the following equation:
H=β 1×(d1+d2)+α1 … formula (3)
Where α 1 is a real number and β 1 is a positive number. Alternatively, α 1=0,β1 =1.
Under the condition that the size of the object to be measured is large, one image cannot contain the whole object to be measured, and the measuring device cannot obtain the size of the object to be measured through one image. In the embodiment of the application, since the first pixel point and the second pixel point are the same name points, the measuring device can obtain the size of the object to be measured according to the first distance and the second distance under the condition that the first distance is obtained based on the first image to be processed and the second distance is obtained based on the second image to be processed.
For example, in the case that the first side is a height side, the measuring device obtains the height of the object to be measured according to the first distance and the second distance; in the case that the first side is a long side (i.e., a side whose length represents the length of the object to be measured), the measuring device obtains the length of the object to be measured according to the first distance and the second distance; in the case where the first side is a width side (i.e., a side whose length represents the width of the object to be measured), the measuring device obtains the width of the object to be measured based on the first distance and the second distance.
As an alternative embodiment, the measuring device comprises a two-dimensional imaging device. In the embodiment of the application, the two-dimensional imaging device can be an RGB imaging device or a YUV imaging device, wherein 'Y' represents brightness (namely gray scale value), 'U' and 'V' represent chromaticity. The measuring device performs the following steps in performing step 301:
1. And under the condition that a large-size object measurement instruction aiming at the object to be measured is detected, shooting the object to be measured by using the two-dimensional imaging equipment to obtain at least two images to be confirmed.
In an embodiment of the present application, the large-size object measurement instruction includes at least one of the following: voice, text, click operation, touch operation.
Under the condition that a measuring device receives a large-size object measuring instruction aiming at an object to be measured, a two-dimensional imaging device is used for shooting the object to be measured at least twice, and at least two images to be confirmed are obtained.
In one possible implementation manner, the measuring device uses the two-dimensional imaging device to carry out video shooting on the object to be measured to obtain at least two images to be confirmed under the condition that a large-size object measuring instruction for the object to be measured is received.
2. And selecting one image from the images containing the first surface as the first image to be processed, and selecting one image from the images containing the second surface as the second image to be processed by carrying out object surface detection processing on the at least two images to be confirmed.
In the embodiment of the application, the object surface detection processing is used for detecting which surfaces the object to be measured contains in the image. For example, the object to be measured includes: upper bottom surface, left side.
In one possible implementation, the object plane detection process may be implemented by an object plane detection neural network, where the object plane detection neural network is obtained by training a convolutional neural network with a plurality of images with labeling information as training data, where the labeling information of the images in the training data is a plane type (e.g., top, bottom, left, right, front).
In another possible implementation manner, the measuring device obtains the corner point in the image to be confirmed by performing corner point detection processing on the image to be confirmed. The measuring device can then determine the object plane in the image to be confirmed according to the corner points in the image to be confirmed. For example, the measuring device can determine the position of the upper bottom surface depending on the position of the corner points of the at least three upper bottom surfaces.
Optionally, the corner detection process may be implemented by a corner detection algorithm, where the corner detection algorithm may be one of the following: the application does not limit the corner detection algorithm for realizing the corner detection processing.
The measuring device determines the object plane contained in each image by carrying out object plane detection processing on each image in at least two images to be confirmed. Selecting an image from the images containing the first surface as a first image to be processed, and selecting an image from the images containing the second surface as a second image to be processed.
For example, in the case where the first surface is the upper bottom surface and the second surface is the lower bottom surface, the measuring apparatus selects one image from the images including the upper bottom surface of the object to be measured as the first image to be processed, and selects one image from the images including the lower bottom surface of the object to be measured as the second image to be processed.
As an alternative embodiment, the measuring device performs the following steps in the course of step 2:
3. And carrying out object surface detection processing on the at least two images to be confirmed to obtain a first surface image to be confirmed containing the first surface and a second surface image to be confirmed containing the second surface.
The implementation of this step can be seen in step 2. The measuring device respectively carries out object surface detection processing on each image in at least two images to be confirmed, determines at least one image containing a first surface and at least one image containing a second surface, takes the image containing the first surface as a first image to be confirmed, and takes the image containing the second surface as a second image to be confirmed.
4. And under the condition that the image to be confirmed on the first surface and the image to be confirmed on the second surface are confirmed to have the same name point belonging to the object to be measured, the image to be confirmed on the first surface is used as the first image to be processed, and the image to be confirmed on the second surface is used as the second image to be processed.
The identical points belonging to the object to be measured exist in the first image to be confirmed and the second image to be confirmed, which indicates that the side length containing the object to be measured can be obtained according to the first image to be confirmed and the second image to be confirmed. And when the first surface to-be-confirmed image and the second surface to-be-confirmed image are determined to have the same name points belonging to the object to be measured, and the same name points do not belong to the first surface and the second surface, taking the first surface to-be-confirmed image as a first to-be-processed image, and taking the second surface to-be-confirmed image as a second to-be-processed image.
As an alternative embodiment, the measuring device further performs the following steps before performing step 302:
5. and acquiring a first depth map of the first image to be processed.
In the embodiment of the application, the first depth map carries the depth information of the pixel points in the first image to be processed. In one implementation of acquiring the first depth map, the measurement device receives the first depth map input by a user through an input component, wherein the input component comprises: a keyboard, a mouse, a touch screen, a touch pad, an audio input device, and the like.
In another implementation manner of obtaining the first depth map, the measuring device receives the first depth map sent by the second control device, where the second control device includes a mobile phone, a computer, a tablet computer, a server, and the like. Optionally, the second control means is identical to the first control means.
In yet another implementation of acquiring the first depth map, the measuring device acquires the first depth map using a depth imaging device. For example, the measurement device is a cell phone and the depth imaging device is a depth camera on the cell phone. The mobile phone can acquire a first depth map by using the depth camera. The depth camera may be any one of the following: structured light (TOF) cameras, time of flight (time of flight) cameras, binocular stereo (binocular stereo vision) cameras.
After obtaining the first depth map, the measuring device performs the following steps in the process of performing step 302:
6. and obtaining the first distance according to the first image to be processed and the first depth map.
In step 302, the measurement device may obtain a first target image point and a first image point according to a first image to be processed and a first depth map. The distance between the first target image point and the first image point is calculated as the first distance. The measuring device can obtain a projection plane and a first image point according to the first image to be processed and the first depth map. The distance between the first image point and the projection plane is calculated as the first distance.
As an alternative embodiment, the first pixel point belongs to a first side of the object to be measured. The measuring device performs the following steps in the process of performing step 6:
7. and acquiring a first depth value of the first pixel point and a second depth value of the first corner point from the first depth map.
In the embodiment of the application, the first corner belongs to the first surface, and the first corner is an endpoint of the first side. For example, taking the object to be measured as shown in fig. 5 as an example, it is assumed that the first face is a plane ABCD. The first pixel point is H, and the first side is AE. At this time, the first corner point is a.
The measuring device acquires a depth value of a first pixel point from the first depth map as a first depth value, and acquires a depth value of a first corner point from the first depth map as a second depth value.
8. And obtaining a first image point of the first pixel point under a camera coordinate system of the two-dimensional imaging device according to the first depth value and the coordinates of the first pixel point in the first image to be processed.
The measuring device converts the depth value (namely the first depth value) of the first pixel point and the coordinate of the first pixel point in the first image to be processed according to the internal parameters of the two-dimensional imaging equipment, so that the coordinate of the image point of the first pixel point under the camera coordinate system can be obtained. The measuring device can obtain a first image point of the first pixel point under a camera coordinate system of the two-dimensional imaging device according to the first depth value and the coordinate of the first pixel point in the first image to be processed.
Optionally, the measuring means acquire internal parameters of the two-dimensional imaging device before step 8 is performed. In the embodiment of the application, the internal parameters of the two-dimensional imaging device include: the focal length of the two-dimensional imaging device is in the coordinate of the camera coordinate system and the optical center of the two-dimensional imaging device is in the coordinate of the camera coordinate system, wherein the optical center is the intersection point of the optical axis of the two-dimensional imaging device and the image plane.
9. And obtaining a second image point of the first angle point under the camera coordinate system according to the second depth value and the coordinates of the first angle point in the first image to be processed.
The measuring device converts the depth value of the first corner point (namely the second depth value) and the coordinate of the first corner point in the first image to be processed according to the internal parameters of the two-dimensional imaging equipment, and an image point of the first corner point under a camera coordinate system, namely the second image point, can be obtained.
10. And determining the distance between the first image point and the second image point to obtain the first distance.
As an alternative embodiment, the first pixel belongs to a first edge of the object to be measured, and the first edge belongs to the third surface of the object to be measured. The measuring device performs the following steps in the process of performing step 6:
11. And acquiring a projection plane of the third surface under a camera coordinate system of the two-dimensional imaging device.
In the embodiment of the present application, the projection plane includes projection points of all points in the third plane under the camera coordinate system. For example, the third surface includes a point a, a point B, and a point C, wherein a projection point of the point a in the camera coordinate system is a point a, a projection point of the point B in the camera coordinate system is a point B, and a projection point of the point C in the camera coordinate system is a point C. At this time, the projection plane contains point a, point B, and point C.
In one implementation of acquiring a projection plane, the measurement device receives a plane equation under a camera coordinate system input by a user through an input component, and further determines the projection plane according to the plane equation, where the input component includes: a keyboard, a mouse, a touch screen, a touch pad, an audio input device, and the like.
In another implementation manner of obtaining the projection plane, the measuring device receives a plane equation under the camera coordinate system sent by the fourth control device, and further determines the projection plane according to the plane equation, where the fourth control device includes a mobile phone, a computer, a tablet computer, a server, and the like. Optionally, the fourth control means is the same as the first control means.
In a further embodiment, the measuring device selects at least three points from the third face as at least three first points. Coordinates of at least three first points in the first image to be processed are obtained, and depth values of the at least three first points are obtained from the depth map. The measuring device converts the coordinate of the first point in the first image to be processed and the depth value of the first point according to the internal parameters of the two-dimensional imaging device, and an image point of the first point under the camera coordinate system of the two-dimensional imaging device can be obtained. After each image point of each first point in the camera coordinate system is obtained, the measuring device performs plane fitting processing on the image points of at least three first points in the camera coordinate system, so that a projection plane of the third face in the camera coordinate system can be obtained.
Limited by the imaging limitations of the depth imaging device, the accuracy of the depth values of points on edges in the object to be measured obtained from the depth map is low. As an alternative embodiment, none of the at least three first points of the measuring device selected from the third surface belongs to the edge of the third surface in the first image to be processed.
12. A first image point corresponding to the first pixel point and a second image point corresponding to the first angle point are determined from the projection plane.
In the embodiment of the application, the first corner belongs to the second surface of the object to be measured, and the first corner is an endpoint of the first edge. Since the projection plane is the projection plane of the third plane under the camera coordinate system, the first pixel point and the first corner point both belong to the plane to be measured, the projection plane comprises an image point corresponding to the first pixel point and an image point corresponding to the first corner point, and the image point corresponding to the first pixel point and the image point corresponding to the first corner point can be determined from the projection plane. And taking an image point corresponding to the first pixel point in the projection plane as a first image point and an image point corresponding to the first angle point as a second image point.
In one possible implementation, the measurement device acquires coordinates of the optical center of the two-dimensional imaging device in the camera coordinate system, and obtains a straight line (hereinafter referred to as a first straight line) passing through the first pixel point and the optical center according to the coordinates of the first pixel point in the image coordinate system and the coordinates of the optical center in the camera coordinate system. The measuring device determines the intersection point of the first straight line and the projection plane as a first image point. Similarly, the measuring device obtains a straight line (hereinafter referred to as a second straight line) passing through the first corner point and the optical center, based on the coordinates of the first corner point in the image coordinate system and the coordinates of the optical center in the camera coordinate system. The measuring device takes the intersection point of the second straight line and the projection plane as a second image point.
13. And determining a distance between the first image point and the second image point as the first distance.
As an alternative embodiment, the first pixel belongs to a first edge of the object to be measured, and the measuring device performs the following steps in performing step 6:
14. And obtaining a third depth value of the first line segment from the depth map.
In the embodiment of the application, two end points of the first line segment are a first pixel point and a first corner point respectively, wherein the first corner point is an end point of the first side. The depth value of the first line segment is a third depth value, wherein the third depth value is positively correlated with the depth value of the first pixel point, and/or the third depth value is positively correlated with the depth value of the first angle point.
In one possible implementation, the depth value of the first line segment is the depth value of the midpoint of the first line segment. For example, the midpoint of the first line segment is a first reference midpoint. The measuring device acquires a depth value of a first reference midpoint from the first depth map, and takes the depth value of the first reference midpoint as a depth value of a first line segment, namely a third depth value.
In another possible implementation manner, the depth value of the first line segment is a mean value of the depth value of the first pixel point and the depth value of the first corner point.
15. And obtaining a mapping relation between the depth value and the transmission factor.
In the embodiment of the application, the transmission factor characterizes the conversion relation between the size and the real size in the image. For example, the transmission factor of a pixel characterizes the conversion relationship between the size of the pixel and the size of the object point to which the pixel corresponds. For another example, the transmission factor of the stool in the image characterizes the conversion relationship between the length of the stool in the image and the true length of the stool.
The ratio between the size in the image and the true size is called transmittance, and the transmission factor is inversely related to the transmittance. For example, the image includes a pixel a and a pixel B, where the transmission factor of the pixel a is greater than the transmission factor of the pixel B, the object point corresponding to the pixel a is the object point a, and the object point corresponding to the pixel B is the object point B. Since the transmission factor of the pixel a is larger than that of the pixel B, the transmittance of the pixel a and the object point a is smaller than that of the pixel B and the object point B. Since the size of the pixel a is the same as that of the pixel B, the size of the object point a is larger than that of the object point B.
In the embodiment of the application, the transmission factor and the depth value are positively correlated. For example, the image includes a pixel a and a pixel b, wherein the depth value of the pixel a is greater than the depth value of the pixel b. At this time, the size of the object point corresponding to the pixel point a is larger than the size of the object point corresponding to the pixel point b.
16. And obtaining the transmission factor of the first line segment according to the mapping relation and the third depth value.
In the embodiment of the application, the transmission factor of the first line segment characterizes a conversion relationship between the length of the first line segment and the length of the second line segment, wherein the second line segment is a physical line segment corresponding to the first line segment, that is, the length of the second line segment is a physical distance of the length of the first line segment (that is, the real length of the first line segment).
17. And obtaining the first distance according to the transmission factor of the first line segment and the length of the first line segment.
In one possible implementation, the transmission factor of the first line segment characterizes a ratio of a length of the first line segment to a length of the second line segment. The measuring device calculates the quotient of the length of the first line segment and the transmission factor of the first line segment to obtain the length of the second line segment, namely the height of the object to be measured.
In another possible implementation, the transmission factor of the first line segment characterizes a ratio of a length of the second line segment to a length of the second line segment. The measuring device calculates the product of the length of the first line segment and the transmission factor of the first line segment to obtain the length of the second line segment, namely the height of the object to be measured.
As an alternative embodiment, the measuring device further performs the following steps before performing step 303:
18. And acquiring a second depth map of the second image to be processed.
In the embodiment of the application, the second depth map carries the depth information of the pixel points in the second image to be processed. In one implementation of acquiring the second depth map, the measurement device receives the second depth map input by the user through an input component, wherein the input component comprises: a keyboard, a mouse, a touch screen, a touch pad, an audio input device, and the like.
In another implementation manner of obtaining the second depth map, the measuring device receives the second depth map sent by the third control device, where the second control device includes a mobile phone, a computer, a tablet computer, a server, and the like. Optionally, the third control means is identical to the second control means.
In yet another implementation of acquiring the second depth map, the measuring device acquires the second depth map using a depth imaging device. For example, the measurement device is a cell phone and the depth imaging device is a depth camera on the cell phone. The mobile phone can acquire a second depth map by using the depth camera. The depth camera may be any one of the following: structured light (TOF) cameras, binocular stereo (binocular stereo vision) cameras.
Optionally, a depth camera is loaded on the measuring device, and the measuring device uses the depth camera to shoot the object to be measured to obtain a first depth map and a second depth map.
After obtaining the second depth map, the measuring device performs the following steps in performing step 303:
19. and obtaining the second distance according to the second image to be processed and the second depth map.
In one possible implementation, the second pixel point is located on a first edge of the first object to be measured, and the corner point located on the first edge and belonging to the second plane is referred to as a second target corner point. The measuring device acquires the depth value of the second pixel point and the depth value of the second target angular point from the second depth map. And the measuring device obtains a second image point of the second pixel point under the camera coordinate system according to the coordinate of the second pixel point in the second image to be processed and the depth value of the second pixel point. And the measuring device obtains a second target image point of the second target angular point under the camera coordinate system according to the coordinates of the second target angular point in the second image to be processed and the depth value of the second target angular point. The distance between the second image point and the second target image point is calculated as the second distance.
In another possible implementation manner, the measurement device may obtain a projection plane of the second surface under the camera coordinate system and an image point (i.e., a second image point) of the second pixel point under the camera coordinate system according to the second image to be processed and the second depth map. The distance between the second image point and the projection plane of the second surface under the camera coordinate system is calculated as the second distance.
As an alternative embodiment, the second pixel belongs to the first side. The measuring device performs the following steps in the course of performing step 19:
20. and acquiring a fourth depth value of the second pixel point and a fifth depth value of the second corner point from the second depth map.
In the embodiment of the application, the second corner point belongs to the first surface, and the second corner point is an endpoint of the first edge. For example, taking the object to be measured shown in fig. 5 as an example, it is assumed that the first face is a planar EFG. The second pixel point is H, and the first side is AE. At this time, the second corner point is E.
The measuring device acquires a depth value of the second pixel point from the second depth map as a fourth depth value, and acquires a depth value of the second corner point from the second depth map as a fifth depth value.
21. And obtaining a third image point of the second pixel point under the camera coordinate system of the two-dimensional imaging device according to the fourth depth value and the coordinates of the second pixel point in the second image to be processed.
The measuring device converts the depth value (namely the fourth depth value) of the second pixel point and the coordinates of the second pixel point in the second image to be processed according to the internal parameters of the two-dimensional imaging device, and the coordinates of the image point of the second pixel point under the camera coordinate system can be obtained. The measuring device can obtain a third image point of the second pixel point under the camera coordinate system of the two-dimensional imaging device according to the second depth value and the coordinates of the second pixel point in the second image to be processed.
22. And obtaining a fourth image point of the second corner point under the camera coordinate system according to the fifth depth value and the coordinates of the second corner point in the second image to be processed.
The measuring device converts the depth value (namely, the fifth depth value) of the second corner point and the coordinates of the second corner point in the second image to be processed according to the internal parameters of the two-dimensional imaging device, and an image point of the second corner point under a camera coordinate system, namely, a fourth image point, can be obtained.
23. And determining the distance between the third image point and the fourth image point to obtain the second distance.
As an alternative embodiment, the first pixel belongs to a first edge of the object to be measured, and the first edge belongs to the third surface. The measuring device performs the following steps in performing step 20:
24. And obtaining the projection plane according to the coordinates of at least three second points in the third surface in the second image to be processed and the depth values of the at least three second points obtained from the depth map.
In the embodiment of the present application, the coordinates in the second image to be processed refer to coordinates in a pixel coordinate system of the second image to be processed (hereinafter, will be referred to as a second reference pixel coordinate system). The measuring device converts the coordinate of the pixel point under the second reference pixel coordinate system and the depth value of the pixel point according to the internal parameters of the two-dimensional imaging equipment, and the coordinate of the image point of the pixel point under the camera coordinate system of the two-dimensional imaging equipment can be obtained.
Therefore, the measuring device can obtain the coordinates of the image point of each second point under the camera coordinate system according to the second image to be processed, the second depth map and the internal parameters of the two-dimensional imaging device. After obtaining coordinates of image points of at least three second points in the camera coordinate system, the measuring device can fit to obtain a plane according to the coordinates of at least three second points in the camera coordinate system, wherein the plane is a projection plane of the surface to be measured in the camera coordinate system.
Limited by the imaging limitations of the depth imaging device, the accuracy of the depth values of points on edges in the object to be measured obtained from the depth map is low. As an alternative embodiment, none of the at least three second points of the measuring device selected from the third surface belongs to the edge of the third surface in the second image to be processed.
25. And determining a third image point corresponding to the second pixel point and a fourth image point corresponding to the second corner point from the projection plane.
In the embodiment of the application, the second corner point belongs to the second face of the object to be measured, and the second corner point is an endpoint of the first edge. Since the projection plane is the projection plane of the third plane under the camera coordinate system, the second pixel point and the second corner point both belong to the plane to be measured, the projection plane comprises the image point corresponding to the second pixel point and the image point corresponding to the second corner point, and the image point corresponding to the second pixel point and the image point corresponding to the second corner point can be determined from the projection plane. And taking an image point corresponding to the second pixel point in the projection plane as a fourth image point and an image point corresponding to the second corner point as a fourth image point.
In one possible implementation, the measurement device acquires the coordinates of the optical center of the two-dimensional imaging device in the camera coordinate system, and obtains a straight line (hereinafter referred to as a second straight line) passing through the second pixel point and the optical center according to the coordinates of the second pixel point in the image coordinate system and the coordinates of the optical center in the camera coordinate system. The measuring device determines the intersection point of the second straight line and the projection plane as a third image point. Similarly, the measuring device obtains a straight line (hereinafter referred to as a second straight line) passing through the second corner point and the optical center, based on the coordinates of the second corner point in the image coordinate system and the coordinates of the optical center in the camera coordinate system. The measuring device uses the intersection point of the second straight line and the projection plane as a fourth image point.
26. And determining a distance between the third image point and the fourth image point as the second distance.
Based on the technical scheme provided by the embodiment of the application, the embodiment of the application also provides a possible application scene. In daily life, people often need to measure the size of some objects (e.g., cartons, tables, cabinets). It is time consuming and labor intensive for people to use a ruler to measure the dimensions of an object. With the development of technology, the hardware configuration of the terminal is higher and higher, and the terminal can measure the size of the object by using the technical scheme disclosed by the embodiment of the application.
For example, a small Ming has contacted a moving company to help move, but the moving company needs to know the size of what needs to be moved by Announcement, so the small Ming needs to measure the size of what needs to be moved. Because the wardrobe is too high and the space for placing the wardrobe is limited, the whole wardrobe cannot be contained in one image obtained by shooting the wardrobe. At this time, the Ming's hand can input the large-size object measurement instruction to the wardrobe to make the mobile phone obtain the height of the wardrobe using the technical scheme disclosed above.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
The foregoing details of the method according to the embodiments of the present application and the apparatus according to the embodiments of the present application are provided below.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a measuring device according to an embodiment of the present application, where the measuring device 1 includes: an acquisition unit 11, a first processing unit 12, a second processing unit 13, a third processing unit 14, a two-dimensional imaging device 15, wherein:
an acquisition unit 11 for acquiring a first image to be processed and a second image to be processed; the first image to be processed comprises a first surface of an object to be measured, and the second image to be processed comprises a second surface of the object to be measured; at least one edge of the object to be measured is arranged between the first surface and the second surface;
a first processing unit 12, configured to determine a distance between an object point corresponding to a first pixel point in the first image to be processed and the first surface, so as to obtain a first distance; the first pixel point belongs to the object to be measured;
a second processing unit 13, configured to determine a distance from an object point corresponding to a second pixel point in the second image to be processed to the second surface, so as to obtain a second distance; the second pixel point and the first pixel point are the same name points;
and a third processing unit 14, configured to obtain the size of the object to be measured according to the first distance and the second distance.
In combination with any of the embodiments of the present application, the measuring device 1 further comprises a two-dimensional imaging apparatus 15, the acquisition unit 11 being configured to:
shooting the object to be measured by using the two-dimensional imaging device 15 under the condition that a large-size object measurement instruction aiming at the object to be measured is detected, so as to obtain at least two images to be confirmed;
And carrying out object surface detection processing on the at least two images to be confirmed, selecting one image from the images containing the first surface as the first image to be processed, and selecting one image from the images containing the second surface as the second image to be processed.
In combination with any embodiment of the present application, the obtaining unit 11 is configured to:
Performing object surface detection processing on the at least two images to be confirmed to obtain a first surface image to be confirmed containing the first surface and a second surface image to be confirmed containing the second surface;
And under the condition that the same name points belonging to the object to be measured exist in the first surface to-be-confirmed image and the second surface to-be-confirmed image through image matching processing of the first surface to-be-confirmed image and the second surface to-be-confirmed image, the first surface to-be-confirmed image is taken as the first to-be-processed image, and the second surface to-be-confirmed image is taken as the second to-be-processed image.
In combination with any one of the embodiments of the present application, the obtaining unit 11 is further configured to obtain a first depth map of the first image to be processed before determining a distance between an object point corresponding to a first pixel point in the first image to be processed and the first surface to obtain a first distance;
The first processing unit 12 is configured to:
and obtaining the first distance according to the first image to be processed and the first depth map.
In combination with any one of the embodiments of the present application, the first pixel belongs to a first edge of the object to be measured;
The first processing unit 12 is configured to:
Acquiring a first depth value of the first pixel point and a second depth value of the first corner point from the first depth map; the first corner point belongs to the first face, and the first corner point is an endpoint of the first edge;
Obtaining a first image point of the first pixel point under a camera coordinate system of the two-dimensional imaging device according to the first depth value and the coordinate of the first pixel point in the first image to be processed;
obtaining a second image point of the first angle point under the camera coordinate system according to the second depth value and the coordinates of the first angle point in the first image to be processed;
determining a distance between the first image point and the second image point to obtain the first distance.
In combination with any one of the embodiments of the present application, the first pixel belongs to a first edge of the object to be measured; the first edge belongs to a third surface of the object to be measured;
The first processing unit 12 is configured to:
Acquiring a projection plane of the third face under a camera coordinate system of the two-dimensional imaging device;
Determining a first image point corresponding to a first pixel point and a second image point corresponding to a first angle point from the projection plane; the first corner point is an endpoint of the first edge;
a distance between the first image point and the second image point is determined as the first distance.
In combination with any one of the embodiments of the present application, the first pixel belongs to a first edge of the object to be measured; the first processing unit 12 is configured to:
Acquiring a third depth value of the first line segment from the depth map; the two endpoints of the first line segment are the first pixel point and the first corner point respectively; the first corner point is an endpoint of the first edge; the third depth value is positively correlated with the depth value of the first pixel point, and/or the third depth value is positively correlated with the depth value of the first corner point;
Obtaining a mapping relation between the depth value and the transmission factor; the transmission factor characterizes the conversion relation between the size in the image and the real size; the transmission factor is positively correlated with the depth value;
Obtaining a transmission factor of the first line segment according to the mapping relation and the third depth value;
And obtaining the first distance according to the transmission factor of the first line segment and the length of the first line segment.
In combination with any one of the embodiments of the present application, the obtaining unit 11 is configured to obtain a second depth map of the second image to be processed before determining a distance between the second pixel point and the second surface to obtain a second distance;
The second processing unit 13 is configured to:
and obtaining the second distance according to the second image to be processed and the second depth map.
In combination with any one of the embodiments of the present application, the second pixel belongs to the first edge;
The second processing unit 13 is configured to:
acquiring a fourth depth value of the second pixel point and a fifth depth value of the second corner point from the second depth map; the second corner point belongs to the second face, and the second corner point is an endpoint of the first edge;
Obtaining a third image point of the second pixel point under a camera coordinate system of the two-dimensional imaging device according to the fourth depth value and the coordinates of the second pixel point in the second image to be processed;
Obtaining a fourth image point of the second corner under the camera coordinate system according to the fifth depth value and the coordinates of the second corner in the second image to be processed;
and determining the distance between the third image point and the fourth image point to obtain the second distance.
In some embodiments, the functions or modules included in the apparatus provided by the embodiments of the present application may be used to perform the methods described in the foregoing method embodiments, and specific implementations thereof may refer to descriptions of the foregoing method embodiments, which are not repeated herein for brevity.
Fig. 7 is a schematic diagram of a hardware structure of a terminal according to an embodiment of the present application. The terminal 2 comprises a processor 21, a memory 22, input means 23, output means 24. The processor 21, memory 22, input device 23, and output device 24 are coupled by connectors including various interfaces, transmission lines or buses, etc., as are not limited by the present embodiments. It should be appreciated that in various embodiments of the application, coupled is intended to mean interconnected by a particular means, including directly or indirectly through other devices, e.g., through various interfaces, transmission lines, buses, etc.
The processor 21 may be one or more graphics processors (graphics processing unit, gpus), which in the case of a GPU as the processor 21 may be a single core GPU or a multi-core GPU. Alternatively, the processor 21 may be a processor group formed by a plurality of GPUs, and the plurality of processors are coupled to each other through one or more buses. In the alternative, the processor may be another type of processor, and the embodiment of the application is not limited.
Memory 22 may be used to store computer program instructions as well as various types of computer program code for performing aspects of the present application. Optionally, the memory includes, but is not limited to, random access memory (random access memory, RAM), read-only memory (ROM), erasable programmable read-only memory (erasable programmable read only memory, EPROM), or portable read-only memory (compact disc read-only memory, CD-ROM) for associated instructions and data.
The input means 23 are for inputting data and/or signals and the output means 24 are for outputting data and/or signals. The input device 23 and the output device 24 may be separate devices or may be an integral device.
It will be appreciated that in the embodiment of the present application, the memory 22 may be used to store not only related instructions, but also related data, for example, the memory 22 may be used to store a first image to be processed and a second image to be processed acquired through the input device 23, or the memory 22 may be used to store a size of an object to be measured obtained through the processor 21, etc., and the embodiment of the present application is not limited to the data specifically stored in the memory.
It will be appreciated that fig. 7 shows only a simplified design of a terminal. In practical applications, the terminal may also include other necessary elements, including but not limited to any number of input/output devices, processors, memories, etc., and all terminals that can implement the embodiments of the present application are within the scope of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein. It will be further apparent to those skilled in the art that the descriptions of the various embodiments of the present application are provided with emphasis, and that the same or similar parts may not be described in detail in different embodiments for convenience and brevity of description, and thus, parts not described in one embodiment or in detail may be referred to in description of other embodiments.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted across a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a digital versatile disk (DIGITAL VERSATILE DISC, DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: a read-only memory (ROM) or a random-access memory (random access memory, RAM), a magnetic disk or an optical disk, or the like.

Claims (9)

1. A measurement method, characterized in that the measurement method is applied to a terminal comprising a two-dimensional imaging device, the method comprising:
Acquiring a first depth map of a first image to be processed;
Acquiring a first image to be processed and a second image to be processed; the first image to be processed comprises a first surface of an object to be measured, and the second image to be processed comprises a second surface of the object to be measured; at least one edge of the object to be measured is arranged between the first surface and the second surface; the acquiring the first image to be processed and the second image to be processed includes: under the condition that a large-size object measurement instruction aiming at the object to be measured is detected, shooting the object to be measured by using the two-dimensional imaging equipment to obtain at least two images to be confirmed; selecting an image from the images containing the first surface as the first image to be processed and selecting an image from the images containing the second surface as the second image to be processed by carrying out object surface detection processing on the at least two images to be confirmed;
determining the distance between an object point corresponding to a first pixel point in the first image to be processed and the first surface to obtain a first distance; the first pixel point belongs to the object to be measured; the determining the distance from the first pixel point in the image to be processed to the first surface to obtain a first distance includes: obtaining the first distance according to the first image to be processed and the first depth map, wherein the first pixel point belongs to a first edge of the object to be measured;
The obtaining the first distance according to the first image to be processed and the first depth map includes: acquiring a first depth value of the first pixel point and a second depth value of the first corner point from the first depth map; the first corner point belongs to the first face, and the first corner point is an endpoint of the first edge; obtaining a first image point of the first pixel point under a camera coordinate system of the two-dimensional imaging device according to the first depth value and the coordinate of the first pixel point in the first image to be processed; obtaining a second image point of the first angle point under the camera coordinate system according to the second depth value and the coordinates of the first angle point in the first image to be processed; determining a distance between the first image point and the second image point to obtain the first distance;
Determining the distance between an object point corresponding to a second pixel point in the second image to be processed and the second surface to obtain a second distance; the second pixel point and the first pixel point are the same name points;
And obtaining the size of the object to be measured according to the first distance and the second distance.
2. The method according to claim 1, wherein selecting one image from the images including the first face as the first image to be processed and one image from the images including the second face as the second image to be processed by performing object face detection processing on the at least two images to be confirmed, comprises:
Performing object surface detection processing on the at least two images to be confirmed to obtain a first surface image to be confirmed containing the first surface and a second surface image to be confirmed containing the second surface;
And under the condition that the same name points belonging to the object to be measured exist in the first surface to-be-confirmed image and the second surface to-be-confirmed image through image matching processing of the first surface to-be-confirmed image and the second surface to-be-confirmed image, the first surface to-be-confirmed image is taken as the first to-be-processed image, and the second surface to-be-confirmed image is taken as the second to-be-processed image.
3. The method according to claim 1, wherein the first pixel belongs to a first side of the object to be measured; the first edge belongs to a third surface of the object to be measured;
the obtaining the first distance according to the first image to be processed and the first depth map includes:
Acquiring a projection plane of the third face under a camera coordinate system of the two-dimensional imaging device;
Determining a first image point corresponding to a first pixel point and a second image point corresponding to a first angle point from the projection plane; the first corner point is an endpoint of the first edge;
a distance between the first image point and the second image point is determined as the first distance.
4. The method according to claim 1, wherein the first pixel belongs to a first side of the object to be measured; the obtaining the first distance according to the first image to be processed and the first depth map includes:
Acquiring a third depth value of the first line segment from the depth map; the two endpoints of the first line segment are the first pixel point and the first corner point respectively; the first corner point is an endpoint of the first edge; the third depth value is positively correlated with the depth value of the first pixel point, and/or the third depth value is positively correlated with the depth value of the first corner point;
Obtaining a mapping relation between the depth value and the transmission factor; the transmission factor characterizes the conversion relation between the size in the image and the real size; the transmission factor is positively correlated with the depth value;
Obtaining a transmission factor of the first line segment according to the mapping relation and the third depth value;
And obtaining the first distance according to the transmission factor of the first line segment and the length of the first line segment.
5. The method of any one of claims 1, 3,4, wherein prior to said determining a distance of the second pixel point to the second face, the method further comprises:
acquiring a second depth map of the second image to be processed;
the determining the distance between the second pixel point and the second surface to obtain a second distance includes:
and obtaining the second distance according to the second image to be processed and the second depth map.
6. The method of claim 5, wherein the second pixel belongs to the first edge;
The obtaining the second distance according to the second image to be processed and the second depth map includes:
acquiring a fourth depth value of the second pixel point and a fifth depth value of the second corner point from the second depth map; the second corner point belongs to the second face, and the second corner point is an endpoint of the first edge;
Obtaining a third image point of the second pixel point under a camera coordinate system of the two-dimensional imaging device according to the fourth depth value and the coordinates of the second pixel point in the second image to be processed;
Obtaining a fourth image point of the second corner under the camera coordinate system according to the fifth depth value and the coordinates of the second corner in the second image to be processed;
and determining the distance between the third image point and the fourth image point to obtain the second distance.
7. A measurement apparatus, the apparatus comprising a two-dimensional imaging device, the apparatus comprising:
an acquisition unit configured to acquire a first depth map of a first image to be processed;
The acquisition unit is used for acquiring a first image to be processed and a second image to be processed; the first image to be processed comprises a first surface of an object to be measured, and the second image to be processed comprises a second surface of the object to be measured; at least one edge of the object to be measured is arranged between the first surface and the second surface; the acquiring the first image to be processed and the second image to be processed includes: under the condition that a large-size object measurement instruction aiming at the object to be measured is detected, shooting the object to be measured by using the two-dimensional imaging equipment to obtain at least two images to be confirmed; selecting an image from the images containing the first surface as the first image to be processed and selecting an image from the images containing the second surface as the second image to be processed by carrying out object surface detection processing on the at least two images to be confirmed;
The first processing unit is used for determining the distance between the object point corresponding to the first pixel point in the first image to be processed and the first surface to obtain a first distance; the first pixel point belongs to the object to be measured; the determining the distance from the first pixel point in the image to be processed to the first surface to obtain a first distance includes: obtaining the first distance according to the first image to be processed and the first depth map, wherein the first pixel point belongs to a first edge of the object to be measured;
The obtaining the first distance according to the first image to be processed and the first depth map includes: acquiring a first depth value of the first pixel point and a second depth value of the first corner point from the first depth map; the first corner point belongs to the first face, and the first corner point is an endpoint of the first edge; obtaining a first image point of the first pixel point under a camera coordinate system of the two-dimensional imaging device according to the first depth value and the coordinate of the first pixel point in the first image to be processed; obtaining a second image point of the first angle point under the camera coordinate system according to the second depth value and the coordinates of the first angle point in the first image to be processed; determining a distance between the first image point and the second image point to obtain the first distance;
The second processing unit is used for determining the distance between the object point corresponding to the second pixel point in the second image to be processed and the second surface to obtain a second distance; the second pixel point and the first pixel point are the same name points;
and the third processing unit is used for obtaining the size of the object to be measured according to the first distance and the second distance.
8. An electronic device, comprising: a processor and a memory for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of any one of claims 1 to 6.
9. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method of any of claims 1 to 6.
CN202010901187.8A 2020-08-31 2020-08-31 Measurement method and device, electronic equipment and storage medium Active CN112146628B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010901187.8A CN112146628B (en) 2020-08-31 2020-08-31 Measurement method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010901187.8A CN112146628B (en) 2020-08-31 2020-08-31 Measurement method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112146628A CN112146628A (en) 2020-12-29
CN112146628B true CN112146628B (en) 2024-04-19

Family

ID=73890356

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010901187.8A Active CN112146628B (en) 2020-08-31 2020-08-31 Measurement method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112146628B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102650518A (en) * 2011-02-25 2012-08-29 株式会社理光 Measuring method and equipment
CN106247951A (en) * 2016-08-29 2016-12-21 上海交通大学 A kind of object measuring method based on depth image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102650518A (en) * 2011-02-25 2012-08-29 株式会社理光 Measuring method and equipment
CN106247951A (en) * 2016-08-29 2016-12-21 上海交通大学 A kind of object measuring method based on depth image

Also Published As

Publication number Publication date
CN112146628A (en) 2020-12-29

Similar Documents

Publication Publication Date Title
CN112771573B (en) Depth estimation method and device based on speckle images and face recognition system
US10924729B2 (en) Method and device for calibration
JP6657214B2 (en) Accuracy measurement of image-based depth detection system
US11816810B2 (en) 3-D reconstruction using augmented reality frameworks
CN113240769B (en) Spatial link relation identification method and device and storage medium
EP3351001A1 (en) Method for encoding a light field content
CN114640833A (en) Projection picture adjusting method and device, electronic equipment and storage medium
CN112197708B (en) Measuring method and device, electronic device and storage medium
CN111383254A (en) Depth information acquisition method and system and terminal equipment
CN107067441B (en) Camera calibration method and device
US20220222842A1 (en) Image reconstruction for virtual 3d
CN112634366B (en) Method for generating position information, related device and computer program product
WO2014093218A1 (en) Techniques for rectification of camera arrays
CN112102391A (en) Measuring method and device, electronic device and storage medium
CN112146628B (en) Measurement method and device, electronic equipment and storage medium
CN111275611B (en) Method, device, terminal and storage medium for determining object depth in three-dimensional scene
US10593054B2 (en) Estimation of 3D point candidates from a location in a single image
CN113628286B (en) Video color gamut detection method, device, computing equipment and computer storage medium
CN116053549A (en) Battery cell positioning method, device and system
CN112150527B (en) Measurement method and device, electronic equipment and storage medium
CN113436269B (en) Image dense stereo matching method, device and computer equipment
CN112615993A (en) Depth information acquisition method, binocular camera module, storage medium and electronic equipment
CN112102390A (en) Measuring method and device, electronic device and storage medium
CN113379826A (en) Method and device for measuring volume of logistics piece
CN112150527A (en) Measuring method and device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant