CN112197708A - Measuring method and device, electronic device and storage medium - Google Patents

Measuring method and device, electronic device and storage medium Download PDF

Info

Publication number
CN112197708A
CN112197708A CN202010901189.7A CN202010901189A CN112197708A CN 112197708 A CN112197708 A CN 112197708A CN 202010901189 A CN202010901189 A CN 202010901189A CN 112197708 A CN112197708 A CN 112197708A
Authority
CN
China
Prior art keywords
point
measured
camera
image
dimensional image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010901189.7A
Other languages
Chinese (zh)
Other versions
CN112197708B (en
Inventor
薛地
周杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen TetrasAI Technology Co Ltd
Original Assignee
Shenzhen TetrasAI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen TetrasAI Technology Co Ltd filed Critical Shenzhen TetrasAI Technology Co Ltd
Priority to CN202210630730.4A priority Critical patent/CN115031635A/en
Priority to CN202010901189.7A priority patent/CN112197708B/en
Publication of CN112197708A publication Critical patent/CN112197708A/en
Application granted granted Critical
Publication of CN112197708B publication Critical patent/CN112197708B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/022Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of tv-camera scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/06Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
    • G01B11/0608Height gauges

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a measuring method and device, electronic equipment and a storage medium. The method is applied to a terminal, the terminal comprises a two-dimensional camera and a depth camera, and the method comprises the following steps: shooting an object to be measured by using the two-dimensional camera to obtain a two-dimensional image, and shooting the object to be measured by using the depth camera to obtain a depth map of the two-dimensional image; acquiring a shooting angle of the two-dimensional image obtained by the two-dimensional camera; and obtaining the height of the object to be measured according to the shooting angle, the two-dimensional image and the depth map.

Description

Measuring method and device, electronic device and storage medium
Technical Field
The present application relates to the field of computer vision technologies, and in particular, to a measurement method and apparatus, an electronic device, and a storage medium.
Background
In daily life, people often need to measure the size of an object. In conventional methods, people usually use length measuring tools (such as tape measure, ruler, vernier caliper) to measure the size of an object. However, this conventional measurement method is time-consuming and labor-consuming for the measurer, and the measurement efficiency is low. Therefore, how to measure the size of the object efficiently and accurately has very important significance.
Disclosure of Invention
The application provides a measurement method and device, an electronic device and a storage medium.
In a first aspect, a measurement method is provided, where the method is applied to a terminal, where the terminal includes a two-dimensional camera and a depth camera, and the method includes:
shooting an object to be measured by using the two-dimensional camera to obtain a two-dimensional image, and shooting the object to be measured by using the depth camera to obtain a depth map of the two-dimensional image;
acquiring a shooting angle of the two-dimensional image obtained by the two-dimensional camera;
and obtaining the height of the object to be measured according to the shooting angle, the two-dimensional image and the depth map.
With reference to any one of the embodiments of the present application, obtaining the height of the object to be measured according to the shooting angle, the two-dimensional image, and the depth map includes:
under the condition that the shooting angle is a depression angle, acquiring a first image point of a first point under a camera coordinate system of the two-dimensional camera and a first projection plane of the upper bottom surface of the object to be measured under the camera coordinate system; the first point is a point on the object to be measured, wherein the depth value of the first point is smaller than the minimum depth value of the first point, and the first point does not belong to the upper bottom surface; the first point is a point except the first type point in the plane to which the lower bottom surface of the object to be measured belongs; the first type of dots comprises dots in the bottom surface;
determining the distance from the first image point to the first projection plane to obtain a first distance;
and obtaining the height of the object to be measured according to the first distance.
In combination with any embodiment of the present application, before the acquiring the first projection plane of the upper bottom surface of the object to be measured under the camera coordinate system, the method further includes:
obtaining internal parameters of the two-dimensional camera, and obtaining depth values of at least three second points in the upper bottom surface from the depth map;
obtaining image points of the at least three second points in the camera coordinate system according to the internal parameters, the depth values of the at least three second points and the coordinates of the at least three second points in the pixel coordinate system of the two-dimensional image;
and performing plane fitting processing on the image points of the at least three second points in the camera coordinate system to obtain the first projection plane.
With reference to any one of the embodiments of the present application, the bottom surface includes a first corner point and a second corner point adjacent to the first corner point;
before the acquiring a first image point of the first point in the camera coordinate system of the two-dimensional camera, the method further includes:
determining a straight line which passes through the midpoint of the first line segment and is parallel to the reference longitudinal axis under the condition that the first line segment is positioned on the front surface of the object to be measured to obtain a first straight line; the first line segment passes through the first corner point and the second corner point; the reference longitudinal axis is a longitudinal axis of a pixel coordinate system of the two-dimensional image;
and selecting a point on the first straight line as the first point.
In combination with any embodiment of the present application, the method further comprises:
determining a straight line which passes through the midpoint of the first line segment and is parallel to the horizontal axis of the pixel coordinate system under the condition that the first line segment is positioned on the side surface of the object to be measured to obtain a second straight line;
and selecting a point on the second straight line as the first point.
With reference to any one of the embodiments of the present application, obtaining the height of the object to be measured according to the shooting angle, the two-dimensional image, and the depth map includes:
under the condition that the shooting angle is a depression angle, acquiring a first depth value of a second line segment from the depth map; two end points of the second line segment are respectively a third corner point of the object to be measured and a fourth corner point of the object to be measured; the second line segment is a height edge of the object to be measured; the first depth value is positively correlated with the depth value of the third corner point, and/or the first depth value is positively correlated with the depth value of the fourth corner point;
acquiring a mapping relation between the depth value and the transmission factor; the transmission factor characterizes a conversion relationship between a dimension in the image and a true dimension; the transmission factor is positively correlated with the depth value;
obtaining a transmission factor of the second line segment according to the mapping relation and the first depth value;
and obtaining the height of the object to be measured according to the transmission factor of the second line segment and the length of the second line segment.
In combination with any embodiment of the present application, the method further comprises:
acquiring a coordinate of a fifth corner point in the two-dimensional image and a coordinate of a sixth corner point in the two-dimensional image, and acquiring a second depth value of the fifth corner point and a third depth value of the sixth corner point from the depth map; the fifth corner point belongs to the upper bottom surface, and the sixth corner point is a corner point adjacent to the fifth corner point in the upper bottom surface;
obtaining a second image point of the fifth corner point under a camera coordinate system of the two-dimensional camera according to the coordinate of the fifth corner point in the two-dimensional image and the second depth value;
obtaining a third image point of the sixth corner point under the camera coordinate system according to the coordinate of the sixth corner point in the two-dimensional image and the third depth value;
and determining the distance between the second image point and the third image point as the side length of the object to be measured.
With reference to any one of the embodiments of the present application, obtaining the height of the object to be measured according to the shooting angle, the two-dimensional image, and the depth map includes:
under the condition that the shooting angle is a non-overhead shooting angle, acquiring a coordinate of a seventh corner point in the two-dimensional image and a coordinate of an eighth corner point in the two-dimensional image, and acquiring a fourth depth value of the seventh corner point and a fifth depth value of the eighth corner point from the depth map; the seventh corner point belongs to the upper bottom surface, the eighth corner point belongs to the lower bottom surface of the object to be measured, and a connecting line between the seventh corner point and the eighth corner point is a height edge of the object to be measured;
obtaining a fourth image point of the seventh corner point under a camera coordinate system of the two-dimensional camera according to the coordinate of the seventh corner point in the two-dimensional image and the fourth depth value;
obtaining a fifth image point of the eighth corner point under the camera coordinate system according to the coordinate of the eighth corner point in the two-dimensional image and the fifth depth value;
and determining the distance between the fourth image point and the fifth image point as the height of the object to be measured.
In combination with any embodiment of the present application, the terminal further includes an angular velocity sensor, the acquiring the shooting angle of the two-dimensional image obtained by the two-dimensional camera includes:
the terminal obtains the shooting angle through the angular velocity sensor in the process of obtaining the two-dimensional image;
under the condition that the shooting angle is within a downward shooting angle interval, determining the shooting angle as a downward shooting angle;
and under the condition that the shooting angle is out of the bent-down angle section, determining that the shooting angle is a non-bent-down angle.
With reference to any one of the embodiments of the present application, the acquiring the shooting angle of the two-dimensional image obtained by the two-dimensional camera includes:
under the condition that the number of the surfaces of the two-dimensional image containing the object to be measured exceeds a threshold value, determining that the shooting angle of the two-dimensional image obtained by the two-dimensional camera is a depression angle;
and under the condition that the number of the surfaces of the two-dimensional image containing the object to be measured does not exceed the threshold value, determining that the shooting angle of the two-dimensional image obtained by the two-dimensional camera is a non-overhead shooting angle.
In a second aspect, a measurement device is provided, the measurement device comprising a two-dimensional camera and a depth camera;
the measuring device shoots an object to be measured by using the two-dimensional camera to obtain a two-dimensional image, and shoots the object to be measured by using the depth camera to obtain a depth map of the two-dimensional image;
the measuring device further includes:
the acquisition unit is used for acquiring the shooting angle of the two-dimensional image obtained by the two-dimensional camera;
and the first processing unit is used for obtaining the height of the object to be measured according to the shooting angle, the two-dimensional image and the depth map.
With reference to any one of the embodiments of the present application, the first processing unit is configured to:
under the condition that the shooting angle is a depression angle, acquiring a first image point of a first point under a camera coordinate system of the two-dimensional camera and a first projection plane of the upper bottom surface of the object to be measured under the camera coordinate system; the first point is a point on the object to be measured, wherein the depth value of the first point is smaller than the minimum depth value of the first point, and the first point does not belong to the upper bottom surface; the first point is a point except the first type point in the plane to which the lower bottom surface of the object to be measured belongs; the first type of dots comprises dots in the bottom surface;
determining the distance from the first image point to the first projection plane to obtain a first distance;
and obtaining the height of the object to be measured according to the first distance.
With reference to any embodiment of the present application, the obtaining unit is configured to:
before the first projection plane of the upper bottom surface of the object to be measured under the camera coordinate system is obtained, obtaining internal parameters of the two-dimensional camera, and obtaining depth values of at least three second points in the upper bottom surface from the depth map;
the first processing unit is configured to:
obtaining image points of the at least three second points in the camera coordinate system according to the internal parameters, the depth values of the at least three second points and the coordinates of the at least three second points in the pixel coordinate system of the two-dimensional image;
and performing plane fitting processing on the image points of the at least three second points in the camera coordinate system to obtain the first projection plane.
With reference to any one of the embodiments of the present application, the bottom surface includes a first corner point and a second corner point adjacent to the first corner point;
the first processing unit is used for determining a straight line which passes through the midpoint of the first line segment and is parallel to the reference longitudinal axis to obtain a first straight line under the condition that the first line segment is positioned on the front surface of the object to be measured before the first image point of the first point under the camera coordinate system of the two-dimensional camera is obtained; the first line segment passes through the first corner point and the second corner point; the reference longitudinal axis is a longitudinal axis of a pixel coordinate system of the two-dimensional image;
and selecting a point on the first straight line as the first point.
With reference to any one of the embodiments of the present application, the first processing unit is configured to:
determining a straight line which passes through the midpoint of the first line segment and is parallel to the horizontal axis of the pixel coordinate system under the condition that the first line segment is positioned on the side surface of the object to be measured to obtain a second straight line;
and selecting a point on the second straight line as the first point.
With reference to any one of the embodiments of the present application, the first processing unit is configured to:
under the condition that the shooting angle is a depression angle, acquiring a first depth value of a second line segment from the depth map; two end points of the second line segment are respectively a third corner point of the object to be measured and a fourth corner point of the object to be measured; the second line segment is a height edge of the object to be measured; the first depth value is positively correlated with the depth value of the third corner point, and/or the first depth value is positively correlated with the depth value of the fourth corner point;
acquiring a mapping relation between the depth value and the transmission factor; the transmission factor characterizes a conversion relationship between a dimension in the image and a true dimension; the transmission factor is positively correlated with the depth value;
obtaining a transmission factor of the second line segment according to the mapping relation and the first depth value;
and obtaining the height of the object to be measured according to the transmission factor of the second line segment and the length of the second line segment.
With reference to any embodiment of the present application, the obtaining unit is further configured to:
acquiring a coordinate of a fifth corner point in the two-dimensional image and a coordinate of a sixth corner point in the two-dimensional image, and acquiring a second depth value of the fifth corner point and a third depth value of the sixth corner point from the depth map; the fifth corner point belongs to the upper bottom surface, and the sixth corner point is a corner point adjacent to the fifth corner point in the upper bottom surface;
the first processing unit is further configured to:
obtaining a second image point of the fifth corner point under a camera coordinate system of the two-dimensional camera according to the coordinate of the fifth corner point in the two-dimensional image and the second depth value;
obtaining a third image point of the sixth corner point under the camera coordinate system according to the coordinate of the sixth corner point in the two-dimensional image and the third depth value;
and determining the distance between the second image point and the third image point as the side length of the object to be measured.
With reference to any one of the embodiments of the present application, the first processing unit is configured to:
under the condition that the shooting angle is a non-overhead shooting angle, acquiring a coordinate of a seventh corner point in the two-dimensional image and a coordinate of an eighth corner point in the two-dimensional image, and acquiring a fourth depth value of the seventh corner point and a fifth depth value of the eighth corner point from the depth map; the seventh corner point belongs to the upper bottom surface, the eighth corner point belongs to the lower bottom surface of the object to be measured, and a connecting line between the seventh corner point and the eighth corner point is a height edge of the object to be measured;
obtaining a fourth image point of the seventh corner point under a camera coordinate system of the two-dimensional camera according to the coordinate of the seventh corner point in the two-dimensional image and the fourth depth value;
obtaining a fifth image point of the eighth corner point under the camera coordinate system according to the coordinate of the eighth corner point in the two-dimensional image and the fifth depth value;
and determining the distance between the fourth image point and the fifth image point as the height of the object to be measured.
In combination with any embodiment of the present application, the measuring apparatus further includes an angular velocity sensor 1, and the obtaining unit 11 is configured to:
acquiring the shooting angle through the angular velocity sensor in the process of obtaining the two-dimensional image;
under the condition that the shooting angle is within a downward shooting angle interval, determining the shooting angle as a downward shooting angle;
and under the condition that the shooting angle is out of the bent-down angle section, determining that the shooting angle is a non-bent-down angle.
With reference to any embodiment of the present application, the obtaining unit is configured to:
under the condition that the number of the surfaces of the two-dimensional image containing the object to be measured exceeds a threshold value, determining that the shooting angle of the two-dimensional image obtained by the two-dimensional camera is a depression angle;
and under the condition that the number of the surfaces of the two-dimensional image containing the object to be measured does not exceed the threshold value, determining that the shooting angle of the two-dimensional image obtained by the two-dimensional camera is a non-overhead shooting angle.
In a third aspect, a processor is provided, which is configured to perform the method according to the first aspect and any one of the possible implementations thereof.
In a fourth aspect, an electronic device is provided, comprising: a processor, transmitting means, input means, output means, and a memory for storing computer program code comprising computer instructions, which, when executed by the processor, cause the electronic device to perform the method of the first aspect and any one of its possible implementations.
In a fifth aspect, there is provided a computer-readable storage medium having stored therein a computer program comprising program instructions which, if executed by a processor, cause the processor to perform the method of the first aspect and any one of its possible implementations.
A sixth aspect provides a computer program product comprising a computer program or instructions which, when run on a computer, causes the computer to perform the method of the first aspect and any of its possible implementations.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and, together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic view of a point of identity provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of an image pixel coordinate system according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a measurement method according to an embodiment of the present application;
fig. 4 is a schematic view of an object to be measured according to an embodiment of the present disclosure;
FIG. 5 is a schematic view of another object to be measured according to an embodiment of the present disclosure;
FIG. 6 is a schematic view of another object to be measured according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram of a two-dimensional image according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a measurement apparatus according to an embodiment of the present disclosure;
fig. 9 is a schematic hardware structure diagram of a measurement apparatus according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more, "at least two" means two or three and three or more, "and/or" for describing an association relationship of associated objects, meaning that three relationships may exist, for example, "a and/or B" may mean: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" may indicate that the objects associated with each other are in an "or" relationship, meaning any combination of the items, including single item(s) or multiple items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural. The character "/" may also represent a division in a mathematical operation, e.g., a/b-a divided by b; 6/3 ═ 2. At least one of the following "or similar expressions.
Some concepts that will appear below are first defined. In the embodiment of the present application, the object point refers to a point in the real world, the physical distance refers to a distance in the real world, and the physical size refers to a size in the real world.
The object points correspond to pixel points in the image. For example, the table is photographed using a camera to obtain an image a. The table comprises an object point a, a pixel point b in the image A is obtained by imaging the object point a, and the object point a corresponds to the pixel point b.
For convenience of description, the pixels of the same object point in different images are referred to as the same name point. As shown in FIG. 1, the pixel A and the pixel C are the same-name points, and the pixel B and the pixel D are the same-name points. In addition, the following will use [ a, b ] to represent a value range of a to b, use (c, d) to represent a value range of c to d, and use [ e, f) to represent a value range of e to f.
In the embodiment of the present application, the positions in the image all refer to positions in pixel coordinates of the image. In the embodiment of the present application, the abscissa of the pixel coordinate system is used to indicate the number of rows where the pixel points are located, and the ordinate of the pixel coordinate system is used to indicate the number of rows where the pixel points are located. For example, in the image shown in fig. 2, a pixel coordinate system XOY is constructed with the upper left corner of the image as the origin O of coordinates, the direction parallel to the rows of the image as the direction of the X axis, and the direction parallel to the columns of the image as the direction of the Y axis. The units of the abscissa and the ordinate are pixel points. For example, pixel A in FIG. 211Has the coordinate of (1, 1), and the pixel point A23Has the coordinates of (3, 2), and the pixel point A42Has the coordinates of (2, 4), and the pixel point A34The coordinates of (2) are (4, 3).
In this embodiment of the application, an image point of a pixel point in a two-dimensional image in a camera coordinate system is a projection point of the pixel point in the camera coordinate system, a distance from the projection point to an optical center of a two-dimensional camera is a distance from an object point corresponding to the pixel point to the two-dimensional camera, and the projection point, the pixel point and the optical center of the two-dimensional camera are on the same straight line.
In the embodiment of the present application, the projection plane of the pixel point plane in the two-dimensional image in the camera coordinate system is a plane including the projection point of the pixel point in the pixel point plane in the camera coordinate system.
In daily life, people often need to measure the size of an object. In conventional methods, people usually use length measuring tools (such as tape measure, ruler, vernier caliper) to measure the size of an object. However, this conventional measurement method is time-consuming and labor-consuming for the measurer, and the measurement efficiency is low. Therefore, how to measure the size of the object efficiently and accurately has very important significance.
Based on this, this application embodiment provides a technical scheme to accurate, efficient measurement object's size. The embodiments of the present application will be described below with reference to the drawings. Referring to fig. 3, fig. 3 is a schematic flow chart of a measurement method according to an embodiment of the present disclosure.
The execution subject of the embodiment of the application is a measuring device. The terminal is loaded with a two-dimensional camera and a depth camera, wherein the two-dimensional camera is used for shooting two-dimensional images (such as RGB images and YUV images, wherein Y represents brightness (namely gray scale value), U and V represent chroma), and the two-dimensional camera can be an RGB camera or a YUV camera. The depth camera is used for shooting a depth map. The depth camera may be one of: structured light (structured light) cameras, time of flight (TOF) cameras, binocular stereo (binocular stereo vision) cameras.
Optionally, the measuring device may be one of the following: cell-phone, computer, server, panel computer. The embodiments of the present application will be described below with reference to the drawings. Referring to fig. 3, fig. 3 is a schematic flow chart of a measurement method according to an embodiment of the present disclosure.
301. And shooting the object to be measured by using the two-dimensional camera to obtain a two-dimensional image, and shooting the object to be measured by using the depth camera to obtain a depth map of the two-dimensional image.
In the embodiment of the present application, the object to be measured may be an object of any shape. For example, the object to be measured may be a rectangular parallelepiped; the object to be measured may be a triangular pyramid; the object to be measured may be a trapezoid.
In the embodiment of the present application, the two-dimensional image may be an RGB image or a YUV image, which is not limited in this application. The depth map carries depth information of pixel points in the two-dimensional image, namely the depth map carries depth information of an object to be measured.
In a possible implementation manner, the terminal uses the two-dimensional camera to shoot the object to be measured, and simultaneously uses the depth camera to shoot the object to be measured, so as to obtain a two-dimensional image and a depth map of the two-dimensional image.
In another possible implementation manner, after the terminal uses the two-dimensional camera to shoot the object to be measured to obtain the two-dimensional image, the terminal uses the depth camera to shoot the object to be measured to obtain the depth map of the two-dimensional image.
In another possible implementation manner, after the terminal uses the depth camera to shoot the object to be measured to obtain the depth map, the terminal uses the two-dimensional camera to shoot the object to be measured to obtain the two-dimensional image.
Alternatively, the terminal executes step 301 in the case of receiving a size measurement instruction for the object to be measured. For example, in the case where the terminal is a mobile phone, the user inputs an instruction to measure the size of the object to be measured to the mobile phone by clicking a screen of the mobile phone. And when the mobile phone receives the instruction, the two-dimensional camera is used for shooting the object to be measured to obtain a two-dimensional image, and the depth camera is used for shooting the object to be measured to obtain a depth map of the two-dimensional image.
302. And acquiring the shooting angle of the two-dimensional image obtained by the two-dimensional camera.
In this application embodiment, the shooting angle of two-dimensional camera refers to the contained angle between the optical axis direction of two-dimensional camera and the gravity direction of the volume of awaiting measuring object, and wherein, the optical axis direction of two-dimensional camera refers to the direction that is parallel with the optical axis of two-dimensional camera, and points to the volume of awaiting measuring object.
In an implementation manner of obtaining the shooting angle of the two-dimensional camera to obtain the two-dimensional image, the terminal receives the shooting angle input by the user through the input component and uses the shooting angle as the shooting angle of the two-dimensional camera to obtain the two-dimensional image.
In another implementation manner of obtaining the shooting angle of the two-dimensional camera to obtain the two-dimensional image, the terminal receives the shooting angle sent by the first control device, and the shooting angle is used as the shooting angle of the two-dimensional camera to obtain the two-dimensional image. The first control means may be one of: cell-phone, computer, server, panel computer, wearable equipment.
In another implementation manner of acquiring the shooting angle of the two-dimensional camera to obtain the two-dimensional image, the terminal is loaded with an angular velocity sensor, and the angular velocity sensor is used for acquiring the shooting angle of the two-dimensional camera. The terminal obtains the shooting angle of the two-dimensional camera from the angular velocity sensor while shooting the object to be measured by the two-dimensional camera to obtain a two-dimensional image, and the shooting angle is used as the shooting angle of the two-dimensional camera to obtain the two-dimensional image.
303. And obtaining the height of the object to be measured according to the shooting angle, the two-dimensional image and the depth map.
The terminal can obtain the depth value of the pixel point in the two-dimensional image from the depth map, wherein the part of the pixel point comprises the pixel point on the object to be measured. That is, the terminal can obtain the depth value of any point on the object to be measured from the depth map, and then obtain the coordinate of any point on the object to be measured under the camera coordinate of the two-dimensional camera by combining the coordinate of any point on the object to be measured in the two-dimensional image. The terminal obtains the height of the object to be measured according to the coordinates (hereinafter, referred to as three-dimensional coordinates) of any point on the object to be measured under the camera coordinates of the two-dimensional camera.
For example, in the case that the object to be measured is a rectangular parallelepiped, the terminal obtains three-dimensional coordinates of two corner points located on the height side of the object to be measured according to the two-dimensional image and the depth map, and further obtains the height of the object to be measured.
In the embodiment of the present application, the side of the object to be measured whose length can represent the height of the object to be measured is referred to as a height side, for example, in the object to be measured shown in fig. 4, the lengths of three sides AE, BF, and CG can all be used to represent the height of the object to be measured, that is, AE, BF, and CG are all height sides.
Under the condition of a larger shooting angle, the farther the object is from the imaging device, the larger the perspective projection error of the object in the imaging process is, and further the larger error exists in the size of the object obtained according to the two-dimensional image and the depth map.
For example, it is assumed that the object to be measured is small in size and the object to be measured is placed on the ground. When the imaging device is required to photograph the object to be measured, the object to be measured is usually photographed in a downward shooting manner. Because the shooting angle of the overhead shooting is large, the perspective projection error generated in the imaging process of the part of the object to be measured, which is farther away from the imaging equipment, is large, the obtained error of the size of the object to be measured is large according to the two-dimensional image containing the object to be measured and the depth map containing the two-dimensional image, and the error of the size of the part of the object to be measured, which is closer to the ground, is large.
Taking the object to be measured shown in fig. 5 as an example, assume that the terminal takes a picture of the object to be measured shown in fig. 5 in a top-down manner, and obtains a two-dimensional image and a depth map. And the terminal can obtain the real length corresponding to HF, the real length corresponding to JH and the real length corresponding to BJ according to the two-dimensional image and the depth map. At this time, the error of the real length corresponding to BJ is smaller than the error of the real length corresponding to JH, and the error of the real length corresponding to JH is smaller than the error of the real length corresponding to HF.
Based on this, in the embodiment of the application, the terminal judges whether the two-dimensional camera shoots in a downward shooting mode according to the shooting angle of the two-dimensional camera, and then selects a corresponding method to measure the size of the object to be measured according to the judgment result, so as to improve the accuracy of the size of the object to be measured.
In a possible implementation mode, under the condition that the shooting angle is in a downward shooting angle interval, the terminal determines that the two-dimensional camera shoots an object to be measured in a downward shooting mode to obtain a two-dimensional image; and under the condition that the shooting angle is in the flat shooting angle interval, the terminal determines that the two-dimensional camera shoots the object to be measured in a flat shooting mode to obtain a two-dimensional image.
Optionally, the oblique-shooting angle interval is (0, 30 degrees), and the flat-shooting angle interval is (30 degrees, 180 degrees).
As an optional implementation manner, when the shooting angle of the two-dimensional camera is in the downward shooting angle interval, the partial perspective projection error of the object to be measured, which is closer to the ground, is larger, the partial perspective projection error of the object to be measured, which is farther from the ground, is larger, and the lower bottom surface of the object to be measured and the ground are located on the same plane, the terminal may obtain the height of the object to be measured by calculating the distance from the ground to the upper bottom surface of the object to be measured.
Under the condition that the shooting angle of the two-dimensional camera is in the flat shooting angle interval, the terminal can obtain the height of the object to be measured by calculating the length of the height edge of the object to be measured.
In this embodiment, since the two-dimensional camera is not a top-down shot compared to the ground, in the depth map, the depth values of the points on the ground are more accurate than the depth values of the points on the lower bottom surface of the object to be measured, and the terminal obtains the height of the object to be measured by calculating the distance from the ground to the upper bottom surface of the object to be measured in the case where the shooting angle of the two-dimensional camera is in the top-down shot angle section, thereby improving the accuracy of the height of the object to be measured.
In the embodiment of the application, the terminal selects the corresponding method to measure the height of the object to be measured according to the shooting angle of the two-dimensional camera, so that the accuracy of the height of the object to be measured can be improved.
As an alternative implementation, the terminal performs the following steps in the process of performing step 303:
1. and acquiring a first image point of a first point in a camera coordinate system of the two-dimensional camera and a first projection plane of the upper bottom surface of the object to be measured in the camera coordinate system when the shooting angle is a top-down shooting angle.
In the embodiment of the application, under the condition that the shooting angle is within the nodding angle interval, the shooting angle is the nodding angle. The shooting angle represents that the two-dimensional camera shoots the object to be measured in a way of nodding to obtain a two-dimensional image. The angle interval of bowing can be set according to actual demand. Optionally, the nodding angle interval is (0, 30 degrees).
In the embodiment of the present application, the first type of dots are dots included in the lower bottom surface. For example, the lower bottom surface of the object to be measured includes points a, b, and c. At this time, the point a, the point b, and the point c are all the first type points.
In a possible implementation manner, the first point is a point on the object to be measured, which has a depth value smaller than the minimum depth value of the first type point, and the first point does not belong to the upper bottom surface. The meaning of the minimum depth value of the first type point can be seen in the following example: assume that the first category includes: the depth value of the point a is A, the depth value of the point B is B, the depth value of the point C is C, A is smaller than B, and B is smaller than C. At this time, the minimum depth value of the first type point is a.
In another possible implementation, the first point is a point other than the first type of point in a plane in which the lower bottom surface of the object to be measured belongs. For example, when the object to be measured is placed on a horizontal surface (e.g., the ground, a table top), the plane to which the lower bottom surface belongs includes the horizontal surface and the lower bottom surface. In this case, the first point is any point in the horizontal plane.
In this embodiment, a first image point of a first point in a camera coordinate system of a two-dimensional camera is a projection point of the first point in the camera coordinate system, a distance from the projection point to an optical center of the two-dimensional camera is a distance from an object point corresponding to the first point to the two-dimensional camera, and the projection point, the first point and the optical center of the two-dimensional camera are on a straight line.
In the embodiment of the present application, the first projection plane is a projection plane of the upper bottom surface under the camera coordinate system, that is, the first projection plane includes projection points of all points in the upper bottom surface under the camera coordinate system. For example, the upper bottom surface includes a point a, a point B, and a point C, where a projected point of the point a in the camera coordinate system is a point a, a projected point of the point B in the camera coordinate system is a point B, and a projected point of the point C in the camera coordinate system is a point C. At this time, the first projection plane includes a point a, a point B, and a point C.
In one implementation of obtaining the first image point, the measuring device receives coordinates of a point input by a user through an input component in a camera coordinate system, and determines the first image point according to the coordinates, where the input component includes: keyboard, mouse, touch screen, touch pad, audio input device, etc.
In another implementation manner of acquiring the first image point, the measuring device receives coordinates of a point in a camera coordinate system sent by the first terminal, and then determines the first image point according to the coordinates.
In one implementation of obtaining the first projection plane, the measuring apparatus receives a plane equation in a camera coordinate system input by a user through an input component, and determines the first projection plane according to the plane equation, where the input component includes: keyboard, mouse, touch screen, touch pad, audio input device, etc.
In another implementation manner of obtaining the first projection plane, the measuring device receives a plane equation in a camera coordinate system sent by the first terminal, and further determines the first projection plane according to the plane equation.
2. And determining the distance from the first image point to the first projection plane to obtain a first distance.
3. And obtaining the height of the object to be measured according to the first distance.
In the case where the first point is a point other than the first type point in the plane to which the lower bottom surface of the object to be measured belongs, it is assumed that the first distance is d1The height of the object to be measured is G. In one possible implementation, d1G satisfies the following formula:
G=α1×d1… formula (1)
Wherein alpha is1Is a positive number. Alternatively, α1=1。
In another possible implementation, d1G satisfies the following formula:
G=α1×d11… formula (2)
Wherein alpha is1Is a positive number, beta1Are real numbers. Alternatively, α1=1,β1=0。
In yet another possible implementation, d1G satisfies the following formula:
G=(α1×d11)n… formula (3)
Wherein alpha is1Is a positive number, beta1And n are real numbers. Alternatively, α1=1,β1=0,n=1。
In the case where the first point is a point on the object to be measured whose depth value is smaller than the minimum depth value of the first type of point, and the first point does not belong to the upper bottom surface, the measuring apparatus determines the length of the height side and the distance from the first point to the upper bottom surface in the two-dimensional image by processing the two-dimensional image.
It should be understood that the first distance is a distance from the first image point to the first projection plane in the camera coordinate system, and is different from a distance from the first point to the upper bottom surface. For example, in the two-dimensional image shown in fig. 6, the U point is the first point, the plane ABCD is the top bottom surface, and AE, BF, and CG are all height edges. In the two-dimensional image, the distance from the first point to the upper bottom surface is the length of UV.
The measuring device calculates the ratio of the distance from the first point to the upper bottom surface to the length of the height edge to obtain a first ratio, and obtains the height of the object to be measured according to the first distance and the first ratio.
Let the first ratio be r1The first distance is d1The height of the object to be measured is G. In one possible implementation, d1、G、r1Satisfies the following formula:
G=α2×d1/r1… formula (4)
Wherein alpha is2Is a positive number. Alternatively, α2=1。
In another possible implementation, d1、G、r1Satisfies the following formula:
G=α2×d1/r12… formula (5)
Wherein alpha is2Is a positive number, beta2Are real numbers. Alternatively, α2=1,β2=0。
In yet another possible implementation, d1、G、r1Satisfies the following formula:
Figure BDA0002659828270000101
wherein alpha is2Is a positive number, beta2And m are real numbers. Alternatively, α2=1,β2=0,m=1。
In the embodiment of the application, because the depth information of the first point in the depth map is more accurate than the depth information of the lower bottom corner point, the measuring device calculates the distance from the first point to the upper bottom (i.e. the first distance) based on the depth value of the first point and the coordinate of the first point in the two-dimensional image, and obtains the height of the object to be measured according to the first distance, so that the accuracy of the height of the object to be measured can be improved.
Since the farther an object is from the imaging device in the case of a large shooting angle (such as a top-down shot), the larger the perspective projection error of the object during imaging. For example, it is assumed that the object to be measured is small in size and the object to be measured is placed on the ground. When the imaging device is required to photograph the object to be measured, the object to be measured is usually photographed in a downward shooting manner. Due to the fact that the shooting angle of the overhead shooting is large, perspective projection errors generated in the imaging process of the part, which is farther away from the imaging device, of the object to be measured are larger.
As an alternative embodiment, before performing step 1, the measuring apparatus further performs the following steps:
4. and acquiring internal parameters of the two-dimensional camera, and acquiring depth values of at least three second points in the upper bottom surface from the depth map.
In the embodiment of the present application, the internal parameters of the two-dimensional camera include: the system comprises a coordinate of a focal length of the two-dimensional camera in a camera coordinate system and a coordinate of an optical center of the two-dimensional camera in the camera coordinate system, wherein the optical center is an intersection point of an optical axis of the two-dimensional camera and an image plane.
Due to the imaging limitations of the depth imaging device, depth information of object points at the edge of a plane may not be accurate in a depth map acquired by the depth imaging device. In view of the fact that the depth information to the at least three second points needs to be used in the subsequent processing, in order to improve the accuracy of the subsequent processing, the measuring apparatus may alternatively select the at least three second points from points other than those on the edge of the surface to be measured. For example, in fig. 4, if the surface to be measured is ABCD, the at least three second points are points other than the point on AB, the point on BC, the point on CD, and the point on DA in ABCD.
Optionally, the measuring device performs corner detection processing on the two-dimensional image to obtain a position of a corner of the surface to be measured in the two-dimensional image. The measuring device obtains an area covered by the surface to be measured in the image according to the position of the corner point in the surface to be measured, and then at least three first points can be selected from the area.
In one possible implementation, the measurement device processes the two-dimensional image using a convolutional neural network, implementing the corner detection process. The convolutional neural network is obtained by training a plurality of images with marking information as training data, wherein the marking information of the images in the training data is angular points and positions of the angular points. In the process of training the convolutional neural network by using the training data, the convolutional neural network extracts the characteristic data of the image from the image and determines whether the corner points exist in the image according to the characteristic data. And under the condition that the angular points exist in the image, obtaining the positions of the angular points in the image according to the characteristic data of the image. In the training process, the result obtained in the training process of the convolutional neural network is supervised by using the marking information as the supervision information, and the parameters of the convolutional neural network are updated to complete the training of the convolutional neural network.
Thus, the trained convolutional neural network can be used for processing the two-dimensional image to obtain the position of the corner point of the object to be measured in the two-dimensional image. It should be understood that the execution subject of training the convolutional neural network may be a measurement device, or may be a training device, wherein the training device may be one of the following: computer, server.
In another possible implementation, the corner detection process may be implemented by a corner detection algorithm, wherein the corner detection algorithm may be one of the following: harris corner detection algorithm, Moravec corner detection algorithm, Shi-Tomasi corner detection algorithm and the like, and the corner detection algorithm for realizing the corner detection processing is not particularly limited in the application.
In an implementation manner of obtaining the depth values of the at least three second points, the measuring device obtains the depth values of the at least three second points by performing image registration processing on the two-dimensional image and the depth map.
In the embodiment of the present application, the image registration processing may be implemented by an algorithm capable of implementing image registration, where the algorithm includes: scale-invariant feature transform (SIFT), feature detection algorithm (HOG), feature extraction algorithm (ORB), Sobel (Sobel) operator.
The measuring device can determine pixel points which are the same points with the second point from the depth map by carrying out image registration processing on the two-dimensional image and the depth map, and further can take the depth value of the pixel points as the depth value of the second point. For example (example 1), the at least three second points include: a second point a, a second point b, a second point c. The measuring device carries out image registration processing on the two-dimensional image and the depth map to determine that a pixel point A and a second point a in the depth map are homonymous points, a pixel point B and a second point B in the depth map are homonymous points, and a pixel point C and a second point C in the depth map are homonymous points. The depth value d of the pixel point A obtained from the depth map by the measuring device1The depth value of the pixel point B is d2D is the depth value of the pixel point C3Then the depth value of the second point a is d1The depth value of the second point b is d2The depth value of the second point c is d3
Optionally, in order to improve the accuracy of the depth value of the second point, the measurement device obtains a pose transformation relationship between the two-dimensional camera and the depth imaging device, where the depth imaging device is an imaging device that acquires a depth map. And the measuring device converts the depth value acquired from the depth map according to the pose conversion relation to obtain the depth value of the two-dimensional image under the pixel coordinate system as the depth value of the second point. Continuing with example 1, the measurement device converts the relationship between the pose and the position of the object1Make a transition (i.e. d)1Multiplied by the pose conversion relation) to obtain a depth value d of the two-dimensional image in the pixel coordinate system4As the depth value of the second point a.
5. And obtaining image points of the at least three second points in the camera coordinate system according to the internal parameters, the depth values of the at least three second points and the coordinates of the at least three second points in the pixel coordinate system of the two-dimensional image.
The measuring device converts the coordinate of the second point in the two-dimensional image and the depth value of the second point according to the internal parameters of the two-dimensional imaging equipment, and can obtain the image point of the second point in the camera coordinate system of the two-dimensional camera.
6. And performing plane fitting processing on the image points of the at least three second points in the camera coordinate system to obtain the first projection plane.
Optionally, the measuring device may perform plane fitting processing on the image points of the at least three second points in the camera coordinate system, so as to minimize the sum of distances from the plane obtained by fitting to the image points of the at least three second points, thereby obtaining the first projection plane. In this way, the accuracy of the first projection plane can be improved.
For example, the image points of the at least three second points include: image point a, image point b and image point c. Suppose that the distance from the image point a to the fitted plane is D1The distance from the image point b to the plane obtained by fitting is D2The distance from the image point c to the plane obtained by fitting is D3. Then, the measuring device makes D a plane obtained by performing plane fitting processing on the image points of the at least three second points1+D2+D3And minimum.
As an alternative embodiment, before acquiring the first image point of the first point in the camera coordinate system of the two-dimensional camera, the measuring apparatus further performs the following steps:
7. and under the condition that the first line segment is positioned on the front surface of the object to be measured, determining a straight line which passes through the midpoint of the first line segment and is parallel to the reference longitudinal axis to obtain a first straight line.
In the embodiment of the application, the lower bottom surface of the object to be measured comprises a first corner point and a second corner point adjacent to the first corner point, and the first line segment is a line segment passing through the first corner point and the second corner point. The reference vertical axis is a vertical axis of a pixel coordinate system of the two-dimensional image. The front surface is the closest surface to the imaging device in the object to be measured.
For example, in the object to be measured in fig. 7, the plane AEFB is the front surface. Optionally, the measuring device processes the two-dimensional image using a convolutional neural network, and may determine the front surface of the object to be measured.
For example, in the two-dimensional image shown in fig. 7, when the first line segment is EF and Q is the midpoint of EF, a straight line passing through the point Q and parallel to the vertical axis of the pixel coordinate system is QR, that is, the first straight line is QR.
8. And selecting a point on the first straight line as the first point.
In the case where the object to be measured is placed on a horizontal surface (e.g., the ground, a table top), the distance from the horizontal surface to the upper bottom surface of the object to be measured is the height of the object to be measured. Since the depth information of the points in the horizontal plane other than the lower bottom surface is more accurate than the depth information of the points in the lower bottom surface in the depth map, the measuring apparatus selects one point from the points in the horizontal plane other than the lower bottom surface as a third point, and obtains the height of the object to be measured according to the distance from the third point to the upper bottom surface, the accuracy of the height of the object to be measured can be improved.
Optionally, the measuring device determines a pixel point region between the first line segment and the fourth line segment as a first candidate region. In the embodiment of the present application, the fourth line segment is a side of the two-dimensional image that is parallel to the horizontal axis of the pixel coordinate system of the two-dimensional image and has the largest vertical coordinate. For example, in the two-dimensional image shown in fig. 7, LJ is a fourth line segment.
After the first line segment and the fourth line segment are determined, the measuring device takes a pixel point area between the first line segment and the fourth line segment as a first candidate area. For example, in fig. 7, if the first segment is EF, at this time, the first candidate region is a pixel point region surrounded by the polygon MLJH, where EM and FH are both extension lines of EF. If the first line segment is FG, at this time, the first candidate area is a pixel point area surrounded by triangle KJZ, where FK and GZ are both extension lines of FG.
Alternatively, the measuring means determines a straight line through the point E parallel to the FG, the intersection of which line with the boundary of the two-dimensional image is N. The measuring device determines a straight line passing through the G point and being parallel to the EF, and the intersection point of the straight line and the boundary of the two-dimensional image is P. The measuring device takes the pixel point area surrounded by the polygon EFKLN and/or the pixel point area surrounded by the polygon FHGG as a first candidate area. The measuring device determines the first to-be-measured area in such a way that the probability that the third point is a point in the plane to which the lower bottom surface belongs is higher, and the accuracy of the height of the object to be measured can be further improved.
The measuring device selects points on the first straight line from the first to-be-selected area as third points, and respectively calculates the distance from each third point to the upper bottom surface. And obtaining at least two heights of the object to be measured according to the distances from the at least two third points to the upper bottom surface. And calculating the average value of the at least two heights to obtain the height of the object to be measured.
As an alternative embodiment, on the basis of executing the above steps, the measuring apparatus further executes the following steps:
9. when the first line segment is located on the side surface of the object to be measured, a straight line passing through the midpoint of the first line segment and parallel to the horizontal axis of the pixel coordinate system is determined, and a second straight line is obtained.
In the embodiment of the present application, in the object to be measured, the surfaces other than the upper bottom surface, the lower bottom surface, and the front surface are all side surfaces. For example, in the object to be measured shown in fig. 7, the plane BFGC is a side surface. Optionally, the measuring device processes the two-dimensional image using a convolutional neural network, and may determine the side of the object to be measured.
For example, in the two-dimensional image shown in fig. 7, if the first line segment is FG and S is the midpoint of FG, a straight line passing through the point S and parallel to the horizontal axis of the pixel coordinate system is ST, that is, the first straight line is ST.
10. And selecting a point on the second straight line as the first point.
The measuring device selects a point on the second straight line from the first to-be-selected area as the first point, so that the probability that the first point is a point in the plane to which the lower bottom surface belongs is improved, the data processing amount is reduced, and the processing speed is improved.
As an alternative implementation, the terminal performs the following steps in the process of performing step 303:
11. and acquiring a first depth value of a second line segment from the depth map when the shooting angle is a top shooting angle.
In the implementation of the present application, two end points of the second line segment are a third corner point and a fourth corner point, respectively. And/or the depth value of the second line segment is positively correlated with the depth value of the fourth angular point.
In one possible implementation, the depth value of the second line segment is a depth value of a midpoint of the second line segment. For example, the midpoint of the second line segment is the reference midpoint. The terminal acquires the depth value of the reference midpoint from the depth map, and takes the depth value of the reference midpoint as the depth value of the second line segment, namely the first depth value.
In another possible implementation manner, the depth value of the second line segment is an average value of the depth value of the third corner and the depth value of the third corner.
12. And acquiring the mapping relation between the depth value and the transmission factor.
In the embodiment of the application, the transmission factor represents the conversion relation between the size in the image and the real size. For example, the transmission factor of a pixel represents the conversion relationship between the size of the pixel and the size of the object point corresponding to the pixel. For another example, the transmission factor of the stool in the image characterizes a conversion relationship between the length of the stool in the image and the true length of the stool.
The ratio between the size in the image and the true size is called the transmittance, and the transmission factor is inversely related to the transmittance. For example, the image includes a pixel point a and a pixel point B, where the transmission factor of the pixel point a is greater than that of the pixel point B, the object point corresponding to the pixel point a is an object point a, and the object point corresponding to the pixel point B is an object point B. Because the transmission factor of the pixel point a is larger than that of the pixel point B, the transmission ratio of the pixel point a to the object point A is smaller than that of the pixel point B to the object point B. And because the size of the pixel point a is the same as that of the pixel point B, the size of the object point A is larger than that of the object point B.
In the embodiment of the application, the transmission factor is positively correlated with the depth value. For example, the image includes a pixel point a and a pixel point b, where the depth value of the pixel point a is greater than the depth value of the pixel point b. At this time, the size of the object point corresponding to the pixel point a is larger than that of the object point corresponding to the pixel point b.
13. And obtaining the transmission factor of the second line segment according to the mapping relation and the first depth value.
In the embodiment of the present application, the transmission factor of the second line segment represents a conversion relationship between the length of the second line segment and the length of the second line segment, where the second line segment is a physical line segment corresponding to the second line segment, that is, the length of the second line segment is a physical distance of the length of the second line segment.
14. And obtaining the height of the object to be measured according to the transmission factor of the second line segment and the length of the second line segment.
In one possible implementation, the transmission factor of the second line segment characterizes a ratio of a length of the second line segment to a length of the second line segment. The measuring device calculates the quotient of the length of the second line segment and the transmission factor of the second line segment to obtain the length of the second line segment, namely the height of the object to be measured.
In another possible implementation, the transmission factor of the second line segment characterizes a ratio of a length of the second line segment to a length of the second line segment. The measuring device calculates the product of the length of the second line segment and the transmission factor of the second line segment to obtain the length of the second line segment, namely the height of the object to be measured.
As an alternative embodiment, the measuring device performs the following steps on the basis of the above steps:
15. and acquiring a coordinate of a fifth corner point in the two-dimensional image and a coordinate of a sixth corner point in the two-dimensional image, and acquiring a second depth value of the fifth corner point and a third depth value of the sixth corner point from the depth map.
In the embodiment of the present application, the fifth corner point belongs to the upper bottom surface of the object to be measured, and the sixth corner point is a corner point adjacent to the fifth corner point in the upper bottom surface.
Optionally, the measuring device may determine, by performing image matching processing on the two-dimensional image and the depth map, a pixel point that is a same name point as the fifth corner point from the depth map, and may further use a depth value of the pixel point as a depth value of the fifth corner point. Similarly, the measuring device can determine the pixel point which is the same name point as the sixth corner point from the depth map by performing image matching processing on the two-dimensional image and the depth map, and further can use the depth value of the pixel point as the depth value of the sixth corner point.
For example, the measurement device performs image registration processing on the two-dimensional image and the depth map to determine that the pixel point a and the fifth corner point in the depth map are homonymous points, and the pixel point B and the sixth corner point in the depth map are homonymous points. The depth value d of the pixel point A obtained from the depth map by the measuring device1The depth value of the pixel point B is d2Then the depth value of the fifth corner point is d1Depth value of the sixth corner point is d2
Optionally, in order to improve accuracy of a depth value of the fifth corner and a depth value of the sixth corner, the measurement apparatus obtains a pose transformation relationship between the RGB imaging device and the depth imaging device, where the depth imaging device is an imaging device that acquires a depth map.
16. And obtaining a second image point of the fifth corner point under the camera coordinate system of the two-dimensional camera according to the coordinate of the fifth corner point in the two-dimensional image and the second depth value.
The measuring device converts the coordinate of the fifth corner point in the two-dimensional image and the second depth value according to the internal parameters of the two-dimensional camera, and an image point of the fifth corner point in a camera coordinate system of the two-dimensional camera, namely a second image point, can be obtained.
Optionally, before step 16 is executed, the measuring device obtains internal parameters of the two-dimensional camera, where the internal parameters of the two-dimensional camera include: the system comprises a coordinate of a focal length of the two-dimensional camera in a camera coordinate system and a coordinate of an optical center of the two-dimensional camera in the camera coordinate system, wherein the optical center is an intersection point of an optical axis of the two-dimensional camera and an image plane.
17. And obtaining a third image point of the sixth dot in the camera coordinate system according to the coordinate of the sixth dot in the two-dimensional image and the third depth value.
The measuring device converts the coordinate of the sixth corner point in the two-dimensional image and the third depth value according to the internal parameters of the two-dimensional camera, and an image point of the sixth corner point in a camera coordinate system of the two-dimensional camera, namely a third image point, can be obtained.
18. And determining the distance between the second image point and the third image point as the side length of the object to be measured.
Since the conversion relationship between the camera coordinate system and the world coordinate system includes only rotation and translation, the distance between the second image point and the third image point is equal to the distance between the fifth angle point and the sixth angle point. Therefore, the measuring device calculates the distance between the second image point and the third image point to obtain the distance between the fifth corner point and the sixth corner point.
It should be understood that the fifth corner point and the sixth corner point are described as objects, and it should not be understood that the embodiment of the present application can only measure the distance between two corner points in the object to be measured. Alternatively, in practical applications, the measuring device may measure the distance between any two corner points on the object to be measured. For example, in the case that the object to be measured is a rectangular parallelepiped, the measuring device may use the technical solution disclosed above to obtain the length and width of the rectangular parallelepiped.
As an alternative embodiment, in the process of executing step 303, the measuring apparatus executes the following steps:
19. and under the condition that the shooting angle is a non-overhead shooting angle, acquiring the coordinate of a seventh corner point in the two-dimensional image and the coordinate of an eighth corner point in the two-dimensional image, and acquiring a fourth depth value of the seventh corner point and a fifth depth value of the eighth corner point from the depth map.
In the embodiment of the application, under the condition that the shooting angle is outside the prostrate angle section, the shooting angle is a non-prostrate angle.
In the embodiment of the present application, the seventh corner point belongs to the upper bottom surface of the object to be measured, the eighth corner point belongs to the lower bottom surface of the object to be measured, and a connecting line between the seventh corner point and the eighth corner point is a height edge of the object to be measured.
20. And obtaining a fourth image point of the seventh corner point in the camera coordinate system of the two-dimensional camera according to the coordinate of the seventh corner point in the two-dimensional image and the fourth depth value.
And converting the coordinate of the seventh corner point in the two-dimensional image and the fourth depth value according to the internal parameters of the two-dimensional camera to obtain an image point of the seventh corner point in a camera coordinate system of the two-dimensional camera, namely a fourth image point.
21. And obtaining a fifth image point of the eighth corner point under the camera coordinate system of the two-dimensional camera according to the coordinate of the eighth corner point in the two-dimensional image and the fifth depth value.
And converting the coordinates of the eighth corner point in the two-dimensional image and the fifth depth value according to the internal parameters of the two-dimensional camera to obtain an image point of the eighth corner point in a camera coordinate system of the two-dimensional camera, namely a fifth image point.
22. And determining the distance between the fourth image point and the fifth image point as the height of the object to be measured.
As an alternative embodiment, after detecting the measurement instruction for the object to be measured, the measurement device further performs a step of acquiring a shooting angle at which the two-dimensional camera obtains the two-dimensional image before performing the step of shooting the object to be measured by using the two-dimensional camera to obtain the two-dimensional image.
In one possible implementation, the measuring device is loaded with an angular velocity sensor. The measuring device can obtain the shooting angle of the two-dimensional camera to obtain the two-dimensional image according to the data obtained from the angular velocity sensor. Optionally, the angular velocity sensor comprises a gyroscope.
In another possible implementation manner, the measuring device may use the two-dimensional camera to acquire an image to be confirmed before using the two-dimensional camera to capture a two-dimensional image of the object to be measured. And processing the image to be confirmed by using a shooting angle detection network to obtain the shooting angle of the two-dimensional camera. The shooting angle detection network is a neural network obtained by training a training image set carrying annotation information, wherein the annotation information comprises a shooting angle.
In yet another possible implementation manner, the measuring device obtains the shooting angle of the two-dimensional image by executing the following steps:
23. and when the number of the surfaces belonging to the object to be measured in the two-dimensional image is determined to exceed a threshold value, determining the shooting angle of the two-dimensional camera for obtaining the two-dimensional image as a top shooting angle.
24. And determining the shooting angle of the two-dimensional camera for obtaining the two-dimensional image as a non-overlook shooting angle when the two-dimensional image is determined to contain the number of the surfaces belonging to the object to be measured and not to exceed the threshold value. Optionally, the threshold is 2.
Based on the technical scheme provided by the embodiment of the application, the embodiment of the application also provides several possible application scenarios. Scene 1: the xiao ming is related to the moving company helping the moving company, but the moving company needs to inform the size of the object needing to be moved in a small format, so the xiao ming needs to measure the size of the object needing to be moved. Since there are many things to be carried, it is troublesome to measure the size of the things (such as a table, a cabinet, a washing machine, hereinafter, referred to as an object to be measured) with a ruler. Therefore, the Xiaoming inputs a stool measuring instruction to the mobile phone by clicking the mobile phone, and the mobile phone uses the two-dimensional camera and the TOF camera to shoot the stool placed on the ground under the condition that the mobile phone receives the instruction, so as to obtain a first RGB image containing the stool and a depth map of the first RGB image.
The mobile phone can determine that the two-dimensional camera obtains the first RGB image in a way of bending according to the shooting angle of the two-dimensional camera when shooting the first RGB image, and then can select a corresponding method to measure the size of the stool (for example, the method disclosed in the step 1 to the step 3 is used to measure the height of the stool).
After the size of the stool is measured by using the mobile phone, a refrigerator measuring instruction is input into the mobile phone by clicking the mobile phone, and the mobile phone shoots the refrigerator by using the two-dimensional camera and the TOF camera under the condition that the mobile phone receives the refrigerator measuring instruction, so that a second RGB image containing the refrigerator and a depth map of the second RGB image are obtained.
The mobile phone can determine that the two-dimensional camera obtains the second RGB image in a flat shooting manner (i.e., the shooting angle is outside the downward shooting angle interval) according to the shooting angle of the two-dimensional camera when shooting the second RGB image, and further can select a corresponding method to measure the size of the refrigerator (e.g., measure the height of the stool using the methods disclosed in steps 15 to 18).
Scene 2: as e-commerce grows rapidly, more and more people shop through e-commerce, which also presents more challenges to the logistics industry, including improving the efficiency of measuring the size of goods to be delivered.
Today's logistics distribution is becoming more standard, and before transporting the goods to be distributed, all will pack the goods to be distributed with the carton. Because the shape of carton is the cuboid, the terminal uses the size that the technical scheme disclosed in this application embodiment can the accurate measurement carton. For example, a worker of a logistics company may use a terminal (e.g., a mobile phone or a tablet computer) loaded with a two-dimensional camera and a TOF camera to shoot a carton to be measured, so as to obtain an RGB image containing the carton to be measured and a depth map of the RGB image. The terminal can further process the RGB image and the depth map by using the technical scheme disclosed above to obtain the size of the carton to be measured. Like this, can reduce the required human cost of measurement carton size, improve the efficiency of measuring the carton size.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
The method of the embodiments of the present application is set forth above in detail and the apparatus of the embodiments of the present application is provided below.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a measurement apparatus according to an embodiment of the present disclosure, in which the measurement apparatus 1 includes: two-dimensionally setting a camera 11, a depth camera 12, an acquisition unit 13, a first processing unit 14, and an angular velocity sensor 15, wherein:
the measuring device 1 uses the two-dimensional camera 11 to shoot an object to be measured to obtain a two-dimensional image, and uses the depth camera 12 to shoot the object to be measured to obtain a depth map of the two-dimensional image;
the measuring device 1 further comprises:
an obtaining unit 13, configured to obtain a shooting angle of the two-dimensional image obtained by the two-dimensional camera;
and the first processing unit 14 is configured to obtain the height of the object to be measured according to the shooting angle, the two-dimensional image and the depth map.
In combination with any embodiment of the present application, the first processing unit 14 is configured to:
under the condition that the shooting angle is a depression angle, acquiring a first image point of a first point under a camera coordinate system of the two-dimensional camera and a first projection plane of the upper bottom surface of the object to be measured under the camera coordinate system; the first point is a point on the object to be measured, wherein the depth value of the first point is smaller than the minimum depth value of the first point, and the first point does not belong to the upper bottom surface; the first point is a point except the first type point in the plane to which the lower bottom surface of the object to be measured belongs; the first type of dots comprises dots in the bottom surface;
determining the distance from the first image point to the first projection plane to obtain a first distance;
and obtaining the height of the object to be measured according to the first distance.
With reference to any embodiment of the present application, the obtaining unit 13 is configured to:
before the first projection plane of the upper bottom surface of the object to be measured under the camera coordinate system is obtained, obtaining internal parameters of the two-dimensional camera, and obtaining depth values of at least three second points in the upper bottom surface from the depth map;
the first processing unit 14 is configured to:
obtaining image points of the at least three second points in the camera coordinate system according to the internal parameters, the depth values of the at least three second points and the coordinates of the at least three second points in the pixel coordinate system of the two-dimensional image;
and performing plane fitting processing on the image points of the at least three second points in the camera coordinate system to obtain the first projection plane.
With reference to any one of the embodiments of the present application, the bottom surface includes a first corner point and a second corner point adjacent to the first corner point;
the first processing unit 14 is configured to, before the obtaining of the first image point of the first point in the camera coordinate system of the two-dimensional camera, determine a straight line that passes through a midpoint of the first line segment and is parallel to the reference longitudinal axis when the first line segment is located on the front surface of the object to be measured, and obtain a first straight line; the first line segment passes through the first corner point and the second corner point; the reference longitudinal axis is a longitudinal axis of a pixel coordinate system of the two-dimensional image;
and selecting a point on the first straight line as the first point.
In combination with any embodiment of the present application, the first processing unit 14 is configured to:
determining a straight line which passes through the midpoint of the first line segment and is parallel to the horizontal axis of the pixel coordinate system under the condition that the first line segment is positioned on the side surface of the object to be measured to obtain a second straight line;
and selecting a point on the second straight line as the first point.
In combination with any embodiment of the present application, the first processing unit 14 is configured to:
under the condition that the shooting angle is a depression angle, acquiring a first depth value of a second line segment from the depth map; two end points of the second line segment are respectively a third corner point of the object to be measured and a fourth corner point of the object to be measured; the second line segment is a height edge of the object to be measured; the first depth value is positively correlated with the depth value of the third corner point, and/or the first depth value is positively correlated with the depth value of the fourth corner point;
acquiring a mapping relation between the depth value and the transmission factor; the transmission factor characterizes a conversion relationship between a dimension in the image and a true dimension; the transmission factor is positively correlated with the depth value;
obtaining a transmission factor of the second line segment according to the mapping relation and the first depth value;
and obtaining the height of the object to be measured according to the transmission factor of the second line segment and the length of the second line segment.
With reference to any embodiment of the present application, the obtaining unit 13 is further configured to:
acquiring a coordinate of a fifth corner point in the two-dimensional image and a coordinate of a sixth corner point in the two-dimensional image, and acquiring a second depth value of the fifth corner point and a third depth value of the sixth corner point from the depth map; the fifth corner point belongs to the upper bottom surface, and the sixth corner point is a corner point adjacent to the fifth corner point in the upper bottom surface;
the first processing unit 14 is further configured to:
obtaining a second image point of the fifth corner point under a camera coordinate system of the two-dimensional camera according to the coordinate of the fifth corner point in the two-dimensional image and the second depth value;
obtaining a third image point of the sixth corner point under the camera coordinate system according to the coordinate of the sixth corner point in the two-dimensional image and the third depth value;
and determining the distance between the second image point and the third image point as the side length of the object to be measured.
In combination with any embodiment of the present application, the first processing unit 14 is configured to:
under the condition that the shooting angle is a non-overhead shooting angle, acquiring a coordinate of a seventh corner point in the two-dimensional image and a coordinate of an eighth corner point in the two-dimensional image, and acquiring a fourth depth value of the seventh corner point and a fifth depth value of the eighth corner point from the depth map; the seventh corner point belongs to the upper bottom surface, the eighth corner point belongs to the lower bottom surface of the object to be measured, and a connecting line between the seventh corner point and the eighth corner point is a height edge of the object to be measured;
obtaining a fourth image point of the seventh corner point under a camera coordinate system of the two-dimensional camera according to the coordinate of the seventh corner point in the two-dimensional image and the fourth depth value;
obtaining a fifth image point of the eighth corner point under the camera coordinate system according to the coordinate of the eighth corner point in the two-dimensional image and the fifth depth value;
and determining the distance between the fourth image point and the fifth image point as the height of the object to be measured.
In combination with any embodiment of the present application, the measurement apparatus 1 further includes an angular velocity sensor 15, and the obtaining unit 13 is configured to:
acquiring the shooting angle through the angular velocity sensor in the process of obtaining the two-dimensional image;
under the condition that the shooting angle is within a downward shooting angle interval, determining the shooting angle as a downward shooting angle;
and under the condition that the shooting angle is out of the bent-down angle section, determining that the shooting angle is a non-bent-down angle.
With reference to any embodiment of the present application, the obtaining unit 13 is configured to:
under the condition that the number of the surfaces of the two-dimensional image containing the object to be measured exceeds a threshold value, determining that the shooting angle of the two-dimensional image obtained by the two-dimensional camera is a depression angle;
and under the condition that the number of the surfaces of the two-dimensional image containing the object to be measured does not exceed the threshold value, determining that the shooting angle of the two-dimensional image obtained by the two-dimensional camera is a non-overhead shooting angle.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present application may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
Fig. 9 is a schematic hardware structure diagram of a measurement apparatus according to an embodiment of the present application. The measuring device 2 comprises a processor 21, a memory 22, an input device 23, an output device 24. The processor 21, the memory 22, the input device 23 and the output device 24 are coupled by a connector, which includes various interfaces, transmission lines or buses, etc., and the embodiment of the present application is not limited thereto. It should be appreciated that in various embodiments of the present application, coupled refers to being interconnected in a particular manner, including being directly connected or indirectly connected through other devices, such as through various interfaces, transmission lines, buses, and the like.
The processor 21 may be one or more Graphics Processing Units (GPUs), and in the case that the processor 21 is one GPU, the GPU may be a single-core GPU or a multi-core GPU. Alternatively, the processor 21 may be a processor group composed of a plurality of GPUs, and the plurality of processors are coupled to each other through one or more buses. Alternatively, the processor may be other types of processors, and the like, and the embodiments of the present application are not limited.
Memory 22 may be used to store computer program instructions, as well as various types of computer program code for executing the program code of aspects of the present application. Alternatively, the memory includes, but is not limited to, Random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), or compact disc read-only memory (CD-ROM), which is used for associated instructions and data.
The input means 23 are for inputting data and/or signals and the output means 24 are for outputting data and/or signals. The input device 23 and the output device 24 may be separate devices or may be an integral device.
It is understood that, in the embodiment of the present application, the memory 22 may be used to store not only the relevant instructions, but also relevant data, for example, the memory 22 may be used to store the two-dimensional image acquired by the input device 23, or the memory 22 may also be used to store the height of the object to be measured obtained by the processor 21, and the like, and the embodiment of the present application is not limited to the data specifically stored in the memory.
It will be appreciated that fig. 9 shows only a simplified design of a measuring device. In practical applications, the measuring devices may also respectively include other necessary components, including but not limited to any number of input/output devices, processors, memories, etc., and all measuring devices that can implement the embodiments of the present application are within the scope of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. It is also clear to those skilled in the art that the descriptions of the various embodiments of the present application have different emphasis, and for convenience and brevity of description, the same or similar parts may not be repeated in different embodiments, so that the parts that are not described or not described in detail in a certain embodiment may refer to the descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)), or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., Digital Versatile Disk (DVD)), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media that can store program codes, such as a read-only memory (ROM) or a Random Access Memory (RAM), a magnetic disk, or an optical disk.

Claims (13)

1. A measuring method is applied to a terminal, wherein the terminal comprises a two-dimensional camera and a depth camera, and the method comprises the following steps:
shooting an object to be measured by using the two-dimensional camera to obtain a two-dimensional image, and shooting the object to be measured by using the depth camera to obtain a depth map of the two-dimensional image;
acquiring a shooting angle of the two-dimensional image obtained by the two-dimensional camera;
and obtaining the height of the object to be measured according to the shooting angle, the two-dimensional image and the depth map.
2. The method according to claim 1, wherein the obtaining the height of the object to be measured according to the shooting angle, the two-dimensional image and the depth map comprises:
under the condition that the shooting angle is a depression angle, acquiring a first image point of a first point under a camera coordinate system of the two-dimensional camera and a first projection plane of the upper bottom surface of the object to be measured under the camera coordinate system; the first point is a point on the object to be measured, wherein the depth value of the first point is smaller than the minimum depth value of the first point, and the first point does not belong to the upper bottom surface; the first point is a point except the first type point in the plane to which the lower bottom surface of the object to be measured belongs; the first type of dots comprises dots in the bottom surface;
determining the distance from the first image point to the first projection plane to obtain a first distance;
and obtaining the height of the object to be measured according to the first distance.
3. The method according to claim 2, characterized in that before the acquiring a first projection plane of the upper bottom surface of the object to be measured in the camera coordinate system, the method further comprises:
obtaining internal parameters of the two-dimensional camera, and obtaining depth values of at least three second points in the upper bottom surface from the depth map;
obtaining image points of the at least three second points in the camera coordinate system according to the internal parameters, the depth values of the at least three second points and the coordinates of the at least three second points in the pixel coordinate system of the two-dimensional image;
and performing plane fitting processing on the image points of the at least three second points in the camera coordinate system to obtain the first projection plane.
4. The method of claim 3, wherein the bottom surface comprises a first corner point and a second corner point adjacent to the first corner point;
before the acquiring a first image point of the first point in the camera coordinate system of the two-dimensional camera, the method further includes:
determining a straight line which passes through the midpoint of the first line segment and is parallel to the reference longitudinal axis under the condition that the first line segment is positioned on the front surface of the object to be measured to obtain a first straight line; the first line segment passes through the first corner point and the second corner point; the reference longitudinal axis is a longitudinal axis of a pixel coordinate system of the two-dimensional image;
and selecting a point on the first straight line as the first point.
5. The method of claim 4, further comprising:
determining a straight line which passes through the midpoint of the first line segment and is parallel to the horizontal axis of the pixel coordinate system under the condition that the first line segment is positioned on the side surface of the object to be measured to obtain a second straight line;
and selecting a point on the second straight line as the first point.
6. The method according to claim 1, wherein the obtaining the height of the object to be measured according to the shooting angle, the two-dimensional image and the depth map comprises:
under the condition that the shooting angle is a depression angle, acquiring a first depth value of a second line segment from the depth map; two end points of the second line segment are respectively a third corner point of the object to be measured and a fourth corner point of the object to be measured; the second line segment is a height edge of the object to be measured; the first depth value is positively correlated with the depth value of the third corner point, and/or the first depth value is positively correlated with the depth value of the fourth corner point;
acquiring a mapping relation between the depth value and the transmission factor; the transmission factor characterizes a conversion relationship between a dimension in the image and a true dimension; the transmission factor is positively correlated with the depth value;
obtaining a transmission factor of the second line segment according to the mapping relation and the first depth value;
and obtaining the height of the object to be measured according to the transmission factor of the second line segment and the length of the second line segment.
7. The method according to any one of claims 1 to 6, further comprising:
acquiring a coordinate of a fifth corner point in the two-dimensional image and a coordinate of a sixth corner point in the two-dimensional image, and acquiring a second depth value of the fifth corner point and a third depth value of the sixth corner point from the depth map; the fifth corner point belongs to the upper bottom surface, and the sixth corner point is a corner point adjacent to the fifth corner point in the upper bottom surface;
obtaining a second image point of the fifth corner point under a camera coordinate system of the two-dimensional camera according to the coordinate of the fifth corner point in the two-dimensional image and the second depth value;
obtaining a third image point of the sixth corner point under the camera coordinate system according to the coordinate of the sixth corner point in the two-dimensional image and the third depth value;
and determining the distance between the second image point and the third image point as the side length of the object to be measured.
8. The method according to claim 1, wherein the obtaining the height of the object to be measured according to the shooting angle, the two-dimensional image and the depth map comprises:
under the condition that the shooting angle is a non-overhead shooting angle, acquiring a coordinate of a seventh corner point in the two-dimensional image and a coordinate of an eighth corner point in the two-dimensional image, and acquiring a fourth depth value of the seventh corner point and a fifth depth value of the eighth corner point from the depth map; the seventh corner point belongs to the upper bottom surface, the eighth corner point belongs to the lower bottom surface of the object to be measured, and a connecting line between the seventh corner point and the eighth corner point is a height edge of the object to be measured;
obtaining a fourth image point of the seventh corner point under a camera coordinate system of the two-dimensional camera according to the coordinate of the seventh corner point in the two-dimensional image and the fourth depth value;
obtaining a fifth image point of the eighth corner point under the camera coordinate system according to the coordinate of the eighth corner point in the two-dimensional image and the fifth depth value;
and determining the distance between the fourth image point and the fifth image point as the height of the object to be measured.
9. The method according to any one of claims 1 to 8, wherein the terminal further comprises an angular velocity sensor, and the acquiring of the shooting angle of the two-dimensional image obtained by the two-dimensional camera comprises:
the terminal obtains the shooting angle through the angular velocity sensor in the process of obtaining the two-dimensional image;
under the condition that the shooting angle is within a downward shooting angle interval, determining the shooting angle as a downward shooting angle;
and under the condition that the shooting angle is out of the bent-down angle section, determining that the shooting angle is a non-bent-down angle.
10. The method according to any one of claims 1 to 8, wherein the obtaining of the shooting angle of the two-dimensional camera to obtain the two-dimensional image comprises:
under the condition that the number of the surfaces of the two-dimensional image containing the object to be measured exceeds a threshold value, determining that the shooting angle of the two-dimensional image obtained by the two-dimensional camera is a depression angle;
and under the condition that the number of the surfaces of the two-dimensional image containing the object to be measured does not exceed the threshold value, determining that the shooting angle of the two-dimensional image obtained by the two-dimensional camera is a non-overhead shooting angle.
11. A measuring device is characterized by comprising a two-dimensional camera and a depth camera;
the measuring device shoots an object to be measured by using the two-dimensional camera to obtain a two-dimensional image, and shoots the object to be measured by using the depth camera to obtain a depth map of the two-dimensional image;
the measuring device further includes:
the acquisition unit is used for acquiring the shooting angle of the two-dimensional image obtained by the two-dimensional camera;
and the first processing unit is used for obtaining the height of the object to be measured according to the shooting angle, the two-dimensional image and the depth map.
12. An electronic device, comprising: a processor and a memory for storing computer program code comprising computer instructions which, if executed by the processor, the electronic device performs the method of any of claims 1 to 10.
13. A computer-readable storage medium, in which a computer program is stored, which computer program comprises program instructions which, if executed by a processor, cause the processor to carry out the method of any one of claims 1 to 10.
CN202010901189.7A 2020-08-31 2020-08-31 Measuring method and device, electronic device and storage medium Active CN112197708B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210630730.4A CN115031635A (en) 2020-08-31 2020-08-31 Measuring method and device, electronic device and storage medium
CN202010901189.7A CN112197708B (en) 2020-08-31 2020-08-31 Measuring method and device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010901189.7A CN112197708B (en) 2020-08-31 2020-08-31 Measuring method and device, electronic device and storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210630730.4A Division CN115031635A (en) 2020-08-31 2020-08-31 Measuring method and device, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN112197708A true CN112197708A (en) 2021-01-08
CN112197708B CN112197708B (en) 2022-04-22

Family

ID=74005185

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210630730.4A Withdrawn CN115031635A (en) 2020-08-31 2020-08-31 Measuring method and device, electronic device and storage medium
CN202010901189.7A Active CN112197708B (en) 2020-08-31 2020-08-31 Measuring method and device, electronic device and storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210630730.4A Withdrawn CN115031635A (en) 2020-08-31 2020-08-31 Measuring method and device, electronic device and storage medium

Country Status (1)

Country Link
CN (2) CN115031635A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436317A (en) * 2021-06-29 2021-09-24 西安商汤智能科技有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN115046480A (en) * 2021-03-09 2022-09-13 华为技术有限公司 Method for measuring length, electronic equipment and mobile equipment

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101294793A (en) * 2007-04-26 2008-10-29 佳能株式会社 Measurement apparatus and control method
CN102103747A (en) * 2009-12-16 2011-06-22 中国科学院电子学研究所 Method for calibrating external parameters of monitoring camera by adopting reference height
US20130182114A1 (en) * 2012-01-17 2013-07-18 Objectvideo, Inc. System and method for monitoring a retail environment using video content analysis with depth sensing
WO2013167705A1 (en) * 2012-05-08 2013-11-14 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Projection display having multi-channel optical system having non-circular overall aperture
US20160307326A1 (en) * 2015-04-20 2016-10-20 Yibing Michelle Wang Cmos image sensor for 2d imaging and depth measurement with ambient light rejection
CN106403828A (en) * 2016-08-30 2017-02-15 成都唐源电气股份有限公司 Monorail contact line remain height measurement method based on checkerboard calibration and monorail contact line remain height measurement system thereof
CN106839975A (en) * 2015-12-03 2017-06-13 杭州海康威视数字技术股份有限公司 Volume measuring method and its system based on depth camera
CN109035320A (en) * 2018-08-12 2018-12-18 浙江农林大学 Depth extraction method based on monocular vision
CN110006343A (en) * 2019-04-15 2019-07-12 Oppo广东移动通信有限公司 Measurement method, device and the terminal of object geometric parameter
CN110136193A (en) * 2019-05-08 2019-08-16 广东嘉腾机器人自动化有限公司 Cubold cabinet three-dimensional dimension measurement method and storage medium based on depth image
CN110276774A (en) * 2019-06-26 2019-09-24 Oppo广东移动通信有限公司 Drawing practice, device, terminal and the computer readable storage medium of object
CN110672020A (en) * 2019-06-14 2020-01-10 浙江农林大学 Stand tree height measuring method based on monocular vision
CN110874864A (en) * 2019-10-25 2020-03-10 深圳奥比中光科技有限公司 Method, device, electronic equipment and system for obtaining three-dimensional model of object
CN111192235A (en) * 2019-12-05 2020-05-22 中国地质大学(武汉) Image measuring method based on monocular vision model and perspective transformation

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101294793A (en) * 2007-04-26 2008-10-29 佳能株式会社 Measurement apparatus and control method
CN102103747A (en) * 2009-12-16 2011-06-22 中国科学院电子学研究所 Method for calibrating external parameters of monitoring camera by adopting reference height
US20130182114A1 (en) * 2012-01-17 2013-07-18 Objectvideo, Inc. System and method for monitoring a retail environment using video content analysis with depth sensing
US20160253802A1 (en) * 2012-01-17 2016-09-01 Avigilon Fortress Corporation System and method for home health care monitoring
WO2013167705A1 (en) * 2012-05-08 2013-11-14 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Projection display having multi-channel optical system having non-circular overall aperture
US20160307326A1 (en) * 2015-04-20 2016-10-20 Yibing Michelle Wang Cmos image sensor for 2d imaging and depth measurement with ambient light rejection
CN106839975A (en) * 2015-12-03 2017-06-13 杭州海康威视数字技术股份有限公司 Volume measuring method and its system based on depth camera
CN106403828A (en) * 2016-08-30 2017-02-15 成都唐源电气股份有限公司 Monorail contact line remain height measurement method based on checkerboard calibration and monorail contact line remain height measurement system thereof
CN109035320A (en) * 2018-08-12 2018-12-18 浙江农林大学 Depth extraction method based on monocular vision
CN110006343A (en) * 2019-04-15 2019-07-12 Oppo广东移动通信有限公司 Measurement method, device and the terminal of object geometric parameter
CN110136193A (en) * 2019-05-08 2019-08-16 广东嘉腾机器人自动化有限公司 Cubold cabinet three-dimensional dimension measurement method and storage medium based on depth image
CN110672020A (en) * 2019-06-14 2020-01-10 浙江农林大学 Stand tree height measuring method based on monocular vision
CN110276774A (en) * 2019-06-26 2019-09-24 Oppo广东移动通信有限公司 Drawing practice, device, terminal and the computer readable storage medium of object
CN110874864A (en) * 2019-10-25 2020-03-10 深圳奥比中光科技有限公司 Method, device, electronic equipment and system for obtaining three-dimensional model of object
CN111192235A (en) * 2019-12-05 2020-05-22 中国地质大学(武汉) Image measuring method based on monocular vision model and perspective transformation

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BJELKHAGEN: "Holographic camera for non-contact measurement of nanoscale surface heights", 《PRACTICAL HOLOGRAPHY XXXIII: DISPLAYS, MATERIALS, AND APPLICATIONS》 *
肖宇峰等: "Kinect与二维激光雷达结合的机器人障碍检测", 《电子科技大学学报》 *
蒋毅飞: "基于虚拟高度线投影的三维重建技术研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115046480A (en) * 2021-03-09 2022-09-13 华为技术有限公司 Method for measuring length, electronic equipment and mobile equipment
CN115046480B (en) * 2021-03-09 2023-11-10 华为技术有限公司 Method for measuring length, electronic equipment and mobile equipment
CN113436317A (en) * 2021-06-29 2021-09-24 西安商汤智能科技有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN113436317B (en) * 2021-06-29 2023-11-03 西安商汤智能科技有限公司 Image processing method and device, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN112197708B (en) 2022-04-22
CN115031635A (en) 2022-09-09

Similar Documents

Publication Publication Date Title
US11120254B2 (en) Methods and apparatuses for determining hand three-dimensional data
CN107223269B (en) Three-dimensional scene positioning method and device
CN110006343B (en) Method and device for measuring geometric parameters of object and terminal
US7554575B2 (en) Fast imaging system calibration
CN112348863B (en) Image alignment method, image alignment device and terminal equipment
CN110276774B (en) Object drawing method, device, terminal and computer-readable storage medium
CN113724368B (en) Image acquisition system, three-dimensional reconstruction method, device, equipment and storage medium
CN112197708B (en) Measuring method and device, electronic device and storage medium
CN112686950B (en) Pose estimation method, pose estimation device, terminal equipment and computer readable storage medium
CN115439607A (en) Three-dimensional reconstruction method and device, electronic equipment and storage medium
CN108492284B (en) Method and apparatus for determining perspective shape of image
CN114037987A (en) Intelligent identification method, device, medium and equipment for scrap steel
JP7432793B1 (en) Mapping methods, devices, chips and module devices based on three-dimensional point clouds
CN112102391A (en) Measuring method and device, electronic device and storage medium
CN110807798B (en) Image recognition method, system, related device and computer readable storage medium
CN112150527B (en) Measurement method and device, electronic equipment and storage medium
CN113379826A (en) Method and device for measuring volume of logistics piece
US10417783B2 (en) Image processing apparatus, image processing method, and storage medium
CN113643386B (en) Calibration method and device, electronic equipment and computer readable storage medium
CN112102390A (en) Measuring method and device, electronic device and storage medium
CN112615993A (en) Depth information acquisition method, binocular camera module, storage medium and electronic equipment
CN112146628B (en) Measurement method and device, electronic equipment and storage medium
CN112991179B (en) Method, apparatus, device and storage medium for outputting information
CN117635875B (en) Three-dimensional reconstruction method, device and terminal
WO2021057582A1 (en) Image matching, 3d imaging and pose recognition method, device, and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant