CN112150527A - Measuring method and device, electronic device and storage medium - Google Patents

Measuring method and device, electronic device and storage medium Download PDF

Info

Publication number
CN112150527A
CN112150527A CN202010899124.3A CN202010899124A CN112150527A CN 112150527 A CN112150527 A CN 112150527A CN 202010899124 A CN202010899124 A CN 202010899124A CN 112150527 A CN112150527 A CN 112150527A
Authority
CN
China
Prior art keywords
point
edge
measured
corner point
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010899124.3A
Other languages
Chinese (zh)
Other versions
CN112150527B (en
Inventor
薛地
周杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen TetrasAI Technology Co Ltd
Original Assignee
Shenzhen TetrasAI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen TetrasAI Technology Co Ltd filed Critical Shenzhen TetrasAI Technology Co Ltd
Priority to CN202010899124.3A priority Critical patent/CN112150527B/en
Publication of CN112150527A publication Critical patent/CN112150527A/en
Application granted granted Critical
Publication of CN112150527B publication Critical patent/CN112150527B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/022Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of tv-camera scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/564Depth or shape recovery from multiple images from contours
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a measuring method and device, electronic equipment and a storage medium. The method comprises the following steps: acquiring a two-dimensional image containing an object to be measured and a depth map of the two-dimensional image; the two-dimensional image is acquired through two-dimensional imaging equipment; and obtaining the distance between the first corner point of the object to be measured and the second corner point of the object to be measured according to the two-dimensional image and the depth map.

Description

Measuring method and device, electronic device and storage medium
Technical Field
The present application relates to the field of computer vision technologies, and in particular, to a measurement method and apparatus, an electronic device, and a storage medium.
Background
In daily life, people often need to measure the size of an object. In conventional methods, people usually use length measuring tools (such as tape measure, ruler, vernier caliper) to measure the size of an object. However, this conventional measurement method is time-consuming and labor-consuming for the measurer, and the measurement efficiency is low. Therefore, how to measure the size of the object efficiently and accurately has very important significance.
Disclosure of Invention
The application provides a measurement method and device, an electronic device and a storage medium.
In a first aspect, a measurement method is provided, the method including:
acquiring a two-dimensional image containing an object to be measured and a depth map of the two-dimensional image; the two-dimensional image is acquired through two-dimensional imaging equipment;
and obtaining the distance between the first corner point of the object to be measured and the second corner point of the object to be measured according to the two-dimensional image and the depth map.
In combination with any embodiment of the present application, obtaining a distance between a first corner point of the object to be measured and a second corner point of the object to be measured according to the two-dimensional image and the depth map includes:
according to the two-dimensional image and the depth map, obtaining a first image point of a first angular point of the object to be measured under a camera coordinate system of the two-dimensional imaging equipment, and a second image point of a second angular point of the object to be measured under the camera coordinate system;
and determining the distance between the first image point and the second image point as the distance between the first corner point and the second corner point.
With reference to any one of the embodiments of the present application, the object to be measured includes a surface to be measured; the obtaining, according to the two-dimensional image and the depth map, a first image point of a first corner point of the object to be measured in a camera coordinate system of the two-dimensional imaging device and a second image point of a second corner point of the object to be measured in the camera coordinate system includes:
obtaining a projection plane of the surface to be measured under a camera coordinate system of the two-dimensional imaging device according to coordinates of at least three first points in the surface to be measured in the two-dimensional image and depth values of the at least three first points obtained from the depth map;
the first image point and the second image point are determined from the projection plane.
In combination with any embodiment of the present application, the determining the first image point and the second image point from the projection plane includes:
acquiring a first straight line passing through the first angular point and an optical center of the two-dimensional imaging equipment, and a second straight line passing through the second angular point and the optical center;
and determining an intersection point between the first straight line and the projection plane to obtain the first image point, and determining an intersection point between the second straight line and the projection plane to obtain the second image point.
In combination with any embodiment of the present application, the acquiring a first straight line passing through the first corner point and the optical center of the two-dimensional imaging device, and a second straight line passing through the second corner point and the optical center includes:
acquiring a first coordinate of the first corner point under an image coordinate system of the two-dimensional imaging equipment, a second coordinate of the second corner point under the image coordinate system and a third coordinate of an optical center of the two-dimensional imaging equipment under the camera coordinate system;
obtaining the first straight line according to the first coordinate and the third coordinate;
and obtaining the second straight line according to the second coordinate and the third coordinate.
With reference to any embodiment of the present application, the obtaining a projection plane of the to-be-measured surface under a camera coordinate system of the two-dimensional imaging device according to coordinates of at least three first points in the to-be-measured surface in the two-dimensional image and depth values of the at least three first points obtained from the depth map includes:
acquiring internal parameters of the two-dimensional imaging equipment and coordinates of the at least three first points in a pixel coordinate system of the two-dimensional image;
obtaining image points of the at least three first points in the camera coordinate system according to the internal parameters, the depth values of the at least three first points and the coordinates of the at least three first points in the pixel coordinate system;
and performing plane fitting processing on the image points of the at least three first points in the camera coordinate system to obtain the projection plane.
In combination with any embodiment of the present application, the shape of the object to be measured is a regular shape or a quasi-regular shape, and the surface to be measured is the upper bottom surface of the object to be measured; the method further comprises the following steps:
acquiring coordinates of the third corner point in the pixel coordinate system, and acquiring a depth value of the third corner point from the depth map; a connecting line between the third corner point and the first corner point is a height edge of the object to be measured;
obtaining a third image point of the third corner point in the camera coordinate system according to the internal parameter, the depth value of the third corner point and the coordinate of the third corner point in the pixel coordinate system;
determining a distance between the first image point and the third image point as a height of the object to be measured.
In combination with any embodiment of the present application, obtaining a distance between a first corner point of the object to be measured and a second corner point of the object to be measured according to the two-dimensional image and the depth map includes:
obtaining a first depth value of a first line segment from the depth map; two end points of the first line segment are the first angular point and the second angular point respectively; the first depth value is positively correlated with the depth value of the first angular point, and/or the first depth value is positively correlated with the depth value of the second angular point;
acquiring a mapping relation between the depth value and the transmission factor; the transmission factor characterizes a conversion relationship between a dimension in the image and a true dimension; the transmission factor is positively correlated with the depth value;
obtaining a transmission factor of the first line segment according to the mapping relation and the first depth value;
and obtaining the distance between the first corner point and the second corner point according to the transmission factor of the first line segment and the length of the first line segment.
In combination with any of the embodiments of the present application, the object surface to be measured includes a non-body structure, and the method further includes:
performing edge detection processing on the two-dimensional image to obtain a first edge and a second edge; the first edge and the second edge are located in the same surface of the object to be measured, and the gradient of the first edge and the gradient of the second edge exceed an edge threshold;
determining a distance between the first edge and the second edge as a dimension of the non-body structure.
With reference to any embodiment of the present application, after performing edge detection processing on the two-dimensional image to obtain a first edge and a second edge, before determining a distance between the first edge and the second edge as a size of the non-body structure, the method further includes:
respectively carrying out curvature detection processing on the first edge and the second edge to obtain a first curvature of the first edge and a second curvature of the second edge;
the determining a distance between the first edge and the second edge as a dimension of the non-body structure comprises:
determining a distance between the first edge and the second edge as a dimension of the non-body structure if both the first curvature and the second curvature exceed a curvature threshold.
In combination with any embodiment of the present application, the method further comprises:
acquiring dynamic display information;
and according to the dynamic display information, performing dynamic display processing on the first corner point and the second corner point, and displaying the first corner point and the second corner point after the dynamic display processing.
In combination with any embodiment of the present application, the dynamic display information includes an interface element; the dynamic display processing is performed on the first corner point and the second corner point according to the dynamic display information, and the first corner point and the second corner point after the dynamic display processing are displayed, including:
moving the interface element from the first corner point to the second corner point.
In combination with any embodiment of the present application, the moving the scroll bar from the first corner point to the second corner point includes:
obtaining the moving speed of the scroll bar according to the distance between the first angular point and the second angular point; the moving speed is in negative correlation with the distance between the first corner point and the second corner point;
moving the scrollbar from the first corner point to the second corner point at the speed of movement.
In combination with any embodiment of the present application, the two-dimensional imaging device includes an RGB camera; the RGB camera belongs to a terminal; the terminal also comprises a depth camera; the acquiring a two-dimensional image containing an object to be measured and a depth map of the two-dimensional image includes:
and under the condition that a measuring instruction for the object to be measured is detected, the terminal respectively shoots the object to be measured by using the RGB camera and the depth camera to obtain the two-dimensional image and the depth map.
In a second aspect, there is provided a measurement apparatus, the apparatus comprising:
an acquisition unit configured to acquire a two-dimensional image including an object to be measured and a depth map of the two-dimensional image; the two-dimensional image is acquired through two-dimensional imaging equipment;
and the first processing unit is used for obtaining the distance between the first corner point of the object to be measured and the second corner point of the object to be measured according to the two-dimensional image and the depth map.
With reference to any one of the embodiments of the present application, the first processing unit is configured to:
according to the two-dimensional image and the depth map, obtaining a first image point of a first angular point of the object to be measured under a camera coordinate system of the two-dimensional imaging equipment, and a second image point of a second angular point of the object to be measured under the camera coordinate system;
and determining the distance between the first image point and the second image point as the distance between the first corner point and the second corner point.
With reference to any one of the embodiments of the present application, the object to be measured includes a surface to be measured; the first processing unit is configured to:
obtaining a projection plane of the surface to be measured under a camera coordinate system of the two-dimensional imaging device according to coordinates of at least three first points in the surface to be measured in the two-dimensional image and depth values of the at least three first points obtained from the depth map;
the first image point and the second image point are determined from the projection plane.
With reference to any one of the embodiments of the present application, the first processing unit is configured to:
acquiring a first straight line passing through the first angular point and an optical center of the two-dimensional imaging equipment, and a second straight line passing through the second angular point and the optical center;
and determining an intersection point between the first straight line and the projection plane to obtain the first image point, and determining an intersection point between the second straight line and the projection plane to obtain the second image point.
With reference to any one of the embodiments of the present application, the first processing unit is configured to:
acquiring a first coordinate of the first corner point under an image coordinate system of the two-dimensional imaging equipment, a second coordinate of the second corner point under the image coordinate system and a third coordinate of an optical center of the two-dimensional imaging equipment under the camera coordinate system;
obtaining the first straight line according to the first coordinate and the third coordinate;
and obtaining the second straight line according to the second coordinate and the third coordinate.
With reference to any one of the embodiments of the present application, the first processing unit is configured to:
acquiring internal parameters of the two-dimensional imaging equipment and coordinates of the at least three first points in a pixel coordinate system of the two-dimensional image;
obtaining image points of the at least three first points in the camera coordinate system according to the internal parameters, the depth values of the at least three first points and the coordinates of the at least three first points in the pixel coordinate system;
and performing plane fitting processing on the image points of the at least three first points in the camera coordinate system to obtain the projection plane.
In combination with any embodiment of the present application, the shape of the object to be measured is a regular shape or a quasi-regular shape, and the surface to be measured is the upper bottom surface of the object to be measured;
the acquisition unit is configured to:
acquiring coordinates of the third corner point in the pixel coordinate system, and acquiring a depth value of the third corner point from the depth map; a connecting line between the third corner point and the first corner point is a height edge of the object to be measured;
the first processing unit is configured to:
obtaining a third image point of the third corner point in the camera coordinate system according to the internal parameter, the depth value of the third corner point and the coordinate of the third corner point in the pixel coordinate system;
determining a distance between the first image point and the third image point as a height of the object to be measured.
With reference to any one of the embodiments of the present application, the first processing unit is configured to:
obtaining a first depth value of a first line segment from the depth map; two end points of the first line segment are the first angular point and the second angular point respectively; the first depth value is positively correlated with the depth value of the first angular point, and/or the first depth value is positively correlated with the depth value of the second angular point;
acquiring a mapping relation between the depth value and the transmission factor; the transmission factor characterizes a conversion relationship between a dimension in the image and a true dimension; the transmission factor is positively correlated with the depth value;
obtaining a transmission factor of the first line segment according to the mapping relation and the first depth value;
and obtaining the distance between the first corner point and the second corner point according to the transmission factor of the first line segment and the length of the first line segment.
In combination with any one of the embodiments of the present application, the surface of the object to be measured includes a non-body structure, and the apparatus further includes:
the first detection unit is used for carrying out edge detection processing on the two-dimensional image to obtain a first edge and a second edge; the first edge and the second edge are located in the same surface of the object to be measured, and the gradient of the first edge and the gradient of the second edge exceed an edge threshold;
a second processing unit for determining a distance between the first edge and the second edge as a dimension of the non-body structure.
In combination with any embodiment of the present application, the apparatus further includes:
a second detection unit, configured to, after performing edge detection processing on the two-dimensional image to obtain a first edge and a second edge, perform curvature detection processing on the first edge and the second edge respectively to obtain a first curvature of the first edge and a second curvature of the second edge before determining a distance between the first edge and the second edge as a size of the non-body structure;
the second processing unit is configured to:
determining a distance between the first edge and the second edge as a dimension of the non-body structure if both the first curvature and the second curvature exceed a curvature threshold.
With reference to any embodiment of the present application, the obtaining unit is further configured to:
acquiring dynamic display information;
the measuring device further includes:
and the display unit is used for carrying out dynamic display processing on the first corner point and the second corner point according to the dynamic display information and displaying the first corner point and the second corner point after the dynamic display processing.
In combination with any embodiment of the present application, the dynamic display information includes an interface element; the display unit is used for:
moving the scrollbar from the first corner point to the second corner point.
In combination with any embodiment of the present application, the display unit is configured to:
obtaining the moving speed of the scroll bar according to the distance between the first angular point and the second angular point; the moving speed is in negative correlation with the distance between the first corner point and the second corner point;
moving the interface element from the first corner point to the second corner point at the movement speed.
In combination with any embodiment of the present application, the two-dimensional imaging device includes an RGB camera; the RGB camera belongs to the measuring device; the measuring device further comprises a depth camera; and when a measuring instruction for the object to be measured is detected, the measuring device respectively shoots the object to be measured by using the RGB camera and the depth camera to obtain the two-dimensional image and the depth map.
In a third aspect, a processor is provided, which is configured to perform the method according to the first aspect and any one of the possible implementations thereof.
In a fourth aspect, an electronic device is provided, comprising: a processor, transmitting means, input means, output means, and a memory for storing computer program code comprising computer instructions, which, when executed by the processor, cause the electronic device to perform the method of the first aspect and any one of its possible implementations.
In a fifth aspect, there is provided a computer-readable storage medium having stored therein a computer program comprising program instructions which, if executed by a processor, cause the processor to perform the method of the first aspect and any one of its possible implementations.
A sixth aspect provides a computer program product comprising a computer program or instructions which, when run on a computer, causes the computer to perform the method of the first aspect and any of its possible implementations.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and, together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic view of a point of identity provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of an image pixel coordinate system according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a measurement method according to an embodiment of the present application;
fig. 4 is a schematic view of an object to be measured according to an embodiment of the present disclosure;
FIG. 5 is a schematic view of another object to be measured according to an embodiment of the present disclosure;
fig. 6 is a schematic view of another object to be measured provided in the embodiment of the present application;
FIG. 7 is a schematic view of a corner cut triangle provided in an embodiment of the present application;
FIG. 8 is a schematic view of a corner cut rectangle provided in an embodiment of the present application;
FIG. 9 is a schematic view of another corner cut rectangle provided by an embodiment of the present application;
FIG. 10 is a schematic view of a regular-shaped object of the present application according to an embodiment;
fig. 11 is a schematic structural diagram of a measurement apparatus according to an embodiment of the present disclosure;
fig. 12 is a schematic hardware structure diagram of a measurement apparatus according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more, "at least two" means two or three and three or more, "and/or" for describing an association relationship of associated objects, meaning that three relationships may exist, for example, "a and/or B" may mean: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" may indicate that the objects associated with each other are in an "or" relationship, meaning any combination of the items, including single item(s) or multiple items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural. The character "/" may also represent a division in a mathematical operation, e.g., a/b-a divided by b; 6/3 ═ 2. At least one of the following "or similar expressions.
Some concepts that will appear below are first defined. In the embodiment of the present application, the object point refers to a point in the real world, the physical distance refers to a distance in the real world, and the physical size refers to a size in the real world.
The object points correspond to pixel points in the image. For example, the table is photographed using a camera to obtain an image a. The table comprises an object point a, a pixel point b in the image A is obtained by imaging the object point a, and the object point a corresponds to the pixel point b.
For convenience of description, the pixels of the same object point in different images are referred to as the same name point. As shown in FIG. 1, the pixel A and the pixel C are the same-name points, and the pixel B and the pixel D are the same-name points.
In the embodiment of the present application, the positions in the image all refer to positions in pixel coordinates of the image. The abscissa of the pixel coordinate system in the embodiment of the present application is usedAnd in the row number of the pixel points, the vertical coordinate in the pixel coordinate system is used for expressing the row number of the pixel points. For example, in the image shown in fig. 2, a pixel coordinate system XOY is constructed with the upper left corner of the image as the origin O of coordinates, the direction parallel to the rows of the image as the direction of the X axis, and the direction parallel to the columns of the image as the direction of the Y axis. The units of the abscissa and the ordinate are pixel points. For example, pixel A in FIG. 211Has the coordinate of (1, 1), and the pixel point A23Has the coordinates of (3, 2), and the pixel point A42Has the coordinates of (2, 4), and the pixel point A34The coordinates of (2) are (4, 3).
In this embodiment of the application, an image point of a pixel point in a two-dimensional image in a camera coordinate system is a projection point of the pixel point in the camera coordinate system, a distance from the projection point to an optical center of a two-dimensional imaging device is a distance from an object point corresponding to the pixel point to the two-dimensional imaging device, and the projection point, the pixel point and the optical center of the two-dimensional imaging device are on the same straight line.
In the embodiment of the present application, the projection plane of the pixel point plane in the two-dimensional image in the camera coordinate system is a plane including the projection point of the pixel point in the pixel point plane in the camera coordinate system.
In daily life, people often need to measure the size of an object. In conventional methods, people usually use length measuring tools (such as tape measure, ruler, vernier caliper) to measure the size of an object. However, this conventional measurement method is time-consuming and labor-consuming for the measurer, and the measurement efficiency is low. Therefore, how to measure the size of the object efficiently and accurately has very important significance.
The electronic device uses a synchronous positioning and mapping (SLAM) method to model the scene where the object to be measured is located, and then the size of the object to be measured can be obtained. However, when the size of the object to be measured is measured using this method, the amount of data that the electronic equipment needs to process is large, and the processing speed is low.
Based on this, the embodiment of the present application provides a technical solution, which can reduce the data processing amount and improve the measurement processing speed. The execution subject of the embodiment of the application is a measuring device. Optionally, the measuring device may be one of the following: cell-phone, computer, server, panel computer. The embodiments of the present application will be described below with reference to the drawings. Referring to fig. 3, fig. 3 is a schematic flow chart of a measurement method according to an embodiment of the present disclosure.
301. A two-dimensional image containing an object to be measured and a depth map of the two-dimensional image are acquired.
In the embodiment of the present application, the two-dimensional image may be an RGB image or a YUV image, where "Y" represents brightness (i.e., a gray-scale value), "U" and "V" each represent Chroma (Chroma). The two-dimensional image contains an object to be measured, and the depth map carries depth information of pixel points in the two-dimensional image, namely the depth map carries the depth information of the object to be measured.
In the embodiment of the application, the two-dimensional image is acquired through a two-dimensional imaging device, wherein the two-dimensional imaging device is a two-dimensional imaging device. For example, a two-dimensional image may be captured by an RGB camera. As another example, the two-dimensional image may be acquired by a YUV imaging device.
In one implementation of acquiring a two-dimensional image, a measuring device receives an RGB image input by a user through an input component as the two-dimensional image, wherein the input component includes: keyboard, mouse, touch screen, touch pad, audio input device, etc.
In another implementation manner of acquiring a two-dimensional image, the measuring device receives an RGB image sent by a first terminal as the two-dimensional image, where the first terminal includes a mobile phone, a computer, a tablet computer, a server, and the like.
In yet another implementation of acquiring a two-dimensional image, the measuring device acquires the two-dimensional image using a two-dimensional imaging device. For example, the measuring device is a mobile phone, and the two-dimensional imaging device is an RGB camera on the mobile phone. The mobile phone can acquire a two-dimensional image by using the RGB camera.
In one implementation of obtaining a depth map of a two-dimensional image, a measuring device receives a depth map of a two-dimensional image input by a user through an input component, wherein the input component comprises: keyboard, mouse, touch screen, touch pad, audio input device, etc.
In another implementation manner of obtaining the depth map of the two-dimensional image, the measuring device receives the two-dimensional image sent by the second terminal, where the second terminal includes a mobile phone, a computer, a tablet computer, a server, and the like. Optionally, the second terminal is the same as the first terminal.
In yet another implementation of obtaining a depth map of a two-dimensional image, the measurement device uses a depth imaging device to acquire the depth map. For example, the measuring device is a mobile phone, and the depth imaging device is a depth camera on the mobile phone. The mobile phone can acquire a depth map by using a depth camera. The depth camera may be any one of: structured light (structured light) cameras, time of flight (TOF) cameras, binocular stereo (binocular stereo vision) cameras.
As an alternative embodiment, the measuring device is equipped with a two-dimensional imaging device and a depth imaging device. The measuring device uses RGB equipment and depth imaging equipment to shoot an object to be measured simultaneously to obtain a two-dimensional image and a depth map of the two-dimensional image.
302. And obtaining the distance between the first corner point of the object to be measured and the second corner point of the object to be measured according to the two-dimensional image and the depth map.
In the embodiment of the application, the first corner point and the second corner point are both corner points belonging to an object to be measured in the two-dimensional image, and the first corner point is different from the second corner point.
In a manner of obtaining a distance between a first corner point and a second corner point, a measuring device obtains a first image point of the first corner point of the object to be measured in a camera coordinate system of a two-dimensional imaging device and a second image point of the second corner point of the object to be measured in the camera coordinate system according to the two-dimensional image and the depth map. And determining the distance between the first image point and the second image point to obtain the distance between the first corner point and the second corner point.
In an implementation of obtaining the first image point and the second image point, the measuring device obtains a depth value of the first corner point and a depth value of the second corner point from the depth map. And obtaining the three-dimensional coordinate of the first image point in the camera coordinate system of the two-dimensional imaging equipment according to the coordinate of the first angular point in the two-dimensional image, the depth value of the first angular point and the internal parameters of the two-dimensional imaging equipment, wherein the first image point is the image point of the first angular point in the two-dimensional imaging equipment. And obtaining the three-dimensional coordinate of the second image point in the camera coordinate system of the two-dimensional imaging device according to the coordinate of the second corner point in the two-dimensional image, the depth value of the second corner point and the internal parameters of the two-dimensional imaging device, wherein the second image point is the image point of the second corner point in the two-dimensional imaging device.
In another implementation for obtaining the first image point and the second image point, it is assumed that the first corner point belongs to the first surface and the second corner point belongs to the second surface. The measuring means determines a projection plane of the first face under the camera coordinate system of the two-dimensional imaging device (which will be referred to as a first intermediate plane hereinafter) and determines a projection plane of the second face under the camera coordinate system of the two-dimensional imaging device (which will be referred to as a second intermediate plane hereinafter). And determining a connecting line between the coordinate of the first angle point in the image coordinate system and the optical center of the two-dimensional imaging equipment to obtain a first middle straight line. And determining a connecting line between the coordinate of the second corner point under the image coordinate system and the optical center of the two-dimensional imaging equipment to obtain a second middle straight line. And determining an intersection point between the first middle straight line and the first middle plane to obtain an image point of the first corner point in the two-dimensional imaging equipment, namely a first image point. And determining an intersection point between the second middle straight line and the second middle plane to obtain an image point of the second corner point in the two-dimensional imaging equipment, namely a second image point.
In another way of obtaining the distance between the first corner point and the second corner point, the distance between the first corner point and the second corner point is taken as a first intermediate distance. The measuring device acquires a mapping relation between the transmission factor and the depth value of the first intermediate distance before determining the first intermediate distance.
The mapping relation represents the mapping relation between the depth value of the pixel point and the transmission factor. The transmission factor represents the conversion relation between the size of the pixel point and the size of the object point corresponding to the pixel point. For example, the box is photographed using a camera to obtain image a. The box comprises an object point a, a pixel point b in the image A is obtained by imaging the object point a, and then the transmission factor of the pixel point b represents the conversion relation between the size of the pixel point b and the size of the object point a.
The depth value of the first distance is positively correlated with the depth value of the first angular point, and/or the first intermediate transmission factor is positively correlated with the depth value of the second angular point. Optionally, a midpoint of a connection line between the first corner point and the second corner point is referred to as a first midpoint, and the depth value of the first distance is a depth value of the first midpoint. The measuring device can acquire the depth value of the first midpoint from the depth map as the depth value of the first distance, and further can obtain a first intermediate transmission factor of the first intermediate distance according to the mapping relation and the depth value of the first intermediate distance. The measuring device converts the first intermediate distance into a physical distance corresponding to the first intermediate distance according to the first transmission factor, and obtains the distance between the first corner point and the second corner point.
It should be understood that the first corner point and the second corner point may be any two different corner points on the object to be measured, that is, the distance between any two corner points on the object to be measured can be obtained through the technical solution disclosed above, and further, the size of the object to be measured can be obtained. For example, in the case where the object to be measured is a carton, the length, width, and height of the object to be measured can be obtained; in the case where the object to be measured is a microwave oven as shown in fig. 4, the length, width, and height of the microwave oven can be obtained.
In the embodiment of the application, the measuring device obtains the size of the object to be measured according to the two-dimensional image and the depth map, so that the data processing amount can be reduced, and the processing speed can be increased.
As an alternative embodiment, the measuring device obtains the first image point and the second image point by performing the following steps:
1. and obtaining a projection plane of the surface to be measured under a camera coordinate system of the two-dimensional imaging equipment according to the coordinates of at least three first points in the surface to be measured in the two-dimensional image and the depth values of the at least three first points obtained from the depth map.
In the embodiment of the present application, the coordinates in the two-dimensional image refer to coordinates in a pixel coordinate system of the two-dimensional image (which will be referred to as a reference pixel coordinate system hereinafter). The measuring device can obtain the coordinates of the pixel point in the camera coordinate system of the two-dimensional imaging device (hereinafter, the coordinates are referred to as two-dimensional coordinates) according to the coordinates of the pixel point in the reference pixel coordinate system and the internal parameters of the two-dimensional imaging device.
In the embodiment of the present application, the internal parameters of the imaging device include: the system comprises coordinates of the focal length of the imaging device in a camera coordinate system and coordinates of an optical center of the imaging device in the camera coordinate system, wherein the optical center is an intersection point of an optical axis of the imaging device and an image plane.
The imaging device that acquires the depth map is referred to as a depth imaging device. The measuring device may obtain depth values of pixel points in the two-dimensional image from the depth map. And converting the depth value of the pixel point to the depth value under the reference pixel coordinate system according to the pose conversion relation between the two-dimensional imaging equipment and the depth imaging equipment. And the measuring device can obtain the depth value of the pixel point in the camera coordinate system of the two-dimensional imaging equipment according to the internal parameters of the two-dimensional imaging equipment and the depth value of the pixel point in the reference pixel coordinate system. In this way, the measuring device can obtain the three-dimensional coordinates of any one of the first points in the camera coordinate system of the two-dimensional imaging apparatus.
After obtaining the three-dimensional coordinates of the at least three first points in the camera coordinate system, the measuring device may obtain a plane according to the three-dimensional coordinates of the at least three first points in the camera coordinate system, where the plane is a projection plane of the surface to be measured in the camera coordinate system.
Optionally, the measuring device performs corner detection processing on the two-dimensional image to obtain a position of a corner of the surface to be measured in the two-dimensional image. The measuring device obtains an area covered by the surface to be measured in the image according to the position of the corner point in the surface to be measured, and then at least three first points can be selected from the area.
In one possible implementation, the measurement device processes the two-dimensional image using a convolutional neural network, implementing the corner detection process. The convolutional neural network is obtained by training a plurality of images with marking information as training data, wherein the marking information of the images in the training data is angular points and positions of the angular points. In the process of training the convolutional neural network by using the training data, the convolutional neural network extracts the characteristic data of the image from the image and determines whether the corner points exist in the image according to the characteristic data. And under the condition that the angular points exist in the image, obtaining the positions of the angular points in the image according to the characteristic data of the image. In the training process, the result obtained in the training process of the convolutional neural network is supervised by using the marking information as the supervision information, and the parameters of the convolutional neural network are updated to complete the training of the convolutional neural network.
Thus, the trained convolutional neural network can be used for processing the two-dimensional image to obtain the position of the corner point of the object to be measured in the two-dimensional image. It should be understood that the execution subject of training the convolutional neural network may be a measurement device, or may be a training device, wherein the training device may be one of the following: computer, server.
In another possible implementation, the corner detection process may be implemented by a corner detection algorithm, wherein the corner detection algorithm may be one of the following: harris corner detection algorithm, Moravec corner detection algorithm, Shi-Tomasi corner detection algorithm and the like, and the corner detection algorithm for realizing the corner detection processing is not particularly limited in the application.
2. And determining the first image point and the second image point from the projection plane.
The projection plane comprises image points of all points on the surface to be measured under the camera coordinate system, the first angular point and the second angular point both belong to the surface to be measured, and the projection plane comprises an image point (namely, a first image point) corresponding to the first angular point and an image point (namely, a second image point) corresponding to the second angular point.
In one possible implementation manner, the measuring device obtains coordinates of an optical center of the two-dimensional imaging device in a camera coordinate system, and obtains a straight line passing through the first corner point and the optical center (hereinafter referred to as a first straight line) according to the coordinates of the first corner point in the camera coordinate system and the coordinates of the optical center in the camera coordinate system. The measuring device determines the intersection point of the first straight line and the projection plane as a first image point. Similarly, the measuring device obtains a straight line (hereinafter referred to as a second straight line) passing through the second corner point and the optical center according to the coordinates of the second corner point in the camera coordinate system and the coordinates of the optical center in the camera coordinate system. The measuring device takes the intersection point of the second straight line and the projection plane as a second image point.
In this step, since the points on the projection plane all carry three-dimensional coordinate information, the first image point and the second image point both carry three-dimensional coordinate information, and the three-dimensional coordinate information carries depth information. And limited by the imaging limitations of the depth imaging apparatus, the accuracy of the depth values of points on the edge in the object to be measured obtained from the depth map is low. Thereby causing inaccuracies in the coordinates of the image points of the points on the edge in the object to be measured in the camera coordinate system. By determining the first image point and the second image point from the projection plane, the accuracy of the coordinates of the first image point in the camera coordinate system and the accuracy of the coordinates of the second image point in the camera coordinate system can be improved.
As an alternative embodiment, the measuring device performs the following steps in the process of performing step 2:
3. and acquiring a first straight line passing through the first angular point and the optical center of the two-dimensional imaging equipment and a second straight line passing through the second angular point and the optical center.
Alternatively, the measuring device may determine the first line by obtaining an equation of the first line and determine the second line by obtaining an equation of the second line.
4. The first image point is obtained by determining the intersection point between the first straight line and the projection plane, and the second image point is obtained by determining the intersection point between the second straight line and the projection plane.
The measuring device combines the equation of the first straight line and the equation of the projection plane, and the coordinate of the first image point in the camera coordinate system can be obtained. The measuring device combines the equation of the second straight line and the equation of the projection plane, and the coordinate of the second image point in the camera coordinate system can be obtained.
As an alternative embodiment, the measuring device performs the following steps in the process of performing step 3:
5. and acquiring a first coordinate of the first corner point under an image coordinate system of the two-dimensional imaging equipment, a second coordinate of the second corner point under the image coordinate system and a third coordinate of the optical center of the two-dimensional imaging equipment under the camera coordinate system.
After the measuring device obtains the internal parameters of the two-dimensional imaging device, the measuring device can obtain the coordinate of the first corner point in the image coordinate system of the two-dimensional imaging device, namely the first coordinate, according to the internal parameters and the coordinate of the first corner point in the two-dimensional image. And the measuring device obtains the coordinate of the second corner point in the image coordinate system, namely a second coordinate according to the internal parameters and the coordinate of the second corner point in the two-dimensional image.
In one implementation of obtaining a third coordinate of the optical center in the camera coordinate system, the measuring device obtains the optical center coordinate input by the user through the input component as the third coordinate.
In another implementation manner of obtaining a third coordinate of the optical center in the camera coordinate system, the measuring device receives the optical center coordinate sent by the second terminal as the third coordinate, where the second terminal includes a mobile phone, a computer, a tablet computer, a server, and the like. Optionally, the second terminal is the same as the first terminal.
6. And obtaining the first straight line according to the first coordinate and the third coordinate.
The measuring device can obtain an equation of a straight line of the first corner point and the optical center, namely an equation of the first straight line, according to the first coordinate and the third coordinate.
7. And obtaining the second straight line according to the second coordinate and the third coordinate.
The measuring device can obtain an equation of a straight line of the second corner point and the optical center, namely an equation of a second straight line, according to the second coordinate and the third coordinate.
As an alternative embodiment, the measuring device performs the following steps in the process of performing step 1:
8. and acquiring internal parameters of the two-dimensional imaging equipment and coordinates of the at least three first points in a pixel coordinate system of the two-dimensional image.
In the embodiment of the present application, the internal parameters of the two-dimensional imaging device include: the system comprises a coordinate of a focal length of the two-dimensional imaging device under a camera coordinate system and a coordinate of an optical center of the two-dimensional imaging device under the camera coordinate system, wherein the optical center is an intersection point of an optical axis of the two-dimensional imaging device and an image plane.
As described above, due to the imaging limitations of the depth imaging device, depth information of object points at the edges of a plane may not be accurate in a depth map acquired by the depth imaging device. In view of the fact that the depth information to the at least three first points needs to be used in the subsequent processing, the measuring apparatus may select the at least three first points from points other than on the edge of the surface to be measured, in order to improve the accuracy of the subsequent processing. For example, in fig. 5, if the surface to be measured is ABCD, at least three first points are points in ABCD other than the point on AB, the point on BC, the point on CD, and the point on DA.
Optionally, the measuring device performs corner detection processing on the two-dimensional image to obtain a position of a corner of the surface to be measured in the two-dimensional image. The measuring device obtains an area covered by the surface to be measured in the image according to the position of the corner point in the surface to be measured, and then at least three first points can be selected from the area.
9. And obtaining image points of the at least three first points in the camera coordinate system according to the internal parameters, the depth values of the at least three first points and the coordinates of the at least three first points in the pixel coordinate system.
The measuring device converts the coordinate of the first point in the two-dimensional image and the depth value of the first point according to the internal parameters of the two-dimensional imaging equipment, and an image point of the first point in a camera coordinate system of the two-dimensional imaging equipment can be obtained.
10. And performing plane fitting processing on the image points of the at least three first points in the camera coordinate system to obtain the projection plane.
Optionally, the measuring device may perform plane fitting processing on the image points of the at least three first points in the camera coordinate system, so as to minimize the sum of distances from the plane obtained by fitting to the image points of the at least three first points, thereby obtaining the projection plane. In this way, the accuracy of the projection plane can be improved.
For example, the image points of the at least three first points include: image point a, image point b and image point c. Suppose that the distance from the image point a to the fitted plane is D1The distance from the image point b to the plane obtained by fitting is D2The distance from the image point c to the plane obtained by fitting is D3. Then, the measuring device makes D a plane obtained by performing plane fitting processing on the image points of the at least three first points1+D2+D3And minimum.
As an alternative embodiment, the shape of the object to be measured is a regular shape or a regular-like shape, and the surface to be measured is the top and bottom surfaces of the object to be measured.
In an embodiment of the present application, the regular shape includes at least one of: rectangle, rhombus, parallelogram, pentagon. For example, the rectangular parallelepiped is a regular shape. For another example, in the object to be measured shown in fig. 6, both the upper bottom surface and the lower bottom surface are pentagonal, and the object to be measured has a regular shape.
In the embodiment of the present application, the rule-like shape includes a regular shape in which at least one face is a corner cut rectangle, and a convex portion and/or a concave portion is present in at least one face in the regular shape. For example, FIG. 7 shows a corner cut triangle; FIG. 8 shows a corner cut rectangle; fig. 9 shows a corner cut rectangle. The shape of the object shown in fig. 10 is a rule-like shape. For another example, the microwave oven shown in fig. 4 has a quasi-regular shape.
In the embodiment of the present application, a vector that is parallel to the gravity direction and has the geometric center of the object to be measured as a starting point is referred to as a reference vector. The bottom surface to which the positive direction of the reference vector points is referred to as a lower bottom surface, and the bottom surface to which the negative direction of the reference vector points is referred to as an upper bottom surface. The surface of the object to be measured other than the upper and lower bottom surfaces is referred to as a side surface. For example, in the object to be measured shown in fig. 5, the plane ABCD is an upper bottom surface, the plane EFG is a lower bottom surface, and the plane ABFE and the plane BCGF are side surfaces.
Optionally, the measuring device determines the gravity direction in the two-dimensional image according to gyroscope data of the two-dimensional imaging device when acquiring the two-dimensional image, and further determines the upper bottom surface of the object to be measured and the lower bottom surface of the object to be measured.
The measuring device further executes the following steps on the basis of executing the steps:
11. and acquiring the coordinate of the third corner in the pixel coordinate system, and acquiring the depth value of the third corner from the depth map.
In the embodiment of the present application, the third corner belongs to the side, that is, the third corner is a corner point in the side. And a connecting line between the third corner point and the first corner point is a height edge of the object to be measured, wherein the height edge refers to an edge with the length of the height of the object to be measured. For example, in the object to be measured shown in fig. 4, the lengths of three sides AE, BF, and CG may all be used to represent the height of the object to be measured, i.e., AE, BF, and CG are all height sides.
12. And obtaining a third image point of the third corner point in the camera coordinate system according to the internal parameter, the depth value of the third corner point and the coordinate of the third corner point in the pixel coordinate system.
The measuring device converts the coordinate of the third corner in the pixel coordinate system of the two-dimensional image and the depth value of the third corner according to the internal parameters, and an image point of the third corner in the camera coordinate system, namely a third image point, can be obtained.
13. And determining the distance between the first image point and the third image point as the height of the object to be measured.
Optionally, the measuring device performs corner point detection processing on the two-dimensional image, so that not only the coordinates of the corner point of the object to be measured in the two-dimensional image can be obtained, but also the confidence of the corner point in the lower bottom surface of the object to be measured can be obtained. For example (example 1), the measurement apparatus performs corner point detection processing on fig. 4, and determining the corner points in the lower bottom surface of the object to be measured includes: and obtaining the position of the point E in the image, the position of the point F in the image and the position of the point G in the image, and obtaining the confidence coefficient of the position of the point E in the image, the confidence coefficient of the position of the point F in the image and the confidence coefficient of the position of the point G in the image.
The measuring device selects k angular points with the maximum confidence level from the lower bottom surface of the object to be measured as height angular points. And calculating the average value of the lengths of the height sides where the height angular points are located to be used as the height of the object to be measured. Continuing the example following example 1, assuming that k is 2, the confidence of the position of point E in the image is greater than the confidence of the position of point F in the image, which is greater than the confidence of the position of point G in the image. At this time, points E and F are height corner points. The measuring device calculates the length average of AE and BF as the height of the object to be measured.
As an alternative embodiment, the measuring apparatus performs the following steps in the process of performing step 302:
14. and acquiring a first depth value of the first line segment from the depth map.
In the implementation of the present application, two end points of the first line segment are a first corner point and a second corner point, respectively. The depth value of the first line segment is positively correlated with the depth value of the first angular point, and/or the depth value of the first line segment is positively correlated with the depth value of the second angular point.
In one possible implementation, the depth value of the first line segment is a depth value of a midpoint of the first line segment. For example, the midpoint of the first line segment is the reference midpoint. The measuring device obtains a depth value of the reference midpoint from the depth map, and takes the depth value of the reference midpoint as a depth value of the first line segment, i.e., a first depth value.
In another possible implementation manner, the depth value of the first line segment is an average of the depth value of the first corner point and the depth value of the third corner point.
15. And acquiring the mapping relation between the depth value and the transmission factor.
In the embodiment of the application, the transmission factor represents the conversion relation between the size in the image and the real size. For example, the transmission factor of a pixel represents the conversion relationship between the size of the pixel and the size of the object point corresponding to the pixel. For another example, the transmission factor of the stool in the image characterizes a conversion relationship between the length of the stool in the image and the true length of the stool.
The ratio between the size in the image and the true size is called the transmittance, and the transmission factor is inversely related to the transmittance. For example, the image includes a pixel point a and a pixel point B, where the transmission factor of the pixel point a is greater than that of the pixel point B, the object point corresponding to the pixel point a is an object point a, and the object point corresponding to the pixel point B is an object point B. Because the transmission factor of the pixel point a is larger than that of the pixel point B, the transmission ratio of the pixel point a to the object point A is smaller than that of the pixel point B to the object point B. And because the size of the pixel point a is the same as that of the pixel point B, the size of the object point A is larger than that of the object point B.
In the embodiment of the application, the transmission factor is positively correlated with the depth value. For example, the image includes a pixel point a and a pixel point b, where the depth value of the pixel point a is greater than the depth value of the pixel point b. At this time, the size of the object point corresponding to the pixel point a is larger than that of the object point corresponding to the pixel point b.
16. And obtaining the transmission factor of the first line segment according to the mapping relation and the first depth value.
In the embodiment of the present application, the transmission factor of the first line segment represents a conversion relationship between the length of the first line segment and the length of the second line segment, where the second line segment is a physical line segment corresponding to the first line segment, that is, the length of the second line segment is a physical distance of the length of the first line segment.
17. And obtaining the distance between the first corner point and the second corner point according to the transmission factor of the first line segment and the length of the first line segment.
In one possible implementation, the transmission factor of the first line segment characterizes a ratio of a length of the first line segment to a length of the second line segment. The measuring device calculates the quotient of the length of the first line segment and the transmission factor of the first line segment to obtain the length of the second line segment, namely the distance between the first corner point and the second corner point.
In another possible implementation, the transmission factor of the first line segment characterizes a ratio of the length of the second line segment to the length of the first line segment. The measuring device calculates the product of the length of the first line segment and the transmission factor of the first line segment to obtain the length of the second line segment, namely the distance between the first corner point and the second corner point.
As an alternative embodiment, the surface of the object to be measured comprises a non-body structure, wherein the non-body structure comprises at least one of: concave structure, protruding structure. For example, in the microwave oven shown in fig. 4, the handle is a surface convex structure. The measuring device further performs the steps of:
18. and carrying out edge detection processing on the two-dimensional image to obtain a first edge and a second edge.
In the embodiment of the present application, the edge detection processing is used to detect the position of an edge in an image and the gradient magnitude of the edge. Optionally, the edge detection process may be implemented by one of the following methods: canny edge detection algorithm, sobel (sobel) operator, Roberts edge detector, laplacian of gaussian (LOG) edge detector.
The measuring device can obtain an edge detection result of the two-dimensional image by performing edge detection processing on the two-dimensional image, wherein the edge detection result comprises: where in the two-dimensional image there are edges, the gradient magnitude of the edges.
In an embodiment of the application, the first edge and the second edge are located in the same surface of the object to be measured, and the gradient of the first edge and the gradient of the second edge exceed the edge threshold. For example, the first edge and the second edge are both located on the upper bottom surface of the object to be measured; the first edge and the second edge are both located on the front side of the object to be measured.
In the embodiment of the present application, the edge where the gradient exceeds the edge threshold is regarded as the edge of the non-body structure. For example, the interface between the handle and the front of the microwave oven shown in fig. 4 is the edge of the handle.
19. Determining a distance between the first edge and the second edge as a dimension of the non-body structure.
The measuring device obtains a first edge and a second edge by performing edge detection processing on the two-dimensional image, and regards the first edge and the second edge as two edges of the non-main body structure. In this way, the measurement device may obtain the dimensions of the non-host structure by determining the distance between the first edge and the second edge.
For example, the measuring device performs edge detection processing on the two-dimensional image shown in fig. 4 to obtain the edge of the handle on the front surface of the microwave oven, and further obtain the length of the handle.
It should be understood that the above-mentioned first edge and second edge are only examples, and in practical applications, the measuring apparatus may obtain the number of edges located on the same surface by performing an edge detection process on the two-dimensional image, which may exceed 2. For example, the object to be measured has a rectangular recess on its bottom surface. The measuring apparatus determines that there are four edges whose gradients exceed the gradient threshold in the upper bottom surface by performing edge detection processing on the two-dimensional image containing the object to be measured, and determines the directions of the four edges. Specifically, among the four edges, the direction of the first edge is the same as the direction of the second edge, and the direction of the third edge is the same as the direction of the fourth edge. The measuring device can obtain the length and the width of the rectangular groove by determining the distance between the first edge and the second edge and the distance between the third edge and the fourth edge.
As an alternative embodiment, before performing step 19, the measuring apparatus further performs the following steps:
20. the first edge and the second edge are respectively subjected to curvature detection processing to obtain a first curvature of the first edge and a second curvature of the second edge.
In the embodiment of the present application, the curvature detection process is used to calculate the curvature of an edge (which will be referred to as a boundary edge hereinafter) where the gradient exceeds a gradient threshold. The measuring apparatus obtains the curvature of the first edge (i.e., the first curvature) and the curvature of the second edge (i.e., the second curvature) by performing curvature detection processing on the first edge and the second edge, respectively.
Because the curvatures of the edges of the non-main body structure are all larger, the measuring device determines the boundary edge as the edge of the non-main body structure under the condition that the curvature of the boundary edge is determined to exceed the curvature threshold value, and the accuracy rate of judging the edge of the non-main body structure can be improved.
For example, assuming that two cracks and two handles are present in the upper bottom surface of the storage box, the measuring apparatus may determine that four boundary edges, i.e., a first edge, a second edge, a third edge, and a fourth edge, are present in the upper bottom surface by performing edge detection processing on a two-dimensional image including the storage box. The measuring device further performs curvature detection processing on the four boundary edges, and determines that the curvature of the first edge and the curvature of the second edge both exceed a curvature threshold, and the curvature of the third edge and the curvature of the fourth edge both do not exceed the curvature threshold. The measuring device in turn determines the distance between the first edge and the second edge as the length of the handle. In this example, the measuring device may avoid the distance between the slits as a dimension of the handle.
Thus, as an alternative embodiment, the measuring device performs step 19 in the event that it is determined that both the first curvature and the second curvature exceed the curvature threshold.
As an alternative embodiment, the measuring device performs the following steps on the basis of performing the above steps:
21. and acquiring dynamic display information.
In an embodiment of the present application, the dynamic display information includes display information of a special effect type, where the special effect type includes at least one of: scroll bar, progress bar, and bold the edge of the object to be measured.
For example, the dynamic display information may be to thicken a height edge of the object to be measured. For another example, the dynamic display information may be in the form of a progress bar that displays the progress of the measurement of the side length dimension. For another example, when measuring a side length of a certain edge on an object to be measured, a scroll bar is displayed between two corner points of the side length.
22. And according to the dynamic display information, performing dynamic display processing on the first corner point and the second corner point, and displaying the first corner point and the second corner point after the dynamic display processing.
In a possible implementation manner, after the measuring device measures the distance between the first corner point and the second corner point, the measuring device performs dynamic display processing on the first corner point and the second corner point according to the dynamic display information, and displays the first corner point and the second corner point after the dynamic display processing.
For example, assuming that a connecting line between the first corner and the second corner is a line segment AB, the dynamic display information includes: a scroll bar is displayed. The measuring device displays a scroll bar between the first corner point and the second corner point during the process of measuring the length of the AB, i.e. displays the scroll bar on the AB. To indicate that the measuring device has measured the length of the AB. Alternatively, the measuring device may determine the speed of movement of the scroll bar in dependence on the length of AB.
In another possible implementation manner, before measuring the distance between the first corner point and the second corner point, the measuring device performs dynamic display processing on the first corner point and the second corner point according to the dynamic display information, and displays the first corner point and the second corner point after the dynamic display processing.
For example, assuming that a connecting line between the first corner and the second corner is a line segment AB, the dynamic display information includes: the edges of the object to be measured are thickened. The measuring device displays the line segment AB in bold, before measuring the length of AB. To indicate that the measuring device is to measure the length of AB.
In a possible implementation manner, the measuring device performs dynamic display processing on the first corner point and the second corner point according to the dynamic display information in a process of measuring a distance between the first corner point and the second corner point, and displays the first corner point and the second corner point after the dynamic display processing.
For example, assuming that a connecting line between the first corner and the second corner is a line segment AB, the dynamic display information includes: and displaying the progress bar. The measuring device displays a progress bar between the first corner point and the second corner point during the measurement of the length of the AB, i.e. displays a scroll bar on the AB. To indicate the progress of the measuring device in measuring the length of the AB.
As an alternative embodiment, the dynamic display information includes interface elements, and the measuring device performs the following steps in the process of performing step 22:
23. and moving the interface element from the first corner point to the second corner point.
In the embodiment of the present application, the interface element may be any graphic. The measuring device measures the first corner point
As an alternative embodiment, the measuring device performs the following steps in the process of performing step 23:
24. and obtaining the moving speed of the scroll bar according to the distance between the first angular point and the second angular point.
In the embodiment of the present application, the moving speed of the scroll bar is negatively correlated with the distance between the first corner point and the second corner point. The measuring device determines the moving speed of the scroll bar according to the distance between the first corner point and the second corner point, so that a user can feel the real distance between the first corner point and the second corner point more intuitively according to the speed of the scroll bar.
25. And moving the scroll bar from the first corner point to the second corner point at the moving speed.
In daily life, people often need to measure the size of objects (such as cartons, tables and cabinets). It is time and labor consuming for people to measure the size of an object using a ruler. With the development of science and technology, the hardware configuration of the terminal is higher and higher, and the terminal can measure the size of an object by using the technical scheme disclosed by the embodiment of the application.
Specifically, the terminal is loaded with an RGB camera and a depth camera. Under the condition that a measuring instruction for an object to be measured is received, the terminal shoots the object to be measured by using the RGB camera to obtain a two-dimensional image, and the terminal shoots the object to be measured by using the depth camera to obtain a depth map. The measurement instruction includes at least one of: voice, text, click operation, touch operation. The terminal processes the acquired RGB image and the depth map by using the technical scheme, and the size of the object to be measured can be obtained.
Based on the technical scheme provided by the embodiment of the application, the embodiment of the application also provides several possible application scenarios.
Scene 1: the xiao ming is related to the moving company helping the moving company, but the moving company needs to inform the size of the object needing to be moved in a small format, so the xiao ming needs to measure the size of the object needing to be moved. Since there are many things to be carried, it is troublesome to measure the size of the things (such as a table, a cabinet, a washing machine, hereinafter, referred to as an object to be measured) with a ruler. Therefore, the xiaoming inputs an instruction for measuring an object to be measured into the mobile phone by clicking the mobile phone, and when the mobile phone receives the instruction, the mobile phone with the RGB camera and the TOF camera is used for shooting the object to be measured, so as to obtain an RGB image and a depth map of the RGB image containing the object to be measured. The mobile phone can further process the RGB image and the depth map by using the technical scheme disclosed above to obtain the size of the object to be measured. Therefore, the user does not need to use a ruler to measure the size of the object to be measured, and only needs to use a mobile phone to shoot the object to be measured, so that the size of the object to be measured can be obtained.
Scene 2: as e-commerce grows rapidly, more and more people shop through e-commerce, which also presents more challenges to the logistics industry, including improving the efficiency of measuring the size of goods to be delivered.
Today's logistics distribution is becoming more standard, and before transporting the goods to be distributed, all will pack the goods to be distributed with the carton. Because the shape of carton is the cuboid, the terminal uses the size that the technical scheme disclosed in this application embodiment can the accurate measurement carton. For example, a worker of a logistics company may use a terminal (e.g., a mobile phone or a tablet computer) loaded with an RGB camera and a TOF camera to shoot a carton to be measured, so as to obtain an RGB image and a depth map of the RGB image containing the carton to be measured. The terminal can further process the RGB image and the depth map by using the technical scheme disclosed above to obtain the size of the carton to be measured. Like this, can reduce the required human cost of measurement carton size, improve the efficiency of measuring the carton size.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
The method of the embodiments of the present application is set forth above in detail and the apparatus of the embodiments of the present application is provided below.
Referring to fig. 11, fig. 11 is a schematic structural diagram of a measurement apparatus according to an embodiment of the present application, where the apparatus 1 includes: an acquisition unit 11, a first processing unit 12, a first detection unit 13, a second processing unit 14, a second detection unit 15, a display unit 16, an RGB camera 17, and a depth camera 18, wherein:
an acquisition unit 11 configured to acquire a two-dimensional image including an object to be measured and a depth map of the two-dimensional image; the two-dimensional image is acquired through two-dimensional imaging equipment;
a first processing unit 12, configured to obtain, according to the two-dimensional image and the depth map, a distance between a first corner point of the object to be measured and a second corner point of the object to be measured.
In combination with any embodiment of the present application, the first processing unit 12 is configured to:
according to the two-dimensional image and the depth map, obtaining a first image point of a first angular point of the object to be measured under a camera coordinate system of the two-dimensional imaging equipment, and a second image point of a second angular point of the object to be measured under the camera coordinate system;
and determining the distance between the first image point and the second image point as the distance between the first corner point and the second corner point.
With reference to any one of the embodiments of the present application, the object to be measured includes a surface to be measured; the first processing unit 12 is configured to:
obtaining a projection plane of the surface to be measured under a camera coordinate system of the two-dimensional imaging device according to coordinates of at least three first points in the surface to be measured in the two-dimensional image and depth values of the at least three first points obtained from the depth map;
the first image point and the second image point are determined from the projection plane.
In combination with any embodiment of the present application, the first processing unit 12 is configured to:
acquiring a first straight line passing through the first angular point and an optical center of the two-dimensional imaging equipment, and a second straight line passing through the second angular point and the optical center;
and determining an intersection point between the first straight line and the projection plane to obtain the first image point, and determining an intersection point between the second straight line and the projection plane to obtain the second image point.
In combination with any embodiment of the present application, the first processing unit 12 is configured to:
acquiring a first coordinate of the first corner point under an image coordinate system of the two-dimensional imaging equipment, a second coordinate of the second corner point under the image coordinate system and a third coordinate of an optical center of the two-dimensional imaging equipment under the camera coordinate system;
obtaining the first straight line according to the first coordinate and the third coordinate;
and obtaining the second straight line according to the second coordinate and the third coordinate.
In combination with any embodiment of the present application, the first processing unit 12 is configured to:
acquiring internal parameters of the two-dimensional imaging equipment and coordinates of the at least three first points in a pixel coordinate system of the two-dimensional image;
obtaining image points of the at least three first points in the camera coordinate system according to the internal parameters, the depth values of the at least three first points and the coordinates of the at least three first points in the pixel coordinate system;
and performing plane fitting processing on the image points of the at least three first points in the camera coordinate system to obtain the projection plane.
In combination with any embodiment of the present application, the shape of the object to be measured is a regular shape or a quasi-regular shape, and the surface to be measured is the upper bottom surface of the object to be measured;
the obtaining unit 11 is configured to:
acquiring coordinates of the third corner point in the pixel coordinate system, and acquiring a depth value of the third corner point from the depth map; a connecting line between the third corner point and the first corner point is a height edge of the object to be measured;
the first processing unit 12 is configured to:
obtaining a third image point of the third corner point in the camera coordinate system according to the internal parameter, the depth value of the third corner point and the coordinate of the third corner point in the pixel coordinate system;
determining a distance between the first image point and the third image point as a height of the object to be measured.
In combination with any embodiment of the present application, the first processing unit 12 is configured to:
obtaining a first depth value of a first line segment from the depth map; two end points of the first line segment are the first angular point and the second angular point respectively; the first depth value is positively correlated with the depth value of the first angular point, and/or the first depth value is positively correlated with the depth value of the second angular point;
acquiring a mapping relation between the depth value and the transmission factor; the transmission factor characterizes a conversion relationship between a dimension in the image and a true dimension; the transmission factor is positively correlated with the depth value;
obtaining a transmission factor of the first line segment according to the mapping relation and the first depth value;
and obtaining the distance between the first corner point and the second corner point according to the transmission factor of the first line segment and the length of the first line segment.
In combination with any one of the embodiments of the present application, the surface of the object to be measured includes a non-body structure, and the apparatus 1 further includes:
a first detection unit 13, configured to perform edge detection processing on the two-dimensional image to obtain a first edge and a second edge; the first edge and the second edge are located in the same surface of the object to be measured, and the gradient of the first edge and the gradient of the second edge exceed an edge threshold;
a second processing unit 14 for determining a distance between the first edge and the second edge as a dimension of the non-body structure.
In combination with any of the embodiments of the present application, the apparatus 1 further includes:
a second detecting unit 15, configured to, after the performing the edge detection processing on the two-dimensional image to obtain a first edge and a second edge, perform curvature detection processing on the first edge and the second edge respectively to obtain a first curvature of the first edge and a second curvature of the second edge before determining a distance between the first edge and the second edge as a size of the non-body structure;
the second processing unit 14 is configured to:
determining a distance between the first edge and the second edge as a dimension of the non-body structure if both the first curvature and the second curvature exceed a curvature threshold.
With reference to any embodiment of the present application, the obtaining unit 11 is further configured to:
acquiring dynamic display information;
the measuring device 1 further comprises:
and a display unit 16, configured to perform dynamic display processing on the first corner point and the second corner point according to the dynamic display information, and display the first corner point and the second corner point after the dynamic display processing.
In combination with any embodiment of the present application, the dynamic display information includes an interface element; the display unit 16 is configured to:
moving the scrollbar from the first corner point to the second corner point.
In combination with any embodiment of the present application, the display unit 16 is configured to:
obtaining the moving speed of the scroll bar according to the distance between the first angular point and the second angular point; the moving speed is in negative correlation with the distance between the first corner point and the second corner point;
moving the interface element from the first corner point to the second corner point at the movement speed.
In combination with any embodiment of the present application, the two-dimensional imaging device includes an RGB camera 17; the RGB camera 17 belongs to the measuring device 1; the measuring device 1 further comprises a depth camera 18; in the case where the measurement instruction for the object to be measured is detected, the measurement apparatus 1 captures the object to be measured using the RGB camera 17 and the depth camera 18, respectively, to obtain the two-dimensional image and the depth map.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present application may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
Fig. 12 is a schematic hardware structure diagram of a measurement apparatus according to an embodiment of the present application. The measuring device 2 comprises a processor 21, a memory 22, an input device 23, an output device 24. The processor 21, the memory 22, the input device 23 and the output device 24 are coupled by a connector, which includes various interfaces, transmission lines or buses, etc., and the embodiment of the present application is not limited thereto. It should be appreciated that in various embodiments of the present application, coupled refers to being interconnected in a particular manner, including being directly connected or indirectly connected through other devices, such as through various interfaces, transmission lines, buses, and the like.
The processor 21 may be one or more Graphics Processing Units (GPUs), and in the case that the processor 21 is one GPU, the GPU may be a single-core GPU or a multi-core GPU. Alternatively, the processor 21 may be a processor group composed of a plurality of GPUs, and the plurality of processors are coupled to each other through one or more buses. Alternatively, the processor may be other types of processors, and the like, and the embodiments of the present application are not limited.
Memory 22 may be used to store computer program instructions, as well as various types of computer program code for executing the program code of aspects of the present application. Alternatively, the memory includes, but is not limited to, Random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), or compact disc read-only memory (CD-ROM), which is used for associated instructions and data.
The input means 23 are for inputting data and/or signals and the output means 24 are for outputting data and/or signals. The input device 23 and the output device 24 may be separate devices or may be an integral device.
It is understood that, in the embodiment of the present application, the memory 22 may be used to store not only the relevant instructions, but also relevant data, for example, the memory 22 may be used to store the two-dimensional image acquired through the input device 23, or the memory 22 may also be used to store the distance between the first corner point and the second corner point obtained through the processor 21, and the like, and the embodiment of the present application is not limited to the data specifically stored in the memory.
It will be appreciated that fig. 12 shows only a simplified design of a measuring device. In practical applications, the measuring devices may also respectively include other necessary components, including but not limited to any number of input/output devices, processors, memories, etc., and all measuring devices that can implement the embodiments of the present application are within the scope of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. It is also clear to those skilled in the art that the descriptions of the various embodiments of the present application have different emphasis, and for convenience and brevity of description, the same or similar parts may not be repeated in different embodiments, so that the parts that are not described or not described in detail in a certain embodiment may refer to the descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)), or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., Digital Versatile Disk (DVD)), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media that can store program codes, such as a read-only memory (ROM) or a Random Access Memory (RAM), a magnetic disk, or an optical disk.

Claims (12)

1. A method of measurement, the method comprising:
acquiring a two-dimensional image containing an object to be measured and a depth map of the two-dimensional image; the two-dimensional image is acquired through two-dimensional imaging equipment;
and obtaining the distance between the first corner point of the object to be measured and the second corner point of the object to be measured according to the two-dimensional image and the depth map.
2. The method according to claim 1, wherein obtaining a distance between a first corner point of the object to be measured and a second corner point of the object to be measured from the two-dimensional image and the depth map comprises:
according to the two-dimensional image and the depth map, obtaining a first image point of a first angular point of the object to be measured under a camera coordinate system of the two-dimensional imaging equipment, and a second image point of a second angular point of the object to be measured under the camera coordinate system;
and determining the distance between the first image point and the second image point as the distance between the first corner point and the second corner point.
3. The method according to claim 2, characterized in that the object to be measured comprises a surface to be measured;
the obtaining, according to the two-dimensional image and the depth map, a first image point of a first corner point of the object to be measured in a camera coordinate system of the two-dimensional imaging device and a second image point of a second corner point of the object to be measured in the camera coordinate system includes:
obtaining a projection plane of the surface to be measured under a camera coordinate system of the two-dimensional imaging device according to coordinates of at least three first points in the surface to be measured in the two-dimensional image and depth values of the at least three first points obtained from the depth map;
the first image point and the second image point are determined from the projection plane.
4. A method according to any one of claims 1 to 3, wherein the object surface to be measured contains non-body structures, the method further comprising:
performing edge detection processing on the two-dimensional image to obtain a first edge and a second edge; the first edge and the second edge are located in the same surface of the object to be measured, and the gradient of the first edge and the gradient of the second edge exceed an edge threshold;
determining a distance between the first edge and the second edge as a dimension of the non-body structure.
5. The method of claim 4, wherein after the performing the edge detection process on the two-dimensional image to obtain a first edge and a second edge, before the determining the distance between the first edge and the second edge as the size of the non-body structure, the method further comprises:
respectively carrying out curvature detection processing on the first edge and the second edge to obtain a first curvature of the first edge and a second curvature of the second edge;
the determining a distance between the first edge and the second edge as a dimension of the non-body structure comprises:
determining a distance between the first edge and the second edge as a dimension of the non-body structure if both the first curvature and the second curvature exceed a curvature threshold.
6. The method according to any one of claims 1 to 5, further comprising:
acquiring dynamic display information;
and according to the dynamic display information, performing dynamic display processing on the first corner point and the second corner point, and displaying the first corner point and the second corner point after the dynamic display processing.
7. The method of claim 6, wherein the dynamic display information comprises an interface element; the dynamic display processing is performed on the first corner point and the second corner point according to the dynamic display information, and the first corner point and the second corner point after the dynamic display processing are displayed, including:
moving the interface element from the first corner point to the second corner point.
8. The method of claim 7, wherein said moving the scrollbar from the first corner point to the second corner point comprises:
obtaining the moving speed of the scroll bar according to the distance between the first angular point and the second angular point; the moving speed is in negative correlation with the distance between the first corner point and the second corner point;
moving the scrollbar from the first corner point to the second corner point at the speed of movement.
9. The method of any of claims 1 to 8, wherein the two-dimensional imaging device comprises an RGB camera; the RGB camera belongs to a terminal; the terminal also comprises a depth camera; the acquiring a two-dimensional image containing an object to be measured and a depth map of the two-dimensional image includes:
and under the condition that a measuring instruction for the object to be measured is detected, the terminal respectively shoots the object to be measured by using the RGB camera and the depth camera to obtain the two-dimensional image and the depth map.
10. A measuring device, characterized in that the device comprises:
an acquisition unit configured to acquire a two-dimensional image including an object to be measured and a depth map of the two-dimensional image; the two-dimensional image is acquired through two-dimensional imaging equipment;
and the first processing unit is used for obtaining the distance between the first corner point of the object to be measured and the second corner point of the object to be measured according to the two-dimensional image and the depth map.
11. An electronic device, comprising: a processor and a memory for storing computer program code comprising computer instructions which, if executed by the processor, the electronic device performs the method of any of claims 1 to 9.
12. A computer-readable storage medium, in which a computer program is stored, which computer program comprises program instructions which, if executed by a processor, cause the processor to carry out the method of any one of claims 1 to 9.
CN202010899124.3A 2020-08-31 2020-08-31 Measurement method and device, electronic equipment and storage medium Active CN112150527B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010899124.3A CN112150527B (en) 2020-08-31 2020-08-31 Measurement method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010899124.3A CN112150527B (en) 2020-08-31 2020-08-31 Measurement method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112150527A true CN112150527A (en) 2020-12-29
CN112150527B CN112150527B (en) 2024-05-17

Family

ID=73890296

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010899124.3A Active CN112150527B (en) 2020-08-31 2020-08-31 Measurement method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112150527B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113188465A (en) * 2021-04-21 2021-07-30 中铁第四勘察设计院集团有限公司 Drilling hole depth identification method and device based on video learning

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013179712A1 (en) * 2012-05-30 2013-12-05 日立コンシューマエレクトロニクス株式会社 Information processing device, information processing method, and program
US20140204120A1 (en) * 2013-01-23 2014-07-24 Fujitsu Limited Image processing device and image processing method
US20140327736A1 (en) * 2013-05-01 2014-11-06 Legend3D, Inc. External depth map transformation method for conversion of two-dimensional images to stereoscopic images
CN106125994A (en) * 2016-06-17 2016-11-16 深圳迪乐普数码科技有限公司 Coordinate matching method and use control method and the terminal of this coordinate matching method
US20170150123A1 (en) * 2015-11-24 2017-05-25 Nokia Technologies Oy High-Speed Depth Sensing With A Hybrid Camera Setup
JP2017151094A (en) * 2016-02-08 2017-08-31 ゼネラル・エレクトリック・カンパニイ Method and device for automatically identifying point of interest in depth measurement of viewed object
WO2017167239A1 (en) * 2016-03-31 2017-10-05 纳恩博(北京)科技有限公司 Mobile control method, mobile electronic apparatus and mobile control system, and storage medium
WO2019136315A2 (en) * 2018-01-05 2019-07-11 Aquifi, Inc. Systems and methods for volumetric sizing
CN110006343A (en) * 2019-04-15 2019-07-12 Oppo广东移动通信有限公司 Measurement method, device and the terminal of object geometric parameter
CN110060240A (en) * 2019-04-09 2019-07-26 南京链和科技有限公司 A kind of tyre contour outline measurement method based on camera shooting
CN110188495A (en) * 2019-06-04 2019-08-30 中住(北京)数据科技有限公司 A method of the two-dimentional floor plan based on deep learning generates three-dimensional floor plan
CN110276774A (en) * 2019-06-26 2019-09-24 Oppo广东移动通信有限公司 Drawing practice, device, terminal and the computer readable storage medium of object
US20200082554A1 (en) * 2018-09-06 2020-03-12 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for processing three-dimensional data, device and storage medium
CN110889890A (en) * 2019-11-29 2020-03-17 深圳市商汤科技有限公司 Image processing method and device, processor, electronic device and storage medium
CN110893617A (en) * 2018-09-13 2020-03-20 深圳市优必选科技有限公司 Obstacle detection method and device and storage device
CN111160178A (en) * 2019-12-19 2020-05-15 深圳市商汤科技有限公司 Image processing method and device, processor, electronic device and storage medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013179712A1 (en) * 2012-05-30 2013-12-05 日立コンシューマエレクトロニクス株式会社 Information processing device, information processing method, and program
US20140204120A1 (en) * 2013-01-23 2014-07-24 Fujitsu Limited Image processing device and image processing method
US20140327736A1 (en) * 2013-05-01 2014-11-06 Legend3D, Inc. External depth map transformation method for conversion of two-dimensional images to stereoscopic images
US20170150123A1 (en) * 2015-11-24 2017-05-25 Nokia Technologies Oy High-Speed Depth Sensing With A Hybrid Camera Setup
JP2017151094A (en) * 2016-02-08 2017-08-31 ゼネラル・エレクトリック・カンパニイ Method and device for automatically identifying point of interest in depth measurement of viewed object
WO2017167239A1 (en) * 2016-03-31 2017-10-05 纳恩博(北京)科技有限公司 Mobile control method, mobile electronic apparatus and mobile control system, and storage medium
CN106125994A (en) * 2016-06-17 2016-11-16 深圳迪乐普数码科技有限公司 Coordinate matching method and use control method and the terminal of this coordinate matching method
WO2019136315A2 (en) * 2018-01-05 2019-07-11 Aquifi, Inc. Systems and methods for volumetric sizing
US20200082554A1 (en) * 2018-09-06 2020-03-12 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for processing three-dimensional data, device and storage medium
CN110893617A (en) * 2018-09-13 2020-03-20 深圳市优必选科技有限公司 Obstacle detection method and device and storage device
CN110060240A (en) * 2019-04-09 2019-07-26 南京链和科技有限公司 A kind of tyre contour outline measurement method based on camera shooting
CN110006343A (en) * 2019-04-15 2019-07-12 Oppo广东移动通信有限公司 Measurement method, device and the terminal of object geometric parameter
CN110188495A (en) * 2019-06-04 2019-08-30 中住(北京)数据科技有限公司 A method of the two-dimentional floor plan based on deep learning generates three-dimensional floor plan
CN110276774A (en) * 2019-06-26 2019-09-24 Oppo广东移动通信有限公司 Drawing practice, device, terminal and the computer readable storage medium of object
CN110889890A (en) * 2019-11-29 2020-03-17 深圳市商汤科技有限公司 Image processing method and device, processor, electronic device and storage medium
CN111160178A (en) * 2019-12-19 2020-05-15 深圳市商汤科技有限公司 Image processing method and device, processor, electronic device and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ADAMS等: "Multitemporal accuracy and precision assessment of unmanned aerial system photogrammetry for slope-scale snow depth maps in Alpine terrain", 《PURE AND APPLIED GEOPHYSICS》, vol. 175, 14 December 2017 (2017-12-14), pages 3303 - 3324, XP036597618, DOI: 10.1007/s00024-017-1748-y *
何立新: "单目视觉图像深度测量方法研究", 《中国博士学位论文全文数据库 信息科技辑》, no. 11, 15 November 2018 (2018-11-15), pages 138 - 10 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113188465A (en) * 2021-04-21 2021-07-30 中铁第四勘察设计院集团有限公司 Drilling hole depth identification method and device based on video learning

Also Published As

Publication number Publication date
CN112150527B (en) 2024-05-17

Similar Documents

Publication Publication Date Title
US11842438B2 (en) Method and terminal device for determining occluded area of virtual object
US11120254B2 (en) Methods and apparatuses for determining hand three-dimensional data
CN107223269B (en) Three-dimensional scene positioning method and device
CN110006343B (en) Method and device for measuring geometric parameters of object and terminal
CN104380338B (en) Information processor and information processing method
US7554575B2 (en) Fast imaging system calibration
US9208547B2 (en) Stereo correspondence smoothness tool
CN110276774B (en) Object drawing method, device, terminal and computer-readable storage medium
CN104634242A (en) Point adding system and method of probe
CN112686877A (en) Binocular camera-based three-dimensional house damage model construction and measurement method and system
CN115439607A (en) Three-dimensional reconstruction method and device, electronic equipment and storage medium
JP6817742B2 (en) Information processing device and its control method
CN112197708B (en) Measuring method and device, electronic device and storage medium
CN112657176A (en) Binocular projection man-machine interaction method combined with portrait behavior information
CN112102391A (en) Measuring method and device, electronic device and storage medium
CN111354029A (en) Gesture depth determination method, device, equipment and storage medium
CN112634366B (en) Method for generating position information, related device and computer program product
CN112150527B (en) Measurement method and device, electronic equipment and storage medium
JP6579659B2 (en) Light source estimation apparatus and program
CN113379826A (en) Method and device for measuring volume of logistics piece
US10417783B2 (en) Image processing apparatus, image processing method, and storage medium
CN112102390A (en) Measuring method and device, electronic device and storage medium
US10796435B2 (en) Image processing method and image processing apparatus
CN112146628B (en) Measurement method and device, electronic equipment and storage medium
CN111368675A (en) Method, device and equipment for processing gesture depth information and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant