WO2020211626A1 - 物体几何参数的测量方法、装置和终端 - Google Patents

物体几何参数的测量方法、装置和终端 Download PDF

Info

Publication number
WO2020211626A1
WO2020211626A1 PCT/CN2020/082081 CN2020082081W WO2020211626A1 WO 2020211626 A1 WO2020211626 A1 WO 2020211626A1 CN 2020082081 W CN2020082081 W CN 2020082081W WO 2020211626 A1 WO2020211626 A1 WO 2020211626A1
Authority
WO
WIPO (PCT)
Prior art keywords
measurement
coordinates
measured
terminal
point
Prior art date
Application number
PCT/CN2020/082081
Other languages
English (en)
French (fr)
Inventor
邓健
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP20790606.6A priority Critical patent/EP3943881A4/en
Publication of WO2020211626A1 publication Critical patent/WO2020211626A1/zh
Priority to US17/478,611 priority patent/US20220003537A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/03Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring coordinates of points
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/28Measuring arrangements characterised by the use of optical techniques for measuring areas
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/28Measuring arrangements characterised by the use of optical techniques for measuring areas
    • G01B11/285Measuring arrangements characterised by the use of optical techniques for measuring areas using photoelectric detection means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B5/00Measuring arrangements characterised by the use of mechanical techniques
    • G01B5/0035Measuring of dimensions of trees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • This application belongs to the field of measurement technology, and in particular relates to a method, device and terminal for measuring geometric parameters of an object.
  • AR technology is a technology that can realize the combination and interaction between the virtual world on the screen and the real world scene.
  • the measurement data in the virtual environment can be superimposed on the image of the real environment captured by the camera to provide users with measurement convenience.
  • the embodiments of the present application provide a method, device, and terminal for measuring geometric parameters of an object, which can solve the technical problem of large measurement errors in the current AR measurement technology.
  • the first aspect of the embodiments of the present application provides a method for measuring geometric parameters of an object, including:
  • Receive a measurement point selection instruction determine the coordinates of the measurement point selected by the measurement point selection instruction in the three-dimensional coordinate system according to the pose data, the second depth image, and the two-dimensional image, and according to the The coordinates determine the geometric parameters of the object to be measured; the measurement point is a measurement point associated with the object to be measured;
  • the measurement mark of the measurement point and the geometric parameter of the object to be measured are displayed in the two-dimensional image displayed on the display interface.
  • a second aspect of the embodiments of the present application provides a measuring device for geometric parameters of an object, including:
  • An establishment unit configured to obtain a first depth image of a real environment captured by the camera component, and establish a three-dimensional coordinate system based on the real environment according to the first depth image;
  • the first display unit is used to obtain the pose data of the terminal, and obtain the second depth image and the two-dimensional image of the object to be measured, and display the two-dimensional image on the display interface of the terminal;
  • the determining unit is configured to receive a measurement point selection instruction, and determine the coordinates of the measurement point selected by the measurement point selection instruction in the three-dimensional coordinate system according to the pose data, the second depth image, and the two-dimensional image , And determine the geometric parameters of the object to be measured according to the coordinates; the measurement point is a measurement point associated with the object to be measured;
  • the second display unit is used to display the measurement mark of the measurement point and the geometric parameters of the object to be measured.
  • the third aspect of the embodiments of the present application provides a terminal, including a camera component, a memory, a processor, and a computer program stored in the memory and running on the processor, and the processor implements the steps of the above method when the computer program is executed.
  • the fourth aspect of the embodiments of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the steps of the foregoing method are implemented.
  • FIG. 1 is a schematic diagram of the implementation process of a method for measuring geometric parameters of an object provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of the measurement process of geometric parameters of an object provided by an embodiment of the present application.
  • step 103 is a schematic diagram of a specific implementation process of step 103 of a method for measuring geometric parameters of an object provided by an embodiment of the present application;
  • FIG. 4 is a first schematic diagram of a display interface provided by an embodiment of the present application.
  • FIG. 5 is a second schematic diagram of a display interface provided by an embodiment of the present application.
  • Fig. 6 is a third schematic diagram of a display interface provided by an embodiment of the present application.
  • FIG. 7 is a schematic flowchart of a method for measuring geometric parameters of an object in the first automatic measurement mode according to an embodiment of the present application
  • FIG. 8 is a schematic diagram of geometric parameter measurement of an irregularly shaped object provided by an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of an object geometric parameter measuring device provided by an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a terminal provided by an embodiment of the present application.
  • a monocular camera when used for AR measurement, it is generally necessary to guide the user to move the terminal.
  • the monocular camera takes multiple frames of continuous photos, and performs plane recognition according to predefined rules, taking the recognized plane as the reference Take AR measurement.
  • This measurement method not only has the problem of slow initialization speed, but also because it needs to rely on plane measurement or rely on rich texture scenes for measurement, which results in the vertical plane lacking rich texture and the color of the vertical plane close to the color of the horizontal plane connected to it. At this time, it will be difficult to identify the vertical plane, and it may even be mistaken for the vertical plane to be a horizontal plane that is very far from the terminal, resulting in larger measurement errors.
  • a three-dimensional coordinate system based on the real environment is established by acquiring the first depth image of the real environment captured by the camera assembly, and the initialization before measurement is completed; and after the initialization is completed, Acquire the pose data of the terminal in real time, and obtain the second depth image and two-dimensional image of the object to be measured; at the same time, receive the measurement point selection instruction triggered by the user, and according to the pose data, the second depth image, and the The two-dimensional image determines the coordinates of the measuring point selected by the measuring point selection instruction in the three-dimensional coordinate system, thereby determining the geometric parameters of the object to be measured according to the coordinates, and realizing the measurement of the geometric parameters of the object to be measured , And display the measurement mark of the measurement point and the geometric parameters of the object to be measured in the two-dimensional image displayed on the display interface; this measurement method does not need to rely on plane recognition, but is based on the pose of the terminal Data, the second depth image and the two
  • Figure 1 shows a schematic diagram of the implementation process of an object geometric parameter measurement method provided by an embodiment of the present application.
  • the measurement method is applied to a terminal and can be executed by an object geometric parameter measurement device configured on the terminal.
  • the terminal may include smart terminals such as smart phones, tablet computers, and learning machines.
  • the terminal may be equipped with a camera component, and the method for measuring the geometric parameters of the object includes step 101 to step 104.
  • Step 101 Obtain a first depth image of a real environment captured by a camera component, and establish a three-dimensional coordinate system based on the real environment according to the first depth image.
  • the first depth image of the real environment taken by the camera assembly must be acquired first to establish a three-dimensional coordinate system based on the real environment to complete the initialization of the measurement.
  • the camera component may include a depth camera and an RGB camera, wherein the depth camera is used to collect a depth image, and the RGB camera is used to collect a two-dimensional color image (two-dimensional image).
  • the gray value of each pixel of the depth image can be used to characterize the distance of a certain point in the scene from the camera.
  • the resolution of the depth camera may be equal to the resolution of the RGB camera, so that each pixel on the two-dimensional image can obtain accurate depth information.
  • the depth camera may be a TOF camera.
  • the camera component may also be a 3D camera that can simultaneously output a depth image and a two-dimensional color image.
  • the terminal When using the terminal to measure the geometric parameters of an object, you can first open the camera application, start the camera component, and acquire the first depth image of the real environment, and when the first depth image returned by the depth camera is received, the first depth image
  • the point in the real environment corresponding to any effective depth data in a depth image is the coordinate origin, and the three-dimensional coordinate system based on the real environment is established and used as a reference basis for coordinate calculation when measuring geometric parameters of an object.
  • the effective depth data refers to depth data whose depth value is within a preset depth value range.
  • the three-dimensional coordinate system based on the real environment is established.
  • the camera assembly has completed the initialization before measurement, and the measurement of the geometric parameters of the object can be started. Since the initialization process does not need to perform plane recognition, and also does not need to move the terminal and collect multiple frames of photos, when the application realizes the measurement of the geometric parameters of the object, it has the characteristics of fast initialization speed, which can achieve "second opening". "Effect.
  • Step 102 Obtain the pose data of the terminal, and obtain the second depth image and the two-dimensional image of the object to be measured, and display the two-dimensional image on the display interface of the terminal.
  • the user may move the terminal when using the terminal to measure the geometric parameters of the object, it is necessary to obtain the terminal pose data to determine the current moment of the terminal relative to the moment when the initialization is completed, and follow This determines the coordinates in the three-dimensional coordinate system corresponding to the second depth image of the object to be measured and the two-dimensional image taken by the camera component.
  • the acquiring the pose data of the terminal includes: starting when the establishment of the three-dimensional coordinate system is completed, using an inertial measurement unit IMU to acquire the 6-DOF pose data of the terminal in real time.
  • the object has six degrees of freedom in space, that is, the freedom of movement along the directions of the x, y, and z three orthogonal coordinate axes and the freedom of rotation around these three coordinate axes. Therefore, to completely determine the position of an object, these six degrees of freedom must be known.
  • An inertial measurement unit is a device that measures the three-axis angular velocity and acceleration of an object.
  • an IMU contains three single-axis accelerometers and three single-axis gyroscopes.
  • the accelerometer detects the acceleration of an object in the independent three-axis coordinate system of the carrier, while the gyroscope detects the angular velocity of the carrier relative to the navigation coordinate system. Measure the angular velocity and acceleration of the object in three-dimensional space, and calculate the posture of the object. Therefore, when acquiring the pose data of the terminal in this application, the inertial measurement unit IMU can be used to acquire it.
  • Step 103 Receive a measurement point selection instruction, determine the coordinates of the measurement point selected by the measurement point selection instruction in the three-dimensional coordinate system according to the pose data, the second depth image, and the two-dimensional image, and The geometric parameters of the object to be measured are determined according to the coordinates; the measurement point is a measurement point associated with the object to be measured.
  • the user can trigger the measurement point selection instruction in the display interface according to the two-dimensional image displayed on the display interface of the terminal ,
  • the terminal determines the coordinates of the measurement point selected by the measurement point selection instruction in the three-dimensional coordinate system according to the pose data, the second depth image, and the two-dimensional image, and according to the The coordinates determine the geometric parameters of the object to be measured.
  • the user can trigger the measurement point selection instruction by clicking 21 or sliding the drawn line 22 on the display interface 20 of the terminal to determine the object to be measured, which is already associated with the object to be measured Of measurement points.
  • the measurement point selected by the measurement point selection instruction is in the three-dimensional coordinate system.
  • the coordinates may include: step 301 to step 302.
  • Step 301 Determine the pixel coordinates of the measurement point on the two-dimensional image according to the position of the measurement point on the two-dimensional image and the corresponding depth value.
  • the pixel coordinate refers to a coordinate composed of the relative position relationship between each pixel point on the two-dimensional image and the depth value of each pixel point.
  • the pixel point in the lower left corner of the two-dimensional image can be used as the origin of the two-dimensional coordinates
  • the pixel coordinate system can be established, and the two-dimensional coordinates of each pixel on the two-dimensional image can be determined, and then the two-dimensional coordinates can be compared with the two-dimensional coordinates.
  • the depth values of each pixel of the image are combined to obtain the pixel coordinates of each pixel in the two-dimensional image.
  • Step 302 Map the pixel coordinates to coordinates in the three-dimensional coordinate system according to the parameter information of the camera component and the pose data.
  • the mapping the pixel coordinates to the coordinates in the three-dimensional coordinate system according to the parameter information of the camera component and the pose data includes: determining according to the parameter information of the camera component and the pose data of the terminal A mapping matrix between the pixel coordinates of the two-dimensional image on the terminal display interface and the coordinates in the three-dimensional coordinate system, and the pixel coordinates are mapped to the coordinates in the three-dimensional coordinate system according to the mapping matrix.
  • the parameter information of the camera component includes internal parameters and external parameters of the camera component, and the internal parameters include equivalent focal lengths f x , f y in the u-axis and v-axis directions, and the actual center point coordinates of the image plane u0, v0.
  • Step 104 Display the measurement mark of the measurement point and the geometric parameter of the object to be measured in the two-dimensional image displayed on the display interface.
  • the geometric parameters may include distance parameters, length parameters, area parameters, and angle parameters.
  • the measurement point mark may be a solid black dot " ⁇ "
  • the geometric parameters include a distance parameter of 2.5 m, a length parameter of 3 m, 1.2 m, an area parameter, and an angle parameter.
  • the measurement point mark may also be another form of mark.
  • it can be a hollow circle, a solid black triangle, etc., here is only a distance description, and is not meant to limit the scope of protection of the present application.
  • the receiving measurement point selection instruction determines the measurement selected by the measurement point selection instruction according to the pose data, the second depth image, and the two-dimensional image After determining the coordinates of the point in the three-dimensional coordinate system, and determining the geometric parameters of the object to be measured according to the coordinates, it may also include: saving the coordinates of the measuring point and determining the coordinates of the measuring point Geometric parameters; when the display area of the display interface contains a saved measurement point, the measurement mark of the saved measurement point is displayed, and the geometric parameter corresponding to the coordinate of the saved measurement point is displayed.
  • the terminal after each measurement of a measuring point, the coordinates of the measuring point and the geometric parameters determined according to the coordinates of the measuring point need to be saved, so that when the terminal moves to another location for measuring the measuring point, The data of the saved measurement points will not disappear.
  • the display area of the display interface of the terminal contains the saved measurement point, the measurement mark of the saved measurement point is continued to be displayed, and the geometric parameters corresponding to the coordinates of the saved measurement point are displayed. Therefore, the terminal will not lose historical measurement data due to the movement of the position, and it is convenient for the user to implement mobile measurement.
  • the terminal saves coordinate data
  • the data of the saved measurement point will not change due to changes in the object in the real environment.
  • the distance data can be based on the terminal The location changes.
  • the step 103 before receiving the measurement point selection instruction, may further include: acquiring the current measurement mode of the terminal; the measurement mode includes: distance measurement mode, length measurement mode, and area measurement mode And one or more of the angle measurement modes.
  • the above measurement modes are separated from each other.
  • In the distance measurement mode only the distance measurement is performed, in the length measurement mode only the length measurement is performed, in the area measurement mode, only the area measurement is performed, and in the angle measurement mode, only Take an angle measurement.
  • one of the measurement modes can be selected by sliding to measure geometric parameters.
  • the determining the geometric parameters of the object to be measured according to the coordinates may include: if the current measurement mode of the terminal is the distance measurement mode, determining the object to be measured and the distance measurement mode according to the coordinates The distance parameter between the terminals; if the current measurement mode of the terminal is the length measurement mode, the length parameter of the object to be measured is determined according to the coordinates; if the current measurement mode of the terminal is the area measurement mode, then according to The coordinates determine the area parameter of the object to be measured; if the current measurement mode of the terminal is an angle measurement mode, the angle parameter of the object to be measured is determined according to the coordinates.
  • the current measurement mode of the terminal is the distance measurement mode
  • a single measurement point selection instruction is received, and the single measurement point selection is determined according to the pose data, the second depth image, and the two-dimensional image Instruct the coordinates of the selected measurement point in the three-dimensional coordinate system, and determine the distance parameter between the object to be measured and the terminal according to the coordinates.
  • the current measurement mode of the terminal receives a line segment drawing instruction or multiple line segment drawing instructions associated with the object to be measured, according to the pose data, the second depth image, and the two-dimensional The image determines the endpoint coordinates of the line segment drawn by each line segment drawing instruction, and determines the length parameter of the object to be measured according to the endpoint coordinates.
  • the endpoint coordinates of the line segment drawn by each line segment drawing instruction are determined in real time according to the pose data, the second depth image, and the two-dimensional image, and The length of each line segment is determined according to the endpoint coordinates, and at the same time, the length parameter of the object to be measured is updated in real time according to the drawing of the line segment.
  • the current measurement mode of the terminal receives multiple line segment drawing instructions associated with the object to be measured, and determines each line segment according to the pose data, the second depth image, and the two-dimensional image
  • the endpoint coordinates of the line segment drawn by the drawing instruction are drawn, and the area parameter of the closed figure enclosed by the multiple line segments is determined according to the endpoint coordinates.
  • the perimeter of the closed figure surrounded by the multiple line segments can also be automatically calculated.
  • the current measurement mode of the terminal receives two consecutive line-segment drawing instructions associated with the object to be measured, and determines the position and posture data, the second depth image, and the two-dimensional image.
  • the endpoint coordinates of the line segment drawn by two consecutive line segment drawing instructions, and the angle of the included angle drawn by the two consecutive line segment drawing instructions is determined according to the endpoint coordinates.
  • the measurement The mode may also include: the first automatic measurement mode.
  • the user can trigger the first automatic measurement mode by clicking the first automatic measurement mode selection control 61 shown in FIG. 6.
  • a schematic flow chart of a method for measuring geometric parameters of an object in the first automatic measurement mode provided by an embodiment of this application.
  • the method for measuring geometric parameters of an object in the first automatic measurement mode may include: step 701 to step 703 .
  • Step 701 If the current measurement mode of the terminal is the first automatic measurement mode, identify the object to be measured contained in the two-dimensional image displayed on the display interface, and determine whether the shape of the object to be measured is It is a regular shape.
  • Step 702 If the shape of the object to be measured is a regular shape, each end point of the regular shape and the point closest to the terminal in the regular shape are determined as the measurement point, and according to the pose data
  • the second depth image and the two-dimensional image determine the coordinates of the measurement point in the three-dimensional coordinate system, and determine the distance parameter, the length parameter, and the area parameter of the object to be measured according to the coordinates.
  • Step 703 If the shape of the object to be measured is an irregular shape, the farthest point in the four directions of up, down, left and right in the irregularly shaped object is determined as the measurement point, and according to the pose data, The second depth image and the two-dimensional image determine the coordinates of the measurement point in the three-dimensional coordinate system, and determine the maximum height and the maximum width of the object to be measured according to the coordinates.
  • the regular shape refers to a shape such as a quadrilateral, a triangle, a circle, and a pentagon.
  • the user may only want to know the height and width of an object in order to roughly determine the size of the space required for it. Therefore, in order to make the measurement of the geometric parameters of the object more convenient, the corresponding automatic measurement can be performed by identifying whether the object to be measured is a regular shape.
  • each end point of the regular shape and the point closest to the terminal in the regular shape may be determined as the measurement point, and according to the pose data
  • the second depth image and the two-dimensional image determine the coordinates of the measurement point in the three-dimensional coordinate system, and determine the distance parameter, the length parameter, and the area parameter of the object to be measured according to the coordinates.
  • the farthest point in the four directions of up, down, left, and right among the irregularly shaped objects can be determined as the measurement point.
  • the second The depth image and the two-dimensional image determine the coordinates of the measurement point in the three-dimensional coordinate system, and determine the maximum height and the maximum width of the object to be measured according to the coordinates.
  • the farthest points A, B, C, and D in the four directions of up, down, left and right in the irregularly shaped object can be determined as the measurement points, and the measurement points A and B
  • the vertical height M between the measurement points is determined as the maximum height of the object to be measured
  • the horizontal width N between the measurement points C and D is determined as the maximum width of the object to be measured.
  • the measurement mode may further include: an automatic measurement mode based on a distance measurement mode and an automatic measurement mode based on a length measurement mode.
  • the measurement mode in this embodiment has higher flexibility.
  • the automatic measurement mode based on the distance measurement mode only the automatic measurement of the distance is performed, and the automatic measurement mode based on the length measurement mode only performs the automatic measurement of the length.
  • the user can click the automatic measurement mode selection control 62 shown in FIG. 6 and combine with the current distance measurement mode of the terminal to realize the triggering of the automatic measurement mode based on the distance measurement mode.
  • the current measurement mode of the terminal is the automatic measurement mode based on the distance measurement mode
  • the object to be measured contained in the two-dimensional image displayed on the display interface is identified, and the measurement The point in the object closest to the terminal is determined to be the measurement point, and the coordinates of the measurement point in the three-dimensional coordinate system are determined according to the pose data, the second depth image, and the two-dimensional image, And determine the distance parameter of the object to be measured according to the coordinates.
  • the current measurement mode of the terminal is the automatic measurement mode based on the length measurement mode, recognize the object to be measured contained in the two-dimensional image displayed on the display interface, and determine whether the shape of the object to be measured is Is a regular shape; if the shape of the object to be measured is a regular shape, each endpoint of the regular shape is determined as the measurement point, based on the pose data, the second depth image, and the two-dimensional The image determines the coordinates of the measurement point in the three-dimensional coordinate system, and determines the length parameter between each adjacent measurement point of the object to be measured according to the coordinates; if the shape of the object to be measured is irregular , The farthest point in the four directions of up, down, left, and right in the irregular shape is determined as the measurement point, and the measurement point is determined according to the pose data, the second depth image, and the two-dimensional image The coordinates in the three-dimensional coordinate system, and the maximum height and the maximum width of the object to be measured are determined according to the coordinates.
  • the last measurement instruction can also be canceled by clicking the cancel control 63, for example, cancel the last measurement point selection instruction or area measurement instruction, etc.; and ,
  • the measurement data can also be saved by clicking the photo control 64, that is, the measurement data is saved in the form of a photo carrying the measurement data.
  • FIG. 9 shows a schematic structural diagram of an object geometric parameter measuring device 900 provided by an embodiment of the present application.
  • the object geometric parameter measuring device is configured in a terminal and includes a establishing unit 901, a first display unit 902, and a determining unit 903. And a second display unit 904.
  • the establishment unit 901 is configured to obtain a first depth image of a real environment captured by the camera component, and establish a three-dimensional coordinate system based on the real environment according to the first depth image;
  • the first display unit 902 is configured to obtain the pose data of the terminal, and obtain the second depth image and the two-dimensional image of the object to be measured, and display the two-dimensional image on the display interface of the terminal;
  • the determining unit 903 is configured to receive a measurement point selection instruction, and determine, based on the pose data, the second depth image, and the two-dimensional image, that the measurement point selected by the measurement point selection instruction is in the three-dimensional coordinate system. Coordinates, and determine the geometric parameters of the object to be measured according to the coordinates; the measurement point is a measurement point associated with the object to be measured;
  • the second display unit 904 is configured to display the measurement mark of the measurement point and the geometric parameters of the object to be measured.
  • the present application provides a terminal for realizing the method for measuring the geometric parameters of an object.
  • the terminal may be a smart phone, a tablet computer, a personal computer (PC), a learning machine, etc., and may include: a processor 11.
  • a memory 12 one or more input devices 13 (only one is shown in FIG. 10), one or more output devices 14 (only one is shown in FIG. 10), and a camera assembly 15.
  • the processor 11, the memory 12, the input device 13, the output device 14 and the camera assembly 15 are connected by a bus 16.
  • the processor 11 may be a central processing unit, and the processor may also be other general-purpose processors, digital signal processors, application specific integrated circuits, field programmable gate arrays, or other programmable gate arrays. Logic devices, discrete gates or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the input device 13 may include a virtual keyboard, a touch panel, a fingerprint sensor (used to collect user fingerprint information and fingerprint orientation information), a microphone, etc.
  • the output device 14 may include a display, a speaker, and the like.
  • the memory 12 may include a read-only memory and a random access memory, and provides instructions and data to the processor 11. A part or all of the memory 12 may also include a non-volatile random access memory. For example, the memory 12 may also store device type information.
  • the memory 12 stores a computer program, and the computer program can be run on the processor 11.
  • the computer program is a program of a method for measuring geometric parameters of an object.
  • the steps in the embodiment of the method for measuring geometric parameters of the object are implemented, for example, step 101 to step 103 shown in FIG. 1.
  • the functions of the modules/units in the foregoing device embodiments are implemented, for example, the functions of the units 901 to 904 shown in FIG. 9.
  • the foregoing computer program may be divided into one or more modules/units, and the foregoing one or more modules/units are stored in the foregoing memory 12 and executed by the foregoing processor 11 to complete the application.
  • the aforementioned one or more modules/units may be a series of computer program instruction segments capable of completing specific functions, and the instruction segments are used to describe the execution process of the aforementioned computer program in the aforementioned terminal for measuring the geometric parameters of the object.
  • the above computer program can be divided into an establishment unit, a first display unit, a determination unit, and a second display unit.
  • each unit is used to obtain the first depth image of the real environment captured by the camera assembly, according to The first depth image establishes a three-dimensional coordinate system based on the real environment;
  • the first display unit is used to obtain the pose data of the terminal, and obtain the second depth image and the two-dimensional image of the object to be measured, which are displayed on the terminal
  • the interface displays the two-dimensional image;
  • the determining unit is configured to receive a measurement point selection instruction, and determine that the measurement point selected by the measurement point selection instruction is based on the pose data, the second depth image, and the two-dimensional image
  • the coordinates in the three-dimensional coordinate system, and the geometric parameters of the object to be measured are determined according to the coordinates;
  • the measurement point is a measurement point associated with the object to be measured;
  • the second display unit is used to display the measurement point And the geometric parameters of the object to be measured.
  • the disclosed apparatus/terminal and method may be implemented in other ways.
  • the above-described device/terminal embodiments are only illustrative.
  • the division of the above-mentioned modules or units is only a logical function division.
  • there may be other division methods, such as multiple units or components. Can be combined or integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described above as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • each unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the above integrated modules/units are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer readable storage medium. Based on this understanding, this application implements all or part of the processes in the above-mentioned embodiments and methods, and can also be completed by instructing relevant hardware through a computer program.
  • the above-mentioned computer program may be stored in a computer-readable storage medium. When executed by the processor, the steps of the foregoing method embodiments can be implemented.
  • the above-mentioned computer program includes computer program code, and the above-mentioned computer program code may be in the form of source code, object code, executable file, or some intermediate forms.
  • the above-mentioned computer-readable medium may include: any entity or device capable of carrying the above-mentioned computer program code, recording medium, U disk, mobile hard disk, magnetic disk, optical disk, computer memory, read-only memory (Read-Only Memory, ROM), random Random Access Memory (RAM), electric carrier signal, telecommunications signal, software distribution medium, etc.
  • the content contained in the above-mentioned computer-readable media can be appropriately added or deleted in accordance with the requirements of the legislation and patent practice in the jurisdiction. For example, in some jurisdictions, according to the legislation and patent practice, the computer-readable media cannot Including electric carrier signal and telecommunication signal.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Botany (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

一种物体几何参数的测量方法,包括:获取摄像组件拍摄的现实环境的第一深度图像,根据第一深度图像建立基于现实环境的三维坐标系(101);获取终端的位姿数据,并获取待测物体的第二深度图像和二维图像,在终端的显示界面显示二维图像(102);接收测量点选中指令,根据位姿数据、第二深度图像以及二维图像确定测量点选中指令选中的测量点在三维坐标系下的坐标,并根据坐标确定待测物体的几何参数(103);在显示界面显示的二维图像中显示测量点的测量标记以及待测物体的几何参数(104);提高了AR测量技术的测量精度。还公开了物体几何参数的测量装置和终端。

Description

物体几何参数的测量方法、装置和终端
本申请要求于2019年4月15日提交中国专利局、申请号为201910302960.6、申请名称为“物体几何参数的测量方法、装置和终端”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请属于测量技术领域,尤其涉及一种物体几何参数的测量方法、装置和终端。
背景技术
随着图像处理技术的发展,目前,已经可以实现利用摄像头捕获的图像对现实世界中的物体进行位置和姿态的识别。其中,增强现实技术(Augmented reality,AR)已经为这种物体识别提供了应用实例。AR技术是一种可以实现屏幕上的虚拟世界能够与现实世界场景进行结合与交互的技术。
例如,利用增强现实技术进行物体测量的AR测量技术中,可以让虚拟环境中的测量数据叠加到摄像头捕获的现实环境的图像中,为用户提供测量便利。
发明内容
本申请实施例提供一种物体几何参数的测量方法、装置和终端,可以解决目前的AR测量技术存在较大测量误差的技术问题。
本申请实施例第一方面提供一种物体几何参数的测量方法,包括:
获取摄像组件拍摄的现实环境的第一深度图像,根据所述第一深度图像建立基于所述现实环境的三维坐标系;
获取终端的位姿数据,并获取待测物体的第二深度图像和二维图像,在终端的显示界面显示所述二维图像;
接收测量点选中指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的几何参数;所述测量点为与待测物体关联的测量点;
在所述显示界面显示的所述二维图像中显示所述测量点的测量标记以及所述待测物体的几何参数。
本申请实施例第二方面提供一种物体几何参数的测量装置,包括:
建立单元,用于获取摄像组件拍摄的现实环境的第一深度图像,根据所述第一深度图像建立基于所述现实环境的三维坐标系;
第一显示单元,用于获取终端的位姿数据,并获取待测物体的第二深度图像和二维图像,在终端的显示界面显示所述二维图像;
确定单元,用于接收测量点选中指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的几何 参数;所述测量点为与待测物体关联的测量点;
第二显示单元,用于显示所述测量点的测量标记以及所述待测物体的几何参数。
本申请实施例第三方面提供一种终端,包括摄像组件、存储器、处理器以及存储在存储器中并可在处理器上运行的计算机程序,处理器执行计算机程序时实现上述方法的步骤。
本申请实施例第四方面提供一种计算机可读存储介质,计算机可读存储介质存储有计算机程序,计算机程序被处理器执行时实现上述方法的步骤。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,应当理解,以下附图仅示出了本申请的某些实施例,因此不应被看作是对范围的限定,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他相关的附图。
图1是本申请实施例提供的一种物体几何参数的测量方法的实现流程示意图;
图2是本申请实施例提供的物体几何参数的测量过程示意图;
图3是本申请实施例提供的一种物体几何参数的测量方法步骤103的具体实现流程示意图;
图4是本申请实施例提供的显示界面的第一示意图;
图5是本申请实施例提供的显示界面的第二示意图;
图6是本申请实施例提供的显示界面的第三示意图;
图7是本申请实施例提供的第一自动测量模式下物体几何参数的测量方法流程示意图;
图8是本申请实施例提供的不规则形状的物体的几何参数测量示意图;
图9是本申请实施例提供的物体几何参数的测量装置的结构示意图;
图10是本申请实施例提供的终端的结构示意图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。同时,在本申请的描述中,术语“第一”、“第二”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
目前,利用单目摄像头进行AR测量时,一般需要引导用户对终端进行移动,由所述单目摄像头拍摄多帧连续的照片,并根据预先定义的规则进行平面识别,以识别到的平面为基准进行AR测量。这种测量方式不仅存在初始化速度慢的问题,还由于其需要依赖平面测量或者依赖 丰富的纹理场景进行测量,造成其对于缺乏丰富纹理的垂直平面,以及垂直平面的颜色接近与其连接的水平面的颜色时,将很难识别出该垂直平面,甚至有可能将该垂直平面误认为是一个距离终端非常远的水平平面,进而出现较大的测量误差。
为了解决这一技术问题,在本申请实施例中,通过获取摄像组件拍摄的现实环境的第一深度图像建立基于所述现实环境的三维坐标系,完成测量前的初始化;并在初始化完成之后,实时获取终端的位姿数据,并获取待测物体的第二深度图像和二维图像;同时,接收用户触发的测量点选中指令,并根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,从而根据所述坐标确定所述待测物体的几何参数,实现所述待测物体几何参数的测量,并在所述显示界面显示的所述二维图像中显示所述测量点的测量标记以及所述待测物体的几何参数;该测量方式不需要依赖于平面识别,而是根据终端的位姿数据,以及待测物体的第二深度图像和二维图像确定与所述待测物体关联的测量点在所述三维坐标系下的坐标,并根据该坐标确定所述待测物体的几何参数,本申请的测量方式可以实现对缺乏丰富纹理的垂直平面的有效识别,并且对于垂直平面的颜色接近与其连接的水平面的颜色的情况也能够进行准确判断,不会出现将该垂直平面误认为是一个距离终端非常远的水平平面的情况,提高了AR测量技术的测量精度。
如图1示出了本申请实施例提供的一种物体几何参数的测量方法的实现流程示意图,该测量方法应用于终端,可以由终端上配置的物体几何参数的测量装置执行,适用于需提高物体测量精度的情形。所述终端可以包括智能手机、平板电脑、学习机等智能终端。该终端可以安装有摄像组件,所述物体几何参数的测量方法包括步骤101至步骤104。
步骤101,获取摄像组件拍摄的现实环境的第一深度图像,根据所述第一深度图像建立基于所述现实环境的三维坐标系。
在进行物体几何参数的测量时,先要获取摄像组件拍摄的现实环境的第一深度图像,以建立基于所述现实环境的三维坐标系,完成测量的初始化。
所述摄像组件可以包括深度摄像头和RGB摄像头,其中,所述深度摄像头用于采集深度图像,所述RGB摄像头用于采集二维彩色图像(二维图像)。
所述深度图像的每个像素点的灰度值可用于表征场景中某一点距离摄像头的远近。
在本申请的一些实施方式中,所述深度摄像头的分辨率可以等于所述RGB摄像头的分辨率,以使所述二维图像上的每个像素均可以获取到准确的深度信息。
在本申请的一些实施方式中,所述深度摄像头可以为TOF摄像头。
在本申请的一些实施方式中,所述摄像组件还可以为可以同时输出 深度图像和二维彩色图像的3D摄像头。
在利用终端进行物体几何参数的测量时,可以先打开相机应用,开启所述摄像组件,进行现实环境的第一深度图像的获取,并在接收到深度摄像头返回的第一深度图像时,以第一深度图像中的任意一个有效深度数据对应的现实环境中的点为坐标原点,建立所述基于所述现实环境的三维坐标系,并作为进行物体几何参数测量时坐标计算的参考依据。其中,所述有效深度数据是指深度值在预设深度值范围内的深度数据。
例如,如图2所示,以沙发上的任意一个点为坐标原点,建立所述基于所述现实环境的三维坐标系。
需要说明的是,在本申请实施例中,所述三维坐标系建立完成后,表示摄像组件已完成测量前的初始化,可以开始进行物体几何参数的测量。由于该初始化过程可以不需要进行平面识别,并且也可以不需要进行终端的移动以及采集多帧照片,因此,本申请实现物体几何参数的测量时,具有初始化速度快的特点,可以达到“秒开”的效果。
步骤102,获取终端的位姿数据,并获取待测物体的第二深度图像和二维图像,在终端的显示界面显示所述二维图像。
由于用户在使用终端进行物体几何参数的测量时,有可能会移动所述终端,因此,需要进行终端位姿数据的获取,以确定终端当前时刻相对于初始化完成时的时刻的姿态变化,并依此确定摄像组件拍摄的待测物体的第二深度图像和二维图像中各个像素点对应的在所述三维坐标系下的坐标。
可选的,所述获取终端的位姿数据包括:在所述三维坐标系建立完成时开始,利用惯性测量单元IMU实时获取所述终端的六自由度位姿数据。
由于物体在空间具有六个自由度,即沿x、y、z三个直角坐标轴方向的移动自由度和绕这三个坐标轴的转动自由度。因此,要完全确定物体的位置,就必须清楚这六个自由度。
惯性测量单元(Inertial measurement unit,IMU)是测量物体三轴角速度以及加速度的装置。一般的,一个IMU包含了三个单轴的加速度计和三个单轴的陀螺仪,加速度计检测物体在载体坐标系统独立三轴的加速度,而陀螺仪检测载体相对于导航坐标系的角速度,测量物体在三维空间中的角速度和加速度,并以此解算出物体的姿态。因此,本申请获取终端的位姿数据时,可以利用惯性测量单元IMU进行获取。
步骤103,接收测量点选中指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的几何参数;所述测量点为与待测物体关联的测量点。
在获取了终端的位姿数据以及待测物体的第二深度图像和二维图像之后,用户可以根据终端的显示界面显示的所述二维图像在所述显示界面中触发所述测量点选中指令,以由所述终端根据所述位姿数据、所 述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的几何参数。
例如,如图2所示,用户可以通过在所述终端的显示界面20上进行点选21或滑动绘制线条22触发测量点选中指令,以确定要待测物体,已经与所述待测物体关联的测量点。
可选的,如图3所示,所述根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标可以包括:步骤301至步骤302。
步骤301,根据所述测量点在所述二维图像上的位置以及对应的深度值,确定所述测量点在所述二维图像上的像素坐标。
本发明实施例中,所述像素坐标是指二维图像上各个像素点之间的相对位置关系,并结合各个像素点的深度值组成的坐标。
例如,可以以二维图像的左下角的像素点为二维坐标原点,建立像素坐标系,并确定所述二维图像上各个像素点的二维坐标,再将所述二维坐标与二维图像各个像素点的深度值进行结合,得到所述二维图像中各个像素点的像素坐标。
步骤302,根据所述摄像组件的参数信息以及所述位姿数据将所述像素坐标映射到所述三维坐标系下的坐标。
所述根据所述摄像组件的参数信息以及所述位姿数据将所述像素坐标映射到所述三维坐标系下的坐标包括:根据所述摄像组件的参数信息和所述终端的位姿数据确定终端显示界面上的二维图像的像素坐标与所述三维坐标系下的坐标的映射矩阵,并根据所述映射矩阵将所述像素坐标映射到所述三维坐标系下的坐标。
其中,所述摄像组件的参数信息包括所述摄像组件的内参和外参,所述内参包括在u轴和v轴方向上的等效焦距f x,f y,以及像平面的实际中心点坐标u0,v0。
步骤104,在所述显示界面显示的所述二维图像中显示所述测量点的测量标记以及所述待测物体的几何参数。
所述几何参数可以包括距离参数、长度参数、面积参数和角度参数。
例如,如图4所示,所述测量点标记可以为实心黑点“·”,所述几何参数包括距离参数2.5m、长度参数3m、1.2m、面积参数和角度参数。
在本发明的其他实施方式中,所述测量点标记还可以为其他形式的标记。例如,可以为空心圆点、实心黑三角形等等,此处仅仅是距离说明,不表示为对本申请保护范围的限制。
可选的,在本申请的一些实施方式中,所述接收测量点选中指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的几何参数之后,还可以包括:保存所述测量点的坐 标以及根据所述测量点的坐标确定的所述几何参数;当所述显示界面的显示区域内包含已保存的测量点时,显示所述已保存的测量点的测量标记,并显示与所述已保存的测量点的坐标对应的几何参数。
也就是说,每次对测量点进行测量之后,需要对该测量点的坐标以及根据所述测量点的坐标确定的所述几何参数进行保存,使得终端移动到其他位置进行测量点的测量时,已保存的测量点的数据不会消失。当终端的显示界面的显示区域中包含已保存的测量点时,则继续显示所述已保存的测量点的测量标记,并显示与所述已保存的测量点的坐标对应的几何参数。使得终端不会因为位置的移动而丢失历史测量数据,方便用户实现移动测量。
需要说明的是,由于终端保存的是坐标数据,因此,已保存的测量点的数据并不会因为现实环境中的物体的变化而变化,而已保存的测量点的数据中,距离数据可以根据终端位置的变化而变化。
例如,如图5所示,用户完成对沙发的长度数据和距离数据的测量之后,将沙发移动到其他位置,并将终端向靠近沙发原来位置的方向移动0.5m,可以发现,终端的显示界面的显示区域中仍会显示沙发的长度数据,不会因为沙发的移动而消失。并且,由于长度数据是两个坐标之间的距离值,因此也不会因为终端的位置变化而变化;但是,由于几何参数中的距离参数是测量点与终端之间的距离值,因此,距离参数可以因为终端的位置的变化而变化。
可选的,在上述描述的实施方式中,上述步骤103,接收测量点选中指令之前还可以包括:获取终端当前的测量模式;所述测量模式包括:距离测量模式、长度测量模式、面积测量模式和角度测量模式中的一种或多种测量模式。
也就是说,上述各个测量模式相互分离,在距离测量模式下,只进行距离测量,在长度测量模式下只进行长度测量,在面积测量模式下,只进行面积测量,在角度测量模式下,只进行角度测量。
例如,如图6所示,可以通过滑动选择其中一种测量模式进行几何参数的测量。
相应的,所述根据所述坐标确定所述待测物体的几何参数可以包括:若所述终端当前的测量模式为所述距离测量模式,则根据所述坐标确定所述待测物体与所述终端之间的距离参数;若所述终端当前的测量模式为长度测量模式,则根据所述坐标确定所述待测物体的长度参数;若所述终端当前的测量模式为面积测量模式,则根据所述坐标确定所述待测物体的面积参数;若所述终端当前的测量模式为角度测量模式,则根据所述坐标确定所述待测物体的角度参数。
其中,若所述终端当前的测量模式为距离测量模式,则接收单一测量点选中指令,并根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述单一测量点选中指令选中的测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体与所述终端之间的距离参 数。
若所述终端当前的测量模式为长度测量模式,则接收与待测物体关联的一条线段绘制指令或多条线段绘制指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定每条线段绘制指令绘制的线段的端点坐标,并根据所述端点坐标确定待测物体的长度参数。
在本申请的一些实施方式中,在进行线段的绘制时,实时根据所述位姿数据、所述第二深度图像以及所述二维图像确定每条线段绘制指令绘制的线段的端点坐标,并根据所述端点坐标确定每条线段的长度,同时,根据线段的绘制实时更新所述待测物体的长度参数。
若所述终端当前的测量模式为面积测量模式,则接收与待测物体关联的多条线段绘制指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定每条线段绘制指令绘制的线段的端点坐标,并根据所述端点坐标确定由所述多条线段围成的封闭图形的面积参数。
在本申请的一些实施方式中,在所述面积测量模式下,还可以自动计算由所述多条线段围成的封闭图形的周长。
若所述终端当前的测量模式为角度测量模式,则接收与待测物体关联的连续两条线段绘制指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述连续两条线段绘制指令绘制的线段的端点坐标,并根据所述端点坐标确定由所述连续两条线段绘制指令绘制的夹角的角度。
在上述描述的实施方式中,在进行物体几何参数的测量过程中,均需要接收相应的测量点选中指令,为了使得物体几何参数的测量更加便利,在本申请的一些实施方式中,所述测量模式还可以包括:第一自动测量模式。
例如,用户可以通过点击如图6所示的第一自动测量模式选择控件61触发所述第一自动测量模式。
如图7所示,为本申请实施例提供的第一自动测量模式下物体几何参数的测量方法流程示意图,所述第一自动测量模式下物体几何参数的测量方法可以包括:步骤701至步骤703。
步骤701,若所述终端当前的测量模式为所述第一自动测量模式,则识别所述显示界面显示的所述二维图像中包含的待测物体,并判断所述待测物体的外形是否为规则形状。
步骤702,若所述待测物体的外形为规则形状,则将所述规则形状的各个端点以及所述规则形状中距离所述终端最近的点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的距离参数、长度参数和面积参数。
步骤703,若所述待测物体的外形为不规则形状,则将所述不规则形状的物体中上下左右四个方向上的最远点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所 述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的最大高度和最大宽度。
在本申请实施例中,所述规则形状是指四边形、三角形、圆形、五边形等形状。
对于家居用品的选购过程中,或者一些其他的应用场景,用户有可能只想知道某个物体的高度和宽度,以便大致确定其需要的摆放空间大小。因此,为了使得物体几何参数的测量更加便利,可以通过识别待测物体是否为规则形状的方式进行相应的自动测量。
例如,对于门窗、冰箱和书柜等规则形状的物体,则可以将所述规则形状的各个端点以及所述规则形状中距离所述终端最近的点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的距离参数、长度参数和面积参数。
对于吹风机、盆栽等不规则形状的物体,则可以将所述不规则形状的物体中上下左右四个方向上的最远点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的最大高度和最大宽度。
例如,如图8所示,可以将所述不规则形状的物体中上下左右四个方向上的最远点A、B、C、D确定为所述测量点,并将测量点A、B之间的竖直高度M确定为所述待测物体的最大高度,将测量点C、D之间的水平宽度N确定为所述待测物体的最大宽度。
可选的,在本申请的一些实施方式中,所述测量模式还可以包括:基于距离测量模式的自动测量模式和基于长度测量模式的自动测量模式。本实施例中的测量模式相比于图7所示的第一自动测量模式,具有更高的灵活性。
即,在基于距离测量模式的自动测量模式下只进行距离的自动测量,而基于长度测量模式的自动测量模式只进行长度的自动测量。
例如,用户可以通过点击如图6所示的自动测量模式选择控件62并结合终端当前的距离测量模式即可实现基于距离测量模式的自动测量模式的触发。
可选的,若所述终端当前的测量模式为所述基于距离测量模式的自动测量模式,则识别所述显示界面显示的所述二维图像中包含的待测物体,并将所述待测物体中距离所述终端最近的点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的距离参数。
若所述终端当前的测量模式为所述基于长度测量模式的自动测量模式,则识别所述显示界面显示的所述二维图像中包含的待测物体,并判断所述待测物体的外形是否为规则形状;若所述待测物体的外形为规 则形状,则将所述规则形状的各个端点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的各个相邻测量点间的长度参数;若所述待测物体的外形为不规则形状,则将所述不规则形状中上下左右四个方向上的最远点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的最大高度和最大宽度。
需要说明的是,在上述描述的各个实施方式中,如图6所示,还可以通过点击撤销控件63撤销上一测量指令,例如,撤销上一测量点选中指令或面积测量指令等等;并且,还可以通过点击拍照控件64,实现测量数据的保存,即以携带测量数据的照片方式进行测量数据的保存。
应理解,上述实施例中各步骤的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
图9示出了本申请实施例提供的一种物体几何参数的测量装置900的结构示意图,所述物体几何参数的测量装置配置于终端,包括建立单元901、第一显示单元902、确定单元903和第二显示单元904。
建立单元901,用于获取摄像组件拍摄的现实环境的第一深度图像,根据所述第一深度图像建立基于所述现实环境的三维坐标系;
第一显示单元902,用于获取终端的位姿数据,并获取待测物体的第二深度图像和二维图像,在终端的显示界面显示所述二维图像;
确定单元903,用于接收测量点选中指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的几何参数;所述测量点为与待测物体关联的测量点;
第二显示单元904,用于显示所述测量点的测量标记以及所述待测物体的几何参数。
需要说明的是,为描述的方便和简洁,上述描述的物体几何参数的测量装置900的具体工作过程,可以参考上述图1至图5中描述的方法的对应过程,在此不再赘述。
如图10所示,本申请提供一种用于实现上述物体几何参数的测量方法的终端,该终端可以为智能手机、平板电脑、个人电脑(PC)、学习机等终端,可以包括:处理器11、存储器12、一个或多个输入设备13(图10中仅示出一个)、一个或多个输出设备14(图10中仅示出一个)和摄像组件15。处理器11、存储器12、输入设备13、输出设备14和摄像组件15通过总线16连接。
应当理解,在本申请实施例中,所称处理器11可以是中央处理单元,该处理器还可以是其他通用处理器、数字信号处理器、专用集成电 路、现场可编程门阵列或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
输入设备13可以包括虚拟键盘、触控板、指纹采传感器(用于采集用户的指纹信息和指纹的方向信息)、麦克风等,输出设备14可以包括显示器、扬声器等。
存储器12可以包括只读存储器和随机存取存储器,并向处理器11提供指令和数据。存储器12的一部分或全部还可以包括非易失性随机存取存储器。例如,存储器12还可以存储设备类型的信息。
上述存储器12存储有计算机程序,上述计算机程序可在上述处理器11上运行,例如,上述计算机程序为物体几何参数的测量方法的程序。上述处理器11执行上述计算机程序时实现上述物体几何参数的测量方法实施例中的步骤,例如图1所示的步骤101至步骤103。或者,上述处理器11执行上述计算机程序时实现上述各装置实施例中各模块/单元的功能,例如图9所示单元901至904的功能。
上述计算机程序可以被分割成一个或多个模块/单元,上述一个或者多个模块/单元被存储在上述存储器12中,并由上述处理器11执行,以完成本申请。上述一个或多个模块/单元可以是能够完成特定功能的一系列计算机程序指令段,该指令段用于描述上述计算机程序在上述进行物体几何参数的测量的终端中的执行过程。例如,上述计算机程序可以被分割成建立单元、第一显示单元、确定单元和第二显示单元,各单元具体功能如下:建立单元,用于获取摄像组件拍摄的现实环境的第一深度图像,根据所述第一深度图像建立基于所述现实环境的三维坐标系;第一显示单元,用于获取终端的位姿数据,并获取待测物体的第二深度图像和二维图像,在终端的显示界面显示所述二维图像;确定单元,用于接收测量点选中指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的几何参数;所述测量点为与待测物体关联的测量点;第二显示单元,用于显示所述测量点的测量标记以及所述待测物体的几何参数。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,仅以上述各功能单元、模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能单元、模块完成,即将上述装置的内部结构划分成不同的功能单元或模块,以完成以上描述的全部或者部分功能。实施例中的各功能单元、模块可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中,上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。另外,各功能单元、模块的具体名称也只是为了便于相互区分,并不用于限制本申请的保护范围。上述系统中单元、模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不 再赘述。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的实施例中,应该理解到,所揭露的装置/终端和方法,可以通过其它的方式实现。例如,以上所描述的装置/终端实施例仅仅是示意性的,例如,上述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。
上述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
上述集成的模块/单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,也可以通过计算机程序来指令相关的硬件来完成,上述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述各个方法实施例的步骤。其中,上述计算机程序包括计算机程序代码,上述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。上述计算机可读介质可以包括:能够携带上述计算机程序代码的任何实体或装置、记录介质、U盘、移动硬盘、磁碟、光盘、计算机存储器、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、电载波信号、电信信号以及软件分发介质等。需要说明的是,上述计算机可读介质包含的内容可以根据司法管辖区内立法和专利实践的要求进行适当的增减,例如在某些司法管辖区,根据立法和专利实践,计算机可读介质不包括电载波信号和电信信号。
以上上述实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围,均应包含在本申请的保护范围之内。

Claims (20)

  1. 一种物体几何参数的测量方法,其特征在于,包括:
    获取摄像组件拍摄的现实环境的第一深度图像,根据所述第一深度图像建立基于所述现实环境的三维坐标系;
    获取终端的位姿数据,并获取待测物体的第二深度图像和二维图像,在终端的显示界面显示所述二维图像;
    接收测量点选中指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的几何参数;所述测量点为与待测物体关联的测量点;
    在所述显示界面显示的所述二维图像中显示所述测量点的测量标记以及所述待测物体的几何参数。
  2. 如权利要求1所述的测量方法,其特征在于,所述接收测量点选中指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的几何参数之后,包括:
    保存所述测量点的坐标以及根据所述测量点的坐标确定的所述几何参数;当所述显示界面的显示区域内包含已保存的测量点时,显示所述已保存的测量点的测量标记,并显示与所述已保存的测量点的坐标对应的几何参数。
  3. 如权利要求1所述的测量方法,其特征在于,所述根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,包括:
    根据所述测量点在所述二维图像上的位置以及对应的深度值,确定所述测量点在所述二维图像上的像素坐标;
    根据所述摄像组件的参数信息和所述位姿数据将所述像素坐标映射到所述三维坐标系下的坐标。
  4. 如权利要求1-3任意一项所述的测量方法,其特征在于,在所述接收测量点选中指令之前,包括:
    获取终端当前的测量模式;所述测量模式包括:距离测量模式、长度测量模式、面积测量模式和角度测量模式中的一种或多种测量模式;
    所述根据所述坐标确定所述待测物体的几何参数包括:
    若所述终端当前的测量模式为所述距离测量模式,则根据所述坐标确定所述待测物体与所述终端之间的距离参数;
    若所述终端当前的测量模式为长度测量模式,则根据所述坐标确定所述待测物体的长度参数;
    若所述终端当前的测量模式为面积测量模式,则根据所述坐标确定所述待测物体的面积参数;
    若所述终端当前的测量模式为角度测量模式,则根据所述坐标确定所述待测物体的角度参数。
  5. 如权利要求4所述的测量方法,其特征在于,所述测量模式还包括:第一自动测量模式;
    若所述终端当前的测量模式为所述第一自动测量模式,则识别所述显示界面显示的所述二维图像中包含的待测物体,并判断所述待测物体的外形是否为规则形状;
    若所述待测物体的外形为规则形状,则将所述规则形状的各个端点以及所述规则形状中距离所述终端最近的点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的距离参数、长度参数和面积参数;
    若所述待测物体的外形为不规则形状,则将所述不规则形状中上下左右四个方向上的最远点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的最大高度和最大宽度。
  6. 如权利要求4所述的测量方法,其特征在于,所述测量模式还包括:基于距离测量模式的自动测量模式和基于长度测量模式的自动测量模式;
    若所述终端当前的测量模式为所述基于距离测量模式的自动测量模式,则识别所述显示界面显示的所述二维图像中包含的待测物体,并将所述待测物体中距离所述终端最近的点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的距离参数;
    若所述终端当前的测量模式为所述基于长度测量模式的自动测量模式,则
    识别所述显示界面显示的所述二维图像中包含的待测物体,并判断所述待测物体的外形是否为规则形状;
    若所述待测物体的外形为规则形状,则将所述规则形状的各个端点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的各个相邻测量点间的长度参数;
    若所述待测物体的外形为不规则形状,则将所述不规则形状中上下左右四个方向上的最远点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的最大高度和最大宽度。
  7. 如权利要求1所述的测量方法,其特征在于,所述获取终端的位姿数据包括:
    在所述三维坐标系建立完成时开始,利用惯性测量单元IMU实时获 取所述终端的六自由度位姿数据。
  8. 如权利要求1所述的测量方法,其特征在于,所述根据所述第一深度图像建立基于所述现实环境的三维坐标系,包括:
    以第一深度图像中的任意一个有效深度数据对应的现实环境中的点为坐标原点,建立所述基于所述现实环境的三维坐标系。
  9. 如权利要求1所述的测量方法,其特征在于,在所述在显示界面显示的所述二维图像中显示所述测量点的测量标记以及所述待测物体的几何参数之后,还包括:
    接收用户点击拍照控件触发的拍照指令,得到携带测量数据的照片;所述测量数据包括所述测量点的测量标记以及所述待测物体的几何参数。
  10. 一种物体几何参数的测量装置,其特征在于,包括:
    建立单元,用于获取摄像组件拍摄的现实环境的第一深度图像,根据所述第一深度图像建立基于所述现实环境的三维坐标系;
    第一显示单元,用于获取终端的位姿数据,并获取待测物体的第二深度图像和二维图像,在终端的显示界面显示所述二维图像;
    确定单元,用于接收测量点选中指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的几何参数;所述测量点为与待测物体关联的测量点;
    第二显示单元,用于显示所述测量点的测量标记以及所述待测物体的几何参数。
  11. 一种终端,包括摄像组件、存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现以下步骤:
    获取摄像组件拍摄的现实环境的第一深度图像,根据所述第一深度图像建立基于所述现实环境的三维坐标系;
    获取终端的位姿数据,并获取待测物体的第二深度图像和二维图像,在终端的显示界面显示所述二维图像;
    接收测量点选中指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的几何参数;所述测量点为与待测物体关联的测量点;
    在所述显示界面显示的所述二维图像中显示所述测量点的测量标记以及所述待测物体的几何参数。
  12. 如权利要求11所述的终端,其特征在于,所述处理器执行所述计算机程序时还实现以下步骤:
    在所述接收测量点选中指令,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点选中指令选中的测量点在所述三维 坐标系下的坐标,并根据所述坐标确定所述待测物体的几何参数之后,保存所述测量点的坐标以及根据所述测量点的坐标确定的所述几何参数;当所述显示界面的显示区域内包含已保存的测量点时,显示所述已保存的测量点的测量标记,并显示与所述已保存的测量点的坐标对应的几何参数。
  13. 如权利要求11所述的终端,其特征在于,所述处理器执行所述计算机程序时还实现以下步骤:
    根据所述测量点在所述二维图像上的位置以及对应的深度值,确定所述测量点在所述二维图像上的像素坐标;
    根据所述摄像组件的参数信息和所述位姿数据将所述像素坐标映射到所述三维坐标系下的坐标。
  14. 如权利要求11-13任意一项所述的终端,其特征在于,所述处理器执行所述计算机程序时还实现以下步骤:
    在所述接收测量点选中指令之前,获取终端当前的测量模式;所述测量模式包括:距离测量模式、长度测量模式、面积测量模式和角度测量模式中的一种或多种测量模式;
    所述根据所述坐标确定所述待测物体的几何参数包括:
    若所述终端当前的测量模式为所述距离测量模式,则根据所述坐标确定所述待测物体与所述终端之间的距离参数;
    若所述终端当前的测量模式为长度测量模式,则根据所述坐标确定所述待测物体的长度参数;
    若所述终端当前的测量模式为面积测量模式,则根据所述坐标确定所述待测物体的面积参数;
    若所述终端当前的测量模式为角度测量模式,则根据所述坐标确定所述待测物体的角度参数。
  15. 如权利要求14所述的终端,其特征在于,所述处理器执行所述计算机程序时还实现以下步骤:
    若所述终端当前的测量模式为所述第一自动测量模式,则识别所述显示界面显示的所述二维图像中包含的待测物体,并判断所述待测物体的外形是否为规则形状;
    若所述待测物体的外形为规则形状,则将所述规则形状的各个端点以及所述规则形状中距离所述终端最近的点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的距离参数、长度参数和面积参数;
    若所述待测物体的外形为不规则形状,则将所述不规则形状中上下左右四个方向上的最远点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的最大高度和最大宽度。
  16. 如权利要求14所述的终端,其特征在于,所述处理器执行所述计算机程序时还实现以下步骤:
    若所述终端当前的测量模式为所述基于距离测量模式的自动测量模式,则识别所述显示界面显示的所述二维图像中包含的待测物体,并将所述待测物体中距离所述终端最近的点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的距离参数;
    若所述终端当前的测量模式为所述基于长度测量模式的自动测量模式,则
    识别所述显示界面显示的所述二维图像中包含的待测物体,并判断所述待测物体的外形是否为规则形状;
    若所述待测物体的外形为规则形状,则将所述规则形状的各个端点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的各个相邻测量点间的长度参数;
    若所述待测物体的外形为不规则形状,则将所述不规则形状中上下左右四个方向上的最远点确定为所述测量点,根据所述位姿数据、所述第二深度图像以及所述二维图像确定所述测量点在所述三维坐标系下的坐标,并根据所述坐标确定所述待测物体的最大高度和最大宽度。
  17. 如权利要求11所述的终端,其特征在于,所述处理器执行所述计算机程序时还实现以下步骤:
    在所述三维坐标系建立完成时开始,利用惯性测量单元IMU实时获取所述终端的六自由度位姿数据。
  18. 如权利要求11所述的终端,其特征在于,所述处理器执行所述计算机程序时还实现以下步骤:
    以第一深度图像中的任意一个有效深度数据对应的现实环境中的点为坐标原点,建立所述基于所述现实环境的三维坐标系。
  19. 如权利要求11所述的终端,其特征在于,所述处理器执行所述计算机程序时还实现以下步骤:
    在所述在显示界面显示的所述二维图像中显示所述测量点的测量标记以及所述待测物体的几何参数之后,接收用户点击拍照控件触发的拍照指令,得到携带测量数据的照片;所述测量数据包括所述测量点的测量标记以及所述待测物体的几何参数。
  20. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至9任意一项所述方法的步骤。
PCT/CN2020/082081 2019-04-15 2020-03-30 物体几何参数的测量方法、装置和终端 WO2020211626A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20790606.6A EP3943881A4 (en) 2019-04-15 2020-03-30 METHOD AND APPARATUS FOR MEASURING A GEOMETRIC PARAMETER OF AN OBJECT, AND ASSOCIATED TERMINAL
US17/478,611 US20220003537A1 (en) 2019-04-15 2021-09-17 Method and apparatus for measuring geometric parameter of object, and terminal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910302960.6 2019-04-15
CN201910302960.6A CN110006343B (zh) 2019-04-15 2019-04-15 物体几何参数的测量方法、装置和终端

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/478,611 Continuation US20220003537A1 (en) 2019-04-15 2021-09-17 Method and apparatus for measuring geometric parameter of object, and terminal

Publications (1)

Publication Number Publication Date
WO2020211626A1 true WO2020211626A1 (zh) 2020-10-22

Family

ID=67172121

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/082081 WO2020211626A1 (zh) 2019-04-15 2020-03-30 物体几何参数的测量方法、装置和终端

Country Status (4)

Country Link
US (1) US20220003537A1 (zh)
EP (1) EP3943881A4 (zh)
CN (2) CN112797897B (zh)
WO (1) WO2020211626A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112707340A (zh) * 2020-12-10 2021-04-27 安徽有光图像科技有限公司 基于视觉识别的设备控制信号生成方法、装置及叉车
CN113689478A (zh) * 2021-09-03 2021-11-23 凌云光技术股份有限公司 量测设备的对齐方法、装置及系统

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112797897B (zh) * 2019-04-15 2022-12-06 Oppo广东移动通信有限公司 物体几何参数的测量方法、装置和终端
WO2021068799A1 (en) * 2019-10-07 2021-04-15 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Occlusion and collision detection for augmented reality applications
CN110736432A (zh) * 2019-10-23 2020-01-31 Oppo广东移动通信有限公司 尺寸标注方法、装置、电子设备及存储介质
CN110930448B (zh) * 2019-11-01 2023-11-24 北京化工大学 基于手部图像的参数测量方法与装置
CN111540061B (zh) * 2020-04-09 2023-04-07 北京完美知识科技有限公司 画作挂载方法、装置、计算机设备及计算机可读存储介质
CN111595304A (zh) * 2020-04-30 2020-08-28 宁波市交建工程监理咨询有限公司 一种监理测量数据记录方法、系统及其存储介质
CN112037236A (zh) * 2020-08-28 2020-12-04 深圳开立生物医疗科技股份有限公司 一种超声三维图像测量方法、系统、设备及计算机介质
CN112033284B (zh) * 2020-08-28 2022-05-17 北京睿呈时代信息科技有限公司 存储器、基于监控视频的交互测量方法、系统和设备
CN112150527B (zh) * 2020-08-31 2024-05-17 深圳市慧鲤科技有限公司 测量方法及装置、电子设备及存储介质
CN112102390A (zh) * 2020-08-31 2020-12-18 北京市商汤科技开发有限公司 测量方法及装置、电子设备及存储介质
CN115031635A (zh) * 2020-08-31 2022-09-09 深圳市慧鲤科技有限公司 测量方法及装置、电子设备及存储介质
CN112683169A (zh) * 2020-12-17 2021-04-20 深圳依时货拉拉科技有限公司 物体尺寸测量方法、装置、设备及存储介质
CN115046480B (zh) * 2021-03-09 2023-11-10 华为技术有限公司 一种测量长度的方法、电子设备以及移动设备
CN114663345B (zh) * 2022-01-13 2023-09-01 北京众禾三石科技有限责任公司 定点测量方法、装置、电子设备及存储介质

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160109220A1 (en) * 2014-10-21 2016-04-21 Hand Held Products, Inc. Handheld dimensioning system with feedback
CN106352797A (zh) * 2015-07-13 2017-01-25 宇龙计算机通信科技(深圳)有限公司 利用双摄像头测量物体长度的方法及终端
KR20180106480A (ko) * 2017-03-20 2018-10-01 전자부품연구원 2d 이미지를 이용한 고해상도 3d 뎁스 이미지 생성 장치 및 그 방법
CN108613625A (zh) * 2018-05-03 2018-10-02 艾律有限责任公司 一种利用增强现实技术的测量设备及其测量方法
CN108844457A (zh) * 2017-05-18 2018-11-20 金钱猫科技股份有限公司 一种精确图像测量方法及系统
CN108898676A (zh) * 2018-06-19 2018-11-27 青岛理工大学 一种虚实物体之间碰撞及遮挡检测方法及系统
CN109559371A (zh) * 2017-09-27 2019-04-02 虹软科技股份有限公司 一种用于三维重建的方法和装置
CN109615703A (zh) * 2018-09-28 2019-04-12 阿里巴巴集团控股有限公司 增强现实的图像展示方法、装置及设备
CN110006343A (zh) * 2019-04-15 2019-07-12 Oppo广东移动通信有限公司 物体几何参数的测量方法、装置和终端

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6912293B1 (en) * 1998-06-26 2005-06-28 Carl P. Korobkin Photogrammetry engine for model construction
US8615376B2 (en) * 2010-05-21 2013-12-24 Sure-Shot Medical Device Inc. Method and apparatus for dimensional measurement
US20130308013A1 (en) * 2012-05-18 2013-11-21 Honeywell International Inc. d/b/a Honeywell Scanning and Mobility Untouched 3d measurement with range imaging
US9336440B2 (en) * 2013-11-25 2016-05-10 Qualcomm Incorporated Power efficient use of a depth sensor on a mobile device
CN104952058B (zh) * 2014-03-28 2019-01-15 联想(北京)有限公司 一种信息处理的方法及电子设备
KR20170066336A (ko) * 2014-08-13 2017-06-14 씨 쓰리 리미티드 로그 스캔 시스템
EP3270098B2 (en) * 2015-03-10 2022-12-07 Hoya Lens Thailand Ltd. Measurement system for eyeglasses-wearing parameter, measurement program, measurement method therefor, and manufacturing method for eyeglasses lens
CN105180802B (zh) * 2015-05-04 2017-08-15 广东欧珀移动通信有限公司 一种物体尺寸信息识别方法和装置
US9792687B2 (en) * 2015-08-31 2017-10-17 Intel Corporation Point-to-point distance measurements in 3D camera images
US10096131B2 (en) * 2015-09-25 2018-10-09 Logical Turn Services Inc. Dimensional acquisition of packages
US10769806B2 (en) * 2015-09-25 2020-09-08 Logical Turn Services, Inc. Dimensional acquisition of packages
CN106813568B (zh) * 2015-11-27 2019-10-29 菜鸟智能物流控股有限公司 物体测量方法及装置
CN106839975B (zh) * 2015-12-03 2019-08-30 杭州海康威视数字技术股份有限公司 基于深度相机的体积测量方法及其系统
JP6465789B2 (ja) * 2015-12-25 2019-02-06 Kddi株式会社 デプスカメラの内部パラメータを算出するプログラム、装置及び方法
US10587858B2 (en) * 2016-03-14 2020-03-10 Symbol Technologies, Llc Device and method of dimensioning using digital images and depth data
CN105976406B (zh) * 2016-04-26 2019-04-23 上海时元互联网科技有限公司 测量系统、测量装置、以及脚型测量方法、和系统
CN106247951B (zh) * 2016-08-29 2019-04-02 上海交通大学 一种基于深度图像的物体测量方法
US10089750B2 (en) * 2017-02-02 2018-10-02 Intel Corporation Method and system of automatic object dimension measurement by using image processing
CN108965690B (zh) * 2017-05-17 2021-02-26 欧姆龙株式会社 图像处理系统、图像处理装置及计算机可读存储介质
US11321864B1 (en) * 2017-10-31 2022-05-03 Edge 3 Technologies User guided mode for measurement purposes
CN107894588B (zh) * 2017-11-13 2020-11-13 北京小米移动软件有限公司 移动终端、距离测量方法、尺寸测量方法及装置
CN108304119B (zh) * 2018-01-19 2022-10-28 腾讯科技(深圳)有限公司 物体测量方法、智能终端及计算机可读存储介质
CN108537834B (zh) * 2018-03-19 2020-05-01 杭州艾芯智能科技有限公司 一种基于深度图像的体积测量方法、系统及深度相机
CN108627092A (zh) * 2018-04-17 2018-10-09 南京阿凡达机器人科技有限公司 一种包裹体积的测量方法、系统、储存介质及移动终端
CN108682031B (zh) * 2018-05-21 2021-08-17 深圳市酷开网络科技股份有限公司 基于增强现实技术的测量方法、智能终端及存储介质
CN109186461A (zh) * 2018-07-27 2019-01-11 南京阿凡达机器人科技有限公司 一种箱体大小的测量方法及测量设备
CN109218702B (zh) * 2018-09-05 2019-12-31 天目爱视(北京)科技有限公司 一种相机自转式3d测量及信息获取装置
US20200090361A1 (en) * 2018-09-17 2020-03-19 Samin E&S Co.,Ltd. Apparatus and method for measuring dimension based on 3d point cloud data
CN109509182B (zh) * 2018-10-29 2021-03-26 首都航天机械有限公司 一种基于图像处理的典型产品几何尺寸测量方法及系统
US20200304375A1 (en) * 2019-03-19 2020-09-24 Microsoft Technology Licensing, Llc Generation of digital twins of physical environments
US10838515B1 (en) * 2019-03-27 2020-11-17 Facebook, Inc. Tracking using controller cameras

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160109220A1 (en) * 2014-10-21 2016-04-21 Hand Held Products, Inc. Handheld dimensioning system with feedback
CN106352797A (zh) * 2015-07-13 2017-01-25 宇龙计算机通信科技(深圳)有限公司 利用双摄像头测量物体长度的方法及终端
KR20180106480A (ko) * 2017-03-20 2018-10-01 전자부품연구원 2d 이미지를 이용한 고해상도 3d 뎁스 이미지 생성 장치 및 그 방법
CN108844457A (zh) * 2017-05-18 2018-11-20 金钱猫科技股份有限公司 一种精确图像测量方法及系统
CN108844456A (zh) * 2017-05-18 2018-11-20 金钱猫科技股份有限公司 一种快速图像测量方法及系统
CN109559371A (zh) * 2017-09-27 2019-04-02 虹软科技股份有限公司 一种用于三维重建的方法和装置
CN108613625A (zh) * 2018-05-03 2018-10-02 艾律有限责任公司 一种利用增强现实技术的测量设备及其测量方法
CN108898676A (zh) * 2018-06-19 2018-11-27 青岛理工大学 一种虚实物体之间碰撞及遮挡检测方法及系统
CN109615703A (zh) * 2018-09-28 2019-04-12 阿里巴巴集团控股有限公司 增强现实的图像展示方法、装置及设备
CN110006343A (zh) * 2019-04-15 2019-07-12 Oppo广东移动通信有限公司 物体几何参数的测量方法、装置和终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3943881A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112707340A (zh) * 2020-12-10 2021-04-27 安徽有光图像科技有限公司 基于视觉识别的设备控制信号生成方法、装置及叉车
CN113689478A (zh) * 2021-09-03 2021-11-23 凌云光技术股份有限公司 量测设备的对齐方法、装置及系统
CN113689478B (zh) * 2021-09-03 2024-02-09 凌云光技术股份有限公司 量测设备的对齐方法、装置及系统

Also Published As

Publication number Publication date
EP3943881A4 (en) 2022-06-01
CN112797897A (zh) 2021-05-14
US20220003537A1 (en) 2022-01-06
CN110006343B (zh) 2021-02-12
CN112797897B (zh) 2022-12-06
CN110006343A (zh) 2019-07-12
EP3943881A1 (en) 2022-01-26

Similar Documents

Publication Publication Date Title
WO2020211626A1 (zh) 物体几何参数的测量方法、装置和终端
WO2020207191A1 (zh) 虚拟物体被遮挡的区域确定方法、装置及终端设备
CN110276317B (zh) 一种物体尺寸检测方法、物体尺寸检测装置及移动终端
WO2019205850A1 (zh) 位姿确定方法、装置、智能设备及存储介质
CN110276774B (zh) 物体的绘图方法、装置、终端和计算机可读存储介质
JP2022533309A (ja) 画像ベースの位置特定
US11308655B2 (en) Image synthesis method and apparatus
CN107507239B (zh) 一种图像分割方法及移动终端
CN107818290B (zh) 基于深度图的启发式手指检测方法
US20230113647A1 (en) Object measurement method, virtual object processing method, and electronic device
WO2022022141A1 (zh) 图像显示方法、装置、计算机设备及存储介质
CN104081307A (zh) 图像处理装置、图像处理方法和程序
JP6965891B2 (ja) 情報処理装置、情報処理方法、及び記録媒体
WO2021004412A1 (zh) 手持输入设备及其指示图标的显示位置控制方法和装置
CN110310325B (zh) 一种虚拟测量方法、电子设备及计算机可读存储介质
CN111354029A (zh) 手势深度确定方法、装置、设备及存储介质
CN112365530A (zh) 增强现实处理方法及装置、存储介质和电子设备
CN110134234B (zh) 一种三维物体定位的方法及装置
CN109934168B (zh) 人脸图像映射方法及装置
EP3676801B1 (en) Electronic devices, methods, and computer program products for controlling 3d modeling operations based on pose metrics
US20200167005A1 (en) Recognition device and recognition method
CN111368675A (zh) 手势深度信息的处理方法、装置、设备及存储介质
TWI779332B (zh) 擴增實境系統與其錨定顯示虛擬物件的方法
WO2022193180A1 (zh) 视频帧处理方法和装置
CN110941974B (zh) 虚拟对象的控制方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20790606

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020790606

Country of ref document: EP

Effective date: 20211021