WO2019169941A1 - 一种测距方法及装置 - Google Patents

一种测距方法及装置 Download PDF

Info

Publication number
WO2019169941A1
WO2019169941A1 PCT/CN2018/125716 CN2018125716W WO2019169941A1 WO 2019169941 A1 WO2019169941 A1 WO 2019169941A1 CN 2018125716 W CN2018125716 W CN 2018125716W WO 2019169941 A1 WO2019169941 A1 WO 2019169941A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
point
scene
scene point
target
Prior art date
Application number
PCT/CN2018/125716
Other languages
English (en)
French (fr)
Inventor
张旭
马超群
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2019169941A1 publication Critical patent/WO2019169941A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • the present invention relates to the field of terminal technologies, and in particular, to a distance measurement method and apparatus.
  • a traditional measuring tool such as a ruler and a tape measure for manual measurement. Because it is necessary to carry a ruler, a tape measure, etc., it is inconvenient to carry with you. And when the measuring distance is large (for example, 10 meters), the operation of the mode is inconvenient; the other is to use a measuring tool such as infrared, laser or ultrasonic to calculate the distance by calculating the time difference of the reflected signal. It is also necessary to carry a measurement tool with you, and the measurement tool is usually a sophisticated electronic instrument, and the measurement cost is high. Therefore, in the prior art, when measuring the distance between two points, it is usually required to use a professional measuring tool, and when the user does not carry the above-mentioned professional measuring tool with him, the measurement may not be completed.
  • a traditional measuring tool such as a ruler and a tape measure for manual measurement. Because it is necessary to carry a ruler, a tape measure, etc., it is inconvenient to carry with you. And when the measuring distance is large (
  • the embodiment of the present application provides a ranging method for measuring a distance between two points of a space without using a measuring tool.
  • an embodiment of the present application provides a ranging method, where the method includes:
  • the ranging method in the embodiment of the present application does not need to rely on a professional measurement tool, and can complete the distance measurement simply and conveniently without increasing the hardware cost of the terminal device; compared to the prior art.
  • the method does not need to limit the target image to a preset definition so that the scope of application is wide, and the spatial distance is calculated by the three-dimensional coordinates so that the measurement accuracy is high.
  • the target image is obtained according to the first image and/or the second image, where the first image is obtained by the first camera device capturing the target scene from the first position, The second image is obtained by the second camera capturing the target scene from the second position.
  • acquiring the parallax image corresponding to the target image includes:
  • acquiring a depth image corresponding to the target image includes:
  • the focal length of the first camera device is the same as the focal length of the second camera device;
  • Determining a depth value of a scene point corresponding to each image point in the target image including:
  • the focal length determines a depth value of a scene point corresponding to each image point in the target image.
  • the spatial distance between the first scene point and the second scene point is obtained according to the three-dimensional coordinates of the first scene point and the three-dimensional coordinates of the second scene point, including:
  • the initial spatial distance as a spatial distance between the first scene point and the second scene point; or obtaining the first scene point and the second according to the initial spatial distance and a preset distance compensation value The spatial distance of the scene point.
  • an embodiment of the present application provides a ranging device, where the ranging device includes:
  • An acquiring unit configured to acquire a target image and a parallax image or a depth image corresponding to the target image
  • a conversion unit configured to perform point cloud conversion on the parallax image or the depth image to obtain a point cloud image
  • a processing unit configured to obtain, according to the point cloud image and the first image point and the second image point in the target image, three-dimensional coordinates of the first scene point corresponding to the first image point, and the second image The three-dimensional coordinates of the second scene point corresponding to the point; and, according to the three-dimensional coordinates of the first scene point and the three-dimensional coordinates of the second scene point, the space of the first scene point and the second scene point are obtained distance.
  • the target image is obtained according to the first image and/or the second image, where the first image is obtained by the first camera device capturing the target scene from the first position, The second image is obtained by the second camera capturing the target scene from the second position.
  • the obtaining unit is specifically configured to:
  • the obtaining unit is specifically configured to:
  • the focal length of the first camera device is the same as the focal length of the second camera device;
  • the obtaining unit is specifically configured to:
  • the focal length determines a depth value of a scene point corresponding to each image point in the target image.
  • the processing unit is specifically configured to:
  • the initial spatial distance as a spatial distance between the first scene point and the second scene point; or obtaining the first scene point and the second according to the initial spatial distance and a preset distance compensation value The spatial distance of the scene point.
  • the distance measuring device is a semiconductor chip, and the semiconductor chip is disposed in the terminal device;
  • first camera device and the second camera device are both rear camera devices of the terminal device; or the first camera device and the second camera device are both front of the terminal device Set the camera.
  • the distance measuring device is a terminal device
  • the first imaging device and the second imaging device are both rear imaging devices of the terminal device; or the first imaging device and the second imaging device are both front camera devices of the terminal device.
  • Yet another embodiment of the present application provides a distance measuring device, the device comprising:
  • a memory for storing a software program
  • a processor for reading a software program in the memory and performing a ranging method in implementing any of the above designs.
  • Yet another embodiment of the present application provides a computer storage medium storing a software program that implements a ranging method in any of the above designs when read and executed by one or more processors .
  • Yet another aspect of the present application provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the methods described in the various aspects above.
  • Yet another aspect of the present application provides a computer program that, when run on a computer, causes the computer to perform the methods described in the various aspects above.
  • FIG. 1 is a schematic structural diagram of a distance measuring device according to an embodiment of the present application.
  • FIG. 2 is a schematic flowchart of a ranging method according to an embodiment of the present application.
  • FIG. 3a is a schematic diagram of a target image captured by a terminal device
  • Figure 3b is a parallax image corresponding to the target image
  • Figure 3c is a point cloud image converted from a parallax image
  • Figure 3d is a schematic view showing the spatial distance
  • FIG. 4 is a schematic diagram showing the principle of acquiring a depth image according to two imaging devices
  • FIG. 5 is a schematic flowchart of an overall execution process of a ranging method according to an embodiment of the present application.
  • FIG. 6a is a schematic diagram showing an interface display of the terminal device in a standby state
  • 6b is a schematic diagram of an interface of the terminal device entering a state to be photographed
  • 6c is a schematic diagram showing a target image displayed by the terminal device
  • Figure 6d is a schematic diagram showing the spatial distance of scene points corresponding to two image points selected by the user
  • Figure 6e is a schematic diagram of the updated spatial distance
  • FIG. 7 is a schematic structural diagram of another ranging device according to an embodiment of the present application.
  • a scheme for using a mobile phone ranging is: pre-establishing a correspondence between a zoom ratio of a mobile phone photograph and a camera parameter, and when the user needs to know the actual geographical distance of the target, for example, the actual distance between the two peaks, You can use the mobile phone to take pictures of the target, that is, take pictures of the two peaks, and obtain the image with the sharpest sharpness of the target, and the scale of the target in the image at this time; the mobile phone obtains the user on this image.
  • Input two target points to be measured calculate the image distance of the two target points to be measured in the image, according to the image distance of the two target points to be measured in the image and the scale of the target in the image
  • the actual geographical distance between the two target points to be measured can be calculated, that is, the actual geographical distance between the two peaks.
  • the terminal device can calculate the distance between two points according to the scaling ratio of the target in the image.
  • the applicability of the above solution is limited, for example, the region with low definition may not be measured.
  • estimating the distance by scaling will result in less accurate measurements.
  • the embodiment of the present application provides a ranging method for measuring the distance between different objects without using a professional measuring tool.
  • the method includes: acquiring a target image and a parallax image or a depth image corresponding to the target image, and then performing point cloud conversion on the parallax image or the depth image to obtain a point cloud image; and further, according to the point cloud image and the target image.
  • a first image point and a second image point the three-dimensional coordinates of the first scene point corresponding to the first image point and the three-dimensional coordinates of the second scene point corresponding to the second image point are obtained, and the first scene point and the second scene point are obtained The distance of the space.
  • the ranging method in the embodiment of the present application does not need to rely on a professional measurement tool, and can easily and conveniently complete the distance measurement without increasing the hardware cost of the terminal device; compared to the prior art solution
  • the method does not need to limit the target image to a preset definition so that the scope of application is wide, and the spatial distance is calculated by the three-dimensional coordinates so that the measurement accuracy is high.
  • the ranging method in the embodiment of the present application can be performed by a ranging device.
  • the distance measuring device may be a semiconductor chip, and the semiconductor chip is disposed in the terminal device; or the ranging device may also be a terminal device.
  • a ranging device 100 provided by an embodiment of the present application includes at least one processor 11 , a communication bus 12 , a memory 13 , and at least one communication interface 14 .
  • the processor 11 can be a general purpose central processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of the program of the present application.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • Communication bus 12 can include a path for communicating information between the components described above.
  • the communication interface 14 uses devices, such as any transceiver, for communicating with other devices or communication networks.
  • the memory 13 can be a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (RAM) or other type that can store information and instructions.
  • the dynamic storage device can also be an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Compact Disc Read-Only Memory (CD-ROM) or other optical disc storage, and a disc storage device. (including compact discs, laser discs, optical discs, digital versatile discs, Blu-ray discs, etc.), magnetic disk storage media or other magnetic storage devices, or can be used to carry or store desired program code in the form of instructions or data structures and can be Any other medium accessed by the device, but is not limited thereto.
  • the memory can exist independently and be connected to the processor via a bus. The memory can also be integrated with the processor.
  • the memory 13 is configured to store application code for executing the solution of the present application, and is controlled by the processor 11, that is, the processor 11 is configured to execute the application code implemented in the memory 13.
  • processor 11 may include one or more CPUs, such as CPU0 and CPU1 in FIG.
  • the apparatus 100 can include multiple processors, such as the processor 11 and the processor 15 of FIG. Each of these processors can be a single-CPU processor or a multi-core processor.
  • a processor herein may refer to one or more circuits and/or processing cores for processing data, such as computer program instructions.
  • the terminal device may further include a first imaging device and a second imaging device, wherein the first imaging device and the second imaging device are both a rear camera device of the terminal device; or both the first camera device and the second camera device are front camera devices of the terminal device.
  • the distance measuring device may further include a first imaging device and a second imaging device (not illustrated in FIG. 1 ), and the first imaging device and the second imaging device are both The rear camera device of the terminal device; or the first camera device and the second camera device are both front camera devices of the terminal device.
  • the terminal device described above may be a mobile phone, a tablet, or the like.
  • FIG. 2 is a schematic flowchart of a ranging method according to an embodiment of the present application, and the ranging method may be performed by the ranging device 100 illustrated in FIG. 1 . As shown in Figure 2, the method includes:
  • Step 201 Acquire a target image and a parallax image or a depth image corresponding to the target image.
  • the target image is obtained according to the first image and/or the second image
  • the first image is obtained by the first imaging device capturing the target scene from the first position
  • the second image is the second image. The device captures the target scene from the second location.
  • the first camera device and the second camera device are both rear camera devices of the terminal device. Further, the optical center of the first imaging device and the optical center of the second imaging device are spaced apart in the lateral direction by a set distance, and the focal lengths of the first imaging device and the second imaging device are the same.
  • the first imaging device is a color imaging device
  • the second imaging device is an auxiliary imaging device, or vice versa; in another example, the first imaging device is a color imaging device, and the second imaging device is a black and white imaging device. Or, vice versa.
  • the first camera device in the embodiment of the present application may specifically be a camera
  • the second camera device may specifically be a camera.
  • the terminal device uses the first camera to capture the target scene from the first location, obtain a first image, and use the second camera to The second location captures the target scene, obtains a second image, and fuses the first image and the second image to obtain a target image, thereby improving the photographing effect.
  • the terminal device may directly use the first image or the second image as the target image, which is not limited in detail. As shown in Figure 3a, it is a schematic diagram of the target image.
  • the terminal device simultaneously starts shooting by the first camera device and the second camera device;
  • the first position may be understood as the first camera.
  • the optical center of the device, the second position may be the optical center of the first imaging device, and the distance between the first position and the second position is the set distance.
  • the disparity image corresponding to the target image may be obtained, and the target may be determined according to a disparity value of the scene point corresponding to each image point in the target image in the first image and the second image.
  • the parallax image corresponding to the image Obtaining a depth image corresponding to the target image, including: determining a depth value of a scene point corresponding to each image point in the target image; determining a location according to a depth value of the scene point corresponding to each image point in the target image A depth image corresponding to the target image.
  • a scene point corresponding to each image point in the target image may be between a disparity value in the first image and the second image, between the first location and the second location
  • the distance and the focal length determine a depth value of a scene point corresponding to each image point in the target image.
  • the parallax image corresponding to the target image refers to an image having the same size as the target image and the element value is the parallax value of the scene point corresponding to the image point in the target image.
  • the depth image corresponding to the target image refers to an image having the same size as the target image and the element value being the depth value of the scene point corresponding to the image point in the target image.
  • the depth image is taken as an example for description below.
  • FIG. 4 is a schematic diagram showing the principle of acquiring a depth image according to two imaging devices.
  • OL is the optical center of the first imaging device (which may also be referred to as the left imaging device)
  • OR is the optical center of the second imaging device (which may also be referred to as the right imaging device), between the two optical centers.
  • the set distance is b
  • the line segment between the two optical centers is the baseline of the first camera and the second camera.
  • a line segment of length L on the left side represents an imaging surface of the first imaging device
  • a line segment of length L on the right side represents an imaging surface of the second imaging device, an imaging surface of the first imaging device, and a second imaging device
  • the imaging planes are on the same plane.
  • the shortest distance from the optical center of the first imaging device to the imaging surface of the first imaging device is the focal length of the first imaging device
  • the shortest distance from the optical center of the second imaging device to the imaging surface of the second imaging device is the second imaging.
  • the focal length of the device is that the focal lengths of the first imaging device and the second imaging device are both f. If P is a scene point in the target scene, the image point on the imaging surface of the first imaging device (ie, the image point in the first image) is PL, and the image point on the imaging surface of the second imaging device ( That is, the image point in the second image is PR, and the distances of PL and PR from the left edge of the respective image planes are XL and XR, respectively.
  • the disparity value of the scene point P (ie, the disparity value of the scene point P in the first image and the second image) is the difference between the image point PL corresponding to the scene point P and the abscissa of the image point PR in the image plane coordinate system. , is XR-XL or XL-XR.
  • the depth value of the scene point P is the distance Z from the scene point P to the baseline of the first camera and the second camera.
  • the depth value of the scene point P can be obtained by calculation based on the parallax value of the scene point P, the set distance between the optical center of the first imaging device, and the optical center of the second imaging device, and the focal length.
  • the depth image corresponding to the target image can be obtained.
  • the parallax image corresponding to the target image can be obtained, as shown in FIG. 3b, which is the parallax image corresponding to the target image in FIG. 3a.
  • FIG. 4 is an example of setting the two camera devices of the terminal device to the left and right. Therefore, the parallax value of the scene point P is the image point PL corresponding to the scene point P and the image point PR in the image plane coordinate system. The difference between the abscissas.
  • the two camera devices of the terminal device may also be set to other positional relationships, such as up and down settings. At this time, the disparity value of the scene point P may still be according to the image point PL corresponding to the scene point P.
  • the coordinates of the image point PR in the image plane coordinate system are calculated, and the specific calculation method will not be described in detail herein.
  • the principal point coordinates of the first imaging device correspond to the center of the imaging surface (L) of the first imaging device
  • the coordinates of the principal point of the second imaging device correspond to the first The center of the imaging plane (L) of the second imaging device
  • the coordinates of the main point of the camera device may not correspond to the center of the imaging surface.
  • the depth value of the scene point P can be obtained by the following formula:
  • c' x is the principal point abscissa of the second imaging device
  • c x is the principal point abscissa of the first imaging device
  • Step 202 Perform point cloud conversion on the parallax image or the depth image to obtain a point cloud image.
  • mapping matrix Q is defined, and the relationship between the parallax value of the scene point and the real three-dimensional coordinates is established, wherein the expression of Q is as follows:
  • c x and c y are the principal point coordinates of the first imaging device, and c' x is the principal point abscissa of the second imaging device.
  • the coordinates (x, y) in the image plane coordinate system and the parallax value d of the scene point corresponding to any image point of any image point in the target image are converted by the following formula, and any one of them can be obtained.
  • the scene point corresponding to the image point is three-dimensionally reconstructed in the three-dimensional coordinates in the reference coordinate system with the first imaging device, and a point cloud image is obtained.
  • (X/W, Y/W, Z/W) is the three-dimensional coordinates of the scene point corresponding to the image point (x, y).
  • the above-described method is to perform point cloud conversion on the parallax image, as shown in FIG. 3c, which is converted according to the parallax image in FIG. 3b.
  • the point cloud image if in step 201, the depth image corresponding to the target image is obtained, the point cloud can be converted to the point cloud to obtain a point cloud image, and the specific conversion process and the above-mentioned parallax image are subjected to point cloud conversion.
  • the process is the same, the difference is that the mapping matrix used for point cloud conversion of depth images may be different from the mapping matrix used for point cloud conversion of parallax images.
  • Step 203 Obtain, according to the point cloud image and the first image point and the second image point in the target image, the three-dimensional coordinates of the first scene point corresponding to the first image point and the second image point corresponding to The three-dimensional coordinates of the second scene point.
  • Step 204 Obtain a spatial distance between the first scene point and the second scene point according to the three-dimensional coordinates of the first scene point and the three-dimensional coordinates of the second scene point.
  • the terminal device may further include a display screen, and the user may view the target image through the display screen. If the display screen is a touch screen, the user may select the first image by finger touch after viewing the target image. Point and second image point. Correspondingly, after the terminal device detects the touch operation of the user and determines that the user selects the first image point and the second image point (see FIG. 3d), each image point corresponding to the image point has been obtained by the point cloud conversion in step 202.
  • the three-dimensional coordinates of the scene point can respectively obtain the three-dimensional coordinates of the first scene point corresponding to the first image point and the three-dimensional coordinates of the second scene point corresponding to the second image point, and then the first scene point can be obtained by the Euclidean distance formula The initial spatial distance between the second scene point and the second scene point.
  • the initial spatial distance may be directly used as the spatial distance of the first scene point and the second scene point.
  • the terminal device may further acquire a preset distance compensation value, and further obtain a spatial distance between the first scene point and the second scene point according to the initial spatial distance and the preset distance compensation value.
  • the preset distance compensation value can be obtained in a variety of ways.
  • a possible implementation manner is that after the user triggers the calibration function of the terminal device, the user calibrates two image points in the preset image, and the spatial distance between the scene points corresponding to the two image points is a preset distance, for example, 10cm;
  • the user can take an image containing the ruler in the calibration mode of the terminal device, thereby facilitating calibration of the preset distance. If the terminal device calculates that the distance between the scene points corresponding to the two image points is 9 cm, since the error value of the calculation result of the terminal device is 1 cm, the preset distance compensation value can be set to 1 cm, and subsequently in the ranging process.
  • the calculated distance and the preset distance compensation value are added to obtain the final result.
  • the user can perform multiple calibrations, for example, calibrating two points of any 10cm, 20cm, 40cm, 80cm, 160cm, and 200cm respectively, and then obtaining a preset distance by the terminal device according to the error value obtained by multiple calibrations.
  • the compensation value for example, the average value of the error values obtained by multiple calibrations can be used as the preset distance compensation value.
  • the terminal device may mark the target scene on the target image and display it to the user, as shown in FIG. 3d, the spatial distance obtained by the terminal device. It is 9.72441cm. If the user determines that the accuracy of the result is not high, the auxiliary tuning function of the terminal device may be triggered. Accordingly, the terminal device may upload the target image and the information of the first image point and the second image point of the user calibration to the server.
  • the server may include a preset ranging model, and the server may obtain a spatial distance according to the target image and the information of the first image point and the second image point through a preset ranging model, and return to the terminal device, and display by the terminal device.
  • the ranging model in the server may be obtained through artificial intelligence training, and is not specifically limited herein.
  • FIG. 5 is only described by taking a distance measuring device as a terminal device as an example. As shown in Figure 5, it includes:
  • Step 501 After determining that the user triggers the photographing function of the terminal device, the terminal device starts the first camera device and the second camera device to enter a state to be photographed.
  • the terminal device is a mobile phone with a touch screen as an example
  • FIG. 6a is a schematic diagram of an interface display in which the terminal device is in a standby state, and the user can trigger the camera function of the terminal device by a touch operation
  • Figure 6b is a schematic diagram of an interface in which the terminal device enters a state to be photographed.
  • Step 502 After determining that the user turns on the shooting function switch, the terminal device determines whether the ranging function is enabled. If yes, step 503 is performed, and if no, step 508 is performed.
  • the shooting function switch and the ranging function switch can be displayed on the interface to be photographed, as shown in FIG. 6b, so that the user can open the shooting function switch and the ranging function switch by touch operation. .
  • the terminal device determines whether the ranging function is enabled before the user turns on the shooting function switch, which is not limited.
  • Step 503 The terminal device performs shooting on the target scene by using the first imaging device and the second imaging device to obtain a target image and a parallax image or a depth image corresponding to the target image.
  • the first imaging device and the second imaging device may both be rear imaging devices of the terminal device.
  • the target image may be displayed to the user, as shown in FIG. 6c, and the parallax image or the depth image corresponding to the target image is only the result obtained by the intermediate processing step performed by the terminal device during the ranging process. No need to display.
  • Step 504 The terminal device performs point cloud conversion on the parallax image or the depth image to obtain three-dimensional coordinates of the scene point corresponding to the image point in the target image.
  • the terminal device After the terminal device obtains the three-dimensional coordinates of the scene point corresponding to the image point in the target image, it may be stored first, so as to calculate the spatial distance after the user selects the image point.
  • Step 505 The terminal device measures a spatial distance of a scene point corresponding to two image points selected by the user.
  • the user can select two image points on the target image by a touch operation.
  • the specific selection manner may be various, which is not limited in the embodiment of the present application.
  • the three-dimensional coordinates of the scene points corresponding to the two image points can be obtained, and the initial spatial distance is obtained according to the Euclidean distance formula, and then the spatial distance is obtained according to the initial spatial distance and the preset distance compensation value.
  • the calibration function switch can be displayed on the interface to be photographed, as shown in FIG. 6b.
  • the user can open the calibration function switch by touch operation, and correspondingly, the terminal device can be Enter the calibration mode to get the preset distance compensation value.
  • Step 506 The terminal device marks the measured spatial distance in the target image and displays it to the user.
  • the spatial distance of the scene points corresponding to the two image points selected by the user is illustrated, and the unit thereof may be preset, such as cm.
  • Step 507 After determining that the user turns on the auxiliary tuning function, the terminal device sends the target image and the information of the two image points selected by the user to the server, and receives the spatial distance returned by the server, and then the target image according to the spatial distance returned by the server. The spatial distance of the marker is updated.
  • the auxiliary tuning function switch can be displayed on the interface to be photographed, as shown in FIG. 6b, so that the user can turn on the auxiliary tuning function switch by a touch operation.
  • Step 508 The terminal device photographs the target scene by using the first imaging device and the second imaging device to obtain a target image.
  • the normal shooting process can be performed to obtain the target image.
  • the terminal device can calculate the spatial distance of the scene points of the two image points selected by the user multiple times, that is, the user selects the image points a1 and a2, and the terminal device calculates the corresponding correspondence. After the spatial distance of the scene, the user can continue to select the image points b1 and b2, and the terminal device calculates the spatial distance of the corresponding scene, and the specific number of times is not limited.
  • the ranging function is implemented based on the existing camera photographing function, so that the measurement tool can be easily and conveniently completed without increasing the hardware cost of the terminal device. .
  • the scene point involved in the embodiment of the present application refers to a position corresponding to a certain three-dimensional coordinate in the target scene.
  • the two peaks shown in FIG. 6d are from left to right.
  • the scene point corresponding to the first image point is the top position of the first mountain in the three-dimensional space
  • the scene point corresponding to the second image point is the three-dimensional space. The top position of the second mountain.
  • the embodiment of the present application further provides a distance measuring device, and the specific implementation of the distance measuring device can be referred to the foregoing method flow.
  • FIG. 7 is a schematic structural diagram of another ranging device according to an embodiment of the present disclosure.
  • the ranging device may be a semiconductor chip (which may be disposed in a terminal device) or a terminal device, and the ranging device may be used for Executing the method flow illustrated in FIG. 2 above, as shown in FIG. 7, the ranging device 700 includes:
  • the acquiring unit 701 is configured to acquire a target image and a parallax image or a depth image corresponding to the target image;
  • the converting unit 702 is configured to perform point cloud conversion on the parallax image or the depth image to obtain a point cloud image
  • the processing unit 703 is configured to obtain, according to the point cloud image and the first image point and the second image point in the target image, the three-dimensional coordinates of the first scene point corresponding to the first image point, and the second a three-dimensional coordinate of the second scene point corresponding to the image point; and obtaining the first scene point and the second scene point according to the three-dimensional coordinates of the first scene point and the three-dimensional coordinates of the second scene point Space distance.
  • the target image is obtained according to the first image and/or the second image, where the first image is obtained by the first camera device capturing the target scene from the first position, The second image is obtained by the second camera capturing the target scene from the second position.
  • the obtaining unit 701 is specifically configured to:
  • the obtaining unit 701 is specifically configured to:
  • the focal length of the first camera device is the same as the focal length of the second camera device;
  • the obtaining unit 701 is specifically configured to:
  • the focal length determines a depth value of a scene point corresponding to each image point in the target image.
  • processing unit 703 is specifically configured to:
  • the initial spatial distance as a spatial distance between the first scene point and the second scene point; or obtaining the first scene point and the second according to the initial spatial distance and a preset distance compensation value The spatial distance of the scene point.
  • the distance measuring device 700 is a semiconductor chip, and the semiconductor chip is disposed in the terminal device;
  • first camera device and the second camera device are both rear camera devices of the terminal device; or the first camera device and the second camera device are both front of the terminal device Set the camera.
  • the ranging device 700 is a terminal device
  • the first imaging device and the second imaging device are both rear imaging devices of the terminal device; or the first imaging device and the second imaging device are both front camera devices of the terminal device.
  • the division of the unit in the embodiment of the present application is schematic, and is only a logical function division. In actual implementation, there may be another division manner.
  • the functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the above embodiments it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • software it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions.
  • the computer program instructions When the computer program instructions are loaded and executed on a computer, the processes or functions described in accordance with embodiments of the present invention are generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions can be stored in a computer readable storage medium or transferred from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions can be from a website site, computer, server or data center Transfer to another website site, computer, server, or data center by wire (eg, coaxial cable, fiber optic, digital subscriber line (DSL), or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer readable storage medium can be any available media that can be accessed by a computer or a data storage device such as a server, data center, or the like that includes one or more available media.
  • the usable medium may be a magnetic medium (eg, a floppy disk, a hard disk, a magnetic tape), an optical medium (eg, a DVD), or a semiconductor medium (such as a solid state disk (SSD)) or the like.
  • a magnetic medium eg, a floppy disk, a hard disk, a magnetic tape
  • an optical medium eg, a DVD
  • a semiconductor medium such as a solid state disk (SSD)
  • embodiments of the invention may be provided as a method, system, or computer program product.
  • embodiments of the invention may be in the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware.
  • embodiments of the invention may take the form of a computer program product embodied on one or more computer usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
  • Embodiments of the invention are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or FIG.
  • These computer program instructions can be provided to a processor of a general purpose computer, special purpose computer, embedded processor or other programmable data processing device to produce a machine such that instructions are executed by a processor of a computer or other programmable data processing device Means for implementing the functions specified in one or more flows of the flowchart or in a block or blocks of the flowchart.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Software Systems (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Geometry (AREA)
  • Computing Systems (AREA)
  • Algebra (AREA)
  • Computer Graphics (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Measurement Of Optical Distance (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

一种测距方法及装置,其中方法包括:获取目标图像以及目标图像对应的视差图像或深度图像,进而通过对视差图像或深度图像进行点云转换,得到点云图像;进而根据点云图像以及目标图像中的第一图像点和第二图像点,得到第一图像点对应的第一场景点的三维坐标和第二图像点对应的第二场景点的三维坐标,以及得到第一场景点和第二场景点的空间距离。通过上述方式,本申请实施例中的测距方法无需借助专业的测量工具,并可以在不增加终端设备的硬件成本的基础上简单方便地完成距离的测量。

Description

一种测距方法及装置
本申请要求在2018年3月5日提交中华人民共和国知识产权局、申请号为201810179367.2、发明名称为“一种测距方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及终端技术领域,特别涉及一种测距方法及装置。
背景技术
目前测量空间两点之间的距离的方式主要有以下几种:一种是使用直尺、卷尺等传统测量工具进行手工测量,由于需要随身携带直尺、卷尺等,从而造成随身携带较为不便,且当测量距离较大(比如10米)时,该种方式的操作较为不便;另一种是使用红外、激光或超声波等测量工具,通过计算反射信号所耗费的时间差来计算距离,该种方式也需要随身携带测量工具,且该测量工具通常为精密的电子仪器,测量成本较高。由此可知,现有技术中在测量两点之间的距离时,通常需要借助专业的测量工具方可进行,当用户没有随身携带上述专业的测量工具时,则可能导致无法完成测量。
综上,目前亟需一种测距方法,用于在不借助专业的测量工具的基础上,对空间两点之间的距离进行测量。
发明内容
本申请实施例提供一种测距方法,用于在不借助测量工具的基础上,对空间两点之间的距离进行测量。
第一方面,本申请实施例提供一种测距方法,所述方法包括:
获取目标图像以及所述目标图像对应的视差图像或深度图像;
对所述视差图像或深度图像进行点云转换,得到点云图像;
根据所述点云图像以及所述目标图像中的第一图像点和第二图像点,得到所述第一图像点对应的第一场景点的三维坐标和所述第二图像点对应的第二场景点的三维坐标;
根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,得到所述第一场景点和所述第二场景点的空间距离。
如此,通过上述方式,本申请实施例中的测距方法无需借助专业的测量工具,并可以在不增加终端设备的硬件成本的基础上简单方便地完成距离的测量;相比于现有技术中的方案来说,该方法无需限定目标图像为预设清晰度使得适用范围较广,且通过三维坐标来计算空间距离使得测量的精确度较高。
在一种可能的设计中,所述目标图像是根据第一图像和/或第二图像得到的,所述第一图像为第一摄像装置从第一位置对目标场景进行拍摄得到的,所述第二图像为第二摄像装置从第二位置对所述目标场景进行拍摄得到的。
在一种可能的设计中,获取所述目标图像对应的视差图像,包括:
根据所述目标图像中每个图像点对应的场景点在所述第一图像和所述第二图像中的视差值,确定所述目标图像对应的视差图像。
在一种可能的设计中,获取所述目标图像对应的深度图像,包括:
确定所述目标图像中每个图像点对应的场景点的深度值;
根据所述目标图像中每个图像点对应的场景点的深度值,确定所述目标图像对应的深度图像。
在一种可能的设计中,所述第一摄像装置的焦距和所述第二摄像装置的焦距相同;
确定所述目标图像中每个图像点对应的场景点的深度值,包括:
根据所述目标图像中每个图像点对应的场景点在所述第一图像和所述第二图像中的视差值、所述第一位置和所述第二位置之间的距离和所述焦距,确定所述目标图像中每个图像点对应的场景点的深度值。
在一种可能的设计中,根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,得到所述第一场景点和所述第二场景点的空间距离,包括:
根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,通过欧式距离公式得到所述第一场景点和所述第二场景点的初始空间距离;
将所述初始空间距离作为所述第一场景点和所述第二场景点的空间距离;或者,根据所述初始空间距离和预设距离补偿值得到所述第一场景点和所述第二场景点的空间距离。
通过上述方式,通过考虑预设距离补偿值能够有效提供结果的准确性。
第二方面,本申请实施例提供一种测距装置,所述测距装置包括:
获取单元,用于获取目标图像以及所述目标图像对应的视差图像或深度图像;
转换单元,用于对所述视差图像或深度图像进行点云转换,得到点云图像;
处理单元,用于根据所述点云图像以及所述目标图像中的第一图像点和第二图像点,得到所述第一图像点对应的第一场景点的三维坐标和所述第二图像点对应的第二场景点的三维坐标;以及,根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,得到所述第一场景点和所述第二场景点的空间距离。
在一种可能的设计中,所述目标图像是根据第一图像和/或第二图像得到的,所述第一图像为第一摄像装置从第一位置对目标场景进行拍摄得到的,所述第二图像为第二摄像装置从第二位置对所述目标场景进行拍摄得到的。
在一种可能的设计中,所述获取单元具体用于:
根据所述目标图像中每个图像点对应的场景点在所述第一图像和所述第二图像中的视差值,确定所述目标图像对应的视差图像。
在一种可能的设计中,所述获取单元具体用于:
确定所述目标图像中每个图像点对应的场景点的深度值;
根据所述目标图像中每个图像点对应的场景点的深度值,确定所述目标图像对应的深度图像。
在一种可能的设计中,所述第一摄像装置的焦距和所述第二摄像装置的焦距相同;
所述获取单元具体用于:
根据所述目标图像中每个图像点对应的场景点在所述第一图像和所述第二图像中的视差值、所述第一位置和所述第二位置之间的距离和所述焦距,确定所述目标图像中每个图像点对应的场景点的深度值。
在一种可能的设计中,所述处理单元具体用于:
根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,通过欧式距离公式得 到所述第一场景点和所述第二场景点的初始空间距离;
将所述初始空间距离作为所述第一场景点和所述第二场景点的空间距离;或者,根据所述初始空间距离和预设距离补偿值得到所述第一场景点和所述第二场景点的空间距离。
在一种可能的设计中,所述测距装置为半导体芯片,所述半导体芯片被设置在终端设备内;
其中,所述第一摄像装置和所述第二摄像装置均为所述终端设备的后置摄像装置;或,所述第一摄像装置和所述第二摄像装置均为所述终端设备的前置摄像装置。
在一种可能的设计中,所述测距装置为终端设备;
其中,所述第一摄像装置和第二摄像装置均为所述终端设备的后置摄像装置;或,所述第一摄像装置和第二摄像装置均为所述终端设备的前置摄像装置。
本申请的又一实施例提供一种测距装置,所述装置包括:
存储器,用于存储软件程序;
处理器,用于读取所述存储器中的软件程序并执行实现上述任一种设计中的测距方法。
本申请的又一实施例提供一种计算机存储介质,所述存储介质中存储软件程序,该软件程序在被一个或多个处理器读取并执行时实现上述任一种设计中的测距方法。
本申请的又一方面提供了一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述各方面所述的方法。
本申请的又一方面提供了一种计算机程序,当其在计算机上运行时,使得计算机执行上述各方面所述的方法。
附图说明
图1为本申请实施例提供的一种测距装置的结构示意图;
图2为本申请实施例提供的一种测距方法对应的流程示意图;
图3a为终端设备拍摄的目标图像示意图;
图3b为目标图像对应的视差图像;
图3c为根据视差图像转换得到的点云图像;
图3d为显示空间距离示意图;
图4为根据两个摄像装置获取深度图像的原理示意图;
图5为本申请实施例中的测距方法的整体执行流程示意图;
图6a为终端设备处于待机状态的界面显示示意图;
图6b为终端设备进入待拍摄状态的界面示意图;
图6c为终端设备显示目标图像示意图;
图6d为用户选定的两个图像点对应的场景点的空间距离示意图;
图6e为更新后的空间距离示意图;
图7为本申请实施例提供的另一种测距装置的结构示意图。
具体实施方式
下面结合说明书附图对本申请进行具体说明,方法实施例中的具体操作方法也可以应用于装置实施例中。
现有技术中通常需要借助专业的测量工具对不同物体之间的距离进行测量,从而给用 户造成不便。考虑到手机等终端设备几乎已经成为人们随身携带的必需品,若利用这些终端设备来测量距离,则用户可无需再携带专业的测量工具。
目前,一种利用手机测距的方案为:预先建立手机拍照的缩放比例和摄像头参数的对应关系,在用户需要知道目标物的实际地理位置距离时,例如:两个山峰之间的实际距离,可以利用手机对目标物进行拍照,即对两个山峰进行拍照,并且获取该目标物清晰度最大的图像,以及此时该目标物在该图像中的缩放比例;手机获取用户在这张图像上输入的两个待测量目标点,计算这两个待测量目标点在该图像中的图像距离,根据这两个待测目标点在该图像中的图像距离及目标物在该图像中的缩放比例可以计算出这两个待测目标点之间的实际地理位置距离,即这两个山峰之间的实际地理位置距离。
上述方案中,终端设备根据目标物在图像中的缩放比例可以计算出两点之间的距离,然而,一方面,上述方案的适用性较为有限,比如清晰度不高的区域则可能无法测量,另一方面,通过缩放比例来估算距离会导致测量的精确度不高。
基于此,本申请实施例提供一种测距方法,用于在不借助专业的测量工具的基础上,对不同物体之间的距离进行测量。
具体来说,该方法包括:获取目标图像以及目标图像对应的视差图像或深度图像,进而通过对视差图像或深度图像进行点云转换,得到点云图像;进而根据点云图像以及目标图像中的第一图像点和第二图像点,得到第一图像点对应的第一场景点的三维坐标和第二图像点对应的第二场景点的三维坐标,以及得到第一场景点和第二场景点的空间距离。通过上述方式,本申请实施例中的测距方法无需借助专业的测量工具,并可以在不增加终端设备的硬件成本的基础上简单方便地完成距离的测量;相比于现有技术中的方案来说,该方法无需限定目标图像为预设清晰度使得适用范围较广,且通过三维坐标来计算空间距离使得测量的精确度较高。
本申请实施例中的测距方法可以由测距装置来执行。其中,所述测距装置可以为半导体芯片,所述半导体芯片被设置于终端设备内;或者,所述测距装置也可以为终端设备。如图1所示,为本申请实施例提供的一种测距装置100,包括至少一个处理器11,通信总线12,存储器13以及至少一个通信接口14。
处理器11可以是一个通用中央处理器(CPU),微处理器,特定应用集成电路(application-specific integrated circuit,ASIC),或一个或多个用于控制本申请方案程序执行的集成电路。
通信总线12可包括一通路,在上述组件之间传送信息。所述通信接口14,使用任何收发器一类的装置,用于与其他设备或通信网络通信。
存储器13可以是只读存储器(read-only memory,ROM)或可存储静态信息和指令的其他类型的静态存储设备,随机存取存储器(random access memory,RAM)或者可存储信息和指令的其他类型的动态存储设备,也可以是电可擦可编程只读存储器(Electrically Erasable Programmable Read-Only Memory,EEPROM)、只读光盘(Compact Disc Read-Only Memory,CD-ROM)或其他光盘存储、光碟存储(包括压缩光碟、激光碟、光碟、数字通用光碟、蓝光光碟等)、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由该装置存取的任何其他介质,但不限于此。存储器可以是独立存在,通过总线与处理器相连接。存储器也可以和处理器集成在一起。
其中,所述存储器13用于存储执行本申请方案的应用程序代码,并由处理器11来控制执行,也就是说,所述处理器11用于执行所述存储器13中存储的应用程序代码实现本申请实施例中的测距方法。
在具体实现中,作为一种实施例,处理器11可以包括一个或多个CPU,例如图1中的CPU0和CPU1。
在具体实现中,作为一种实施例,该装置100可以包括多个处理器,例如图1中的处理器11和处理器15。这些处理器中的每一个可以是一个单核(single-CPU)处理器,也可以是一个多核(multi-CPU)处理器。这里的处理器可以指一个或多个电路和/或用于处理数据(例如计算机程序指令)的处理核。
进一步地,当所述测距装置为设置在终端设备内的半导体芯片时,所述终端设备还可以包括第一拍摄装置和第二拍摄装置,第一摄像装置和第二摄像装置均为所述终端设备的后置摄像装置;或者,第一摄像装置和第二摄像装置均为所述终端设备的前置摄像装置。
当所述测距装置为终端设备时,所述测距装置中还可以包括第一拍摄装置和第二拍摄装置(未在图1中示意),第一摄像装置和第二摄像装置均为所述终端设备的后置摄像装置;或者,第一摄像装置和第二摄像装置均为所述终端设备的前置摄像装置。
需要说明的是,上述所描述的终端设备可以为手机(mobile phone)、平板电脑(pad)等等。
图2为本申请实施例提供的一种测距方法对应的流程示意图,该测距方法可以由图1中所示意的测距装置100来执行。如图2所示,该方法包括:
步骤201,获取目标图像以及所述目标图像对应的视差图像或深度图像。
此处,目标图像是根据第一图像和/或第二图像得到的,所述第一图像为第一摄像装置从第一位置对目标场景进行拍摄得到的,所述第二图像为第二摄像装置从第二位置对所述目标场景进行拍摄得到的。
在一种可能的实现方式中,第一摄像装置和第二摄像装置均为终端设备的后置摄像装置。进一步地,所述第一摄像装置的光心和所述第二摄像装置的光心在横向上间隔设定距离,所述第一摄像装置和所述第二摄像装置的焦距相同。在一个示例中,第一摄像装置为彩色摄像装置,第二摄像装置为辅助摄像装置,或者,反之;在另一个示例中,第一摄像装置为彩色摄像装置,第二摄像装置为黑白摄像装置,或者,反之。
本申请实施例中的第一摄像装置具体可以为摄像头,第二摄像装置具体也可以为摄像头。在一个示例中,用户触发终端设备的测距功能后,终端设备使用所述第一摄像装置从第一位置对所述目标场景进行拍摄,得到第一图像,以及使用所述第二摄像装置从第二位置对所述目标场景进行拍摄,得到第二图像,并将第一图像和第二图像进行融合,得到目标图像,从而可以提升拍照效果。在其它实施例中,终端设备也可以直接将第一图像或第二图像作为所述目标图像,具体不做限定。如图3a所示,为目标图像示意图。
需要说明的是:第一图像和第二图像为用户触发终端设备的测距功能后,终端设备同时启动第一摄像装置和第二摄像装置拍摄得到的;上述第一位置可以理解为第一摄像装置的光心,第二位置可以为第一摄像装置的光心,第一位置和第二位置之间的距离即为所述设定距离。
进一步地,获取目标图像对应的视差图像,可以为:根据所述目标图像中每个图像点对应的场景点在所述第一图像和所述第二图像中的视差值,确定所述目标图像对应的视差 图像。获取所述目标图像对应的深度图像,包括:确定所述目标图像中每个图像点对应的场景点的深度值;根据所述目标图像中每个图像点对应的场景点的深度值,确定所述目标图像对应的深度图像。具体来说,可以根据所述目标图像中每个图像点对应的场景点在所述第一图像和所述第二图像中的视差值、所述第一位置和所述第二位置之间的距离和所述焦距,确定所述目标图像中每个图像点对应的场景点的深度值。
本申请实施例中,目标图像对应的视差图像即是指与所述目标图像的大小相同、且元素值为目标图像中的图像点对应的场景点的视差值的图像。目标图像对应的深度图像即是指与所述目标图像的大小相同、且元素值为目标图像中的图像点对应的场景点的深度值的图像。
下面以深度图像为例进行描述。
图4为根据两个摄像装置获取深度图像的原理示意图。如图4所示,OL为第一摄像装置(也可以称为左摄像装置)的光心,OR为第二摄像装置(也可以称为右摄像装置)的光心,两个光心之间的设定距离为b,两个光心之间的线段即为第一摄像装置和第二摄像装置的基线。位于左侧的一条长度为L的线段表示第一摄像装置的成像面,位于右侧的一条长度为L的线段表示第二摄像装置的成像面,第一摄像装置的成像面和第二摄像装置的成像面位于同一平面上。第一摄像装置的光心到第一摄像装置的成像面的最短距离即为第一摄像装置的焦距,第二摄像装置的光心到第二摄像装置的成像面的最短距离即为第二摄像装置的焦距,如图3可知,第一摄像装置和第二摄像装置的焦距长度均为f。若P为目标场景中的一个场景点,其在第一摄像装置的成像面上的图像点(即第一图像中的图像点)为PL,在第二摄像装置的成像面上的图像点(即第二图像中的图像点)为PR,PL和PR距各自像面的左边缘的距离分别是XL和XR。场景点P的视差值(即场景点P在第一图像和第二图像中的视差值)为场景点P对应的图像点PL和图像点PR在像平面坐标系中的横坐标之差,即为XR-XL或者XL-XR。场景点P的深度值即为场景点P到第一摄像装置和第二摄像装置的基线的距离Z。
根据三角形相似原理,存在如下比例关系:
Figure PCTCN2018125716-appb-000001
又可以写为:
Figure PCTCN2018125716-appb-000002
于是有:
Figure PCTCN2018125716-appb-000003
可推导出:
Figure PCTCN2018125716-appb-000004
根据上述内容可知,根据场景点P的视差值、第一摄像装置的光心和第二摄像装置的光心之间的设定距离以及焦距,通过计算可以得到场景点P的深度值。如此,通过计算目 标图像中的图像点对应的各个场景点的深度值,即可得到目标图像对应的深度图像。同样地,通过计算目标图像中的图像点对应的各个场景点的视差值,即可得到目标图像对应的视差图像,如图3b所示,为图3a中的目标图像对应的视差图像。
需要说明的是,图4是以终端设备的两个摄像装置左右设置为例,因此,场景点P的视差值为场景点P对应的图像点PL和图像点PR在像平面坐标系中的横坐标之差。在其它可能的实施例中,终端设备的两个摄像装置也可以设置为其它的位置关系,比如上下设置,此时,场景点P的视差值仍可根据场景点P对应的图像点PL和图像点PR在像平面坐标系中的坐标计算得到的,具体的计算方式此处不再详述。
进一步地,图4所示意的原理图(理想情况)中,第一摄像装置的主点坐标对应于第一摄像装置的成像面(L)的中心,第二摄像装置的主点坐标对应于第二摄像装置的成像面(L)的中心,因此,场景点P的深度值为上述公式4中计算得到的Z。
而真实情况中,摄像装置的主点坐标可能并非对应于成像面的中心,此时,场景点P的深度值可以通过如下公式得到:
Figure PCTCN2018125716-appb-000005
其中,c′ x是第二摄像装置的主点横坐标,c x是第一摄像装置的主点横坐标。
步骤202,对所述视差图像或所述深度图像进行点云转换,得到点云图像。
此处,以对视差图像进行点云转换为例,根据三角形相似原理,定义了一个映射矩阵Q,建立场景点的视差值与真实三维坐标的关系,其中,Q的表达式如下所示:
Figure PCTCN2018125716-appb-000006
其中,c x和c y是第一摄像装置的主点坐标,c′ x是第二摄像装置的主点横坐标。
进而,针对于目标图像中的任一图像点在像平面坐标系中的坐标(x,y)以及任一图像点对应的场景点的视差值d,通过如下公式进行转换,可得到任一图像点对应的场景点在以第一摄像装置为参考坐标系中的三维坐标,实现三维重建,得到点云图像。
Figure PCTCN2018125716-appb-000007
其中,(X/W,Y/W,Z/W)为图像点(x,y)对应的场景点的三维坐标。
需要说明的是,若在步骤201中,得到的为目标图像对应的视差图像,则可以采用上述是对视差图像进行点云转换,如图3c所示,为根据图3b中的视差图像转换得到的点云图像;若在步骤201中,得到的是为目标图像对应的深度图像,则可以对深度图像进行点云转换,得到点云图像,具体的转换过程与上述对视差图像进行点云转换的流程相同,其差别在于:对深度图像进行点云转换时所使用的映射矩阵与对视差图像进行点云转换时所 使用的映射矩阵可能不同。
步骤203,根据所述点云图像以及所述目标图像中的第一图像点和第二图像点,得到所述第一图像点对应的第一场景点的三维坐标和所述第二图像点对应的第二场景点的三维坐标。
步骤204,根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,得到所述第一场景点和所述第二场景点的空间距离。
在一种可能的实现方式中,终端设备中还可以包括显示屏,用户可通过显示屏查看目标图像,若显示屏为触摸屏,则用户在查看目标图像后,可通过手指触摸选定第一图像点和第二图像点。相应地,终端设备检测到用户的触摸操作并确定用户选定第一图像点和第二图像点(可参见图3d所示)后,由于在步骤202中已通过点云转换得到各个图像点对应的场景点的三维坐标,则可分别获取第一图像点对应的第一场景点的三维坐标和第二图像点对应的第二场景点的三维坐标,进而通过欧式距离公式可得到第一场景点和第二场景点之间的初始空间距离。
在一个示例中,可以直接将所述初始空间距离作为所述第一场景点和所述第二场景点的空间距离。
在另一个示例中,为提高结果的准确性,终端设备还可以获取预设距离补偿值,进而根据初始空间距离和预设距离补偿值得到第一场景点和第二场景点之间的空间距离。
其中,预设距离补偿值可以通过多种方式得到。一种可能的实现方式为,用户触发终端设备的校准功能后,由用户在预设图像中标定两个图像点,且两个图像点对应的场景点之间的空间距离为预设距离,比如10cm;具体实施中,用户可在终端设备的校准模式下,拍摄一张包含标尺的图像,从而便于标定出预设距离。若终端设备通过计算得到两个图像点对应的场景点之间的距离为9cm,由于终端设备计算结果的误差值为1cm,则可将预设距离补偿值设置为1cm,后续在测距过程中,将计算得出距离和预设距离补偿值相加,得到最终结果。为进一步提高准确性,可由用户进行多次标定,比如,分别标定任意10cm、20cm、40cm、80cm、160cm、200cm的两个点,进而由终端设备根据多次标定得到的误差值得到预设距离补偿值,比如,可以将多次标定得到的误差值的平均值作为预设距离补偿值。
本申请实施例中,终端设备得到第一场景点和第二场景点之间的空间距离后,可将其标记在目标图像上并显示给用户,如图3d所示,终端设备得到的空间距离为9.72641cm。若用户确定该结果的准确性不高,则可触发终端设备的辅助调优功能,相应地,终端设备可将目标图像以及用户标定的第一图像点和第二图像点的信息上传给服务器,服务器中可包括预设的测距模型,进而服务器可根据目标图像以及第一图像点和第二图像点的信息通过预设的测距模型得到空间距离,并返回给终端设备,由终端设备显示给用户。其中,服务器中的测距模型可以是通过人工智能训练方式得到的,此处不做具体限定。
下面结合图5对本申请实施例中的测距方法的整体执行流程进行描述。需要说明的是,图5中仅是以测距装置为终端设备为例来进行描述的。如图5所示,包括:
步骤501,终端设备确定用户触发终端设备的拍照功能后,启动第一摄像装置和第二摄像装置,进入待拍摄状态。
具体实施中,以终端设备为具有触摸屏的手机为例,图6a为终端设备处于待机状态的界面显示示意图,用户可通过触摸操作触发终端设备的拍照功能。图6b为终端设备进入 待拍摄状态的界面示意图。
步骤502,终端设备确定用户打开拍摄功能开关后,判断测距功能是否开启,若是,执行步骤503,若否,则执行步骤508。
此处,终端设备进入待拍摄状态后,可在待拍摄的界面上显示拍摄功能开关和测距功能开关,参见图6b所示,如此,用户可以通过触摸操作打开拍摄功能开关和测距功能开关。
需要说明的是,本申请实施例中,终端设备在用户打开拍摄功能开关之前判断测距功能是否开启,具体不做限定。
步骤503,终端设备使用第一摄像装置和第二摄像装置对目标场景进行拍摄,得到目标图像以及目标图像对应的视差图像或深度图像。
此处,第一摄像装置和第二摄像装置可以均为终端设备的后置摄像装置。
终端设备得到目标图像后,可将目标图像显示给用户,如图6c所示,而目标图像对应的视差图像或深度图像仅为测距过程中终端设备所执行的中间处理步骤所得到的结果,无需显示。
步骤504,终端设备对视差图像或深度图像进行点云转换,得到目标图像中的图像点对应的场景点的三维坐标。
此处,终端设备得到目标图像中的图像点对应的场景点的三维坐标后,可先存储,以便于后续在用户选定图像点后计算空间距离。
步骤505,终端设备对用户选定的两个图像点对应的场景点的空间距离进行测量。
此处,用户可通过触摸操作在目标图像上选定两个图像点,参见图6d所示,具体的选定方式可以有多种,本申请实施例对此不做限定。
具体来说,可获取两个图像点对应的场景点的三维坐标,根据欧式距离公式得到初始空间距离,进而根据初始空间距离和预设距离补偿值得到空间距离。
本申请实施例中,终端设备进入待拍摄状态后,可在待拍摄的界面上显示校准功能开关,参见图6b所示,如此,用户可以通过触摸操作打开校准功能开关,相应地,终端设备可进入校准模式,以便于得到预设距离补偿值。具体参见上述描述,此处不再赘述。
步骤506,终端设备将测量得到的空间距离标记在目标图像中,显示给用户。
此处,参见图6d,示意出了用户选定的两个图像点对应的场景点的空间距离,其单位可以预先设定,比如cm。
步骤507,终端设备确定用户打开辅助调优功能后,将目标图像以及用户选定的两个图像点的信息发给服务器,并接收服务器返回的空间距离,进而根据服务器返回的空间距离对目标图像中标记的空间距离进行更新。
此处,终端设备进入待拍摄状态后,可在待拍摄的界面上显示辅助调优功能开关,参见图6b所示,如此,用户可以通过触摸操作打开辅助调优功能开关。
参见图6e,示意出了更新后的空间距离。
步骤508,终端设备使用第一摄像装置和第二摄像装置对目标场景进行拍摄,得到目标图像。此处,由于用户未触发测距功能,因此可执行正常的拍摄流程得到目标图像。
需要说明的是,(1)上述步骤编号仅为执行流程的一种可能示例,具体实施中对各个步骤的先后顺序不做限定;(2)图6b所示意的多个功能开关的位置和实现方式仅为一种示例,在其它实施例中,多个功能开关也可以位于其它位置,或者通过其它方式实现,具 体不做限定。(3)针对于目标图像,终端设备可以对用户多次选定的两个图像点的场景点的空间距离进行计算,也就是说,用户选定图像点a1和a2,由终端设备计算出对应的场景的空间距离后,用户还可以继续选定图像点b1和b2,由终端设备计算出对应的场景的空间距离,具体的次数不做限定。
根据上述流程可知,本申请实施例中是基于现有的终端拍照功能来实现测距功能,从而无需借助测量工具,并可以在不增加终端设备的硬件成本的基础上简单方便地完成距离的测量。
需要说明的是,本申请实施例中所涉及的场景点即是指目标场景中的某个三维坐标对应的位置,举个例子,假设图6d中所示意出的两个波峰从左至右依次为第一座山的山顶和第二座山的山顶,则第一图像点对应的场景点即为三维空间中第一座山的山顶位置,第二图像点对应的场景点即为三维空间中第二座山的山顶位置。
针对于上述方法流程,本申请实施例还提供一种测距装置,该测距装置的具体实现可参见上述方法流程。
基于相同发明构思,图7为本申请实施例提供的另一种测距装置的结构示意图,该测距装置可以为半导体芯片(可设置在终端设备内)或者终端设备,该测距装置可用于执行上述图2所示意的方法流程,如图7所示,该测距装置700包括:
获取单元701,用于获取目标图像以及所述目标图像对应的视差图像或深度图像;
转换单元702,用于对所述视差图像或深度图像进行点云转换,得到点云图像;
处理单元703,用于根据所述点云图像以及所述目标图像中的第一图像点和第二图像点,得到所述第一图像点对应的第一场景点的三维坐标和所述第二图像点对应的第二场景点的三维坐标;以及,根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,得到所述第一场景点和所述第二场景点的空间距离。
在一种可能的设计中,所述目标图像是根据第一图像和/或第二图像得到的,所述第一图像为第一摄像装置从第一位置对目标场景进行拍摄得到的,所述第二图像为第二摄像装置从第二位置对所述目标场景进行拍摄得到的。
在一种可能的设计中,所述获取单元701具体用于:
根据所述目标图像中每个图像点对应的场景点在所述第一图像和所述第二图像中的视差值,确定所述目标图像对应的视差图像。
在一种可能的设计中,所述获取单元701具体用于:
确定所述目标图像中每个图像点对应的场景点的深度值;
根据所述目标图像中每个图像点对应的场景点的深度值,确定所述目标图像对应的深度图像。
在一种可能的设计中,所述第一摄像装置的焦距和所述第二摄像装置的焦距相同;
所述获取单元701具体用于:
根据所述目标图像中每个图像点对应的场景点在所述第一图像和所述第二图像中的视差值、所述第一位置和所述第二位置之间的距离和所述焦距,确定所述目标图像中每个图像点对应的场景点的深度值。
在一种可能的设计中,所述处理单元703具体用于:
根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,通过欧式距离公式得到所述第一场景点和所述第二场景点的初始空间距离;
将所述初始空间距离作为所述第一场景点和所述第二场景点的空间距离;或者,根据所述初始空间距离和预设距离补偿值得到所述第一场景点和所述第二场景点的空间距离。
在一种可能的设计中,所述测距装置700为半导体芯片,所述半导体芯片被设置在终端设备内;
其中,所述第一摄像装置和所述第二摄像装置均为所述终端设备的后置摄像装置;或,所述第一摄像装置和所述第二摄像装置均为所述终端设备的前置摄像装置。
在一种可能的设计中,所述测距装置700为终端设备;
其中,所述第一摄像装置和第二摄像装置均为所述终端设备的后置摄像装置;或,所述第一摄像装置和第二摄像装置均为所述终端设备的前置摄像装置。
需要说明的是,本申请实施例中对单元的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。在本申请的实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本发明实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
本领域内的技术人员应明白,本发明实施例可提供为方法、系统、或计算机程序产品。因此,本发明实施例可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明实施例可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本发明实施例是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个 方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
显然,本领域的技术人员可以对本发明实施例进行各种改动和变型而不脱离本申请的精神和范围。这样,倘若本发明实施例的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包含这些改动和变型在内。

Claims (16)

  1. 一种测距方法,其特征在于,所述方法包括:
    获取目标图像以及所述目标图像对应的视差图像或深度图像;
    对所述视差图像或深度图像进行点云转换,得到点云图像;
    根据所述点云图像以及所述目标图像中的第一图像点和第二图像点,得到所述第一图像点对应的第一场景点的三维坐标和所述第二图像点对应的第二场景点的三维坐标;
    根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,得到所述第一场景点和所述第二场景点的空间距离。
  2. 根据权利要求1所述的方法,其特征在于,所述目标图像是根据第一图像和/或第二图像得到的,所述第一图像为第一摄像装置从第一位置对目标场景进行拍摄得到的,所述第二图像为第二摄像装置从第二位置对所述目标场景进行拍摄得到的。
  3. 根据权利要求2所述的方法,其特征在于,获取所述目标图像对应的视差图像,包括:
    根据所述目标图像中每个图像点对应的场景点在所述第一图像和所述第二图像中的视差值,确定所述目标图像对应的视差图像。
  4. 根据权利要求2所述的方法,其特征在于,获取所述目标图像对应的深度图像,包括:
    确定所述目标图像中每个图像点对应的场景点的深度值;
    根据所述目标图像中每个图像点对应的场景点的深度值,确定所述目标图像对应的深度图像。
  5. 根据权利要求4所述的方法,其特征在于,所述第一摄像装置的焦距和所述第二摄像装置的焦距相同;
    确定所述目标图像中每个图像点对应的场景点的深度值,包括:
    根据所述目标图像中每个图像点对应的场景点在所述第一图像和所述第二图像中的视差值、所述第一位置和所述第二位置之间的距离和所述焦距,确定所述目标图像中每个图像点对应的场景点的深度值。
  6. 根据权利要求1至5中任一项所述的方法,其特征在于,根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,得到所述第一场景点和所述第二场景点的空间距离,包括:
    根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,通过欧式距离公式得到所述第一场景点和所述第二场景点的初始空间距离;
    将所述初始空间距离作为所述第一场景点和所述第二场景点的空间距离;或者,根据所述初始空间距离和预设距离补偿值得到所述第一场景点和所述第二场景点的空间距离。
  7. 一种测距装置,其特征在于,所述测距装置包括:
    获取单元,用于获取目标图像以及所述目标图像对应的视差图像或深度图像;
    转换单元,用于对所述视差图像或深度图像进行点云转换,得到点云图像;
    处理单元,用于根据所述点云图像以及所述目标图像中的第一图像点和第二图像点,得到所述第一图像点对应的第一场景点的三维坐标和所述第二图像点对应的第二场景点的三维坐标;以及,根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,得到 所述第一场景点和所述第二场景点的空间距离。
  8. 根据权利要求7所述的装置,其特征在于,所述目标图像是根据第一图像和/或第二图像得到的,所述第一图像为第一摄像装置从第一位置对目标场景进行拍摄得到的,所述第二图像为第二摄像装置从第二位置对所述目标场景进行拍摄得到的。
  9. 根据权利要求8所述的装置,其特征在于,所述获取单元具体用于:
    根据所述目标图像中每个图像点对应的场景点在所述第一图像和所述第二图像中的视差值,确定所述目标图像对应的视差图像。
  10. 根据权利要求8所述的装置,其特征在于,所述获取单元具体用于:
    确定所述目标图像中每个图像点对应的场景点的深度值;
    根据所述目标图像中每个图像点对应的场景点的深度值,确定所述目标图像对应的深度图像。
  11. 根据权利要求10所述的装置,其特征在于,所述第一摄像装置的焦距和所述第二摄像装置的焦距相同;
    所述获取单元具体用于:
    根据所述目标图像中每个图像点对应的场景点在所述第一图像和所述第二图像中的视差值、所述第一位置和所述第二位置之间的距离和所述焦距,确定所述目标图像中每个图像点对应的场景点的深度值。
  12. 根据权利要求7至11中任一项所述的装置,其特征在于,所述处理单元具体用于:
    根据所述第一场景点的三维坐标和所述第二场景点的三维坐标,通过欧式距离公式得到所述第一场景点和所述第二场景点的初始空间距离;
    将所述初始空间距离作为所述第一场景点和所述第二场景点的空间距离;或者,根据所述初始空间距离和预设距离补偿值得到所述第一场景点和所述第二场景点的空间距离。
  13. 根据权利要求7至12中任一项所述的装置,其特征在于,所述测距装置为半导体芯片,所述半导体芯片被设置在终端设备内;
    其中,所述第一摄像装置和所述第二摄像装置均为所述终端设备的后置摄像装置;或,所述第一摄像装置和所述第二摄像装置均为所述终端设备的前置摄像装置。
  14. 根据权利要求7至12中任一项所述的装置,其特征在于,所述测距装置为终端设备;
    其中,所述第一摄像装置和第二摄像装置均为所述终端设备的后置摄像装置;或,所述第一摄像装置和第二摄像装置均为所述终端设备的前置摄像装置。
  15. 一种测距装置,其特征在于,所述装置包括:
    存储器,用于存储软件程序;
    处理器,用于读取所述存储器中的软件程序并执行权利要求1至权利要求6中任一项所述的测距方法。
  16. 一种计算机存储介质,其特征在于,所述存储介质中存储软件程序,该软件程序在被一个或多个处理器读取并执行时实现权利要求1至权利要求6中任一项所述的测距方法。
PCT/CN2018/125716 2018-03-05 2018-12-29 一种测距方法及装置 WO2019169941A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810179367.2 2018-03-05
CN201810179367.2A CN110232707B (zh) 2018-03-05 2018-03-05 一种测距方法及装置

Publications (1)

Publication Number Publication Date
WO2019169941A1 true WO2019169941A1 (zh) 2019-09-12

Family

ID=67846804

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/125716 WO2019169941A1 (zh) 2018-03-05 2018-12-29 一种测距方法及装置

Country Status (2)

Country Link
CN (2) CN110232707B (zh)
WO (1) WO2019169941A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112308962A (zh) * 2020-11-05 2021-02-02 山东产研信息与人工智能融合研究院有限公司 以实体目标为最小单元的实景模型构建方法及装置
CN113050113A (zh) * 2021-03-10 2021-06-29 广州南方卫星导航仪器有限公司 一种激光点定位方法和装置
CN113327318A (zh) * 2021-05-18 2021-08-31 禾多科技(北京)有限公司 图像显示方法、装置、电子设备和计算机可读介质
CN113379826A (zh) * 2020-03-10 2021-09-10 顺丰科技有限公司 物流件的体积测量方法以及装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111990930B (zh) * 2020-08-28 2022-05-20 北京石头创新科技有限公司 一种测距方法、装置、机器人和存储介质
CN113376643A (zh) * 2021-05-10 2021-09-10 广州文远知行科技有限公司 距离检测方法、装置及电子设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030234512A1 (en) * 2002-06-20 2003-12-25 Holub David G. Trailer hitch video alignment system
CN1847781A (zh) * 2006-02-14 2006-10-18 中国科学院上海技术物理研究所 光电测宽仪动态测量位置校正方法
CN105222717A (zh) * 2015-08-28 2016-01-06 宇龙计算机通信科技(深圳)有限公司 一种标的物长度测量方法及装置

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10775165B2 (en) * 2014-10-10 2020-09-15 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
CN105913444B (zh) * 2016-05-03 2019-07-19 华南农业大学 基于软激光测距的牲畜体型轮廓重构方法及体况评分方法
CN106384106A (zh) * 2016-10-24 2017-02-08 杭州非白三维科技有限公司 一种基于三维扫描的反欺诈人脸识别系统
CN106569225B (zh) * 2016-10-31 2023-11-24 浙江大学 一种基于测距传感器的无人车实时避障方法
CN106780619B (zh) * 2016-11-25 2020-03-13 青岛大学 一种基于Kinect深度相机的人体尺寸测量方法
CN106651926A (zh) * 2016-12-28 2017-05-10 华东师范大学 一种基于区域配准的深度点云三维重建方法
CN106971403B (zh) * 2017-04-27 2020-04-03 武汉数文科技有限公司 点云图像处理方法及装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030234512A1 (en) * 2002-06-20 2003-12-25 Holub David G. Trailer hitch video alignment system
CN1847781A (zh) * 2006-02-14 2006-10-18 中国科学院上海技术物理研究所 光电测宽仪动态测量位置校正方法
CN105222717A (zh) * 2015-08-28 2016-01-06 宇龙计算机通信科技(深圳)有限公司 一种标的物长度测量方法及装置

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113379826A (zh) * 2020-03-10 2021-09-10 顺丰科技有限公司 物流件的体积测量方法以及装置
CN112308962A (zh) * 2020-11-05 2021-02-02 山东产研信息与人工智能融合研究院有限公司 以实体目标为最小单元的实景模型构建方法及装置
CN112308962B (zh) * 2020-11-05 2023-10-17 山东产研信息与人工智能融合研究院有限公司 以实体目标为最小单元的实景模型构建方法及装置
CN113050113A (zh) * 2021-03-10 2021-06-29 广州南方卫星导航仪器有限公司 一种激光点定位方法和装置
CN113050113B (zh) * 2021-03-10 2023-08-01 广州南方卫星导航仪器有限公司 一种激光点定位方法和装置
CN113327318A (zh) * 2021-05-18 2021-08-31 禾多科技(北京)有限公司 图像显示方法、装置、电子设备和计算机可读介质
CN113327318B (zh) * 2021-05-18 2022-07-29 禾多科技(北京)有限公司 图像显示方法、装置、电子设备和计算机可读介质

Also Published As

Publication number Publication date
CN110232707B (zh) 2021-08-31
CN113781534A (zh) 2021-12-10
CN110232707A (zh) 2019-09-13

Similar Documents

Publication Publication Date Title
WO2019169941A1 (zh) 一种测距方法及装置
US9886774B2 (en) Photogrammetric methods and devices related thereto
CN107230225B (zh) 三维重建的方法和装置
US10157474B2 (en) 3D recording device, method for producing a 3D image, and method for setting up a 3D recording device
JP6124184B2 (ja) 画像化された被写体上の異なる点の間の距離の取得
JP6338021B2 (ja) 画像処理装置、画像処理方法および画像処理プログラム
WO2016127328A1 (zh) 一种测量物体尺寸的方法及装置
US10277889B2 (en) Method and system for depth estimation based upon object magnification
CN102196166B (zh) 摄像装置及显示方法
WO2021136386A1 (zh) 数据处理方法、终端和服务器
JP2004506389A (ja) グラフィカル・ユーザ・インタフェースを介するカメラの外部キャリブレーション方法及び装置
KR102129206B1 (ko) 사진 이미지를 이용한 3차원 좌표 계산 방법 및 3차원 좌표 계산 장치
CN110260801A (zh) 用于测量物料体积的方法和装置
JP2017090420A (ja) 3次元情報復元装置及び3次元情報復元方法
TW201312080A (zh) 非接觸式測量實物長度的方法
JP5996233B2 (ja) 画像撮像装置
CN105783881B (zh) 空中三角测量的方法和装置
JP2003006618A (ja) 3次元モデルの生成方法および装置並びにコンピュータプログラム
WO2015087315A1 (en) Methods and systems for remotely guiding a camera for self-taken photographs
CN109612439B (zh) 基于有理函数模型的立体影像交会角和基线长度估计方法
JP2018032144A (ja) 画像処理装置、画像処理方法およびプログラム。
TWI637353B (zh) 測量裝置及測量方法
WO2023199583A1 (ja) ビューワ制御方法及び情報処理装置
TWI516744B (zh) 距離估算系統、距離估算方法及電腦可讀取媒體
JP2004354234A (ja) 写真計測用カメラキャリブレーション方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18909144

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18909144

Country of ref document: EP

Kind code of ref document: A1