CN112835062A - Underwater distance measuring method, device, equipment and storage medium - Google Patents

Underwater distance measuring method, device, equipment and storage medium Download PDF

Info

Publication number
CN112835062A
CN112835062A CN202110018274.3A CN202110018274A CN112835062A CN 112835062 A CN112835062 A CN 112835062A CN 202110018274 A CN202110018274 A CN 202110018274A CN 112835062 A CN112835062 A CN 112835062A
Authority
CN
China
Prior art keywords
laser
preset
coordinate system
imaging
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110018274.3A
Other languages
Chinese (zh)
Inventor
张洵
戚伟
习志平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Chasing-Innovation Technology Co ltd
Original Assignee
Shenzhen Chasing-Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chasing-Innovation Technology Co ltd filed Critical Shenzhen Chasing-Innovation Technology Co ltd
Priority to CN202110018274.3A priority Critical patent/CN112835062A/en
Publication of CN112835062A publication Critical patent/CN112835062A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data

Abstract

The embodiment of the invention discloses an underwater ranging method, an underwater ranging device, underwater ranging equipment and a storage medium, wherein the method comprises the following steps: acquiring a laser reflection image of an object to be detected; determining imaging coordinates of a laser imaging point on the laser reflection image; determining an imaging three-dimensional coordinate of the imaging coordinate under a preset three-dimensional coordinate system, wherein the preset three-dimensional coordinate system is a three-dimensional coordinate system which is formed by taking a focus of a camera module as an original point, taking a main optical axis of the camera module as a z-axis, taking a parallel line with a transverse axis of the camera module coordinate system as an x-axis and taking a parallel line with a longitudinal axis of the camera module coordinate system as a y-axis; determining the path of a reflected light ray of the object to be detected based on the imaging three-dimensional coordinates; and determining the distance of the object to be measured based on the reflected light path and a preset laser light path. The embodiment of the invention realizes the distance measurement of a long-distance object, and has the advantages of simple test equipment, strong robustness, strong underwater anti-interference capability, simple and rapid calculation mode and high measurement precision.

Description

Underwater distance measuring method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of distance measurement, in particular to an underwater distance measurement method, device, equipment and storage medium.
Background
With the development of science and technology, people are exploring the sea continuously, and when various works are carried out under the sea, the distances of related objects are required to be measured so as to determine the positions of the related objects.
The existing underwater object distance measuring method is acoustic radar distance measurement, but the distance measuring method is usually suitable for distance measurement of objects with large sizes, and required professional equipment is expensive, so that the distance measuring cost is increased, and various problems of echo interference and the like exist. At present, a method for measuring distance by using mobile phone equipment is also available, and a 3D structured light and TOF (Time of Flight Measurement) distance measuring method is adopted for mobile phone distance measuring, so that the distance measuring method is suitable for short-distance Measurement, and the underwater distance measuring effect is not ideal.
Disclosure of Invention
In view of this, embodiments of the present invention provide an underwater distance measurement method, apparatus, device and storage medium, so as to provide an underwater long-distance object distance measurement method and improve robustness of underwater object distance measurement.
In a first aspect, an embodiment of the present invention provides an underwater distance measurement method, including:
acquiring a laser reflection image of an object to be detected;
determining imaging coordinates of a laser imaging point on the laser reflection image;
determining an imaging three-dimensional coordinate of the imaging coordinate under a preset three-dimensional coordinate system, wherein the preset three-dimensional coordinate system is a three-dimensional coordinate system which is formed by taking a focus of a camera module as an original point, taking a main optical axis of the camera module as a z-axis, taking a parallel line with a transverse axis of the camera module coordinate system as an x-axis and taking a parallel line with a longitudinal axis of the camera module coordinate system as a y-axis;
determining the path of a reflected light ray of the object to be detected based on the imaging three-dimensional coordinates;
and determining the distance of the object to be measured based on the reflected light path and a preset laser light path.
Further, before obtaining the laser reflection image of the object to be measured, the method further includes:
fixedly mounting a laser transmitter and a camera module to form a preset three-dimensional coordinate system;
and correcting the camera module to determine a preset laser ray path of the laser ray emitted by the laser emitter under the preset three-dimensional coordinate system.
Further, correcting the camera module to determine a preset laser ray path of the laser ray emitted by the laser emitter under the preset three-dimensional coordinate system includes:
emitting laser light at a first preset distance from a sample through the laser emitter, and acquiring a first laser image of the sample based on the first preset distance through the camera module;
emitting laser light at a second preset distance from the sample through the laser emitter, and acquiring a second laser image of the sample based on the second preset distance through the camera module;
determining a first image coordinate of a first laser imaging point in the first laser image and a second image coordinate of a second laser imaging point in the second laser image;
determining a first position coordinate of a first laser point corresponding to the first laser imaging point under the preset three-dimensional coordinate system according to the first image coordinate and determining a second position coordinate of a second laser point corresponding to the second laser imaging point under the preset three-dimensional coordinate system according to the second image coordinate;
and determining a preset laser ray path according to the first position coordinate and the second position coordinate.
Further, determining a first position coordinate of a first laser point corresponding to the first laser imaging point in the preset three-dimensional coordinate system according to the first image coordinate and determining a second position coordinate of a second laser point corresponding to the second laser imaging point in the preset three-dimensional coordinate system according to the second image coordinate includes:
acquiring intrinsic parameters of the camera module;
converting the first image coordinate into a first position coordinate under the preset three-dimensional coordinate system according to the inherent parameter and the first preset distance;
and converting the second image coordinate into a second position coordinate under the preset three-dimensional coordinate system according to the inherent parameter and the second preset distance.
Further, determining the imaging three-dimensional coordinates of the imaging coordinates in the preset three-dimensional coordinate system includes:
and converting the imaging coordinate into an imaging three-dimensional coordinate under a preset three-dimensional coordinate system according to the inherent parameters.
Further, determining the path of the reflected light of the object to be measured based on the imaged three-dimensional coordinates comprises:
and determining the path of the reflected light of the object to be detected according to the imaging three-dimensional coordinate and the coordinate of the focus.
Further, determining the distance of the object to be measured based on the reflected light path and a preset laser light path includes:
and determining intersection point coordinates of the reflected light ray path and a preset laser light ray path, wherein the z-axis value of the intersection point coordinates is the distance of the object to be measured.
In a second aspect, an embodiment of the present invention provides an underwater distance measuring device, including:
the image acquisition module is used for acquiring a laser reflection image of the object to be detected;
the imaging coordinate determination module is used for determining the imaging coordinate of a laser imaging point on the laser reflection image;
the coordinate conversion module is used for determining an imaging three-dimensional coordinate of the imaging coordinate under a preset three-dimensional coordinate system, wherein the preset three-dimensional coordinate system is a three-dimensional coordinate system which is formed by taking a focus of a camera module as an original point, taking a main optical axis of the camera module as a z-axis, taking a parallel line with a transverse axis of the camera module coordinate system as an x-axis and taking a parallel line with a longitudinal axis of the camera module coordinate system as a y-axis;
the reflected light path determining module is used for determining the reflected light path of the object to be detected based on the imaging three-dimensional coordinate;
and the distance determining module is used for determining the distance of the object to be measured based on the reflected light ray path and a preset laser light ray path.
In a third aspect, an embodiment of the present invention provides an electronic device, including:
one or more processors;
a storage device for storing one or more programs,
when the one or at least one program is executed by the one or more processors, the one or more processors are caused to implement the underwater ranging method provided by any embodiment of the present invention.
In a fourth aspect, embodiments of the present invention provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the underwater ranging method provided in any embodiment of the present invention.
The underwater distance measurement method provided by the embodiment of the invention can realize distance measurement of a long-distance object, and has the advantages of simple test equipment, strong robustness, strong underwater anti-interference capability, simple and quick calculation mode and high measurement precision.
Drawings
Fig. 1 is a schematic flow chart of an underwater distance measuring method according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of an underwater distance measuring method according to a second embodiment of the present invention;
fig. 3 is a schematic flow chart of a camera module calibration method according to a second embodiment of the present invention;
fig. 4 is a schematic diagram illustrating an underwater ranging method according to a second embodiment of the present invention;
fig. 5 is a schematic diagram illustrating a principle of a camera module calibration method according to a second embodiment of the present invention;
fig. 6 is a schematic structural diagram of an underwater distance measuring device according to a third embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the steps as a sequential process, many of the steps can be performed in parallel, concurrently or simultaneously. In addition, the order of the steps may be rearranged. A process may be terminated when its operations are completed, but may have additional steps not included in the figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc.
Furthermore, the terms "first," "second," and the like may be used herein to describe various orientations, actions, steps, elements, or the like, but the orientations, actions, steps, or elements are not limited by these terms. These terms are only used to distinguish one direction, action, step or element from another direction, action, step or element. The terms "first", "second", etc. are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "plurality", "batch" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Example one
Fig. 1 is a schematic flow chart of an underwater distance measuring method according to an embodiment of the present invention. As shown in fig. 1, an underwater distance measuring method provided by an embodiment of the present invention includes:
and S110, acquiring a laser reflection image of the object to be detected.
Specifically, distance measuring equipment in this embodiment is including making a video recording module and laser emitter, and laser emitter is used for to the object transmission laser light that awaits measuring, and the module of making a video recording is used for shooing the image of the object that awaits measuring. The laser reflection image of the object to be measured is that a laser emitter emits laser rays to the object to be measured, the laser rays form a laser spot on the surface of the object to be measured, then an image of the object to be measured is shot through a camera device, and a laser reflection image is obtained, wherein the laser reflection image is provided with the laser spot.
And S120, determining the imaging coordinate of the laser imaging point on the laser reflection image.
Specifically, the imaging coordinate of the laser imaging point on the laser reflection image refers to the pixel position of the imaging point of the laser point in the camera module, and is represented by the pixel coordinate, which is a two-dimensional pixel coordinate of the laser imaging point in the image coordinate system of the camera module.
S130, determining an imaging three-dimensional coordinate of the imaging coordinate under a preset three-dimensional coordinate system, wherein the preset three-dimensional coordinate system is a three-dimensional coordinate system formed by taking a focus of a camera module as an original point, taking a main optical axis of the camera module as a z-axis, taking a parallel line with a transverse axis of the camera module coordinate system as an x-axis and taking a parallel line with a longitudinal axis of the camera module coordinate system as a y-axis.
In this embodiment, as shown in fig. 4, the laser emitter 1 and the camera module 2 are fixedly installed, the camera module 2 mainly includes an image sensor 3(sensor) and a lens 4, and the image sensor 3 can adopt a CCD camera or a CMOS camera. The preset three-dimensional coordinate system is a coordinate system shared by the laser transmitter 1 and the camera module 2, and the origin of the coordinate system is the focus O of the camera module 2; a connecting line passing through the optical center of the image sensor 3 and the focus O is a main optical axis of the camera module, and the main optical axis is a z axis of a preset three-dimensional coordinate system; the image sensor 3 has a two-dimensional coordinate system, that is, a coordinate system of the camera module 2, and the preset three-dimensional coordinate system Oxyz is configured by using a parallel line passing through the focal point O and a horizontal axis of the coordinate system of the camera module 2 as an x-axis of the preset three-dimensional coordinate system and a parallel line passing through the focal point O and a vertical axis of the coordinate system of the camera module 2 as a y-axis of the preset three-dimensional coordinate system.
And converting the imaging coordinate of the laser imaging point, namely converting the pixel coordinate of the imaging point of the laser point on the camera module 2 into the three-dimensional coordinate of the imaging point in a preset three-dimensional coordinate system, namely the imaging three-dimensional coordinate. Illustratively, as shown in FIG. 4, laserThe light emitter 1 emits laser light to the object 5 to be measured, a laser point (also called a laser reflection point) is formed on a surface point A of the object 5 to be measured, the laser light forms laser reflection light through the laser reflection point, the laser reflection light passes through the focus O, the camera module 2 shoots a laser reflection image of the object 5 to be measured, and a laser imaging point B corresponding to the laser point A is determined. For the distance measuring device composed of the laser emitter 1 and the camera module 2, the optical center of the image sensor 3, the pixel width and the focal length of the lens 4 are known (i.e. the focal point is known), and the optical center coordinate of the image sensor 3 is set as (X)c,Yc) (in pixels), a pixel width P (in micrometers, which represents the physical width represented by a single pixel), a focal length F (in millimeters), and an imaging coordinate of a laser imaging point B of the laser spot A as (X)0,Y0) The unit is pixel, and the three-dimensional coordinate of the laser imaging point B under the preset three-dimensional coordinate system is XB,YB,ZB) (in microns), then:
XB=(X0-Xc)*P
YB=(Y0-Yc)*P
ZB=F*103
and S140, determining the path of the reflected light of the object to be detected based on the imaging three-dimensional coordinates.
Specifically, the reflected light path is a path of the laser reflected light formed by the object to be measured reflecting the laser light, and the reflected light path passes through the focal point O, and the laser point a and the corresponding laser imaging point B are both points on the reflected light path. Therefore, the connecting line of the laser imaging point B and the focal point O is the reflected light path, and the expression of the reflected light path can be obtained according to the imaging three-dimensional coordinate of the laser imaging point B and the three-dimensional coordinate of the focal point O (the focal point O is the origin of the preset three-dimensional coordinate system, and the three-dimensional coordinate thereof is (0,0, 0)).
S150, determining the distance of the object to be measured based on the reflected light path and a preset laser light path.
Specifically, the preset laser ray path is the path of the laser ray emitted by the laser emitter 1, and for the fixedly installed laser emitter 1 and the camera module 2, the path of the laser ray emitted by the laser emitter 1 under the preset three-dimensional coordinate system is fixed, and the path can be known through calibration of the camera module 2 and stored in advance as a calculation parameter. As shown in fig. 4, the laser point a is a point on the object 5, and then the distance H of the object 5 can be determined by obtaining the coordinates of the laser point a. Meanwhile, the laser point a is a point on the preset laser ray path and the reflected light ray path, and therefore, the intersection point of the preset laser ray path and the reflected light ray path is determined, and the three-dimensional coordinate of the laser point a can be determined. In this embodiment, the distance H of the object 5 to be measured is a distance from the object to be measured to the origin of the preset three-dimensional coordinate system, and then the z-coordinate value in the three-dimensional coordinate of the laser point a is the distance H of the object 5 to be measured.
The underwater distance measurement method provided by the embodiment of the invention can realize distance measurement of a long-distance object, and has the advantages of simple test equipment, strong robustness, strong underwater anti-interference capability, simple and quick calculation mode and high measurement precision.
Example two
Fig. 2 is a schematic flow chart of an underwater distance measuring method according to a second embodiment of the present invention, which further details the above-described embodiment. As shown in fig. 2, an underwater distance measuring method provided by the second embodiment of the present invention includes:
s210, fixedly mounting the laser transmitter and the camera module to form a preset three-dimensional coordinate system.
Specifically, as shown in fig. 4, the laser emitter 1 and the camera module 2 are fixedly installed, the camera module 2 mainly includes an image sensor 3(sensor) and a lens 4, and the image sensor 3 can adopt a CCD camera or a CMOS camera. The preset three-dimensional coordinate system is a coordinate system shared by the laser transmitter 1 and the camera module 2, and the origin of the coordinate system is the focus O of the camera module 2; a connecting line passing through the optical center of the image sensor 3 and the focus O is a main optical axis of the camera module, and the main optical axis is a z axis of a preset three-dimensional coordinate system; the image sensor 3 has a two-dimensional coordinate system, that is, a coordinate system of the camera module 2, and the preset three-dimensional coordinate system Oxyz is configured by using a parallel line passing through the focal point O and a horizontal axis of the coordinate system of the camera module 2 as an x-axis of the preset three-dimensional coordinate system and a parallel line passing through the focal point O and a vertical axis of the coordinate system of the camera module 2 as a y-axis of the preset three-dimensional coordinate system.
S220, correcting the camera module to determine a preset laser ray path of the laser ray emitted by the laser emitter under the preset three-dimensional coordinate system.
Specifically, the calibration is to obtain the relevant calculation parameters of the testing device, such as the preset laser ray path. During correction, a laser emitter is used for emitting laser rays to a sample, the distance between the sample and the original point of a preset three-dimensional coordinate system is changed, a plurality of groups of laser reflection pictures with different distances are collected through a lens module, and the actual expression of the laser ray path in the preset three-dimensional coordinate system is measured by utilizing the sum of the actual distances of pixel points of a plurality of groups of different laser points on the image.
Further, as shown in fig. 3, the calibration process comprises the following steps:
s221, emitting laser light at a first preset distance from the sample through the laser emitter, and obtaining a first laser image of the sample based on the first preset distance through the camera module.
Specifically, as shown in fig. 5, first, the sample 6 is placed at a first preset distance D1, the laser emitter 1 emits a laser beam to the sample 6, a first laser point a1 (i.e., a first laser reflection point a1) is formed on the sample 6, and then a first laser image of the sample 6 is obtained through the camera module 2, where the first laser image includes a first laser imaging point B1 corresponding to the first laser point a 1.
S222, emitting laser light at a second preset distance from the sample through the laser emitter, and obtaining a second laser image of the sample based on the second preset distance through the camera module.
Specifically, as shown in fig. 5, after the first laser image is obtained, the sample 6 is moved to a second preset distance D2, then the laser emitter 1 emits laser light to the sample 6, a second laser point a2 (i.e., a second laser reflection point a2) is formed on the sample 6, and then the camera module 2 obtains a second laser image of the sample 6, where the second laser image includes a second laser imaging point B2 corresponding to the second laser point a 2.
S223, determining a first image coordinate of a first laser imaging point in the first laser image and a second image coordinate of a second laser imaging point in the second laser image.
Specifically, the first image coordinate of the first laser imaging point is the pixel coordinate of the first laser imaging point B1 corresponding to the first laser point a1 in the first laser image, and is marked as (X)1,Y1). The second image coordinate of the second laser imaging point is the pixel coordinate of the second laser imaging point B2 corresponding to the second laser point A2 in the second laser image, and is marked as (X)2,Y2)。
S224, determining a first position coordinate of a first laser point corresponding to the first laser imaging point in the preset three-dimensional coordinate system according to the first image coordinate, and determining a second position coordinate of a second laser point corresponding to the second laser imaging point in the preset three-dimensional coordinate system according to the second image coordinate.
Specifically, after the image coordinates of the laser imaging points are determined, the position coordinates of the stress light points under a preset three-dimensional coordinate system are determined according to the image coordinates, so that the subsequent calculation is facilitated.
Further, the step of calculating the coordinates of the laser spot comprises: acquiring intrinsic parameters of the camera module; converting the first image coordinate into a first position coordinate under the preset three-dimensional coordinate system according to the inherent parameter and the first preset distance; and converting the second image coordinate into a second position coordinate under the preset three-dimensional coordinate system according to the inherent parameter and the second preset distance.
Specifically, the intrinsic parameters of the camera module include a focal length F, a pixel width P, and an optical center coordinate (X)c,Yc). The focal length F is the distance from the center of the lens 4 to the focal point O, the pixel width P is the actual physical width represented by a single pixel, and the optical center coordinate (X)c,Yc) I.e. the pixel coordinates of the optical center of the image sensor 3 in its own image coordinate system.
In the presetA triangle formed by the three-dimensional coordinate system, the focus O, the optical center of the image sensor 3, and the laser imaging point, and a triangle formed by the focus O, the projection point of the main optical axis (z-axis) on the sample 6 (the projection point is represented as (0,0, actual object distance) in the preset three-dimensional coordinate system), and the laser reflection point are similar triangles. Let first image coordinates (X) of first laser imaging point B11,Y1) First position coordinate (x) of the corresponding first laser point A11,y1,z1) Second image coordinate (X) of second laser imaging point B22,Y2) Corresponding second position coordinate (x) of the second laser point A22,y2,z2) According to the characteristics of similar triangles:
Figure BDA0002887781520000111
illustratively, the focal length F is 3.8 mm, the pixel width P is 3.1 microns, and the optical center coordinates are (960, 540). The first preset distance D1 is 85 cm, the first image coordinates are (1288, 409), the second preset distance D2 is 30 cm, and the second image coordinates are (1300, 332). The first position coordinates (227442, -90838, 850000) (in microns) and the second position coordinates (83210, -10010, 300000) (in microns) were obtained according to the above equation.
And S225, determining a preset laser ray path according to the first position coordinate and the second position coordinate.
Specifically, a first position coordinate (x)1,y1,z1) Corresponding first laser point A1 and second position coordinate (x)2,y2,z2) The corresponding second laser points a2 are all points on the laser beam emitted by the laser emitter 1, and are therefore based on the first position coordinate (x)1,y1,z1) And second position coordinates (x)2,y2,z2) The preset laser ray path can be obtained and stored as the calculation parameter.
And S230, acquiring a laser reflection image of the object to be detected.
Specifically, when the distance measurement is performed on the object to be measured, as shown in fig. 4, the laser emitter 1 emits laser light to the object to be measured 5, and a laser spot a is formed on the object to be measured 5. Then, a laser reflection image is obtained through the camera module 2, and the laser reflection image comprises a laser imaging point B of the laser point A on the camera module 2.
And S240, determining the imaging coordinate of the laser imaging point on the laser reflection image.
Specifically, the imaging coordinate of the laser imaging point on the laser reflection image refers to the pixel position of the imaging point of the laser point in the camera module, and is represented by the pixel coordinate, which is a two-dimensional pixel coordinate of the laser imaging point in the image coordinate system of the camera module. Illustratively, the imaging coordinate of the laser imaging point is (1248, 386).
And S250, converting the imaging coordinate into an imaging three-dimensional coordinate under a preset three-dimensional coordinate system according to the inherent parameters.
In particular, three-dimensional coordinates (X) are imagedB,YB,ZB) The coordinate of the laser imaging point B in the three-dimensional coordinate system is calculated in the following way:
XB=(X0-Xc)*P
YB=(Y0-Yc)*P
ZB=F*103
illustratively, the imaging three-dimensional coordinates (893, -477, 3800) are obtained according to the imaging coordinates (1248, 386).
And S260, determining the path of the reflected light of the object to be detected according to the imaging three-dimensional coordinate and the coordinate of the focus.
Specifically, the laser imaging point B and the focal point are both points on the laser reflected light, and the reflected light path can be obtained by jointly imaging the three-dimensional coordinate and the focal point coordinate (i.e., the origin coordinate).
S270, determining intersection point coordinates of the reflected light ray path and a preset laser light ray path, wherein the z-axis value of the intersection point coordinates is the distance of the object to be measured.
Specifically, the laser point a is a point on both the preset laser ray path and the reflected light ray path, and the intersection point coordinate of the preset laser ray path and the reflected light ray path is obtained, that is, the three-dimensional coordinate of the laser point a can be determined, and the z-axis value of the intersection point coordinate is the distance H of the object 5 to be measured. For example, the distance H of the object 5 to be measured is 55 cm, which is determined by the reflected light path according to the three-dimensional imaging coordinates (893, -477, 3800).
The underwater distance measurement method provided by the embodiment of the invention can realize distance measurement of a long-distance object, and has the advantages of simple test equipment, strong robustness, strong underwater anti-interference capability, simple and quick calculation mode and high measurement precision.
EXAMPLE III
Fig. 6 is a schematic structural diagram of an underwater distance measuring device according to a third embodiment of the present invention. The underwater ranging device provided by the embodiment can realize the underwater ranging method provided by any embodiment of the invention, has corresponding functional structures and beneficial effects of the realization method, and the content which is not described in detail in the embodiment can refer to the description of any method embodiment of the invention.
As shown in fig. 6, an underwater ranging device provided by a third embodiment of the present invention includes: an image acquisition module 310, an imaging coordinate determination module 320, a coordinate conversion module 330, a reflected light path determination module 340, and a distance determination module 350, wherein:
the image acquisition module 310 is configured to acquire a laser reflection image of an object to be detected;
the imaging coordinate determination module 320 is configured to determine imaging coordinates of a laser imaging point on the laser reflection image;
the coordinate conversion module 330 is configured to determine an imaging three-dimensional coordinate of the imaging coordinate in a preset three-dimensional coordinate system, where the preset three-dimensional coordinate system is a three-dimensional coordinate system formed by taking a focus of the camera module as an origin, taking a main optical axis of the camera module as a z-axis, taking a parallel line with a horizontal axis of the camera module coordinate system as an x-axis, and taking a parallel line with a longitudinal axis of the camera module coordinate system as a y-axis;
the reflected light path determining module 340 is configured to determine a reflected light path of the object to be detected based on the imaged three-dimensional coordinates;
the distance determining module 350 is configured to determine the distance of the object to be measured based on the reflected light path and a preset laser light path.
Further, the method also comprises the following steps:
the preset three-dimensional coordinate system forming module is used for fixedly mounting the laser transmitter and the camera module to form a preset three-dimensional coordinate system;
and the correction module is used for correcting the camera module so as to determine a preset laser ray path of the laser ray emitted by the laser emitter under the preset three-dimensional coordinate system.
Further, the correction module comprises:
the first laser image acquisition unit is used for transmitting laser light at a first preset distance from a sample through the laser transmitter and acquiring a first laser image of the sample based on the first preset distance through the camera module;
the second laser image acquisition unit is used for transmitting laser light at a second preset distance from the sample through the laser transmitter and acquiring a second laser image of the sample based on the second preset distance through the camera module;
the image coordinate determination unit is used for determining a first image coordinate of a first laser imaging point in the first laser image and a second image coordinate of a second laser imaging point in the second laser image;
the position coordinate determining unit is used for determining a first position coordinate of a first laser point corresponding to the first laser imaging point in the preset three-dimensional coordinate system according to the first image coordinate and determining a second position coordinate of a second laser point corresponding to the second laser imaging point in the preset three-dimensional coordinate system according to the second image coordinate;
and the laser ray path determining unit is used for determining a preset laser ray path according to the first position coordinate and the second position coordinate.
Further, the position coordinate determination unit is specifically configured to:
acquiring intrinsic parameters of the camera module;
converting the first image coordinate into a first position coordinate under the preset three-dimensional coordinate system according to the inherent parameter and the first preset distance;
and converting the second image coordinate into a second position coordinate under the preset three-dimensional coordinate system according to the inherent parameter and the second preset distance.
Further, the coordinate transformation module 330 is specifically configured to:
and converting the imaging coordinate into an imaging three-dimensional coordinate under a preset three-dimensional coordinate system according to the inherent parameters.
Further, the reflected light path determining module 340 is specifically configured to:
and determining the path of the reflected light of the object to be detected according to the imaging three-dimensional coordinate and the coordinate of the focus.
Further, the distance determining module 350 is specifically configured to:
and determining intersection point coordinates of the reflected light ray path and a preset laser light ray path, wherein the z-axis value of the intersection point coordinates is the distance of the object to be measured.
The underwater ranging device provided by the third embodiment of the invention can realize ranging of long-distance objects through the image acquisition module, the imaging coordinate determination module, the coordinate conversion module, the reflection ray path determination module and the distance determination module, and has the advantages of simple test equipment, strong robustness, strong underwater anti-interference capability, simple and rapid calculation mode and high measurement precision.
Example four
Fig. 7 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention. FIG. 7 illustrates a block diagram of an exemplary electronic device 412 (hereinafter device 412) suitable for use in implementing embodiments of the present invention. The device 412 shown in fig. 7 is only an example and should not impose any limitation on the functionality or scope of use of embodiments of the present invention.
As shown in fig. 7, the device 412 is in the form of a general purpose electronic device. The components of device 412 may include, but are not limited to: one or more processors 416, a storage device 428, and a bus 418 that couples the various device components including the storage device 428 and the processors 416.
Bus 418 represents one or more of any of several types of bus structures, including a memory device bus or memory device controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Device 412 typically includes a variety of computer device readable media. Such media can be any available media that is accessible by device 412 and includes both volatile and nonvolatile media, removable and non-removable media.
Storage 428 may include computer-device readable media in the form of volatile Memory, such as Random Access Memory (RAM) 430 and/or cache Memory 432. The device 412 may further include other removable/non-removable, volatile/nonvolatile computer device storage media. By way of example only, storage device 434 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 7, and commonly referred to as a "hard drive"). Although not shown in FIG. 7, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk such as a Compact disk Read-Only Memory (CD-ROM), Digital Video disk Read-Only Memory (DVD-ROM) or other optical media may be provided. In these cases, each drive may be connected to bus 418 by one or more data media interfaces. Storage 428 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 440 having a set (at least one) of program modules 442 may be stored, for instance, in storage 428, such program modules 442 including, but not limited to, an operating device, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. The program modules 442 generally perform the functions and/or methodologies of the described embodiments of the invention.
The device 412 may also communicate with one or more external devices 414 (e.g., keyboard, pointing terminal, display 424, etc.), with one or more terminals that enable a user to interact with the device 412, and/or with any terminals (e.g., network card, modem, etc.) that enable the device 412 to communicate with one or more other computing terminals. Such communication may occur via input/output (I/O) interfaces 422. Further, the device 412 may also communicate with one or more networks (e.g., a Local Area Network (LAN), Wide Area Network (WAN), and/or a public Network, such as the internet) via the Network adapter 420. As shown in FIG. 7, network adapter 420 communicates with the other modules of device 412 via bus 418. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the device 412, including but not limited to: microcode, end drives, Redundant processors, external disk drive Arrays, disk array (RAID) devices, tape drives, and data backup storage devices, to name a few.
The processor 416, by executing programs stored in the storage device 428, performs various functional applications and data processing, such as implementing a method for underwater ranging provided by any embodiment of the present invention, which may include:
acquiring a laser reflection image of an object to be detected;
determining imaging coordinates of a laser imaging point on the laser reflection image;
determining an imaging three-dimensional coordinate of the imaging coordinate under a preset three-dimensional coordinate system, wherein the preset three-dimensional coordinate system is a three-dimensional coordinate system which is formed by taking a focus of a camera module as an original point, taking a main optical axis of the camera module as a z-axis, taking a parallel line with a transverse axis of the camera module coordinate system as an x-axis and taking a parallel line with a longitudinal axis of the camera module coordinate system as a y-axis;
determining the path of a reflected light ray of the object to be detected based on the imaging three-dimensional coordinates;
and determining the distance of the object to be measured based on the reflected light path and a preset laser light path.
EXAMPLE five
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements an underwater ranging method according to any embodiment of the present invention, where the method may include:
acquiring a laser reflection image of an object to be detected;
determining imaging coordinates of a laser imaging point on the laser reflection image;
determining an imaging three-dimensional coordinate of the imaging coordinate under a preset three-dimensional coordinate system, wherein the preset three-dimensional coordinate system is a three-dimensional coordinate system which is formed by taking a focus of a camera module as an original point, taking a main optical axis of the camera module as a z-axis, taking a parallel line with a transverse axis of the camera module coordinate system as an x-axis and taking a parallel line with a longitudinal axis of the camera module coordinate system as a y-axis;
determining the path of a reflected light ray of the object to be detected based on the imaging three-dimensional coordinates;
and determining the distance of the object to be measured based on the reflected light path and a preset laser light path.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor device, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution apparatus, device, or apparatus.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution apparatus, device, or apparatus.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or terminal. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. An underwater ranging method, comprising:
acquiring a laser reflection image of an object to be detected;
determining imaging coordinates of a laser imaging point on the laser reflection image;
determining an imaging three-dimensional coordinate of the imaging coordinate under a preset three-dimensional coordinate system, wherein the preset three-dimensional coordinate system is a three-dimensional coordinate system which is formed by taking a focus of a camera module as an original point, taking a main optical axis of the camera module as a z-axis, taking a parallel line with a transverse axis of the camera module coordinate system as an x-axis and taking a parallel line with a longitudinal axis of the camera module coordinate system as a y-axis;
determining the path of a reflected light ray of the object to be detected based on the imaging three-dimensional coordinates;
and determining the distance of the object to be measured based on the reflected light path and a preset laser light path.
2. The method of claim 1, wherein prior to acquiring the laser reflection image of the object under test, further comprising:
fixedly mounting a laser transmitter and a camera module to form a preset three-dimensional coordinate system;
and correcting the camera module to determine a preset laser ray path of the laser ray emitted by the laser emitter under the preset three-dimensional coordinate system.
3. The method of claim 2, wherein calibrating the camera module to determine a predetermined laser ray path of the laser ray emitted by the laser emitter in the predetermined three-dimensional coordinate system comprises:
emitting laser light at a first preset distance from a sample through the laser emitter, and acquiring a first laser image of the sample based on the first preset distance through the camera module;
emitting laser light at a second preset distance from the sample through the laser emitter, and acquiring a second laser image of the sample based on the second preset distance through the camera module;
determining a first image coordinate of a first laser imaging point in the first laser image and a second image coordinate of a second laser imaging point in the second laser image;
determining a first position coordinate of a first laser point corresponding to the first laser imaging point under the preset three-dimensional coordinate system according to the first image coordinate and determining a second position coordinate of a second laser point corresponding to the second laser imaging point under the preset three-dimensional coordinate system according to the second image coordinate;
and determining a preset laser ray path according to the first position coordinate and the second position coordinate.
4. The method of claim 3, wherein determining first position coordinates of a first laser spot in the predetermined three-dimensional coordinate system corresponding to the first laser imaging point from the first image coordinates and second position coordinates of a second laser spot in the predetermined three-dimensional coordinate system corresponding to the second laser imaging point from the second image coordinates comprises:
acquiring intrinsic parameters of the camera module;
converting the first image coordinate into a first position coordinate under the preset three-dimensional coordinate system according to the inherent parameter and the first preset distance;
and converting the second image coordinate into a second position coordinate under the preset three-dimensional coordinate system according to the inherent parameter and the second preset distance.
5. The method of claim 4, wherein determining the imaged three-dimensional coordinates of the imaged coordinates in the predetermined three-dimensional coordinate system comprises:
and converting the imaging coordinate into an imaging three-dimensional coordinate under a preset three-dimensional coordinate system according to the inherent parameters.
6. The method of claim 4, wherein determining the reflected light path of the object under test based on the imaged three-dimensional coordinates comprises:
and determining the path of the reflected light of the object to be detected according to the imaging three-dimensional coordinate and the coordinate of the focus.
7. The method of claim 1, wherein determining the distance of the object to be measured based on the reflected ray path and a predetermined laser ray path comprises:
and determining intersection point coordinates of the reflected light ray path and a preset laser light ray path, wherein the z-axis value of the intersection point coordinates is the distance of the object to be measured.
8. An underwater distance measuring device, comprising:
the image acquisition module is used for acquiring a laser reflection image of the object to be detected;
the imaging coordinate determination module is used for determining the imaging coordinate of a laser imaging point on the laser reflection image;
the coordinate conversion module is used for determining an imaging three-dimensional coordinate of the imaging coordinate under a preset three-dimensional coordinate system, wherein the preset three-dimensional coordinate system is a three-dimensional coordinate system which is formed by taking a focus of a camera module as an original point, taking a main optical axis of the camera module as a z-axis, taking a parallel line with a transverse axis of the camera module coordinate system as an x-axis and taking a parallel line with a longitudinal axis of the camera module coordinate system as a y-axis;
the reflected light path determining module is used for determining the reflected light path of the object to be detected based on the imaging three-dimensional coordinate;
and the distance determining module is used for determining the distance of the object to be measured based on the reflected light ray path and a preset laser light ray path.
9. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the underwater ranging method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the underwater ranging method according to any one of claims 1 to 7.
CN202110018274.3A 2021-01-07 2021-01-07 Underwater distance measuring method, device, equipment and storage medium Pending CN112835062A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110018274.3A CN112835062A (en) 2021-01-07 2021-01-07 Underwater distance measuring method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110018274.3A CN112835062A (en) 2021-01-07 2021-01-07 Underwater distance measuring method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112835062A true CN112835062A (en) 2021-05-25

Family

ID=75927767

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110018274.3A Pending CN112835062A (en) 2021-01-07 2021-01-07 Underwater distance measuring method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112835062A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101629806A (en) * 2009-06-22 2010-01-20 哈尔滨工程大学 Nonlinear CCD 3D locating device combined with laser transmitter and locating method thereof
CN103712572A (en) * 2013-12-18 2014-04-09 同济大学 Structural light source-and-camera-combined object contour three-dimensional coordinate measuring device
CN108286958A (en) * 2018-02-06 2018-07-17 北京优尔博特创新科技有限公司 A kind of distance measuring method and range-measurement system
CN109727290A (en) * 2018-12-26 2019-05-07 南京理工大学 Zoom camera dynamic calibrating method based on monocular vision triangle telemetry
CN111006610A (en) * 2019-12-13 2020-04-14 中国科学院光电技术研究所 Underwater three-dimensional measurement data correction method based on structured light three-dimensional measurement
CN111336948A (en) * 2020-03-02 2020-06-26 武汉理工大学 Non-calibration handheld profile detection method and device based on imaging plane conversion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101629806A (en) * 2009-06-22 2010-01-20 哈尔滨工程大学 Nonlinear CCD 3D locating device combined with laser transmitter and locating method thereof
CN103712572A (en) * 2013-12-18 2014-04-09 同济大学 Structural light source-and-camera-combined object contour three-dimensional coordinate measuring device
CN108286958A (en) * 2018-02-06 2018-07-17 北京优尔博特创新科技有限公司 A kind of distance measuring method and range-measurement system
CN109727290A (en) * 2018-12-26 2019-05-07 南京理工大学 Zoom camera dynamic calibrating method based on monocular vision triangle telemetry
CN111006610A (en) * 2019-12-13 2020-04-14 中国科学院光电技术研究所 Underwater three-dimensional measurement data correction method based on structured light three-dimensional measurement
CN111336948A (en) * 2020-03-02 2020-06-26 武汉理工大学 Non-calibration handheld profile detection method and device based on imaging plane conversion

Similar Documents

Publication Publication Date Title
CN109489620B (en) Monocular vision distance measuring method
Kang et al. Two-view underwater structure and motion for cameras under flat refractive interfaces
WO2022227844A1 (en) Laser radar correction apparatus and method
KR102254627B1 (en) High throughput and low cost height triangulation system and method
CN205333856U (en) Low -cost laser rangefinder based on ordinary camera chip
CN114152935B (en) Method, device and equipment for evaluating radar external parameter calibration precision
CN114460588B (en) High-precision imaging method based on active acoustic imager
CN115984371A (en) Scanning head posture detection method, device, equipment and medium
JP5599849B2 (en) Lens inspection apparatus and method
CN112215903A (en) Method and device for detecting river flow velocity based on ultrasonic wave and optical flow method
CN112014829B (en) Performance index testing method and device of laser radar scanner
AU2016321728B2 (en) An apparatus and a method for encoding an image captured by an optical acquisition system
EP3400414B1 (en) Depth map generation in structured light system
CN116152357B (en) Parameter calibration system and method for infinity focusing camera
CN112835062A (en) Underwater distance measuring method, device, equipment and storage medium
CN114556431A (en) 3D reconstruction of augmented reality
CN109212546B (en) Method and device for calculating depth direction measurement error of binocular camera
CN115164776B (en) Three-dimensional measurement method and device for fusion of structured light decoding and deep learning
CN116091700A (en) Three-dimensional reconstruction method, three-dimensional reconstruction device, terminal equipment and computer readable medium
CN111637837B (en) Method and system for measuring size and distance of object by monocular camera
CN108776338A (en) Signal source space method for sensing, device and active sensor-based system
CN112630750A (en) Sensor calibration method and sensor calibration device
CN113763457A (en) Method and device for calibrating drop terrain, electronic equipment and storage medium
KR102402432B1 (en) Apparatus and method for generating data representing a pixel beam
CN116883516B (en) Camera parameter calibration method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination