CN106558038B - A kind of detection of sea-level and device - Google Patents

A kind of detection of sea-level and device Download PDF

Info

Publication number
CN106558038B
CN106558038B CN201510595935.3A CN201510595935A CN106558038B CN 106558038 B CN106558038 B CN 106558038B CN 201510595935 A CN201510595935 A CN 201510595935A CN 106558038 B CN106558038 B CN 106558038B
Authority
CN
China
Prior art keywords
image
coordinate
camera
pixel
default
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510595935.3A
Other languages
Chinese (zh)
Other versions
CN106558038A (en
Inventor
胡庭波
吴涛
安向京
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN201510595935.3A priority Critical patent/CN106558038B/en
Publication of CN106558038A publication Critical patent/CN106558038A/en
Application granted granted Critical
Publication of CN106558038B publication Critical patent/CN106558038B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention discloses a kind of detection of sea-level and devices, the image obtained based on first camera and second camera in synchronization photographic subjects waters, obtain spatial position of the horizontal plane of target water in default three-dimensional system of coordinate, in turn, utilize the spatial position of horizontal plane, the projection line of horizontal plane in the picture is obtained, and the projection line is determined as to the sky-line of target water.Detection method and device disclosed in the embodiment of the present disclosure can be accurately detected the sky-line when the water surface color and vein of target water changes or illumination changes.Also, when being blocked in the image of shooting there is no the real sky-line or the sky-line, the position of the sky-line in the picture can also be correctly detected, effectively improve the adaptability of detection method and detection device.

Description

A kind of detection of sea-level and device
Technical field
The present invention relates to technical field of image processing, more particularly to a kind of detection of sea-level and device.
Background technique
The sky-line is the line of demarcation of image Zhong Shui and day that camera or video camera shoot waters, to guidance Marine operation safety is of great significance.For example, the sky-line in Surface Picture is able to detect within the scope of Surface Picture Barrier on the water surface, also, the position of position and barrier in Surface Picture according to the sky-line in Surface Picture, Three-dimensional coordinate of the barrier in space coordinates is calculated, is avoided in advance with early warning ship.
Currently, existing sky-line detection means, which is mainly based upon monocular image, carries out sky-line detection, detection method Are as follows: using the pixel characteristic of each pixel in the Surface Picture of video camera acquisition, edge extracting is carried out to Surface Picture, to mention There are the edge lines of object in taking-up Surface Picture, then, using fitting algorithms such as least square methods to the edge line extracted Straight line fitting is carried out, at least one fitting a straight line is obtained, finally, determining that a fitting a straight line is made at least one fitting a straight line For the sky-line.
But in practical application, when ship's navigation is not when the shooting visual field of narrow water, video camera is open, alternatively, When the sky-line when distant place is blocked by barrier, or, when the distant place of ship is all land, the Surface Picture of video camera acquisition In may and be not present the real sky-line when, can not accurately and efficiently detect water using the detection method of the existing sky-line Antenna;In addition, the detection method of the existing sky-line can be because of the contrast mistake of Shui Tian intersection image when water surface fogged-in It is low and detection is caused to fail also, larger in illumination variation or when water surface color change is larger, the detection method of the existing sky-line Also the sky-line can not be accurately detected.
Summary of the invention
A kind of detection of sea-level and device are provided in the embodiment of the present invention, to solve the detection side of the existing sky-line Method can not correctly, accurately detect sky-line problem.
In order to solve the above-mentioned technical problem, the embodiment of the invention discloses following technical solutions:
A kind of detection of sea-level, which comprises
What the first camera and second camera for obtaining interval pre-determined distance respectively obtained in synchronization photographic subjects waters Image, wherein the image of the first camera shooting is the first image, and the image of second camera shooting is the second image;
Stereo matching is carried out to the pixel in the first image and second image, is obtained in the first image The parallax value of each pixel;
According to the image coordinate of pixel each in the first image, and, each pixel in the first image The parallax value of point, determines three-dimensional coordinate of each pixel in the default three-dimensional system of coordinate in the first image;
Using three-dimensional coordinate of the pixels multiple in the first image in the default three-dimensional system of coordinate, determine described in Spatial position of the horizontal plane of target water in the default three-dimensional system of coordinate;
According to spatial position of the horizontal plane in the default three-dimensional system of coordinate, the horizontal plane is obtained described Projection line in one image, and the projection line is determined as the sky-line of the target water in the first image.
Optionally, the pixel to the first image and in second image carries out Stereo matching, obtains institute State the parallax value of each pixel in the first image, comprising:
For each of the first image pixel, the row where the pixel is determined, in second figure In the mutually colleague of picture search with the maximum pixel of pixel similitude, as in second image with the pixel Corresponding corresponding pixel points;
Columns of the pixel in the first image and the corresponding pixel points are calculated in second image Columns difference, the parallax value as the pixel.
Optionally, the method also includes:
Before obtaining the image that the first camera and the second camera are shot for the first time, to first phase Machine and the second camera are demarcated;
Obtain the image coordinate that the optical center of the first camera projects in the first image respectively according to calibration result, And the pre-determined distance between the first camera and the second camera;
The image coordinate projected in the first image using the optical center of the first camera, and, first phase The pre-determined distance between machine and the second camera establishes the optical center coordinate system of the first camera, and as described pre- If three-dimensional system of coordinate.
Optionally, the image coordinate according to pixel each in the first image, and, in the first image The parallax value of each pixel determines three-dimensional of each pixel in the default three-dimensional system of coordinate in the first image Coordinate, comprising:
For each of the first image pixel, the pixel is calculated using the following equation in default three-dimensional Three-dimensional coordinate in coordinate system:
Wherein, (u1, v1) is image coordinate of the pixel in the first image, and (u0, v0) is described first The image coordinate that the optical center of camera projects in the first image, b is between the first camera and the second camera The pre-determined distance, f are the default focal length of the first camera, and d is the pixel relative to corresponding in second image The parallax value of pixel.
Optionally, the three-dimensional coordinate using pixels multiple in the first image in default three-dimensional system of coordinate, Determine spatial position of the horizontal plane of the target water in the default three-dimensional system of coordinate, comprising:
Multiple pixels are chosen in the first image as sampled point, all groups of samples are at a sample Collection;
Monitor the quantity of reference value;
The quantity of the reference value is compared with preset reference value quantity;
When the quantity of the reference value is less than preset reference value quantity, default sampling is randomly selected in the sample set The point quantity sampled point, and using the default sampled point quantity sampled point chosen in the default three-dimensional system of coordinate Three-dimensional coordinate calculates the objective plane including the default sampled point quantity sampled point in the default three-dimensional system of coordinate Spatial position;
The distance between the sampled point of each in the sample set and the objective plane are calculated, and calculates all institutes The intermediate value for stating distance increases by 1 in the quantity of the reference value, as next using the intermediate value as a reference value The secondary quantity for monitoring obtained reference value;
When the quantity of the reference value is equal to preset reference value quantity, in the preset reference value quantity reference In value, the smallest reference value of numerical value is chosen, and by the corresponding objective plane of the smallest reference value of the numerical value described pre- If the spatial position in three-dimensional system of coordinate, as spatial position of the horizontal plane in the default three-dimensional system of coordinate.
Optionally, the method also includes:
The destination sample point in the sample set is searched, the destination sample point and the horizontal plane are in the default three-dimensional The distance between spatial position in coordinate system is less than preset threshold;
The sampled point in the sample set in addition to the destination sample point is removed, and using only including the destination sample The sample set of point reacquires spatial position of the horizontal plane in the default three-dimensional system of coordinate.
Optionally, the spatial position according to the horizontal plane in the default three-dimensional system of coordinate, obtains the water Projection line of the plane in the first image, comprising:
Obtain spatial position of the optical center of the first camera in the default three-dimensional system of coordinate;
Using spatial position of the optical center of the first camera in the default three-dimensional system of coordinate, and, the level Spatial position of the face in the default three-dimensional system of coordinate calculates the optical center comprising the first camera and puts down with the horizontal plane Spatial position of the capable optical center plane in the default three-dimensional system of coordinate;
Utilize the light of spatial position and the first camera of the optical center plane in the default three-dimensional system of coordinate Projection of the optical center plane in the first image is calculated in the image coordinate that the heart projects in the first image Projection line of the optical center plane in the first image is determined as throwing of the horizontal plane in the first image by line Hachure.
Optionally, the method also includes:
After obtaining the first image and second image, to the first image and second image into Row noise reduction process.
A kind of sky-line detection device, described device include:
Image acquisition unit, the first camera and second camera for obtaining interval pre-determined distance respectively are in synchronization bat Take the photograph the image that target water obtains, wherein the image of the first camera shooting is the first image, the image of second camera shooting For the second image;
Parallax value acquiring unit, for carrying out three-dimensional to the pixel in the first image and second image Match, obtains the parallax value of each pixel in the first image;
Pixel coordinate determination unit, for the image coordinate according to pixel each in the first image, and, institute The parallax value for stating each pixel in the first image determines that each pixel is in the default three-dimensional seat in the first image Three-dimensional coordinate in mark system;
Horizontal plane determination unit, for presetting three in three-dimensional system of coordinate using multiple pixels in the first image Coordinate is tieed up, determines spatial position of the horizontal plane of the target water in the default three-dimensional system of coordinate;
Sky-line determination unit is obtained for the spatial position according to the horizontal plane in the default three-dimensional system of coordinate Projection line of the horizontal plane in the first image, and the projection line is determined as the target water described the The sky-line in one image.
Optionally, the parallax value acquiring unit, comprising:
Corresponding pixel points searching unit, for determining the pixel for each of the first image pixel Row where point, lookup and the maximum pixel of pixel similitude in the mutually colleague of second image, as institute State corresponding pixel points corresponding with the pixel in the second image;
Parallax value computing unit, for calculating columns of the pixel in the first image and the respective pixel The difference of columns of the point in second image, the parallax value as the pixel.
Optionally, described device further include:
Camera calibration unit, in the image for obtaining the first camera and the second camera for the first time and shooting Before, the first camera and the second camera are demarcated;
Acquiring unit, the optical center for obtaining the first camera respectively according to calibration result are thrown in the first image The image coordinate of shadow, and, the pre-determined distance between the first camera and the second camera;
Default three-dimensional system of coordinate establishes unit, for being projected in the first image using the optical center of the first camera Image coordinate, and, the pre-determined distance between the first camera and the second camera establishes the first camera Optical center coordinate system, and as the default three-dimensional system of coordinate.
Optionally, the pixel coordinate determination unit, comprising:
Pixel coordinate calculating unit, for being directed to each of the first image pixel, using following formula Calculate three-dimensional coordinate of the pixel in default three-dimensional system of coordinate:
Wherein, (u1, v1) is image coordinate of the pixel in the first image, and (u0, v0) is described first The image coordinate that the optical center of camera projects in the first image, b is between the first camera and the second camera The pre-determined distance, f are the default focal length of the first camera, and d is the pixel relative to corresponding in second image The parallax value of pixel.
Optionally, the horizontal plane determination unit, comprising:
Sample set component units, it is all described for choosing multiple pixels in the first image as sampled point Groups of samples is at a sample set;
Reference value Monitoring of Quantity unit, for monitoring the quantity of reference value;
Reference value quantity comparing unit, for the quantity of the reference value to be compared with preset reference value quantity;
Objective plane computing unit, when being less than preset reference value quantity for the quantity in the reference value, in the sample This concentration randomly selects the default sampled point quantity sampled point, and is existed using the default sampled point quantity sampled point chosen Three-dimensional coordinate in the default three-dimensional system of coordinate calculates the objective plane including the default sampled point quantity sampled point and exists Spatial position in the default three-dimensional system of coordinate;
Reference value acquiring unit, for calculating between the sampled point of each in the sample set and the objective plane Distance, and the intermediate value of all distances is calculated, using the intermediate value as a reference value, and in the reference value Increase by 1 in quantity, the quantity as the reference value monitored next time;
Horizontal plane determines subelement, for when the quantity of the reference value be equal to preset reference value quantity when, described pre- If in the reference value quantity reference value, choosing the smallest reference value of numerical value, and the smallest reference value of the numerical value is corresponding Spatial position of the objective plane in the default three-dimensional system of coordinate, default three-dimensional sat as the horizontal plane described Spatial position in mark system.
Optionally, described device further include:
Destination sample point searching unit, for searching the destination sample point in the sample set, the destination sample point with Distance of the horizontal plane between the spatial position in the default three-dimensional system of coordinate is less than preset threshold;
Horizontal plane reacquires unit, for removing the sampled point in the sample set in addition to the destination sample point, And the horizontal plane is reacquired in the default three-dimensional system of coordinate using the sample set only including the destination sample point Spatial position.
Optionally, the sky-line determination unit, comprising:
Optical center position acquisition unit, for obtaining sky of the optical center of the first camera in the default three-dimensional system of coordinate Between position;
Optical center plane computations unit, for sky of the optical center using the first camera in the default three-dimensional system of coordinate Between position, and, spatial position of the horizontal plane in the default three-dimensional system of coordinate is calculated comprising the first camera The spatial position of optical center and the optical center plane parallel with the horizontal plane in the default three-dimensional system of coordinate;
Projection line acquiring unit, for utilizing spatial position of the optical center plane in the default three-dimensional system of coordinate, And the image coordinate that the optical center of the first camera projects in the first image, the optical center plane is calculated in institute The projection line in the first image is stated, projection line of the optical center plane in the first image is determined as the horizontal plane and is existed Projection line in the first image.
Optionally, described device further include:
Noise reduction unit, for after obtaining the first image and second image, to the first image and Second image carries out noise reduction process.
By above technical scheme as it can be seen that a kind of detection of sea-level provided in an embodiment of the present invention and device, based on the The image that one camera and second camera obtain in synchronization photographic subjects waters obtains the horizontal plane of target water default three The spatial position in coordinate system is tieed up, in turn, using the spatial position of horizontal plane, obtains the projection line of horizontal plane in the picture, and The projection line is determined as to the sky-line of target water.
Since in the embodiments of the present disclosure, based on binocular image, the pixel for needing not rely on pixel in image is special Sign, therefore, will not when the water surface color and vein of target water changes or illumination changes, appearance can not it is accurate, have The case where detecting the sky-line to effect.Also, in embodiment of the disclosure, it is only necessary to take the level of target water in image Face can obtain the position of the sky-line in the picture by spatial position of the determining horizontal plane in default three-dimensional system of coordinate, because Water day can be correctly detected when being blocked in the image of shooting there is no the real sky-line or the sky-line in this The position of line in the picture, effectively improves the adaptability of detection method and detection device.
In addition, in one embodiment of the present disclosure, removing in sample set with horizontal plane at a distance of sampled point too far, utilizing Remaining destination sample point recalculates spatial position of the horizontal plane in default three-dimensional system of coordinate in sample set, enhances detection side The robustness of method improves the accuracy of the final identified sky-line in turn.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, for those of ordinary skill in the art Speech, without any creative labor, is also possible to obtain other drawings based on these drawings.
Fig. 1 is a kind of flow diagram of detection of sea-level provided in an embodiment of the present invention;
Fig. 2 is the flow diagram of step S102 in a kind of Fig. 1 provided in an embodiment of the present invention;
Fig. 3 is the flow diagram of another detection of sea-level provided in an embodiment of the present invention;
Fig. 4 is the flow diagram of step S103 in a kind of Fig. 1 provided in an embodiment of the present invention;
Fig. 5 is the flow diagram of step S104 in a kind of Fig. 1 provided in an embodiment of the present invention;
Fig. 6 is the flow diagram of step S105 in a kind of Fig. 1 provided in an embodiment of the present invention;
Fig. 7 is the schematic diagram of determining horizontal plane projection line in the first image provided in an embodiment of the present invention;
Fig. 8 is a kind of schematic diagram of sky-line testing result provided in an embodiment of the present invention;
Fig. 9 is the schematic diagram of another sky-line testing result provided in an embodiment of the present invention;
Figure 10 is a kind of structural schematic diagram of sky-line detection device provided in an embodiment of the present invention.
Specific embodiment
Technical solution in order to enable those skilled in the art to better understand the present invention, below in conjunction with of the invention real The attached drawing in example is applied, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described implementation Example is only a part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, this field is common Technical staff's every other embodiment obtained without making creative work, all should belong to protection of the present invention Range.
Fig. 1 is a kind of flow diagram for detection of sea-level that the embodiment of the present disclosure provides, before this method is realized Mentioning is, two cameras shoot image to target water simultaneously, and two cameras are referred to as first camera and second camera.Such as figure Shown in 1, this approach includes the following steps.
In step s101, the first camera and second camera for obtaining interval pre-determined distance respectively are in synchronization shooting mesh The image that mark waters obtains, wherein the image of first camera shooting is the first image, and the image of second camera shooting is the second figure Picture.
First camera and second camera are arranged to while the image in photographic subjects waters, and target water is one in horizontal plane The lesser waters of block area, the coverage of camera can include the target water, wherein when each camera can be separated by default Between section shoot a target water image.By taking first camera as an example, after first camera shoots piece image, wait 5 seconds Clock continues to shoot lower piece image.
It is separated by pre-determined distance between first camera and second camera, which can be according to first camera and second camera Best position and set, under normal circumstances, pre-determined distance is set as 0.2 meter to 0.8 meter.Also, the bat of first camera It is Chong Die with the coverage of second camera to take the photograph range, therefore, first camera and second camera are clapped with a piece of target water It takes the photograph, in the first image of first camera shooting and the second image of second camera shooting, there are more identical objects.For example, There is same ship in first image and the second image, or, exist in the first image and the second image bank it is same between House etc..
The mode for the image that first camera and second camera obtain in synchronization photographic subjects waters is obtained respectively, it can be with Are as follows: first camera and second camera are obtained respectively in the image in current time photographic subjects waters, after 5 seconds, first Camera and second camera photographic subjects waters simultaneously again, and then obtain first camera and second camera synchronization respectively again The image in photographic subjects waters.Also, for the first camera and second camera got every time in synchronization photographic subjects The image that waters obtains is carried out subsequent all steps, to continue according to the first image and the second image got every time The sky-line in detection image.
In one embodiment of the present disclosure, after obtaining the first image and the second image, to the first image and second Image carries out noise reduction process, wherein existing noise-removed technology can be used in the mode of noise reduction process, for example, median filtering or Gauss are flat Sliding filtering etc..
In step s 102, Stereo matching is carried out to the pixel in the first image and the second image, obtains the first image In each pixel parallax value.
Using existing Stereo Matching Algorithm, the pixel to all pixels point in the first image and in the second image is clicked through Row Stereo matching, the parallax value of each pixel in available any piece image.In the embodiments of the present disclosure, with For one image, in this step, the parallax value of each pixel in the first image is obtained after Stereo matching.
In one embodiment of the present disclosure, it as shown in Fig. 2, carrying out Stereo matching to the first image and the second image, obtains Into the first image, the mode of the parallax value of each pixel be may comprise steps of.
In step s 201, for each of the first image pixel, the row where pixel is determined, in the second figure Lookup and the maximum pixel of pixel similitude in the mutually colleague of picture, as corresponding corresponding with pixel in the second image Pixel.
Each of first image pixel all has image coordinate in the first image, equally, in the second image Each pixel also has image coordinate in the second image, if the abscissa of certain pixel and the second figure in the first image The abscissa of another pixel is identical as in, illustrates that the pixel of the two is in identical row.
By taking a pixel in the first image as an example, where determining the pixel according to the abscissa of the pixel Row, search in the mutually colleague in the second image with the maximum pixel of pixel similitude, as in the second image with should The corresponding corresponding pixel points of pixel.Also, in the manner described above, for each of the first image pixel, respectively The maximum pixel of similitude is found in the second image, as corresponding pixel points.
In a specific embodiment of the disclosure, the similitude between pixel can obtain in the following manner:
The similitude between pixel is evaluated using MNCC (Moravec normalization crosscorrelation is measured).Calculate first The similitude between another pixel in certain pixel and the second image in image, reference can be made to following formula:
Wherein, C in formulaMNCCThe similarity of (p, d) between pixel;D is the ordinate of A pixel and the in the first image The difference of the ordinate of B pixel in two images, i.e. view of the A pixel relative to B pixel in the second image in the first image Difference;WpFor A pixel neighborhood of a point in the first image, for example, 7*7 pixel in image centered on A pixel Region in range;Il(u, v) and Ir(u, v) is respectively picture in the second image of sum of the grayscale values of pixel (u, v) in the first image The gray value of vegetarian refreshments (u, v);WithThe gray scale of all pixels point is average in A pixel neighborhood of a point in respectively the first image In value and the second image in B pixel neighborhood of a point all pixels point average gray, B pixel neighborhood of a point in the second image Set-up mode it is identical as the set-up mode of A pixel neighborhood of a point in the first image.
In the present embodiment, the coordinate of pixel is (u, v) in image, wherein u represents the line number where pixel, is The abscissa of pixel, v represent the columns where pixel, are the ordinates of pixel.Using above-mentioned formula, from the second image Middle coordinate is that the pixel of (1,1) starts, and successively calculates coordinate in the pixel and the first image that abscissa is 1 in the second image For the similitude of the pixel of (1,1), that is, successively calculate in the second image and sat in the pixel and the first image of the first row It is designated as the similitude of the pixel of (1,1).Also, it is found out and the first figure in the pixel that abscissa is 1 in the second image As in coordinate be (1,1) the maximum pixel of pixel similitude, as in the second image with coordinate in the first image be (1, 1) the maximum corresponding pixel points of pixel similitude.Then, it is successively searched and first in the second image in the manner described above The corresponding corresponding pixel points of residual pixel point in image.
In step S202, pixel is calculated in the column of columns and corresponding pixel points in the second image in the first image The difference of number, the parallax value as pixel.
Pixel is calculated in column of the columns corresponding pixel points corresponding with pixel in the second image in the first image The difference of number, using the difference being calculated as the parallax value of the pixel.For example, ordinate of certain pixel in the first image It is 15, ordinate of the corresponding pixel points corresponding with the pixel in the second image is 10, and the view of the pixel is calculated Difference is 5, obtains the parallax value of each pixel in the first image in the manner described above.
In step s 103, according to the image coordinate of pixel each in the first image, and, each in the first image The parallax value of pixel determines three-dimensional coordinate of each pixel in default three-dimensional system of coordinate in the first image.
It obtains according to above-mentioned steps S102 in the first image after the parallax value of each pixel, according to the first image In each pixel image coordinate, and, the parallax value of each pixel in the first image determines the in the following manner Three-dimensional coordinate of each pixel in default three-dimensional system of coordinate in one image.
In one embodiment of the present disclosure, as shown in figure 3, default three-dimensional system of coordinate can be obtained by following step:
In step S301, before the image that first time obtains that first camera and second camera are shot, to first Camera and second camera are demarcated.
First camera and second camera are demarcated using existing camera calibration technology, wherein the content of calibration can be with Including the distance between the respective inner parameter of two cameras, external parameter and two cameras etc., wherein join the inside of camera The image coordinate etc. that the optical center of the focal length including camera, distortion parameter and camera projects in the image captured by camera is counted, outside Parameter includes spin matrix and the translation vector etc. between the image coordinate system of camera and space coordinates.
In step s 302, the image that the optical center of first camera projects in the first image is obtained respectively according to calibration result Coordinate, and, the pre-determined distance between first camera and second camera.
After completing above-mentioned steps S301, each calibrating parameters of two cameras are obtained, thus, it is possible to obtain the first phase The image coordinate that the optical center of machine projects in the first image, and, the pre-determined distance between first camera and second camera.
In step S303, the image coordinate projected in the first image using the optical center of first camera, and, the first phase Pre-determined distance between machine and second camera establishes the optical center coordinate system of first camera, and as default three-dimensional system of coordinate.
In the embodiments of the present disclosure, default three-dimensional system of coordinate can be defined as the optical center coordinate system of first camera, such as Fig. 7 institute Show, which with the shooting direction perpendicular to first camera and is parallel to level using the optical center of first camera as coordinate origin The direction in face is X-axis, using the direction perpendicular to horizontal plane as Y-axis, using the shooting direction of first camera as Z axis.The first camera Optical center coordinate system in any one point should all meet following formula.
(u1,v1) it is image coordinate of the pixel in the first image, (u0,v0) it is the optical center of first camera in the first figure The image coordinate projected as in, pre-determined distance of the b between first camera and second camera, f are the default focal length of first camera, D is parallax value of the pixel relative to corresponding pixel points in the second image.
Therefore, can according to above-mentioned formula, according to the image coordinate of pixel each in the first image, and, the first image In each pixel parallax value, determine the three-dimensional coordinate of each pixel in the first image in default three-dimensional system of coordinate.
In step S104, using three-dimensional coordinate of the multiple pixels in the first image in default three-dimensional system of coordinate, really Set the goal the spatial position of the horizontal plane in default three-dimensional system of coordinate in waters.
The equation of plane can be indicated with general plane equation formula, such as following formula:
Ax+by+cz+d=0
Wherein, the coordinate of (x, y, z, 1) ' can express a little in formula, a, b, c, d are the coefficient in plane equation.
Some point P=(x, y, z, 1) ' belongs to some plane φ=(a, b, c, d) ', then necessarily satisfying for plane equation, if There is n three-dimensional point while belonging to this plane, then the equation group that they meet are as follows:
It enablesAbove-mentioned equation group is represented by φ '=0 A.
Using above-mentioned equation group can obtain plane fitting as a result, solving equation group can be used many existing sides Method, including least square method, RANSAC algorithm (RANdom SAmple Consensus, random sampling unification algorism) and minimum Median method etc..
The distance between some point P=(x, y, z, 1) ' and plane φ=(a, b, c, d) ' can be by following formula in space It is calculated:
First thought of plane fitting is to minimize following formula:
That is, minimizing the distance between multiple points and plane in space, for example, in main included in the first image Holding is target water, therefore, in the first image indicates that the pixel quantity of horizontal plane is more, if there is some plane, energy Enough make the summation of the distance between position of object representated by each pixel and the plane in the first image minimum, the plane A possibility that being horizontal plane, is very big.
Therefore, it is based on above-mentioned thought, as shown in figure 4, determining the horizontal plane of target water default three using following step Tie up the spatial position in coordinate system.
In step S401, multiple pixels are chosen in the first image as sampled point, all groups of samples are at one Sample set.
For example, a pixel is selected in region in the first image where every 5*5 pixel, alternatively, first In image, at interval of 5 pixels, a pixel is chosen, the either selection mode of which kind of pixel, main purpose all exists In being able to reflect the pixel of object included in the first image from picking out part in the first image, to utilize these selections Pixel out carries out subsequent calculating, carries out subsequent calculating to avoid using all pixels point in the first image, occurs serious The case where influencing computation rate.
In one embodiment of the present disclosure, it also needs to sieve the pixel in the first image before choosing sampled point Choosing obtains the similitude between the corresponding pixel points in the first image in each pixel and the second image, filters out the first image In between corresponding pixel points similitude be lower than preset threshold pixel, that is, filter out in the first image with the second image solid The poor pixel of matching effect chooses sampled point in the remaining pixel of the first image.
Using the pixel selected in the manner described above as sampled point, also, by all groups of samples at a sample Collection.
In step S402, the quantity of reference value is monitored.
Reference value is the reflection of distance between all sampled points and the plane of fitting in sample set, can judge fit Plane Result it is whether suitable, that is, whether the result of fit Plane can represent horizontal plane.It will in the next steps in detail about reference value It describes in detail bright.
In step S403, the quantity of reference value is compared with preset reference value quantity.
The preset reference value quantity can be a preset numerical value, can also be calculated by preset calculation formula It arrives, will be described in detail in the next steps about the calculation formula.
When the quantity of reference value is less than preset reference value quantity, then step S404 is continued to execute;If the number of reference value Amount is equal to preset reference value quantity, then continues to execute step S406.
When the quantity of reference value is less than preset reference value quantity, in step s 404, randomly selected in sample set pre- If sampled point quantity sampled point, and three in three-dimensional system of coordinate are being preset using the default sampled point quantity sampled point chosen Coordinate is tieed up, spatial position of the objective plane including default sampled point quantity sampled point in default three-dimensional system of coordinate is calculated.
When randomly selecting default sampled point quantity sampled point in sample set, presets sampled point quantity and be less than in sample set The total quantity of sampled point, in the embodiments of the present disclosure, default sampled point quantity are set as 3, that is, 3 are randomly selected in sample set A sampled point.Three-dimensional coordinate of 3 sampled points selected in default three-dimensional system of coordinate is obtained according to above-mentioned steps S103, and And according to the method for above-mentioned acquisition fit Plane, three-dimensional seat of 3 sampled points selected in default three-dimensional system of coordinate is utilized Mark calculates spatial position of the objective plane including this 3 sampled points in default three-dimensional system of coordinate using least square method, That is, fitting plane equation ax+by+cz+d=0 of the plane in default three-dimensional system of coordinate.
In step S405, the distance between each sampled point and objective plane in sample set are calculated, and is calculated all The intermediate value of distance using intermediate value as a reference value, and increases by 1 in the quantity of reference value, obtains as monitoring next time The quantity of reference value.
Three-dimensional coordinate of each sampled point in default three-dimensional system of coordinate in sample set is obtained, also, is stated before using Formula in bright calculates the distance between the objective plane that each sampled point is fitted to this in sample set.Obtaining sample set In after all sampled points to the distance between objective plane, calculate the following institute of calculation formula of the intermediate value Mj, Mj of these distances Show:
Using the intermediate value being calculated as a reference value, meanwhile, the quantity of reference value is obtained, and in the number of reference value Increase by 1 in amount.For example, according to preceding method, obtained 10 fit Planes as a result, also, for each fit Plane, An intermediate value is all calculated as reference value, therefore, the quantity of reference value is 10.After this digital simulation plane, obtain A new reference value is obtained, increases by 1 in the quantity of reference value, the quantity variation of reference value is 11, and 11 are used as and is supervised next time The quantity of the reference value measured.And so on, until the quantity of reference value is equal to preset reference value quantity.
When the quantity of reference value is equal to preset reference value quantity, in step S406, join in preset reference value quantity It examines in value, chooses the smallest reference value of numerical value, and by the corresponding objective plane of the smallest reference value of numerical value in default three-dimensional coordinate Spatial position in system, as spatial position of the horizontal plane in default three-dimensional system of coordinate.
Preset reference value quantity can be calculated by following formula:
P=1- [1- (1- ε)p]m
Wherein ε indicates to match ratio shared by incorrect sampled point in sample set, that is, in sample set and in the second image Ratio shared by the incorrect pixel of corresponding pixel points matching result, ε are a pre-set numerical value;P represents sampling The default sampled point quantity of point, is in the embodiments of the present disclosure 3;M is preset reference value quantity.It is calculated according to above-mentioned formula P indicate to obtain the probability correctly solved in stochastical sampling m times and after obtaining m reference value, P is generally set as 0.9, I.e. after carrying out m sampling to sample set, 90% probability can obtain the objective plane for representing horizontal plane.
In the preset reference value quantity reference value obtained in above-mentioned steps, the smallest reference value of numerical value is selected, Also, by spatial position of the corresponding objective plane of the smallest reference value of the numerical value in default three-dimensional system of coordinate, as level Spatial position of the face in default three-dimensional system of coordinate, that is, equal the target used when the smallest reference value of the numerical value is calculated Plane Equation of the face in default three-dimensional system of coordinate, as Plane Equation of the horizontal plane in default three-dimensional system of coordinate.
When the quantity of reference value is equal to preset reference value quantity, it is calculated using preset reference value quantity reference value Spatial position of the horizontal plane in default three-dimensional system of coordinate.After the spatial position of the horizontal plane of target water has determined, Need not continue to obtain reference value, but carry out the subsequent acquisition sky-line the step of, therefore, the quantity of reference value is not in big In the preset reference value quantity the case where.
First camera is Chong Die with the coverage of second camera, not completely overlapped typically only to partly overlap, because This, is likely to there are different objects in the first image and the second image, for example, including a room on bank in the first image Room, and not comprising the house in the second image.Therefore, it is likely to exist and any one in the second image in the first image The all dissimilar pixel of pixel, but each of first image pixel has a respective pixel in the second image Point, therefore, the corresponding relationship of pixel and its corresponding pixel points in the second image in the first image, it is more likely that be wrong Accidentally.
Therefore, the correctness for enhancing horizontal plane in the spatial position of default three-dimensional system of coordinate, in a reality of the disclosure It applies in example, first with incorrect sampled point is obviously matched in the method removal sample set of stochastical sampling, then utilizes minimum Square law carries out plane fitting to remaining sampled point.
In the horizontal plane for obtaining target water after the spatial position in default three-dimensional system of coordinate, as shown in figure 5, also logical Cross the spatial position that following steps further correct the horizontal plane.
In step S501, the destination sample point in sample set, destination sample point and horizontal plane are searched in default three-dimensional seat The distance between spatial position in mark system is less than preset threshold.
Destination sample point is the correct sampled point of corresponding relationship in sample set with the corresponding pixel points in the second image, can It is obtained using following manner.
Firstly, utilizing formulaCalculate threshold value, wherein n is the sampling in sample set The total number of point, MJHorizontal plane by obtaining target water in abovementioned steps is adopted in default three-dimensional system of coordinate spatial location The smallest intermediate value of reference value, i.e. numerical value, p are default sampled point quantity.
Then, it is based on threshold value, determines the point of the destination sample in sample set according to following formula.
Wherein,
ri=d2(p,φ)
P be coordinate (x, y, z, 1) of the sampled point in default three-dimensional system of coordinate ', φ=(a, b, c, d) ' is abovementioned steps Spatial position of the sky-line of the target water of middle acquisition in default three-dimensional system of coordinate, the i.e. plane equation of horizontal plane, d2Table Show sampled point p to distance between horizontal plane square.When it is 1 that certain sampled point is according to the Wi that above-mentioned formula is calculated, Determine that the sampled point is destination sample point.
In step S502, the sampled point in sample set in addition to destination sample point is removed, and utilize and only adopt including target The sample set of sampling point reacquires spatial position of the horizontal plane in default three-dimensional system of coordinate using least square method.
The sampled point in sample set in addition to destination sample point is removed, also, utilizes remaining destination sample in sample set Point reacquires space of the horizontal plane of target water in default three-dimensional system of coordinate according to above-mentioned steps S401 to step S405 Position.
In step s105, the spatial position according to horizontal plane in default three-dimensional system of coordinate obtains horizontal plane first Projection line in image, and projection line is determined as the sky-line of the target water in the first image.
Horizontal plane is being obtained after the spatial position in default three-dimensional system of coordinate, is determining horizontal plane in the first image Projection line, the projection line are the boundary line of the sky in horizontal plane and image in image, this boundary line is determined as target water The sky-line of the domain in the first image.
Wherein, the spatial position according to horizontal plane in default three-dimensional system of coordinate obtains horizontal plane in the first image The method of projection line, as shown in Figure 6, it may include following steps.
In step s 601, spatial position of the optical center of first camera in default three-dimensional system of coordinate is obtained, in the disclosure In embodiment, the optical center coordinate system that three-dimensional system of coordinate is first camera is preset, the origin of the optical center coordinate system is first camera Optical center.
In step S602, spatial position of the optical center of first camera in default three-dimensional system of coordinate, i.e. the first phase are utilized Spatial position of the optical center of machine in optical center coordinate system, and, spatial position of the horizontal plane in default three-dimensional system of coordinate calculates The spatial position of optical center comprising first camera and the optical center plane being parallel to the horizontal plane in default three-dimensional system of coordinate.
As shown in fig. 7, O is the optical center of video camera in figure, coordinate system OXYZ is the optical center coordinate system of first camera, this first For the optical center coordinate system of camera using the optical center of first camera as origin, φ is horizontal plane, φfFor focal plane, optical center plane OAB was Optical center O and the plane parallel with plane φ.The sky-line of plane φ is exactly projection of the infinite point of φ on focal plane, and The projection line that the plane being parallel to each other projects on focal plane is same straight line, so optical center plane OAB and plane φ are in image On projection line be it is the same, optical center plane OAB and projection line of the plane φ on focal plane are exactly plane φ in the first image In projection line, which is determined as the sky-line of the target water in the first image.
Assuming that the equation of plane φ is ax+by+cz+d=0, then the equation for crossing the optical center plane OAB of optical center O is ax+by+ Cz=0.
In step S603, spatial position of the optical center plane in default three-dimensional system of coordinate and first camera are utilized The image coordinate that optical center projects in the first image is calculated projection line of the optical center plane in the first image, optical center is put down Face is determined as projection line of the horizontal plane in the first image in the projection line in the first image.
Focal plane φfEquation be z=f, f be first camera focal length therefore obtain optical center plane OAB in focal plane On projection line straight line AB equation are as follows:
Ax+by+cf=0
Conversion formula between the optical center coordinate system of first camera and the image coordinate system of first camera are as follows:
(u in formula0,v0) it is projection of the optical center of first camera in the first image.So as to obtain straight line AB first Projection equation in the image coordinate system of camera are as follows:
a(u-u0)+b(v-v0)+cF=0
Wherein, F is the focal length of first camera, dpIt is a pixel in the first image in the optical center coordinate system of first camera In physical size.
Optical center plane OAB is determined as projection line of the horizontal plane in the first image in the projection line in the first image, into And obtain equation of the sky-line in the image coordinate system of first camera.By the way of in disclosure above-described embodiment, detection To the first image in the sky-line it is as shown in Figure 8 and Figure 9.
Mode due to detecting the sky-line in the first image is equally applicable to detect the sky-line in the second image, because This, repeats no more in the embodiments of the present disclosure.
Figure 10 is a kind of structural schematic diagram for sky-line detection device that the embodiment of the present disclosure provides, as shown in Figure 10, should Device includes:
Image acquisition unit 11, the first camera and second camera for obtaining interval pre-determined distance respectively are in synchronization The image that photographic subjects waters obtains, wherein the image of first camera shooting is the first image;
Parallax value acquiring unit 12 is obtained for carrying out Stereo matching to the pixel in the first image and the second image The parallax value of each pixel in first image;
Pixel coordinate determination unit 13, for the image coordinate according to pixel each in the first image, and, first The parallax value of each pixel in image determines three-dimensional seat of each pixel in default three-dimensional system of coordinate in the first image Mark;
Horizontal plane determination unit 14, for the three-dimensional using multiple pixels in the first image in default three-dimensional system of coordinate Coordinate determines spatial position of the horizontal plane of target water in default three-dimensional system of coordinate;
Sky-line determination unit 15 obtains horizontal for the spatial position according to horizontal plane in default three-dimensional system of coordinate Projection line of the face in the first image, and projection line is determined as the sky-line of the target water in the first image.
In one embodiment of the present disclosure, the parallax value acquiring unit 12 in previous embodiment, comprising:
Corresponding pixel points searching unit, for where for each of the first image pixel, determining pixel Row, in the mutually colleague of the second image search with the maximum pixel of pixel similitude, as in the second image with pixel Corresponding corresponding pixel points;
Parallax value computing unit, for calculating columns and corresponding pixel points of the pixel in the first image in the second image In columns difference, the parallax value as pixel.
Device in another embodiment of the disclosure, in previous embodiment further include:
Camera calibration unit, it is right for before the image that first time obtains that first camera and second camera are shot First camera and second camera are demarcated;
Acquiring unit, the image that the optical center for obtaining first camera respectively according to calibration result projects in the first image Coordinate, and, the pre-determined distance between first camera and second camera;
Default three-dimensional system of coordinate establishes unit, and the image projected in the first image for the optical center using first camera is sat Mark, and, the pre-determined distance between first camera and second camera establishes the optical center coordinate system of first camera, and as default Three-dimensional system of coordinate.
Pixel coordinate determination unit 13 in another embodiment of the disclosure, in previous embodiment, comprising:
Pixel coordinate calculating unit, for being calculated using the following equation for each of the first image pixel Three-dimensional coordinate of the pixel in default three-dimensional system of coordinate:
Wherein, (u1,v1) it is image coordinate of the pixel in the first image, (u0,v0) it is the optical center of first camera The image coordinate projected in one image, pre-determined distance of the b between first camera and second camera, f are the default of first camera Focal length, d are parallax value of the pixel relative to corresponding pixel points in the second image.
Horizontal plane determination unit 14 in another embodiment of the disclosure, in previous embodiment, comprising:
Sample set component units, for choosing multiple pixels in the first image as sampled point, all groups of samples At a sample set;
Reference value Monitoring of Quantity unit, for monitoring the quantity of reference value;
Reference value quantity comparing unit, for the quantity of reference value to be compared with preset reference value quantity;
Objective plane computing unit, for the quantity in reference value be less than preset reference value quantity when, in sample set with Machine chooses default sampled point quantity sampled point, and using the default sampled point quantity sampled point chosen in default three-dimensional coordinate Three-dimensional coordinate in system calculates sky of the objective plane including default sampled point quantity sampled point in default three-dimensional system of coordinate Between position;
Reference value acquiring unit for calculating the distance between each sampled point and objective plane in sample set, and is counted The intermediate value for calculating all distances using intermediate value as a reference value, and increases by 1 in the quantity of reference value, as monitoring next time The quantity of obtained reference value;
Horizontal plane determines subelement, for when the quantity of reference value be equal to preset reference value quantity when, in preset reference value In quantity reference value, the smallest reference value of numerical value is chosen, and by the corresponding objective plane of the smallest reference value of numerical value default Spatial position in three-dimensional system of coordinate, as spatial position of the horizontal plane in default three-dimensional system of coordinate.
Device in another embodiment of the disclosure, in previous embodiment further include:
Destination sample point searching unit, for searching the point of the destination sample in sample set, destination sample point and horizontal plane exist The distance between spatial position in default three-dimensional system of coordinate is less than preset threshold;
Horizontal plane reacquires unit, for removing the sampled point in sample set in addition to destination sample point, and using only Sample set including destination sample point reacquires spatial position of the horizontal plane in default three-dimensional system of coordinate.
Sky-line determination unit 15 in another embodiment of the disclosure, in previous embodiment, comprising:
Optical center position acquisition unit, for obtaining spatial position of the optical center of first camera in default three-dimensional system of coordinate;
Optical center plane computations unit is presetting the spatial position in three-dimensional system of coordinate for the optical center using first camera, And spatial position of the horizontal plane in default three-dimensional system of coordinate, it calculates the optical center comprising first camera and is parallel to the horizontal plane Spatial position of the optical center plane in default three-dimensional system of coordinate;
Projection line acquiring unit, for the spatial position and first using optical center plane in default three-dimensional system of coordinate Projection line of the optical center plane in the first image is calculated in the image coordinate that the optical center of camera projects in the first image, will Optical center plane is determined as projection line of the horizontal plane in the first image in the projection line in the first image.
Device in another embodiment of the disclosure, in previous embodiment further include:
Noise reduction unit, for being carried out to the first image and the second image after obtaining the first image and the second image Noise reduction process.
It should be noted that, in this document, the relational terms of such as " first " and " second " or the like are used merely to one A entity or operation with another entity or operate distinguish, without necessarily requiring or implying these entities or operation it Between there are any actual relationship or orders.Moreover, the terms "include", "comprise" or its any other variant are intended to Cover non-exclusive inclusion, so that the process, method, article or equipment for including a series of elements not only includes those Element, but also including other elements that are not explicitly listed, or further include for this process, method, article or setting Standby intrinsic element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that There is also other identical elements in the process, method, article or apparatus that includes the element.
The above is only a specific embodiment of the invention, is made skilled artisans appreciate that or realizing this hair It is bright.Various modifications to these embodiments will be apparent to one skilled in the art, as defined herein General Principle can be realized in other embodiments without departing from the spirit or scope of the present invention.Therefore, of the invention It is not intended to be limited to the embodiments shown herein, and is to fit to and the principles and novel features disclosed herein phase one The widest scope of cause.

Claims (16)

1. a kind of detection of sea-level, which is characterized in that the described method includes:
The image that the first camera and second camera for obtaining interval pre-determined distance respectively obtain in synchronization photographic subjects waters, Wherein, the image of the first camera shooting is the first image, and the image of second camera shooting is the second image, first phase The coverage of machine is Chong Die with the coverage of the second camera;
Stereo matching is carried out to the pixel in the first image and second image, is obtained each in the first image The parallax value of a pixel;
According to the image coordinate of pixel each in the first image, and, each pixel in the first image Parallax value determines three-dimensional coordinate of each pixel in default three-dimensional system of coordinate in the first image;
Using three-dimensional coordinate of the pixels multiple in the first image in the default three-dimensional system of coordinate, the target is determined Spatial position of the horizontal plane in waters in the default three-dimensional system of coordinate;
According to spatial position of the horizontal plane in the default three-dimensional system of coordinate, the horizontal plane is obtained in first figure Projection line as in, and the projection line is determined as the sky-line of the target water in the first image.
2. the method according to claim 1, wherein described to the first image and in second image Pixel carries out Stereo matching, obtains the parallax value of each pixel in the first image, comprising:
For each of the first image pixel, the row where the pixel is determined, in second image Mutually lookup and the maximum pixel of pixel similitude in colleague, as opposite with the pixel in second image The corresponding pixel points answered;
Calculate columns of the pixel in the first image and column of the corresponding pixel points in second image The difference of number, the parallax value as the pixel.
3. the method according to claim 1, wherein the method also includes:
Before obtaining the image that the first camera and the second camera are shot for the first time, to the first camera and The second camera is demarcated;
Obtain the image coordinate that the optical center of the first camera projects in the first image respectively according to calibration result, with And the pre-determined distance between the first camera and the second camera;
The image coordinate projected in the first image using the optical center of the first camera, and, the first camera with The pre-determined distance between the second camera establishes the optical center coordinate system of the first camera, and as described default three Tie up coordinate system.
4. according to the method described in claim 3, it is characterized in that, the figure according to pixel each in the first image As coordinate, and, the parallax value of each pixel in the first image determines that each pixel exists in the first image Three-dimensional coordinate in the default three-dimensional system of coordinate, comprising:
For each of the first image pixel, the pixel is calculated using the following equation in default three-dimensional coordinate Three-dimensional coordinate in system:
Wherein, (u1,v1) it is image coordinate of the pixel in the first image, (u0,v0) it is the first camera The image coordinate that optical center projects in the first image, b are described pre- between the first camera and the second camera If distance, f is the default focal length of the first camera, and d is the pixel relative to corresponding pixel points in second image Parallax value.
5. the method according to claim 1, wherein multiple pixels are in institute in the utilization the first image The three-dimensional coordinate in default three-dimensional system of coordinate is stated, determines the horizontal plane of the target water in the default three-dimensional system of coordinate Spatial position, comprising:
Multiple pixels are chosen in the first image as sampled point, all groups of samples are at a sample set;
Monitor the quantity of reference value;
The quantity of the reference value is compared with preset reference value quantity;
When the quantity of the reference value is less than preset reference value quantity, default sampling number is randomly selected in the sample set Measure a sampled point, and the three-dimensional using the default sampled point quantity sampled point chosen in the default three-dimensional system of coordinate Coordinate calculates space of the objective plane including the default sampled point quantity sampled point in the default three-dimensional system of coordinate Position;
Calculate the distance between the sampled point of each in the sample set and the objective plane, and calculate it is all it is described away from From intermediate value increase by 1 in the quantity of the reference value using the intermediate value as a reference value, as supervising next time The quantity of the reference value measured;
When the quantity of the reference value is equal to preset reference value quantity, in the preset reference value quantity reference value In, the smallest reference value of numerical value is chosen, and by the corresponding objective plane of the smallest reference value of the numerical value described default Spatial position in three-dimensional system of coordinate, as spatial position of the horizontal plane in the default three-dimensional system of coordinate.
6. according to the method described in claim 5, it is characterized in that, the method also includes:
The destination sample point in the sample set is searched, the destination sample point and the horizontal plane are in the default three-dimensional coordinate The distance between spatial position in system is less than preset threshold;
The sampled point in the sample set in addition to the destination sample point is removed, and utilizing only includes the destination sample point Sample set reacquires spatial position of the horizontal plane in the default three-dimensional system of coordinate.
7. according to the method described in claim 3, it is characterized in that, it is described according to the horizontal plane in the default three-dimensional coordinate Spatial position in system obtains projection line of the horizontal plane in the first image, comprising:
Obtain spatial position of the optical center of the first camera in the default three-dimensional system of coordinate;
Using spatial position of the optical center of the first camera in the default three-dimensional system of coordinate, and, the horizontal plane exists Spatial position in the default three-dimensional system of coordinate is calculated comprising the optical center of the first camera and parallel with the horizontal plane Spatial position of the optical center plane in the default three-dimensional system of coordinate;
Optical center using spatial position and the first camera of the optical center plane in the default three-dimensional system of coordinate exists Projection line of the optical center plane in the first image is calculated in the image coordinate projected in the first image, will Projection line of the optical center plane in the first image is determined as projection line of the horizontal plane in the first image.
8. the method according to claim 1, wherein the method also includes:
After obtaining the first image and second image, the first image and second image are dropped It makes an uproar processing.
9. a kind of sky-line detection device, which is characterized in that described device includes:
Image acquisition unit, the first camera and second camera for obtaining interval pre-determined distance respectively are in synchronization shooting mesh The obtained image in mark waters, wherein the image of the first camera shooting is the first image, and the image of second camera shooting is the Two images, the coverage of the first camera are Chong Die with the coverage of the second camera;
Parallax value acquiring unit is obtained for carrying out Stereo matching to the pixel in the first image and second image The parallax value of each pixel into the first image;
Pixel coordinate determination unit, for the image coordinate according to pixel each in the first image, and, described The parallax value of each pixel in one image determines that each pixel is in default three-dimensional system of coordinate in the first image Three-dimensional coordinate;
Horizontal plane determination unit, for using multiple pixels in the first image in the default three-dimensional system of coordinate three Coordinate is tieed up, determines spatial position of the horizontal plane of the target water in the default three-dimensional system of coordinate;
Sky-line determination unit obtains institute for the spatial position according to the horizontal plane in the default three-dimensional system of coordinate Projection line of the horizontal plane in the first image is stated, and the projection line is determined as the target water in first figure The sky-line as in.
10. device according to claim 9, which is characterized in that the parallax value acquiring unit, comprising:
Corresponding pixel points searching unit, for determining the pixel institute for each of the first image pixel Row, searched in the mutually colleague of second image with the maximum pixel of pixel similitude, as described the Corresponding pixel points corresponding with the pixel in two images;
Parallax value computing unit exists for calculating columns of the pixel in the first image and the corresponding pixel points The difference of columns in second image, the parallax value as the pixel.
11. device according to claim 10, which is characterized in that described device further include:
Camera calibration unit, for obtain for the first time image that the first camera and the second camera are shot it Before, the first camera and the second camera are demarcated;
Acquiring unit, what the optical center for obtaining the first camera respectively according to calibration result projected in the first image Image coordinate, and, the pre-determined distance between the first camera and the second camera;
Default three-dimensional system of coordinate establishes unit, the figure for being projected in the first image using the optical center of the first camera As coordinate, and, the pre-determined distance between the first camera and the second camera establishes the light of the first camera Heart coordinate system, and as the default three-dimensional system of coordinate.
12. device according to claim 11, which is characterized in that the pixel coordinate determination unit, comprising:
Pixel coordinate calculating unit, for being calculated using the following equation for each of the first image pixel Three-dimensional coordinate of the pixel in default three-dimensional system of coordinate:
Wherein, (u1,v1) it is image coordinate of the pixel in the first image, (u0,v0) it is the first camera The image coordinate that optical center projects in the first image, b are described pre- between the first camera and the second camera If distance, f is the default focal length of the first camera, and d is the pixel relative to corresponding pixel points in second image Parallax value.
13. device according to claim 9, which is characterized in that the horizontal plane determination unit, comprising:
Sample set component units, for choosing multiple pixels in the first image as sampled point, all samplings Point one sample set of composition;
Reference value Monitoring of Quantity unit, for monitoring the quantity of reference value;
Reference value quantity comparing unit, for the quantity of the reference value to be compared with preset reference value quantity;
Objective plane computing unit, when being less than preset reference value quantity for the quantity in the reference value, in the sample set In randomly select the default sampled point quantity sampled point, and using the default sampled point quantity sampled point chosen described Three-dimensional coordinate in default three-dimensional system of coordinate calculates the objective plane including the default sampled point quantity sampled point described Spatial position in default three-dimensional system of coordinate;
Reference value acquiring unit, for calculate between the sampled point of each in the sample set and the objective plane away from From, and the intermediate value of all distances is calculated, using the intermediate value as a reference value, and in the quantity of the reference value It is upper to increase by 1, the quantity as the reference value monitored next time;
Horizontal plane determines subelement, for when the quantity of the reference value be equal to preset reference value quantity when, in the default ginseng It examines in the value quantity reference value, chooses the smallest reference value of numerical value, and by the corresponding institute of the smallest reference value of the numerical value Spatial position of the objective plane in the default three-dimensional system of coordinate is stated, as the horizontal plane in the default three-dimensional system of coordinate In spatial position.
14. device according to claim 13, which is characterized in that described device further include:
Destination sample point searching unit, for searching the destination sample point in the sample set, the destination sample point with it is described Distance of the horizontal plane between the spatial position in the default three-dimensional system of coordinate is less than preset threshold;
Horizontal plane reacquires unit, for removing the sampled point in the sample set in addition to the destination sample point, and benefit Space of the horizontal plane in the default three-dimensional system of coordinate is reacquired with the sample set for only including the destination sample point Position.
15. device according to claim 11, which is characterized in that the sky-line determination unit, comprising:
Optical center position acquisition unit, for obtaining space bit of the optical center of the first camera in the default three-dimensional system of coordinate It sets;
Optical center plane computations unit, for space bit of the optical center using the first camera in the default three-dimensional system of coordinate It sets, and, spatial position of the horizontal plane in the default three-dimensional system of coordinate calculates the optical center comprising the first camera And spatial position of the optical center plane parallel with the horizontal plane in the default three-dimensional system of coordinate;
Projection line acquiring unit, for the spatial position using the optical center plane in the default three-dimensional system of coordinate, and The optical center plane is calculated described in the image coordinate that the optical center of the first camera projects in the first image Projection line of the optical center plane in the first image is determined as the horizontal plane described by the projection line in one image Projection line in first image.
16. device according to claim 9, which is characterized in that described device further include:
Noise reduction unit, for after obtaining the first image and second image, to the first image and described Second image carries out noise reduction process.
CN201510595935.3A 2015-09-18 2015-09-18 A kind of detection of sea-level and device Active CN106558038B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510595935.3A CN106558038B (en) 2015-09-18 2015-09-18 A kind of detection of sea-level and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510595935.3A CN106558038B (en) 2015-09-18 2015-09-18 A kind of detection of sea-level and device

Publications (2)

Publication Number Publication Date
CN106558038A CN106558038A (en) 2017-04-05
CN106558038B true CN106558038B (en) 2019-07-02

Family

ID=58414233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510595935.3A Active CN106558038B (en) 2015-09-18 2015-09-18 A kind of detection of sea-level and device

Country Status (1)

Country Link
CN (1) CN106558038B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109936704A (en) * 2017-12-18 2019-06-25 姜鹏飞 A kind of image data transparent effect processing method and processing device
CN109961455B (en) * 2017-12-22 2022-03-04 杭州萤石软件有限公司 Target detection method and device
CN112017238A (en) * 2019-05-30 2020-12-01 北京初速度科技有限公司 Method and device for determining spatial position information of linear object
CN112639881A (en) * 2020-01-21 2021-04-09 深圳市大疆创新科技有限公司 Distance measuring method, movable platform, device and storage medium
WO2021174539A1 (en) * 2020-03-06 2021-09-10 深圳市大疆创新科技有限公司 Object detection method, mobile platform, device and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604383A (en) * 2009-07-24 2009-12-16 哈尔滨工业大学 A kind of method for detecting targets at sea based on infrared image
CN104778695A (en) * 2015-04-10 2015-07-15 哈尔滨工程大学 Water sky line detection method based on gradient saliency

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100886611B1 (en) * 2007-08-14 2009-03-05 한국전자통신연구원 Method and apparatus for detecting line segment by incremental pixel extension in an image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604383A (en) * 2009-07-24 2009-12-16 哈尔滨工业大学 A kind of method for detecting targets at sea based on infrared image
CN104778695A (en) * 2015-04-10 2015-07-15 哈尔滨工程大学 Water sky line detection method based on gradient saliency

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Rapid Water-Sky-Line Detecting Algorithm in Marine Celestial Navigation;Chonghui Li et al;《第三届中国卫星导航学术年会电子文集》;20121231;全文
一种基于小波多尺度分析的水天线检测方法;裴立力等;《沈阳工业大学学报》;20030430;第25卷(第2期);全文
基于光视觉的无人艇水面目标检测与跟踪研究;曾文静;《中国博士学位论文全文数据库 工程科技Ⅱ辑》;20140415(第4期);全文
基于水天线检测的数字图像处理算法研究;赵凝霞等;《华东船舶工业学院学报(自然科学版)》;20050531;第19卷(第1期);全文
船载摄像系统的一种电子稳像算法;赵红颖等;《光学技术》;20030530;第29卷(第5期);全文

Also Published As

Publication number Publication date
CN106558038A (en) 2017-04-05

Similar Documents

Publication Publication Date Title
CN106558038B (en) A kind of detection of sea-level and device
CN105678742B (en) A kind of underwater camera scaling method
CN105894499B (en) A kind of space object three-dimensional information rapid detection method based on binocular vision
CN107635129B (en) Three-dimensional trinocular camera device and depth fusion method
Chaudhury et al. Auto-rectification of user photos
KR20170056474A (en) Method, device and storage medium for calculating building height
CN107564062A (en) Pose method for detecting abnormality and device
CN106530358A (en) Method for calibrating PTZ camera by using only two scene images
CN106767810A (en) The indoor orientation method and system of a kind of WIFI and visual information based on mobile terminal
CN105608706B (en) A kind of polarization vision sensor structure design and geometric calibration method
CN107092905B (en) Method for positioning instrument to be identified of power inspection robot
CN107833250A (en) Semantic space map constructing method and device
US20130208975A1 (en) Stereo Matching Device and Method for Determining Concave Block and Convex Block
CN103955888A (en) High-definition video image mosaic method and device based on SIFT
CN109141432B (en) Indoor positioning navigation method based on image space and panoramic assistance
CN104102069A (en) Focusing method and device of imaging system, and imaging system
CN109712188A (en) A kind of method for tracking target and device
CN112541932A (en) Multi-source image registration method based on different focal length transformation parameters of dual-optical camera
CN103927785A (en) Feature point matching method for close-range shot stereoscopic image
CN116957987A (en) Multi-eye polar line correction method, device, computer equipment and storage medium
CN109753930A (en) Method for detecting human face and face detection system
Zeng et al. Orb-slam2 with 6dof motion
CN110800020A (en) Image information acquisition method, image processing equipment and computer storage medium
CN113884017B (en) Non-contact deformation detection method and system for insulator based on three-eye vision
CN108269278A (en) A kind of method and device of scene modeling

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant