CN104867158A - Monocular vision-based indoor water surface ship precise positioning system and method - Google Patents

Monocular vision-based indoor water surface ship precise positioning system and method Download PDF

Info

Publication number
CN104867158A
CN104867158A CN201510298563.8A CN201510298563A CN104867158A CN 104867158 A CN104867158 A CN 104867158A CN 201510298563 A CN201510298563 A CN 201510298563A CN 104867158 A CN104867158 A CN 104867158A
Authority
CN
China
Prior art keywords
image
ship
identification
monocular vision
centerdot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510298563.8A
Other languages
Chinese (zh)
Other versions
CN104867158B (en
Inventor
初秀民
柳晨光
谢朔
王乐
王桂冲
欧阳雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University of Technology WUT
Original Assignee
Wuhan University of Technology WUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University of Technology WUT filed Critical Wuhan University of Technology WUT
Priority to CN201510298563.8A priority Critical patent/CN104867158B/en
Publication of CN104867158A publication Critical patent/CN104867158A/en
Application granted granted Critical
Publication of CN104867158B publication Critical patent/CN104867158B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/207Analysis of motion for motion estimation over a hierarchy of resolutions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/36Videogrammetry, i.e. electronic processing of video signals from a single source or from different sources to give parallax or range information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a monocular vision-based indoor water surface ship precise positioning system, which comprises a monocular vision unit, a recognition positioning unit connected with a monocular vision system and a ship end marker unit arranged on a ship, wherein the monocular vision unit is used for acquiring and outputting an image when the ship sails on an indoor water surface; the recognition positioning unit is used for image collecting, image processing, target ship recognizing, positioning and sailing information calculating; the ship end marker unit comprises two marker lamps arranged on the head and tail of the ship; the shapes of the two marker lamps are spherical and are different in color and the center of a connection line of the two marker lamps represents the shape center of the target ship. According to the invention, the ship under the indoor environment can be accurately positioned, sailing state of the ship is obtained by applying an algorithm and a result is output in real time.

Description

Based on indoor above water craft Precise Position System and the method for monocular vision
Technical field
The invention belongs to technical field of computer vision, relate to the perception of ship motion state simultaneously, be specially a kind of indoor above water craft Precise Position System based on monocular vision and method.
Background technology
Moving target location based on computer vision is the research direction that the intelligent field of waterborne traffic safety administration receives much concern, and it detects boats and ships from the video sequence collected, identify, locate and follow the tracks of.And monocular vision real time positioning technology is because structure is simple, demarcating steps is few and be able to widespread use.Its position fixing process is: the collection first by a video camera, moving target being carried out to continuous print image sequence, then apply default location algorithm to detect target in image sequence, follow the tracks of and locate, and the movement locus of moving target is rebuild, finally export, show and store the positioning result of each moment target.It is except having the advantage such as simple, easy-to-use, real-time is good, and its pattern distortion produced is also relatively little, and corrects by post-processed.The algorithm of current view-based access control model location is applied to mobile robot more, but because boats and ships are at surface motions, factors such as being subject to water-reflected, reflecting, block is disturbed, and stem is very important to information in ship motion process, the indoor ship-positioning system therefore based on monocular vision has certain singularity and complicacy.
Summary of the invention
The technical problem to be solved in the present invention is: provide a kind of indoor above water craft Precise Position System based on monocular vision and method, can accurately position the boats and ships under indoor environment, and application algorithm obtains ship navigation state and real-time Output rusults.
The present invention for solving the problems of the technologies described above taked technical scheme is: a kind of indoor above water craft Precise Position System based on monocular vision, is characterized in that: it comprises the monocular vision unit be arranged on the bank and the identification positioning unit be connected with single camera vision system and the ship end tag unit of installing aboard ship; Wherein
Monocular vision unit is for obtaining the image of boats and ships when indoor surface navigation and exporting;
Identify that positioning unit is for gathering image, image procossing, identification target boats and ships, location and calculating sail information;
Ship end tag unit comprises 2 identification lights being arranged on stern and bow, and the shape of two identification lights is spherical, and color is different, and for identifying bow and the stern of target boats and ships, meanwhile, the center of two identification light lines characterizes the centre of form of target boats and ships.
By said system, the color of 2 described identification lights is respectively the one in red, green, blue three kinds.
Utilize the above-mentioned localization method realized based on the indoor above water craft Precise Position System of monocular vision, it is characterized in that: it comprises the following steps:
S1, the single-frame images that the monocular-camera demarcated obtains to be corrected, and be separated into the single channel image of R, G, B tri-passages;
S2, threshold value setting is carried out to the gray-scale value of each single channel image;
S3, using the gray-scale value of the single channel image identical with identification light color as benchmark, deduct the gray-scale value of other two single channel image respectively, obtain channel difference values, threshold value setting is carried out to channel difference values;
S4, set up and the single channel gray level image of image same size obtained, the threshold value according to S2 and S3 setting carries out binary conversion treatment, obtains binary image;
S5, obtain the connected domain of binary image, reject non-identification light target by connected domain area threshold;
S6, calculating remain the Hu square value of profile sequence after rejecting non-identification light target, do mate with the image of identification light, identify 2 identification lights respectively, adopt the central coordinate of circle of the profile minimum circumscribed circle matched to characterize the position of the identification light recognized;
S7, utilization pinhole imaging system principle, be converted to the coordinate under world coordinate system by the image coordinate of two identification lights;
S8, the identification light coordinate utilized under world coordinate system, calculate stem angle this moment;
S9, in real-time position fixing process, to obtain each two field picture carry out S1-S8 respectively, get headway and the course angle of the two identification light position calculation target boats and ships that adjacent two two field picture identifications obtain.
As stated above, in described S6, if matching result is not unique, or there is no the profile of coupling, then identify and stop and this two field picture is given up, get back to S1 and next frame image is processed.
As stated above, also comprise S10, the ship navigation state information calculated is shown in real time and stored, ship navigation state information comprises vessel position, headway, course angle and stem angle.
Beneficial effect of the present invention is: native system adopts fixing monocular-camera to position, and applicable above water craft locates the target localization of this two dimensional surface, can realize good positioning precision with lower cost; The recognition method based on color threshold and Shape expression adopted, in conjunction with ship end tag unit, utilizes the advantage of passive discerning, can get rid of the interference of water surface ripple and shelter preferably; Simple and practical by the location algorithm arranging threshold value identification target, decrease calculated amount, enhance the real-time of location; In the locator meams used, the calculating of vessel position is adopted to the geometrical principle of pinhole imaging system, the explicit algorithm formula parameter used derived is easily surveys parameter, convenient and swift, can not cause the accumulation of error.
Accompanying drawing explanation
Fig. 1 is the system architecture schematic diagram of one embodiment of the invention.
Fig. 2 is the control flow chart of one embodiment of the invention.
Fig. 3 is coordinate conversion schematic diagram.
Fig. 4 is the calculating schematic diagram at stem angle.
Fig. 5 is the calculating schematic diagram of headway and course angle.
In figure: 1-target boats and ships, 2-monocular-camera, 3-heading marker lamp, 4-stern marker lamp.
Embodiment
Below in conjunction with instantiation and accompanying drawing, the present invention will be further described.
The invention provides a kind of indoor above water craft Precise Position System based on monocular vision, as shown in Figure 1, it comprises the monocular vision unit be arranged on the bank and the identification positioning unit be connected with single camera vision system and the ship end tag unit of installing aboard ship; Wherein monocular vision unit is for obtaining the image of boats and ships when indoor surface navigation and exporting (in the present embodiment, monocular vision unit comprises monocular-camera and one and surpasses the above netting twine of five classes for transmit image data).Identify that positioning unit is for gathering image, image procossing, identification target boats and ships, location and calculating sail information (identifying in the present embodiment that positioning unit is load the host computer based on the identification positioning software program of C++ platform).Ship end tag unit comprises 2 identification lights (the present embodiment is heading marker lamp 3 and stern marker lamp 4) being arranged on stern and bow, the shape of two identification lights is spherical, make the character shape that monocular-camera recognizes when target boats and ships are in different attitude constant, the color of identification light is different, for identifying bow and the stern of target boats and ships, meanwhile, the center of two identification light lines characterizes the centre of form of target boats and ships.
Preferably, for the ease of processing image, the color of 2 described identification lights is respectively the one in red, green, blue three kinds.In the present embodiment, heading marker lamp 3 is blue, and stern marker lamp 4 is green.
Utilize the above-mentioned localization method realized based on the indoor above water craft Precise Position System of monocular vision as shown in Figure 2, comprise the following steps:
S1, the single-frame images that the monocular-camera demarcated obtains to be corrected, and be separated into the single channel image of R, G, B tri-passages.In the present embodiment, monocular-camera adopts chessboard calibration plate to demarcate, and utilize OpenCV vision built-in function to be corrected by the single-frame images that monocular-camera obtains, the built-in function of the separate colors passage used is cvSplit ().
S2, threshold value setting is carried out to the gray-scale value of each single channel image.
THA(R min)<R<THA(R max)
THA(G min)<G<THA(G max) (1),
THA(B min)<B<THA(B max)
Wherein R, G, B are chromaticity coordinate, and represent RGB three Color Channel gray-scale values respectively, scope is 0 ~ 255.For R passage threshold value, THA (r min) represent this passage bottom threshold, THA (r max) represent this passage upper threshold.This threshold value be chosen as empirical value, can test by experiment and obtain, due to below for the identification of identification light for turquoise dichromatism, general empirical value is as follows:
THA(R min)=0THA(G min)=50THA(B min)=50 (2)。
THA(R max)=50THA(G max)=255THA(B max)=255
S3, using the gray-scale value of the single channel image identical with identification light color as benchmark, deduct the gray-scale value of other two single channel image respectively, obtain channel difference values, threshold value setting is carried out to channel difference values.
In the present embodiment, for the stern marker lamp of green, the difference of green channel G and all the other two passage gray-scale values is arranged threshold value:
G-R>THA((G-R) min)
G-B>THA((G-B) min) (3),
Heading marker lamp for blueness is also set to:
B-R>THA((B-R) min)
B-G>THA((B-G) min) (4),
In formula, THA ((G-R) min) represent green and the difference threshold lower limit of redness 2 passage gray-scale values, THA ((G-B) min) represent green and blue 2 passage gray-scale value bottom threshold, THA ((B-R) min) represent blue and the difference threshold lower limit of redness 2 passage gray-scale values, THA ((B-G) min) represent blue and green 2 passage gray-scale value bottom threshold.
The effect of this step is further recognition image Green and blue logo lamp, and adopt the method for difference threshold can get rid of illumination condition to the impact identified, the empirical value of employing is as follows:
THA((G-R) min)=30THA((B-R) min)=30 (5)。
THA((G-B) min)=30THA((B-G) min)=30
S4, set up and the single channel gray level image of image same size obtained, the threshold value according to S2 and S3 setting carries out binary conversion treatment, obtains binary image.
Set up the single channel gray level image with the image same size obtained, after the color threshold process of S2 and S3, the image-region that color requires can be met, the single channel gray-scale value of the point in these regions is set to 255 (pure white), other regions are set to 0 (black), namely obtain binary image.
S5, obtain the connected domain of binary image, reject non-identification light target by connected domain area threshold.
In the present embodiment, utilize OpenCV built-in function to find the connected domain of the binary image obtained in S4, reject the non-identification light target of part by connected domain area threshold:
THA(S min)<S Area<THA(S max) (6),
Identification light in the picture connected domain areal extent obtains by coordinate conversion.Wherein S arearepresent the area of connected domain, THA (S min) and THA (S max) represent area threshold lower limit and the upper limit of setting respectively.This threshold value is obtained by experience, is the area size that boats and ships (1) are shared in the picture.Assuming that the image resolution ratio got is 1024*776, getting empirical value is:
THA(S min)=10
THA(S max)=1000 (7)。
S6, calculating remain the Hu square value of profile sequence after rejecting non-identification light target, do mate with the image of identification light, identify 2 identification lights respectively, adopt the central coordinate of circle of the profile minimum circumscribed circle matched to characterize the position of the identification light recognized.
Identification light Hu square value range set is as follows:
Hu min<Hu<Hu max(8),
Hu minand Hu maxbe respectively minimum value and the maximal value of Hu square value.
Profile after being mated by Hu square value is unique in theory, cause matching result unique if there is reasons such as interference, or there is no the profile of coupling, then identify and stop and this two field picture is given up, get back to S1 to process next frame image, until recognize unique mark profile.
S7, utilization pinhole imaging system principle, be converted to the coordinate under world coordinate system by the image coordinate of two identification lights.
As shown in Figure 3, if recognize the position of blue heading marker lamp on image for some P s1(x s1, y s1), the green position of stern marker lamp on image be P s2(x s2, y s2).Use the principle of pinhole imaging system, the image coordinate of two identification lights is converted to the position P of the water surface under world coordinate system 1(x 1, y 1) and P 2(x 2, y 2), because the Formula of Coordinate System Transformation of two points is identical, existing with P s1to P 1be converted to example to illustrate:
x 1 = H &CenterDot; ( tan &theta; - L H ) &CenterDot; sin ( 2 &theta; ) &CenterDot; x s 1 &CenterDot; S 2 H - ( tan &theta; - L H ) &CenterDot; sin ( 2 &theta; ) &CenterDot; y s 1 &CenterDot; S &CenterDot; sin &theta; - - - ( 9 ) ,
y 1 = 1 2 ( sin &theta; - L &CenterDot; cos &theta; H ) &CenterDot; sin ( 2 &theta; ) &CenterDot; y s 1 &CenterDot; S + ( 2 L + ( sin &theta; - L &CenterDot; cos &theta; H ) &CenterDot; sin ( 2 &theta; ) &CenterDot; y s 1 &CenterDot; S ) &CenterDot; ( tan &theta; - L H ) &CenterDot; sin ( 2 &theta; ) &CenterDot; y s 1 &CenterDot; S &CenterDot; sin &theta; 2 ( 2 H - ( tan &theta; - L H ) &CenterDot; sin ( 2 &theta; ) &CenterDot; y s 1 &CenterDot; S &CenterDot; sin &theta; )
In formula, H: the height (unit is cm) of plane that video camera is dried up; L: video camera is to the horizontal range (unit is cm) on bank; θ: focal plane X ' O ' Y ' with the angle of surface level (unit be °); W: the width (unit is cm) on the bank in the waters observed; W ': image level width (pixel unit).Above-mentioned physical quantity all can under indoor environment directly or double measurement obtain.
In like manner, for a P 2, have:
x 2 = H &CenterDot; ( tan &theta; - L H ) &CenterDot; sin ( 2 &theta; ) &CenterDot; x s 2 &CenterDot; W W &prime; 2 H - ( tan &theta; - L H ) &CenterDot; sin ( 2 &theta; ) &CenterDot; y s 2 &CenterDot; W W &prime; &CenterDot; sin &theta; - - - ( 10 ) ,
y 2 = 1 2 ( sin &theta; - L &CenterDot; cos &theta; H ) &CenterDot; sin ( 2 &theta; ) &CenterDot; y s 2 &CenterDot; W W &prime; + ( 2 L + ( sin &theta; - L &CenterDot; cos &theta; H ) &CenterDot; sin ( 2 &theta; ) &CenterDot; y s 2 &CenterDot; W W &prime; ) &CenterDot; ( tan &theta; - L H ) &CenterDot; sin ( 2 &theta; ) &CenterDot; y s 2 &CenterDot; W W &prime; &CenterDot; sin &theta; 2 ( 2 H - ( tan &theta; - L H ) &CenterDot; sin ( 2 &theta; ) &CenterDot; y s 2 &CenterDot; W W &prime; &CenterDot; sin &theta; )
Above-mentioned physical quantity all can under indoor environment directly or double measurement obtain.
S8, the identification light coordinate utilized under world coordinate system, calculate stem angle this moment.
As shown in Figure 4, the heading marker lamp coordinate P obtained is utilized 1(x 1, y 1) and stern marker lamp coordinate P 2(x 2, y 2), calculate stem angle φ this moment:
&phi; = arcsin ( x 1 - x 2 ( x 1 - x 2 ) 2 + ( y 1 - y 2 ) 2 ) if : x 1 - x 2 > 0 , y 1 - y 2 > 0 &pi; - arcsin ( x 1 - x 2 ( x 1 - x 2 ) 2 + ( y 1 - y 2 ) 2 ) if : x 1 - x 2 > 0 , y 1 - y 2 < 0 2 &pi; - arcsin ( x 1 - x 2 ( x 1 - x 2 ) 2 + ( y 1 - y 2 ) 2 ) if : x 1 - x 2 < 0 , y 1 - y 2 < 0 &pi; + arcsin ( x 1 - x 2 ( x 1 - x 2 ) 2 + ( y 1 - y 2 ) 2 ) if : x 1 - x 2 < 0 , y 1 - y 2 > 0 - - - ( 11 ) .
S9, in real-time position fixing process, to obtain each two field picture carry out S1-S8 respectively, get headway v and the course angle ψ of the two identification light position calculation target boats and ships that adjacent two two field picture identifications obtain.
As shown in Figure 5, suppose that two lamp positions of previous frame image are: heading marker lamp P 1' (x 1', y 1'), stern marker lamp P 2' (x 2', y 2'); Two lamp positions of current frame image are: heading marker lamp P 1(x 1, y 1), stern marker lamp P 2(x 2, y 2); Then have:
v = ( x 1 + x 2 - x 1 &prime; - x 2 &prime; ) 2 + ( y 1 + y 2 - y 1 &prime; - y 2 &prime; ) 2 2 T - - - ( 12 ) ,
&psi; = arcsin ( x 1 + x 2 - x 1 &prime; - x 2 &prime; ( x 1 + x 2 - x 1 &prime; - x 2 &prime; ) 2 + ( y 1 + y 2 - y 1 &prime; - y 2 &prime; ) 2 ) if : x b 2 - x b 1 > 0 , y b 2 - y b 1 > 0 &pi; - arcsin ( x 1 + x 2 - x 1 &prime; - x 2 &prime; ( x 1 + x 2 - x 1 &prime; - x 2 &prime; ) 2 + ( y 1 + y 2 - y 1 &prime; - y 2 &prime; ) 2 ) if : x b 2 - x b 1 > 0 , y b 2 - y b 1 < 0 2 &pi; - arcsin ( x 1 + x 2 - x 1 &prime; - x 2 &prime; ( x 1 + x 2 - x 1 &prime; - x 2 &prime; ) 2 + ( y 1 + y 2 - y 1 &prime; - y 2 &prime; ) 2 ) if : x b 2 - x b 1 < 0 , y b 2 - y b 1 < 0 &pi; + arcsin ( x 1 + x 2 - x 1 &prime; - x 2 &prime; ( x 1 + x 2 - x 1 &prime; - x 2 &prime; ) 2 + ( y 1 + y 2 - y 1 &prime; - y 2 &prime; ) 2 ) if : x b 2 - x b 1 < 0 , y b 2 - y b 1 > 0 - - - ( 13 ) ,
T represented for two frame period times in formula (12), due to time T very little (ms level), and can using the headway v that calculates and course angle ψ as the instantaneous speed of a ship or plane of current time and course angle.
Preferably, also comprise S10, the ship navigation state information calculated is shown in real time and stored, ship navigation state information comprises vessel position, headway, course angle and stem angle.
Above embodiment is only for illustration of design philosophy of the present invention and feature, and its object is to enable those skilled in the art understand content of the present invention and implement according to this, protection scope of the present invention is not limited to above-described embodiment.So all equivalent variations of doing according to disclosed principle, mentality of designing or modification, all within protection scope of the present invention.

Claims (5)

1. based on an indoor above water craft Precise Position System for monocular vision, it is characterized in that: it comprises the monocular vision unit be arranged on the bank and the identification positioning unit be connected with single camera vision system and the ship end tag unit of installing aboard ship; Wherein
Monocular vision unit is for obtaining the image of boats and ships when indoor surface navigation and exporting;
Identify that positioning unit is for gathering image, image procossing, identification target boats and ships, location and calculating sail information;
Ship end tag unit comprises 2 identification lights being arranged on stern and bow, and the shape of two identification lights is spherical, and color is different, and for identifying bow and the stern of target boats and ships, meanwhile, the center of two identification light lines characterizes the centre of form of target boats and ships.
2. the indoor above water craft Precise Position System based on monocular vision according to claim 1, is characterized in that: the color of 2 described identification lights is respectively the one in red, green, blue three kinds.
3. utilize the localization method realized based on the indoor above water craft Precise Position System of monocular vision described in claim 2, it is characterized in that: it comprises the following steps:
S1, the single-frame images that the monocular-camera demarcated obtains to be corrected, and be separated into the single channel image of R, G, B tri-passages;
S2, threshold value setting is carried out to the gray-scale value of each single channel image;
S3, using the gray-scale value of the single channel image identical with identification light color as benchmark, deduct the gray-scale value of other two single channel image respectively, obtain channel difference values, threshold value setting is carried out to channel difference values;
S4, set up and the single channel gray level image of image same size obtained, the threshold value according to S2 and S3 setting carries out binary conversion treatment, obtains binary image;
S5, obtain the connected domain of binary image, reject non-identification light target by connected domain area threshold;
S6, calculating remain the Hu square value of profile sequence after rejecting non-identification light target, do mate with the image of identification light, identify 2 identification lights respectively, adopt the central coordinate of circle of the profile minimum circumscribed circle matched to characterize the position of the identification light recognized;
S7, utilization pinhole imaging system principle, be converted to the coordinate under world coordinate system by the image coordinate of two identification lights;
S8, the identification light coordinate utilized under world coordinate system, calculate stem angle this moment;
S9, in real-time position fixing process, to obtain each two field picture carry out S1-S8 respectively, get headway and the course angle of the two identification light position calculation target boats and ships that adjacent two two field picture identifications obtain.
4. localization method according to claim 3, is characterized in that: in described S6, if matching result is not unique, or does not have the profile of coupling, then identify and stop and given up by this two field picture, get back to S1 and process next frame image.
5. localization method according to claim 3, is characterized in that: also comprise S10, the ship navigation state information calculated is shown in real time and stored, ship navigation state information comprises vessel position, headway, course angle and stem angle.
CN201510298563.8A 2015-06-03 2015-06-03 Indoor above water craft Precise Position System and method based on monocular vision Expired - Fee Related CN104867158B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510298563.8A CN104867158B (en) 2015-06-03 2015-06-03 Indoor above water craft Precise Position System and method based on monocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510298563.8A CN104867158B (en) 2015-06-03 2015-06-03 Indoor above water craft Precise Position System and method based on monocular vision

Publications (2)

Publication Number Publication Date
CN104867158A true CN104867158A (en) 2015-08-26
CN104867158B CN104867158B (en) 2017-09-29

Family

ID=53912973

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510298563.8A Expired - Fee Related CN104867158B (en) 2015-06-03 2015-06-03 Indoor above water craft Precise Position System and method based on monocular vision

Country Status (1)

Country Link
CN (1) CN104867158B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105241424A (en) * 2015-09-25 2016-01-13 小米科技有限责任公司 Indoor positioning method and intelligent management apparatus
CN105551037A (en) * 2015-12-10 2016-05-04 广州视源电子科技股份有限公司 User clothing size matching method, system and intelligent mirror
CN108133491A (en) * 2017-12-29 2018-06-08 重庆锐纳达自动化技术有限公司 A kind of method for realizing dynamic target tracking
CN108919800A (en) * 2018-06-22 2018-11-30 武汉理工大学 A kind of ship intelligently lines up navigation system
CN109901594A (en) * 2019-04-11 2019-06-18 清华大学深圳研究生院 A kind of localization method and system of weed-eradicating robot
CN110334701A (en) * 2019-07-11 2019-10-15 郑州轻工业学院 Collecting method based on deep learning and multi-vision visual under the twin environment of number
CN111596676A (en) * 2020-05-27 2020-08-28 中国科学院半导体研究所 Underwater Bessel light vision guiding method
CN114459423A (en) * 2022-01-24 2022-05-10 长江大学 Method for monocular measurement and calculation of distance of sailing ship
US11513525B2 (en) * 2018-11-29 2022-11-29 Lg Electronics Inc. Server and method for controlling laser irradiation of movement path of robot, and robot that moves based thereon

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101067557A (en) * 2007-07-03 2007-11-07 北京控制工程研究所 Environment sensing one-eye visual navigating method adapted to self-aid moving vehicle
CN102541417A (en) * 2010-12-30 2012-07-04 株式会社理光 Multi-object tracking method and system in virtual touch screen system
WO2014070483A1 (en) * 2012-11-02 2014-05-08 Qualcomm Incorporated Fast initialization for monocular visual slam
CN104331702A (en) * 2014-11-03 2015-02-04 黄辉 Image-recognition-based fresh tea leaf furcation number recognition method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101067557A (en) * 2007-07-03 2007-11-07 北京控制工程研究所 Environment sensing one-eye visual navigating method adapted to self-aid moving vehicle
CN102541417A (en) * 2010-12-30 2012-07-04 株式会社理光 Multi-object tracking method and system in virtual touch screen system
WO2014070483A1 (en) * 2012-11-02 2014-05-08 Qualcomm Incorporated Fast initialization for monocular visual slam
CN104331702A (en) * 2014-11-03 2015-02-04 黄辉 Image-recognition-based fresh tea leaf furcation number recognition method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HASHEN ASHRAFIUON ET AL.: "Sliding Mode Tracking Control of Surface Vessels", 《IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105241424A (en) * 2015-09-25 2016-01-13 小米科技有限责任公司 Indoor positioning method and intelligent management apparatus
CN105241424B (en) * 2015-09-25 2017-11-21 小米科技有限责任公司 Indoor orientation method and intelligent management apapratus
CN105551037A (en) * 2015-12-10 2016-05-04 广州视源电子科技股份有限公司 User clothing size matching method, system and intelligent mirror
CN108133491A (en) * 2017-12-29 2018-06-08 重庆锐纳达自动化技术有限公司 A kind of method for realizing dynamic target tracking
CN108919800A (en) * 2018-06-22 2018-11-30 武汉理工大学 A kind of ship intelligently lines up navigation system
US11513525B2 (en) * 2018-11-29 2022-11-29 Lg Electronics Inc. Server and method for controlling laser irradiation of movement path of robot, and robot that moves based thereon
CN109901594A (en) * 2019-04-11 2019-06-18 清华大学深圳研究生院 A kind of localization method and system of weed-eradicating robot
CN110334701A (en) * 2019-07-11 2019-10-15 郑州轻工业学院 Collecting method based on deep learning and multi-vision visual under the twin environment of number
CN110334701B (en) * 2019-07-11 2020-07-31 郑州轻工业学院 Data acquisition method based on deep learning and multi-vision in digital twin environment
CN111596676A (en) * 2020-05-27 2020-08-28 中国科学院半导体研究所 Underwater Bessel light vision guiding method
CN114459423A (en) * 2022-01-24 2022-05-10 长江大学 Method for monocular measurement and calculation of distance of sailing ship

Also Published As

Publication number Publication date
CN104867158B (en) 2017-09-29

Similar Documents

Publication Publication Date Title
CN104867158A (en) Monocular vision-based indoor water surface ship precise positioning system and method
CN105512628B (en) Vehicle environmental sensory perceptual system based on unmanned plane and method
CN103646249B (en) A kind of greenhouse intelligent mobile robot vision navigation path identification method
Milford et al. SeqSLAM: Visual route-based navigation for sunny summer days and stormy winter nights
CN100538723C (en) The inner river ship automatic identification system that multiple vision sensor information merges
CN110161485A (en) A kind of outer ginseng caliberating device and scaling method of laser radar and vision camera
CN102314602B (en) Shadow removal in image captured by vehicle-based camera using optimized oriented linear axis
CN101452292B (en) Fish glasses head omnidirectional vision aiming method based on sequence dual-color dot matrix type navigation mark
US20220024549A1 (en) System and method for measuring the distance to an object in water
CN111562791A (en) System and method for identifying visual auxiliary landing of unmanned aerial vehicle cooperative target
CN106863332B (en) Robot vision positioning method and system
CN110580044A (en) unmanned ship full-automatic navigation heterogeneous system based on intelligent sensing
CN104463877A (en) Shoreline registration method based on information of radar image and electronic sea chart
CN110334625A (en) A kind of parking stall visual identifying system and its recognition methods towards automatic parking
CN108985274B (en) Water surface foreign matter identification method
US20210295060A1 (en) Apparatus and method for acquiring coordinate conversion information
Zhang et al. Research on unmanned surface vehicles environment perception based on the fusion of vision and lidar
CN105427284A (en) Fixed target marking method based on airborne android platform
CN109740584A (en) Automatic parking parking space detection method based on deep learning
CN112364707A (en) System and method for over-the-horizon sensing of intelligent vehicle on complex road conditions
CN107977960A (en) A kind of car surface scratch detection algorithm based on improved SUSAN operators
CN102156991A (en) Quaternion based object optical flow tracking method
CN116087982A (en) Marine water falling person identification and positioning method integrating vision and radar system
CN110322462B (en) Unmanned aerial vehicle visual landing method and system based on 5G network
CN110120073B (en) Method for guiding recovery of unmanned ship based on lamp beacon visual signal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170929

Termination date: 20180603

CF01 Termination of patent right due to non-payment of annual fee