CN107452038A - Complex water areas method for tracking target based on AIS and active video camera - Google Patents
Complex water areas method for tracking target based on AIS and active video camera Download PDFInfo
- Publication number
- CN107452038A CN107452038A CN201710632153.1A CN201710632153A CN107452038A CN 107452038 A CN107452038 A CN 107452038A CN 201710632153 A CN201710632153 A CN 201710632153A CN 107452038 A CN107452038 A CN 107452038A
- Authority
- CN
- China
- Prior art keywords
- mtd
- mtr
- msub
- cell
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a kind of complex water areas method for tracking target based on AIS and active video camera, this method comprises the following steps:1) observation area is divided into n Cell unit, every video camera for actively imaging unit is demarcated for each Cell unit respectively, obtain the intrinsic parameter of video camera and outer parameter;2) Cell units are demarcated using AIS devices in advance, and demarcation information is stored, after target object enters related Cell units, according to the positional information of AIS returns, determine the Cell units where target object;3) make actively to image unit and quickly move up to Cell regions where alignment target;It is described that actively to image active video camera number of units in unit be m;4) photographic subjects object clearly video image, and according to tracing algorithm, being tracked to target object.The present invention proposes a kind of new shooting thinking:Using the method for division Cell units, combine the position and course for determining target object with Cell areas using AIS systems.
Description
Technical field
The present invention relates to, more particularly to a kind of complex water areas method for tracking target based on AIS and active video camera.
Background technology
In recent years, inland water transport plays more and more important effect in the development of national economy, with riverboat number of elements
Amount increase, navigation environment constantly deteriorate, and the safety issue of cruiseway also becomes increasingly conspicuous, therefore, to inland river complex water areas
Monitoring is particularly important, and monitor mode normal at present mainly has vision monitoring, radar monitoring and AIS monitoring, radar monitoring
Application field is very extensive, but it has inborn deficiency, i.e., is just failed in the weather to rain, haze, haze is heavier, equally,
Vision monitoring there is also it is similar the problem of, vision monitoring and AIS assembly monitors are joined together to realize that monitoring to ship can be with
Greatly solve problem above, effectively strengthen the tracking and monitoring to river vessel, reduce inland river accident and occur.Market at present
The mode for capturing video image is mostly passively to perceive, and active perception mode has been to be concerned by more and more people, but at present should
With less, it is mainly used in tight national defence monitoring place, often moves the camera of once active video camera, it is right to be required for again
Video camera is demarcated, and therefore, even if active video camera has application case, but does not play its maximum effectiveness --- shooting
Clearly picture, (author is by active video camera, AIS devices and " target scene Cell divisions " by author of the present invention
Being itd is proposed in the patent of submission) triple combination gets up and realizes monitoring to ship, and category proposes first, ready-portioned
Cell regions, before active camera operation, for each Cell area, video camera and AIS are demarcated, and record and deposit
The result of demarcation is stored up, the development of the work ensure that video camera is capable of the picture of shooting clear, meanwhile, returned according to AIS transmitters
The message returned, it may be determined that the direction of ship's navigation, and then realize and shooting clear is tracked to ship using active video camera
The purpose of image.
At present, the method or apparatus of capture images mainly has following several:
A. the patent invented by Nantong shipping Vocationl Technical College:A kind of intelligence of pathfinder or AIS tracking parameter guiding
Can video monitoring system (the published patent No.:CN104184990A), the invention is carried out first with radar or AIS to target
Positioning, then, shooting is tracked to target using the equipment at video surveillance station, tracking equipment is mainly with head function
Video camera.The advantages of this method is to carry out " combining shooting " with video monitor using AIS data, and coverage is wider, and simultaneously
More than for target object in itself, shooting is tracked using video camera, meanwhile, for target object imaging parameters, and
To the track algorithm of target object operating path, the program does not provide specific scheme, it is thus impossible to enough photographic subjects objects
The image and video of high-resolution.This method is shot using active video camera to object, just with actively imaging function
It is enough to change shooting posture this feature, not using active video camera take pictures aspect the characteristics of, i.e., this method is not to target
Region is finely divided, and also target area is not demarcated, therefore, it is impossible to the inside and outside parameter of video camera is adjusted, photographic subjects thing
The picture rich in detail of the ad-hoc location of body, and ad-hoc location is tracked.
B. the patent invented by China Aerospace Times Electronics Corporation:A kind of vision in UAV Maneuver target locating
Method of servo-controlling (the published patent No.:CN105353772A), a kind of UAV Maneuver target positioning of the disclosure of the invention
Visual servoing control method in tracking.The invention is only needed using single fixed camera to target imaging, by controlling unmanned plane
Pose, the high-precision consecutive tracking tracking to maneuvering target is realized, this method need not track head and Laser Distance Measuring Equipment, effectively
Ground reduces the volume and cost of load, improves the disguise of scouting.This method target is positioned using unmanned plane and
Track up, target clearly image can be obtained, still, this method can not carry out Collaborative Control with ship, can not be automatic
The ship that tracking and monitoring is fixed.This method, which is mainly characterized by scouting, has disguise, and still, the feature also result in this method only
Target object topography can be captured.
C. the patent invented by China Electronics Technology Group Co., Ltd. Second Research Institute 18:A kind of CCTV ships video smoothing
Tracking (the published patent No.:105430326A), the characteristics of the invention it is CCTV cameras track up together with AIS
Target ship, solve the problems, such as float in shooting process, still, this method can not photographic subjects high definition image, together
When also groundless target object speed, to allocate the tracking velocity of video, finally, the invention uses AIS localization methods, still
There is certain error in AIS localization methods, therefore, the locating effect of the invention is nor very accurate in itself.
The content of the invention
The defects of the technical problem to be solved in the present invention is to be directed in the prior art, there is provided one kind is based on AIS and active
The complex water areas method for tracking target of video camera.
The technical solution adopted for the present invention to solve the technical problems is:A kind of complexity based on AIS and active video camera
Waters method for tracking target, comprises the following steps:
1) observation area is divided into n Cell unit, wherein n >=2;Every video camera for actively imaging unit is distinguished into pin
Each Cell unit is demarcated, obtains the intrinsic parameter of video camera and outer parameter;
2) Cell units are demarcated using AIS devices in advance, and demarcation information is stored, when target object enters phase
After the Cell units of pass, according to the positional information of AIS returns, the Cell units where target object are determined;
3) make actively to image unit (actively imaging active video camera number of units m >=1 (platform) in unit) and quickly move up to be aligned
Cell regions where target;
4) photographic subjects object clearly video image, and according to tracing algorithm, being tracked to target object.
Video camera demarcate by such scheme, in the step 1) specific as follows:
When every video camera is directed at each Cell area, the posture and angle value of head are recorded, will be by camera calibration
Result storage, the result of the camera calibration includes the intrinsic parameter of video camera and outer parameter;
Video camera is demarcated, and scaling method is as follows:
1.1) camera coordinate system and image coordinate system are established
Initially set up two fundamental coordinate systems, i.e. image coordinate system and camera coordinate system:Image coordinate system is left with photo
The rectangular coordinate system that upper angle is established in units of pixel, image coordinate point are set to [XP YP]T;Camera coordinate system is with camera photocentre
For origin, optical axis is z-axis, and with x in Picture Coordinate system, y establishes in direction x-axis, y-axis, and coordinate system meets right-hand rule, its coordinate points
For [XC YC ZC]T;
1.2) relation between Two coordinate system is established
Because camera meets pin-hole model, therefore meet relationship below
In formula, λ is scale factor, and K is camera intrinsic parameter, is obtained by Zhang Zhengyou standardizations;
1.3) world coordinate system is established
World coordinate system is introduced, and establishes the relation of world coordinate system and above-mentioned Two coordinate system, is set in world coordinate system
It is x along the long cross direction of gridiron pattern, y-axis is z-axis perpendicular to gridiron pattern direction, and world coordinate system midpoint is set to [XW YW ZW]T;
World coordinate system and camera coordinate system and image coordinate system relation are as follows:
1.4) camera is calculated to chessboard compartment spin matrix R and translation matrix t
Obtain to coordinate of all angle points under world coordinate system in gridiron pattern, its i-th of angular coordinate is [XWi YWi
0]T, while find and point coordinates [X is corresponded under image coordinate systemPi YPi]T, with reference to demarcation camera intrinsic parameter K, with following formula to R and t
Solved:
Wherein, λ is scale factor, and R is 3 × 3 matrix, represents world coordinate system to the anglec of rotation of camera coordinate system
Degree, t are 3 × 1 matrix, represent that world coordinate system to the translation distance of camera coordinates system, obtains R and λ, so as to obtain by above formula
To the optimal imaging parameters for actively imaging unit.
By such scheme, it is specially in the step 2):The Cell regions formed in advance using AIS devices to Cell units
Demarcated, when water surface area shared by target object is less than Cell cellar areas, the object run for being loaded with AIS devices enters to
In each Cell unit, for each Cell unit, by repeatedly measuring, target object in different Cell units
When, resulting data classification storage is got up, and its storage method is:The form of quantity identical with Cell unit numbers is established,
The data obtained in different Cell, are stored in corresponding Cell forms and (the larger data of deviation are removed), according to each
Data in form, it is determined that the maximum position range that each Cell can be accommodated, the scope include longitude range and latitude model
Value is enclosed, after target object enters related Cell units, the positional information that is returned according to AIS, where determining target object
Cell units;When water surface area shared by target object is more than a Cell unit, according to the length and width of target object and residing
Position determine the Cell units shared by target object number and target object where whole Cell units;
By such scheme, it is specially in the step 2):Cell units are demarcated using AIS devices, it is determined that each
The longitude and latitude value of cell units, the four edges that each cell units are linked to be by four points surround;It is online in ship observation, really
The longitude and latitude value of fixed each cell units angle point, by four angle points of each cell units, utilize polygon functions
Characteristic, you can determine target object any point whether in the cell units;
When there is the nearly observation area of ship, the longitude of ship, latitude value, course, the speed of a ship or plane are received by AIS, established
The running orbit model of ship, working line point set is obtained, according to polygon functions, each point concentrated to working line point
Judged, and then order and the time of each cell units that ship passes through can be obtained.
By such scheme, the step 3) is specially:
Cell regions according to where target, finding step 1) in the internal reference numerical value of video camera that stores of calculated in advance with it is outer
The posture and angle value of parameter value and head, and being input in corresponding active video camera, make with target shared by Cell areas phase
With the Cell regions actively imaged where unit rapidly moves to alignment target of number, target is observed.
By such scheme, the step 4) is specially:
Returned according to the AIS data demarcated in advance, including the position of target object, direction, size and speed, and AIS
The real-time speed and bearing data of the target object returned, it can be deduced that, target object enters time during next Cell units,
Into before next Cell units, the active video camera demarcated is directed at corresponding Cell units, it is mono- to treat that target enters Cell
, can be with photographic subjects clearly video image after member;Constantly repeat the above steps, so as to complete the task of target following.
The beneficial effect comprise that:
1st, the present invention proposes a kind of new shooting thinking:Using the method in division Cell areas, using AIS systems, (AIS connects
Receipts machine and AIS transmitters) combine the position and course for determining target object with Cell areas;
2nd, this method uses AIS systems, assembles conjunction shooting image with active video camera, tracks target;
3rd, the timing of AIS devices send the course of object, the speed of object, the size of object (including length with it is wide
Degree) with the positional information of target, according to this information, monitoring system calculates the Cell regions that arrival is walked under ship automatically, and point
Shooting is tracked with an appropriate number of video camera.
Brief description of the drawings
Below in conjunction with drawings and Examples, the invention will be further described, in accompanying drawing:
Fig. 1 utilizes the schematic diagram of a scenario for actively imaging unit and observing divided Cell areas for the embodiment of the present invention;
Fig. 2 is active camera motion direction of the embodiment of the present invention and angle;
Fig. 3 be the embodiment of the present invention in Cell areas shared by target and actively image unit observe Cell areas;
Fig. 4 is the flow chart of the embodiment of the present invention.
Embodiment
In order to make the purpose , technical scheme and advantage of the present invention be clearer, with reference to embodiments, to the present invention
It is further elaborated.It should be appreciated that specific embodiment described herein is not used to limit only to explain the present invention
The fixed present invention.
As shown in figure 4, a kind of complex water areas method for tracking target based on AIS and active video camera, comprises the following steps:
1) observation area is divided into n cell unit, division principle is:The size of each cell units is each actively to take the photograph
Camera can observe the maximum magnitude (allowing to partly overlap, the width of lap is more than beam) of scene, if shooting
The shooting angle of machine is different, the position where cell is different, and the size of the cell units of division is also different, the size of cell units
It must not exceed the maximum range of observation of video camera.The camera lens range of observation of video camera is, it is known that can be with online right in ship observation
The Changjiang river region to be observed carries out cell divisions, during cell dividing elements, between cell units and cell units, certain weight be present
Folded, in the direction vertical with ship direct of travel, the width of adjacent cell cells overlaps part must be more than maximum observation ship
Width (blue portion in such as Fig. 3), on ship direct of travel, the width of adjacent cell cells overlaps part must be more than S1 (S1
For the length L of maximum observation ship 1/4) (RED sector in such as Fig. 3).Every video camera for actively imaging unit is distinguished into pin
Each cell units are demarcated, obtain the intrinsic parameter of video camera and outer parameter, and the parameter is deposited in letter system server.
Video camera demarcate specific as follows:
When every video camera is directed at each cell area, the posture and angle value of head are recorded, will be by camera calibration
Result storage, the result of the camera calibration includes the intrinsic parameter of video camera and outer parameter;
Video camera is demarcated, and scaling method is as follows:
1.1) camera coordinate system and image coordinate system are established
Initially set up two fundamental coordinate systems, i.e. image coordinate system and camera coordinate system:Image coordinate system is left with photo
The rectangular coordinate system that upper angle is established in units of pixel, image coordinate point are set to [XP YP]T;Camera coordinate system is with camera photocentre
For origin, optical axis is z-axis, and with x in Picture Coordinate system, y establishes in direction x-axis, y-axis, and coordinate system meets right-hand rule, its coordinate points
For [XC YC ZC]T;
1.2) relation between Two coordinate system is established
Because camera meets pin-hole model, therefore meet relationship below
In formula, λ is scale factor, and K is camera intrinsic parameter, is obtained by Zhang Zhengyou standardizations;
1.3) world coordinate system is established
World coordinate system is introduced, and establishes the relation of world coordinate system and above-mentioned Two coordinate system, is set in world coordinate system
It is x along the long cross direction of gridiron pattern, y-axis is z-axis perpendicular to gridiron pattern direction, and world coordinate system midpoint is set to [XW YW ZW]T;
World coordinate system and camera coordinate system and image coordinate system relation are as follows:
1.4) camera is calculated to chessboard compartment spin matrix R and translation matrix t
Obtain to coordinate of all angle points under world coordinate system in gridiron pattern, its i-th of angular coordinate is [XWi YWi
0]T, while find and point coordinates [X is corresponded under image coordinate systemPi YPi]T, with reference to demarcation camera intrinsic parameter K, with following formula to R and t
Solved:
Wherein, λ is scale factor, and R is 3 × 3 matrix, represents world coordinate system to the anglec of rotation of camera coordinate system
Degree, t are 3 × 1 matrix, represent that world coordinate system to the translation distance of camera coordinates system, obtains R and λ, so as to obtain by above formula
To the optimal imaging parameters for actively imaging unit.
2) because ship is to send an AIS data at regular intervals, therefore, it is impossible to the AIS numbers sent according to ship
According to the position for monitoring ship all the time, the monitoring that can only be interrupted, therefore, to monitoring AIS data blind area, ship can be passed through
Headway and the closest cell units of ship distance, determine that ship enters the time of cell units, navigated according to ship
Capable track, determines which cell unit ship enters.
3) on the basis of upper step divides divided good cell units, net is observed (such as using ship:Ship news net, precious ship
Net etc.) (it is determined that longitude and latitude value of each cell units) is demarcated to cell units, each cell units are by four points
The four edges being linked to be surround.It is online in ship observation, it may be determined that the longitude and latitude value of each cell unit angle points, by every
Whether four angle points of individual cell units, utilize the characteristic of polygon functions, you can determine any point in the cell units
In.
When there is the nearly observation area of ship, the longitude of ship, latitude value, course, the speed of a ship or plane are received by AIS.Establish
The running orbit model of ship --- working line point set, according to polygon functions, each click-through concentrated to working line point
Row judges, and then can obtain sequence number and the time of each cell units that ship passes through;
Online division cell units are observed in ship in step 3), each cell units are numbered, and are recorded each
The latitude and longitude value of four angle points of cell units, and by value storage in the server.
Cell regions are demarcated using ship observation net:Due to being more by region division to be seen in previous step
Individual cell units, and ship observation can determine the longitude and latitude scope of each cell units on the net, and it is corresponding by what is obtained
The longitude and latitude scope of cell units is stored in system server.The position that the AIS of the target ship assembly of reception is sent
Put data (longitude and latitude value) to contrast with each cell units longitude and latitude scope, the used function of contrast is polygon, really
Latitude and longitude value is determined in cell units, is considered as the target ship and is in the cell units.
4) photographic subjects ship clearly video image:When target ship enters first cell unit, from actively
Video camera is directed at first cell unit and observed, and will be entered using another active video camera alignment ship for imaging unit
The cell units entered, when ship appears in second cell unit, the picture of ship is repeatedly shot immediately.Repeat above mistake
Journey, until ship leaves observation area completely.
Step 4) is specially:
Cell areas according to where target, finding step 1) in calculated in advance internal reference numerical value and the outer ginseng of the video camera that store
In numerical value, and shooting unit, when every video camera is directed at each cell units, the angle value of head (is taken on each head 2-2
A video camera 2-1 is carried, as shown in Figure 2), and be input in corresponding active video camera, make active camera motion to alignment
Cell units where target, are observed target.
Coordinate the device of this method as follows:A kind of complex water areas target recognition and tracking based on AIS and active video camera
Device.The device includes:Ship automatic identification system (AIS), actively image unit, graphics processing unit;The ship is automatic
Identifying system (AIS) includes AIS dispensing devices and AIS reception devices, and AIS dispensing devices are arranged on ship, are mainly used in sending out
AIS information (as shown in table 1) is sent, AIS reception devices are placed in bank, are mainly used in receiving AIS information.The active video camera
Group includes controller and several active video cameras, and each active video camera adjusts active video camera observation visual angle equipped with one
Rotary head, and the image pickup scope of each active video camera is identical with the cell units of a setting;Controller is used for according to spy
Survey the rotary head of the corresponding active video camera of signal adjustment of device;Described image processing unit mainly includes a computer, with
The wiring unit of computer is connected, graphics processing unit is mainly used in being synthesized in the photo that active video camera is shot.This hair
It is bright being capable of active perception target object:Ship, first according to the position where active video camera, and observed object:Ship
Size carries out optimal cell units to viewing area and split, then, the AIS message informations returned by onboard AIS device, letter
Breath mainly includes:The size of ship, course, speed, position, the running orbit model of ship is established, according to the rail of vessel motion
Mark model determines numbering and the time for all cell units that ship will pass through, and it is mono- to appear in each cell in target object
When first, the image of shooting clear is carried out to each cell unit using the active video camera demarcated, and using actively taking the photograph
Second cell unit that remaining video camera alignment target ship in camera group will enter, after entering so as to target, horse
Photographs target clearly image, after target enters every two cell units, will be directed at the video camera of first cell unit
It is adjusted to be directed at the 3rd cell unit that ship will pass through, repeats above procedure, realize to target ship track up
Purpose.Above whole process, realize the purpose to target ship recognition and tracking.
The Ship dynamic situation information format of table 1
One specific embodiment:
S1, such as Fig. 1, the unit 3 that actively images that this example is selected contain 4 active video cameras, and 4 active video cameras are in upright plane
Place, every active video camera works independently, and can choose suitable number of active video camera according to the size of target and participate in work
Make;
S2, the characteristic according to passive video camera and active video camera, the Changjiang river section is selected as observation area 4 as, outside viewing section
Region is region 1;The Changjiang river section is divided into 6 cell units, serial number 1~6, as shown in figure 3,4 are actively imaged
Machine is respectively aligned to 6 cell areas, and records the angle value of head when being directed at each cell areas, meanwhile, every video camera alignment
Demarcated during each cell areas, obtain R and λ, parameter obtained by calibrating is stored in computer, in case being taken at any time when taking pictures
Go out to use;
S3, using ship observe net (such as:Precious ship net, ship news net etc.) division demarcation is carried out to region to be measured, it is necessary first to
It is determined that the range of observation of active camera lens, secondly, examines in ship and carries out cell dividing elements to viewing area on the net, and look into
Look for the longitude and latitude value of four angle points of each cell units.And classification is carried out with latitude value information to these longitudes and deposited
Storage.
S4, when there is the nearly observation area of ship, the longitude of ship, latitude value, course, the speed of a ship or plane are received by AIS.Build
The running orbit model of vertical ship, according to running orbit model, determine sequence number and the time of each cell units that ship passes through;
S5, when target ship enters first cell unit, with carry head active video camera alignment target cell
Unit is observed, and the cell units that will be entered using another active video camera alignment ship for imaging unit, works as ship
When appearing in second cell unit, the picture of ship is repeatedly shot immediately.Above procedure is repeated, until ship leaves sight completely
Cha Qu.
It should be appreciated that for those of ordinary skills, can according to the above description be improved or converted,
And all these modifications and variations should all belong to the protection domain of appended claims of the present invention.
Claims (6)
1. a kind of complex water areas method for tracking target based on AIS and active video camera, it is characterised in that comprise the following steps:
1) observation area is divided into n Cell unit, wherein n >=2;Every video camera of unit will actively be imaged respectively for every
One Cell unit is demarcated, and obtains the intrinsic parameter of video camera and outer parameter;
2) Cell units are demarcated using AIS devices in advance, and demarcation information is stored, when target object enters correlation
After Cell units, according to the positional information of AIS returns, the Cell units where target object are determined;
3) make actively to image unit and quickly move up to Cell regions where alignment target;Described actively image in unit is actively taken the photograph
Camera number of units is m, m >=1;
4) photographic subjects object clearly video image, and according to tracing algorithm, being tracked to target object.
2. the complex water areas method for tracking target according to claim 1 based on AIS and active video camera, its feature exist
In to video camera demarcate in the step 1) specific as follows:
When every video camera is directed at each Cell area, the posture and angle value of head are recorded, by by the knot of camera calibration
Fruit stores, and the result of the camera calibration includes the intrinsic parameter of video camera and outer parameter;
Video camera is demarcated, and scaling method is as follows:
1.1) camera coordinate system and image coordinate system are established
Initially set up two fundamental coordinate systems, i.e. image coordinate system and camera coordinate system:Image coordinate system is with the photo upper left corner
The rectangular coordinate system established in units of pixel, image coordinate point are set to [XP YP]T;Camera coordinate system is using camera photocentre as original
Point, optical axis are z-axis, and with x in Picture Coordinate system, y establishes in direction x-axis, y-axis, and coordinate system meets right-hand rule, and its coordinate points is
[XC YC ZC]T;
1.2) relation between Two coordinate system is established
Because camera meets pin-hole model, therefore meet relationship below
<mrow>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>X</mi>
<mi>P</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Y</mi>
<mi>P</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>1</mn>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>=</mo>
<mi>&lambda;</mi>
<mo>&CenterDot;</mo>
<mi>K</mi>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>X</mi>
<mi>C</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Y</mi>
<mi>C</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Z</mi>
<mi>C</mi>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
In formula, λ is scale factor, and K is camera intrinsic parameter, is obtained by Zhang Zhengyou standardizations;
1.3) world coordinate system is established
World coordinate system is introduced, and establishes the relation of world coordinate system and above-mentioned Two coordinate system, is set in world coordinate system along chess
The long cross direction of disk lattice is x, and y-axis is z-axis perpendicular to gridiron pattern direction, and world coordinate system midpoint is set to [XW YW ZW]T;
World coordinate system and camera coordinate system and image coordinate system relation are as follows:
<mrow>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>X</mi>
<mi>C</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Y</mi>
<mi>C</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Z</mi>
<mi>C</mi>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>=</mo>
<mo>&lsqb;</mo>
<mtable>
<mtr>
<mtd>
<mi>R</mi>
</mtd>
<mtd>
<mi>t</mi>
</mtd>
</mtr>
</mtable>
<mo>&rsqb;</mo>
<mo>&CenterDot;</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>X</mi>
<mi>W</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Y</mi>
<mi>W</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Z</mi>
<mi>W</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>1</mn>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
<mrow>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>X</mi>
<mi>P</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Y</mi>
<mi>P</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>1</mn>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>=</mo>
<mi>&lambda;</mi>
<mo>&CenterDot;</mo>
<mi>K</mi>
<mo>&CenterDot;</mo>
<mo>&lsqb;</mo>
<mtable>
<mtr>
<mtd>
<mi>R</mi>
</mtd>
<mtd>
<mi>t</mi>
</mtd>
</mtr>
</mtable>
<mo>&rsqb;</mo>
<mo>&CenterDot;</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>X</mi>
<mi>W</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Y</mi>
<mi>W</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Z</mi>
<mi>W</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>1</mn>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
1.4) camera is calculated to chessboard compartment spin matrix R and translation matrix t
Obtain to coordinate of all angle points under world coordinate system in gridiron pattern, its i-th of angular coordinate is [XWi YWi 0]T,
Find simultaneously and point coordinates [X is corresponded under image coordinate systemPi YPi]T, with reference to demarcation camera intrinsic parameter K, R and t is carried out with following formula
Solve:
<mrow>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>X</mi>
<mi>P</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Y</mi>
<mi>P</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>1</mn>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>=</mo>
<mi>&lambda;</mi>
<mo>&CenterDot;</mo>
<mi>K</mi>
<mo>&CenterDot;</mo>
<mo>&lsqb;</mo>
<mtable>
<mtr>
<mtd>
<mi>R</mi>
</mtd>
<mtd>
<mi>t</mi>
</mtd>
</mtr>
</mtable>
<mo>&rsqb;</mo>
<mo>&CenterDot;</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>X</mi>
<mi>W</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Y</mi>
<mi>W</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>Z</mi>
<mi>W</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>1</mn>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
Wherein, λ is scale factor, and R is 3 × 3 matrix, represents that world coordinate system is to the anglec of rotation of camera coordinate system, t
3 × 1 matrix, represent that world coordinate system to the translation distance of camera coordinates system, obtains R and λ, so as to obtain actively by above formula
Image the optimal imaging parameters of unit.
3. the complex water areas method for tracking target according to claim 1 based on AIS and active video camera, its feature exist
In in the step 2) specially:The Cell regions that Cell units form are demarcated using AIS devices in advance, work as target
When water surface area shared by object is less than Cell cellar areas, the object run for being loaded with AIS devices enters to each Cell unit
In, for each Cell unit, by repeatedly measuring, during target object in different Cell units, resulting number
Get up according to classification storage, its storage method is:The form of identical with Cell unit numbers quantity is established, being obtained in different Cell
The data taken, it is stored in corresponding Cell forms, according to the data in each form, it is determined that each Cell can be accommodated most
Big position range, the scope include longitude range and latitude scope value, after target object enters related Cell units, root
The positional information returned according to AIS, determines the Cell units where target object;When water surface area shared by target object is more than one
During Cell units, the number of the Cell units according to shared by the length and width of target object and location determine target object and
Whole Cell units where target object.
4. the complex water areas method for tracking target according to claim 1 based on AIS and active video camera, its feature exist
In in the step 2) specially:Cell units are demarcated using AIS devices, it is determined that the longitude of each cell units with
Latitude value, the four edges that each cell units are linked to be by four points surround;It is online in ship observation, it is determined that each cell cell corners
The longitude and latitude value of point, by four angle points of each cell units, utilize the characteristic of polygon functions, you can determine mesh
Object any point is marked whether in the cell units;
When there is the nearly observation area of ship, the longitude of ship, latitude value, course, the speed of a ship or plane are received by AIS, establish ship
Running orbit model, obtain working line point set, according to polygon functions, each point that working line point is concentrated carried out
Judge, and then order and the time of each cell units that ship passes through can be obtained.
5. the complex water areas method for tracking target according to claim 1 based on AIS and active video camera, its feature exist
In the step 3) is specially:
Cell regions according to where target, finding step 1) in calculated in advance internal reference numerical value and the outer parameter of the video camera that store
The posture and angle value of value and head, and are input in corresponding active video camera, make with target shared by identical of Cell areas
Several Cell regions for actively imaging unit and rapidly moving to where alignment target, are observed target.
6. the complex water areas method for tracking target according to claim 1 based on AIS and active video camera, its feature exist
In the step 4) is specially:
According to AIS data demarcate in advance, including the position of target object, direction, size and speed, and AIS returns
The real-time speed and bearing data of target object, it can be deduced that, target object enters time during next Cell units, is entering
Before next Cell units, the active video camera demarcated is directed at corresponding Cell units, after target enters Cell units,
Can be with photographic subjects clearly video image;Constantly repeat the above steps, so as to complete the task of target following.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710632153.1A CN107452038A (en) | 2017-07-28 | 2017-07-28 | Complex water areas method for tracking target based on AIS and active video camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710632153.1A CN107452038A (en) | 2017-07-28 | 2017-07-28 | Complex water areas method for tracking target based on AIS and active video camera |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107452038A true CN107452038A (en) | 2017-12-08 |
Family
ID=60489730
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710632153.1A Pending CN107452038A (en) | 2017-07-28 | 2017-07-28 | Complex water areas method for tracking target based on AIS and active video camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107452038A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108109179A (en) * | 2017-12-29 | 2018-06-01 | 天津科技大学 | Video camera attitude updating method based on pinhole camera modeling |
CN108366227A (en) * | 2018-01-30 | 2018-08-03 | 上海海事大学 | The application platform of unmanned plane in a kind of maritime affairs intelligence cruise |
CN109255820A (en) * | 2018-08-31 | 2019-01-22 | 武汉理工大学 | A kind of actively perceive apparatus and method based on unmanned boat |
CN112750104A (en) * | 2020-12-29 | 2021-05-04 | 广东鉴面智能科技有限公司 | Method and device for automatically matching optimal camera by monitoring ship through multiple cameras |
CN112857360A (en) * | 2021-03-22 | 2021-05-28 | 哈尔滨工程大学 | Ship navigation multi-information fusion method |
CN116309851A (en) * | 2023-05-19 | 2023-06-23 | 安徽云森物联网科技有限公司 | Position and orientation calibration method for intelligent park monitoring camera |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104184990A (en) * | 2014-06-03 | 2014-12-03 | 南通航运职业技术学院 | Navigation radar or AIS tracking parameter booted intelligent video monitoring system |
CN104809917A (en) * | 2015-03-23 | 2015-07-29 | 南通大学 | Ship real-time tracking monitoring method |
CN105353772A (en) * | 2015-11-16 | 2016-02-24 | 中国航天时代电子公司 | Visual servo control method for unmanned aerial vehicle maneuvering target locating and tracking |
CN105430326A (en) * | 2015-11-03 | 2016-03-23 | 中国电子科技集团公司第二十八研究所 | Smooth CCTV (Closed Circuit Television System) ship video tracking method |
CN106846284A (en) * | 2016-12-28 | 2017-06-13 | 武汉理工大学 | Active-mode intelligent sensing device and method based on cell |
-
2017
- 2017-07-28 CN CN201710632153.1A patent/CN107452038A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104184990A (en) * | 2014-06-03 | 2014-12-03 | 南通航运职业技术学院 | Navigation radar or AIS tracking parameter booted intelligent video monitoring system |
CN104809917A (en) * | 2015-03-23 | 2015-07-29 | 南通大学 | Ship real-time tracking monitoring method |
CN105430326A (en) * | 2015-11-03 | 2016-03-23 | 中国电子科技集团公司第二十八研究所 | Smooth CCTV (Closed Circuit Television System) ship video tracking method |
CN105353772A (en) * | 2015-11-16 | 2016-02-24 | 中国航天时代电子公司 | Visual servo control method for unmanned aerial vehicle maneuvering target locating and tracking |
CN106846284A (en) * | 2016-12-28 | 2017-06-13 | 武汉理工大学 | Active-mode intelligent sensing device and method based on cell |
Non-Patent Citations (3)
Title |
---|
于臣: "海事CCTV控制系统的研究", 《中国优秀硕士学位论文全文数据库-信息科技辑》 * |
赵航等: "一种VTS系统中CCTV对船舶跟踪监控方法", 《雷达与对抗》 * |
靳智: "基于AIS的控制河段船舶视觉跟踪控制系统研究", 《中国优秀硕士学位论文全文数据库-信息科技辑》 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108109179A (en) * | 2017-12-29 | 2018-06-01 | 天津科技大学 | Video camera attitude updating method based on pinhole camera modeling |
CN108109179B (en) * | 2017-12-29 | 2021-05-18 | 天津科技大学 | Camera attitude correction method based on pinhole camera model |
CN108366227A (en) * | 2018-01-30 | 2018-08-03 | 上海海事大学 | The application platform of unmanned plane in a kind of maritime affairs intelligence cruise |
CN109255820A (en) * | 2018-08-31 | 2019-01-22 | 武汉理工大学 | A kind of actively perceive apparatus and method based on unmanned boat |
CN112750104A (en) * | 2020-12-29 | 2021-05-04 | 广东鉴面智能科技有限公司 | Method and device for automatically matching optimal camera by monitoring ship through multiple cameras |
CN112857360A (en) * | 2021-03-22 | 2021-05-28 | 哈尔滨工程大学 | Ship navigation multi-information fusion method |
CN112857360B (en) * | 2021-03-22 | 2022-06-17 | 哈尔滨工程大学 | Ship navigation multi-information fusion method |
CN116309851A (en) * | 2023-05-19 | 2023-06-23 | 安徽云森物联网科技有限公司 | Position and orientation calibration method for intelligent park monitoring camera |
CN116309851B (en) * | 2023-05-19 | 2023-08-11 | 安徽云森物联网科技有限公司 | Position and orientation calibration method for intelligent park monitoring camera |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107452038A (en) | Complex water areas method for tracking target based on AIS and active video camera | |
CN106681353B (en) | The unmanned plane barrier-avoiding method and system merged based on binocular vision with light stream | |
CN106444837A (en) | Obstacle avoiding method and obstacle avoiding system for unmanned aerial vehicle | |
CN103822635B (en) | The unmanned plane during flying spatial location real-time computing technique of view-based access control model information | |
CN105197252B (en) | A kind of SUAV landing method and system | |
CN109901580A (en) | A kind of unmanned plane cooperates with unmanned ground robot follows diameter obstacle avoidance system and its method | |
CN108227751A (en) | The landing method and system of a kind of unmanned plane | |
CN111080679A (en) | Method for dynamically tracking and positioning indoor personnel in large-scale place | |
CN107247458A (en) | UAV Video image object alignment system, localization method and cloud platform control method | |
CN107358796A (en) | A kind of vehicle checking method based on unmanned plane | |
CN110246175A (en) | Intelligent Mobile Robot image detecting system and method for the panorama camera in conjunction with holder camera | |
CN108447075A (en) | A kind of unmanned plane monitoring system and its monitoring method | |
CN108364304A (en) | A kind of system and method for the detection of monocular airborne target | |
CN108303078A (en) | A kind of early warning of omnidirection shipping anti-collision and navigation system based on stereoscopic vision | |
CN115576357B (en) | Full-automatic unmanned aerial vehicle inspection intelligent path planning method under RTK signal-free scene | |
CN106527457B (en) | Airborne scanner scan control instructs planing method | |
CN106990781A (en) | Automatic dock AGV localization methods based on laser radar and image information | |
CN106370160A (en) | Robot indoor positioning system and method | |
CN115683062B (en) | Territorial space planning detection analysis system | |
CN109597432A (en) | A kind of unmanned plane landing monitoring method and system based on vehicle-mounted pick-up unit | |
CN113763484A (en) | Ship target positioning and speed estimation method based on video image analysis technology | |
CN109255820A (en) | A kind of actively perceive apparatus and method based on unmanned boat | |
CN110322462A (en) | Unmanned aerial vehicle vision based on 5G network feels land method and system | |
WO2020070114A1 (en) | System and method for assisting docking of a vessel | |
CN105393084B (en) | Determine the method, apparatus and user equipment of location information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20171208 |