CN114898593B - Track acquisition method, track acquisition system and server - Google Patents

Track acquisition method, track acquisition system and server Download PDF

Info

Publication number
CN114898593B
CN114898593B CN202210374001.7A CN202210374001A CN114898593B CN 114898593 B CN114898593 B CN 114898593B CN 202210374001 A CN202210374001 A CN 202210374001A CN 114898593 B CN114898593 B CN 114898593B
Authority
CN
China
Prior art keywords
perception information
information
perception
target
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210374001.7A
Other languages
Chinese (zh)
Other versions
CN114898593A (en
Inventor
黄旭艳
张云飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Yunzhou Intelligence Technology Ltd
Original Assignee
Zhuhai Yunzhou Intelligence Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Yunzhou Intelligence Technology Ltd filed Critical Zhuhai Yunzhou Intelligence Technology Ltd
Priority to CN202210374001.7A priority Critical patent/CN114898593B/en
Publication of CN114898593A publication Critical patent/CN114898593A/en
Application granted granted Critical
Publication of CN114898593B publication Critical patent/CN114898593B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G3/00Traffic control systems for marine craft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/42Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for mass transport vehicles, e.g. buses, trains or aircraft
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Ocean & Marine Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application discloses a track acquisition method, a track acquisition device, a server and a computer storage medium, wherein the method comprises the following steps: acquiring at least one first sensing information flow of a ship end and at least one second sensing information flow of a shore end, wherein the first sensing information flow comprises first attribute information of a water target sensed by the ship end, and the second sensing information flow comprises second attribute information of the water target sensed by the shore end; determining at least one perception information group based on two perception information flows, wherein each perception information group comprises a first perception information flow and a second perception information flow, and the water targets corresponding to the first perception information flow and the second perception information flow which are positioned in the same perception information group are the same; and fusing the first sensing information flow and the second sensing information flow in the sensing information groups aiming at each sensing information group to obtain track information of the water target corresponding to the sensing information groups. The method can improve the accuracy of the fitting of the target track on water.

Description

Track acquisition method, track acquisition system and server
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a track acquisition method, a track acquisition system, a server, and a computer readable storage medium.
Background
At present, a shore-based detection method is generally adopted in a method for tracking a target on water, namely, the target on water is tracked through shore-based radar stations, photoelectric monitoring and other equipment. However, the equipment in the method is limited by the shielding of the terrain of the water area, a blind area exists in the tracking process, the whole-course tracking of the target on the water cannot be realized, and the accuracy of track fitting is low.
Disclosure of Invention
The application provides a track acquisition method, a track acquisition system, a server and a computer readable storage medium, which can improve the accuracy of the fitting of a target track on water.
In a first aspect, the present application provides a track acquisition method, applied to a server, where the track acquisition method includes:
acquiring at least one first sensing information flow of a ship end and at least one second sensing information flow of a shore end, wherein the first sensing information flow comprises first attribute information of a water target sensed by the ship end, and the second sensing information flow comprises second attribute information of the water target sensed by the shore end;
determining at least one sensing information group in the at least one first sensing information flow and the at least one second sensing information flow, wherein each sensing information group comprises a first sensing information flow and a second sensing information flow, and the water targets corresponding to the first sensing information flow and the second sensing information flow in the same sensing information group are the same;
And fusing the first perception information flow and the second perception information flow in the perception information groups aiming at each perception information group to obtain track information of the water target corresponding to the perception information groups.
In a second aspect, the present application provides a server comprising:
the acquisition module is used for acquiring at least one first sensing information flow of the ship end and at least one second sensing information flow of the shore end, wherein the first sensing information flow comprises first attribute information of the water target sensed by the ship end, and the second sensing information flow comprises second attribute information of the water target sensed by the shore end;
the determining module is used for determining at least one perception information group in the at least one first perception information flow and the at least one second perception information flow, wherein each perception information group comprises a first perception information flow and a second perception information flow, and the water targets corresponding to the first perception information flow and the second perception information flow in the same perception information group are the same;
and the fusion module is used for fusing the first perception information flow and the second perception information flow in the perception information groups aiming at each perception information group to obtain track information of the water target corresponding to the perception information groups.
In a third aspect, the present application provides a track acquisition system, including a shore end, a boat end, and a server, where the boat end and the shore end are respectively connected in communication with the server, and the server includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor implements the steps of the method according to the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, performs the steps of the method of the first aspect described above.
In a fifth aspect, the present application provides a computer program product comprising a computer program which, when executed by one or more processors, implements the steps of the method of the first aspect described above.
Compared with the prior art, the beneficial effects that this application exists are: firstly, at least one first sensing information flow of a boat end and at least one second sensing information flow of a shore end are obtained, then at least one sensing information group is obtained by matching the two sensing information flows, the sensing information group comprises the first sensing information flow and the second sensing information flow, and for the two sensing information flows in the same sensing information group, the two sensing information flows are both directed to the same water target; and fusing two kinds of perception information flows in the same perception information group to obtain the track information of the water target corresponding to the perception information group. According to the method, the water maneuvering sensing capability of the boat is combined on the basis of the shore end detection, the water targets are tracked in a combined mode, the influence of blind areas of the shore ends on tracking results in the tracking process can be effectively reduced, accurate and comprehensive track information is obtained, and therefore accuracy of track fitting is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the following description will briefly introduce the drawings that are needed in the embodiments or the description of the prior art, it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a track acquisition system according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of a track acquisition method provided in an embodiment of the present application;
fig. 3 is a schematic diagram of an electronic chart obtained based on fusion of two kinds of perception information according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a virtual device of a server according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a physical device of a server according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical solutions proposed in the present application, the following description is made by specific embodiments.
A track acquisition system provided in an embodiment of the present application is described below. Referring to fig. 1, fig. 1 shows an architecture schematic of a track acquisition system. The track acquisition system comprises three-terminal equipment: the first end is a shore base; the second end is a boat, such as an unmanned boat; the third end is a server. It can be appreciated that the track acquisition system may comprise a plurality of shore bases and a plurality of boats, wherein one shore base may be combined with at least one boat to track the target on water within the detection range and acquire track information of the target on water. Accordingly, a plurality of shore bases may be respectively combined with a plurality of boats to form a plurality of detection ranges, and for each combination, the process of tracking and detecting the target on water is similar, and for convenience of subsequent description, the following embodiment uses one shore base in combination with at least one boat as an example to describe the track acquisition method of the present application.
The shore base and the boat are integrated with a wireless communication device, and the wireless communication device is provided with a signal receiving and transmitting antenna. Thus, both the shore base and the boat can establish wireless communication connection with the server through the wireless communication device integrated by the shore base and the boat. Alternatively, for shore-based and boats, the wireless communication connection between the two parties may be established by wireless communication devices integrated with each other. Of course, in the case where no wireless communication connection is established between the shore base and the boat, both may establish a communication connection using the server as a communication bridge.
In the track acquisition process, the shore base and the boat respectively detect the water targets in the coverage area of the signals, the data obtained at the two ends after detection can be processed by themselves to generate corresponding perception information streams, and then the corresponding perception information streams are directly uploaded to a server; or the obtained data can be directly uploaded to a server, and the server processes the obtained data to generate two perception information streams respectively. In the actual application process, the computing capacity of the two ends can be fully considered, and for the boat end with weaker computing capacity, the obtained data can be directly uploaded to a server and processed by the server to generate a first perception information stream of the boat end; and the data obtained by the bank end can be processed by the bank end to generate a second perception information stream, and then the second perception information stream is uploaded to the server.
Optionally, the boat may incorporate radar detectors, cameras, and an automatic ship identification system (Automatic Identification System, AIS). The boat can observe the water targets in the signal coverage range through the integrated equipment or system thereof so as to obtain observation data. The boat or server may then generate at least one first perceived information stream based on the observed data. Specifically, the observation data includes radar data, video data, AIS data, and the like, which are not limited herein.
Alternatively, the shore base may integrate a radar detector, a camera, an AIS, and a positioning system, which may be, by way of example only, a global positioning system or a beidou satellite navigation system, without limitation. The shore-based can detect the water targets in the signal coverage range through the integrated equipment or system thereof so as to obtain detection data. The shore-based or server may then generate at least one second stream of perceived information based on the probe data. Specifically, the probe data includes radar data, video data, AIS data, positioning data, and the like, which are not limited herein.
It will be appreciated that the shore base or the boat, respectively, may establish wired communication with the respective integrated physical devices. Taking a shore base and integrated entity equipment (marked as first entity equipment and distinguishing entity equipment integrated with a boat) as examples for explanation, the shore base and the first entity equipment are connected to a preset control bus; i.e. both for the shore-based and the first entity device a communication connection based on the control bus is established. By way of example only, the control bus may be a controller area network (Controller Area Network, CAN) bus or other type of control bus, not limited herein.
Based on the track acquisition system, a track acquisition method provided in the embodiments of the present application is described below. Referring to fig. 2, the track acquisition method is applied to a server, and mainly describes a fusion process of two perception information streams of a shore end and a boat end. The track acquisition method comprises the following steps:
step 201, at least one first sensing information flow of the ship end and at least one second sensing information flow of the shore end are obtained.
The perceived information stream includes attribute information of the aquatic target, wherein the attribute information may include perceived time, longitude, latitude, speed, heading, size, appearance, trajectory, identity, and the like. It can be considered that one perceived information stream corresponds to one water target; correspondingly, if an aquatic target is continuously perceived, the perceived information flow is continuously updated. In the signal coverage area of the ship end, at least one sensing information flow can be obtained under the assumption that the ship end can sense at least one water target, and the first sensing information flow is marked for forming a distinction with the shore end; similarly, in the signal coverage area of the shore, assuming that it can sense at least one water target, at least one sensing information stream can also be obtained and recorded as a second sensing information stream.
Step 202, determining at least one perception information group in at least one first perception information stream and at least one second perception information stream.
After the two kinds of perception information flows are obtained, in order to accurately fit the track of the water target which can be perceived by both the ship end and the shore end, at least one perception information group can be obtained based on the two kinds of perception information flows. For each perception information group, a first perception information stream and a second perception information stream are included, and the two perception information streams in the perception information group point to the same water target. That is, only the water target perceived by both the boat end and the shore end can obtain the corresponding perception information set, and based on the perception information set, the comprehensiveness and the accuracy of the track information of the water target can be ensured.
It should be understood that the matching of the two perceptual information streams is performed under an ideal communication state, and it may be considered that there is no delay in the acquisition process of the two perceptual information streams, that is, the first perceptual information stream and the second perceptual information stream achieve time synchronization in the acquisition process. For example, at the t+1 time instant, at least one first perceived information stream at the t time instant and at least one second perceived information stream at the t time instant may be acquired simultaneously. By means of time synchronization constraint, two kinds of perception information flows in the same perception information group can be ensured to accurately point to the same water target, and accuracy of obtaining the track of the water target is improved.
In some embodiments, in a practical application scenario, delay of wireless communication is unavoidable. Therefore, after the two kinds of perception information streams are obtained, the obtained two kinds of perception information streams can be time-synchronized, so that the two kinds of perception information streams in the same perception information group can be ensured to be accurately directed to the same water target. For example, a first perceived information stream at time 0-t-1 and a second perceived information stream at time 0-t are obtained at time t+1, and the two perceived information streams at each time may be aligned in sequence, so that a perceived information set is determined based on the aligned two perceived information streams.
Since the signal coverage of the hull is smaller than that of the shore, there is an intersection between the hull and the water targets perceived by the shore. Assuming that the ship end can sense n water targets and the shore end can sense m water targets, i water targets with intersection exist between the n water targets, the server can obtain i sensing information groups from the two information flows in a matching mode.
In an ideal case, it is assumed that if the signal coverage of the shore end completely covers the signal coverage of the boat end, it can be considered that: n.ltoreq.m, and n=i. That is, after n first perceptual information streams and m second perceptual information streams are acquired, n perceptual information sets may be matched from the two information streams.
However, in the practical application scenario, the shore end is in static perception, and the boat end is in dynamic perception, so that the situation that the signal coverage of the boat end exceeds the signal coverage of the shore end can occur. In this case, the number relationship between the two water targets that can be perceived is not determined, that is, a part of the water targets that can be perceived by the ship end cannot be perceived by the ship end, and on the contrary, a part of the water targets that can be perceived by the shore end cannot be perceived by the ship end. That is, after n first perceptual information streams and m second perceptual information streams are acquired, i perceptual information sets may be obtained by matching the two information streams, where n and m do not have a fixed quantitative relationship, but it may be determined that i is necessarily not greater than n and i is necessarily not greater than m.
Step 203, for each sensing information group, fusing the first sensing information flow and the second sensing information flow in the sensing information group to obtain track information of the water target corresponding to the sensing information group.
In order to reduce the influence of dead zones at the shore ends on the tracking of the targets on water, the sensing information flow at the shore ends and the sensing information flow at the boat ends can be matched and fused. Through mutual matching and fusion of the two perception information flows, complementation of the static global perception capability of the shore end and the dynamic local perception capability of the boat end can be realized, so that the limitation that the shore end detection equipment is blocked by a terrain water area is effectively overcome, and the accuracy of fitting the target track on water is improved. Specifically, in the process of information fusion, only two perception information streams corresponding to the same water target are fused. That is, a fusion operation is performed on two kinds of perceived information streams within one perceived information group, which are directed to the same aquatic target, so as to obtain track information of the aquatic target corresponding to the perceived information group.
In the embodiment of the application, at least one first sensing information flow of a boat end and at least one second sensing information flow of a shore end are firstly obtained, then at least one sensing information set is obtained by matching the two sensing information flows, the sensing information set comprises one first sensing information flow and one second sensing information flow, and two sensing information flows in the same sensing information set point to the same water target; and fusing two kinds of perception information flows in the same perception information group to obtain the track information of the water target corresponding to the perception information group. According to the method, on the basis of shore end detection, the water maneuvering sensing capability of the boat is combined to track the water target, so that the influence of blind areas of the shore ends on a tracking result in the tracking process can be effectively reduced, more accurate and comprehensive track information is obtained, and the accuracy of track fitting is improved.
For example only, referring to fig. 3, fig. 3 shows a schematic diagram of an electronic chart based on fusion of two kinds of perception information. In the figure, the shore end can sense three water targets a, b and c, the boat end can sense c, and the track information of c can be obtained through matching of the first sensing information flow and the second sensing information flow, so that the track of c is obtained through fitting. It will be appreciated that when the water target is closer to the shore end, the second perceived information flow confidence is higher, and then the track information of the water target, for example, a in the figure, may be determined only according to the perceived information of the shore end. Among them, the trajectory of c can be considered more accurate and comprehensive than the trajectory of a.
In some embodiments, before the step 201, the method further includes:
and A1, obtaining observation data of a ship end, and generating at least one first perception information stream based on the observation data.
And A2, acquiring detection data of the shore end, and generating at least one piece of second perception information flow based on the detection data.
For the first perceived information stream, the first perceived information stream can be generated by a boat end or a server according to the observed data of the boat end; for the second perceived information stream, it may be generated by the shore or by the server from the detection data of the shore. It should be understood that the generation process of the sensing information stream may be understood as a process of recognizing sensing data obtained from various sensors.
In some embodiments, in order to obtain a more accurate first perceived information stream, the step A1 specifically includes:
and step A11, coupling the observation data based on longitude and latitude coordinates by taking the boat end as a reference object to obtain at least one first perception information stream.
When the ship end observes the water target in the signal coverage range, the ship end takes the ship end as a reference object to perceive first attribute information of the water target. Taking the position information of the aquatic target a as an example, the boat end can range the aquatic target a through a distance sensor to obtain a distance value, wherein the distance value is observation data; similarly, the boat end can observe the azimuth of the aquatic target a through the azimuth sensor to obtain an azimuth angle, and the azimuth angle is also observation data. After the distance value and the azimuth angle are obtained, the distance value and the azimuth angle can be coupled to longitude and latitude coordinates by combining the positioning information of the boat end on the self, so that the position information of the aquatic target a expressed by the longitude and latitude coordinates is obtained, and the partial information in the first sensing information flow corresponding to the aquatic target a is obtained.
In some embodiments, in order to obtain a more accurate second perceived information flow, the step A2 specifically includes:
step a21, dividing the detected data into first detected data and second detected data based on the data type.
And step A22, processing the first detection data based on a feature fusion method to obtain at least one piece of local first perception information flow.
And step A23, processing the second detection data based on the identification fusion method to obtain at least one piece of local second perception information flow.
For different data types, different methods can be adopted for processing in order to improve the accuracy of the sensing result. First, the probe data is divided based on the data type, and two types of probe data are obtained and recorded as first probe data and second probe data. For the first detection data, a feature fusion method can be adopted to process the first detection data to obtain at least one local first perception information flow; for the second detection, a recognition fusion method can be adopted to process the second detection, so as to obtain at least one local second perception information flow.
In some embodiments, the step a22 specifically includes:
and step A221, respectively carrying out feature extraction on first detection data obtained from a plurality of sensors to obtain first feature vectors corresponding to the sensors.
And step A222, fusing all the obtained first feature vectors to obtain a fused feature vector.
And step A223, identifying the fusion feature vector to obtain at least one piece of local first perception information flow.
In order to facilitate feature extraction, the first detection data may be divided into different data groups in units of the number of sensors, and then feature extraction may be performed on the first detection data in units of the groups. For example, 3 sensors, for each sensor, the corresponding first detection data can be divided into a group of data to obtain 3 groups of data; and then, carrying out feature extraction on the first detection data by taking the group as a unit to obtain first feature vectors corresponding to each sensor, namely obtaining 3 first feature vectors. Of course, if among the three sensors, two of them are the same type of sensor, for example, all of them are radar sensors; in view of the fact that the types of data obtained by the same type of sensor are not different, the first detection data can be divided into different data sets based on the types of the sensor types, and feature extraction can be performed on the first detection data. After the first feature vector corresponding to each sensor is obtained, the feature vectors can be fused to obtain a single vector, namely a fused feature vector. And at least one piece of local first perception information flow can be obtained through the identification of the fusion feature vector.
In some embodiments, the step a23 specifically includes:
and step A231, respectively carrying out feature extraction on second detection data obtained from a plurality of sensors to obtain second feature vectors corresponding to the sensors.
And step A232, respectively identifying each second characteristic vector to obtain an identification result corresponding to each sensor.
And step A233, fusing all the obtained recognition results to obtain at least one piece of local second perception information flow.
The feature extraction process of the second probe data is similar to that of the first probe data, namely: the second detection data is divided into different data groups by taking the number or the type of the sensors as a unit, and then the characteristic extraction is carried out on the second detection data by taking the groups as a unit. However, the recognition process performed subsequently is different from the first detection data, that is, after the second feature vector corresponding to each sensor is obtained, each second feature vector may be first recognized to obtain a recognition result corresponding to each sensor, and then the recognition results corresponding to each sensor are fused to obtain at least one local second sensing information stream. It can be understood that if the second detection data are divided according to the types of the sensors, in the identifying process, the second feature vector corresponding to each sensor can be identified to obtain the identifying result corresponding to each sensor, and then all the obtained identifying results are fused to obtain at least one piece of local second sensing information flow.
In some embodiments, to ensure that the first and second perceived information streams within the same perceived information set can correspond to the same aquatic target, the perceived information set may be determined based on the identification information of the aquatic target. The identification information may be any one or a combination of two or more of longitude, latitude, speed, heading, size, appearance, trajectory, and identity. For example, the longitude and latitude are used as identification information, and the perception information group is determined. For two kinds of perceived information streams that achieve time synchronization, assuming that at time t, the first perceived information stream N1 includes longitude 1 and latitude 1, and the second perceived information stream M1 also includes longitude 1 and latitude 1, then N1 and M1 may be determined as one perceived information group. However, in an actual application scenario, there is a difference between the relevant data of the unified water targets perceived by the shore end and the boat end, that is, longitude 2 and latitude 2 possibly included in M1, instead of longitude 1 and latitude 1. That is, if the same time is used, the identification information that the first sensing information stream and the second sensing information stream are completely consistent with each other is used as the basis for determining the sensing information group, it is very likely that the sensing information group cannot be determined. Therefore, in determining the group of perceptual information, a similarity threshold may be set, and when the similarity between the identification information 1 in one first stream of perceptual information and the identification information 2 in one second stream of perceptual information is greater than the similarity threshold, the two streams of perceptual information may be determined as the group of perceptual information. Still taking the above-mentioned actual scenario as an example: assuming that the similarity threshold is w, the similarity S1 between longitude 1 and longitude 2, the similarity S2 between latitude 1 and latitude 2, and then the two similarities are compared with w, respectively, when S1 > w and S2 > w, N1 and M1 can be determined as one perception information group. The perception information set determined by the method can ensure that two kinds of perception information streams contained in the perception information set point to the same water target, and the accuracy of track acquisition is improved.
In some embodiments, to achieve continuous tracking of the aquatic target, after the step 203, the method further includes:
if the target perception information group exists, a tracking instruction is sent to the boat end so as to control the boat end to track the water target corresponding to the target perception information group, wherein the target perception information group is a perception information group of which the second perception information flow is not updated in a preset time period.
For a water target for which track information has been acquired, there may be some reason that the shore end fails to continuously detect the water target for a preset period of time. For example, when the above-water target exits the detection range of the shore end, or when the above-water target travels into the blind area of the shore end, the shore end cannot detect the above-water target, so that the situation that the second sensing information flow in the sensing information group corresponding to the above-water target cannot be updated in a period of time occurs. Such a water target may be determined as a water target with a heel end, and the sensing information set corresponding to the water target is the target sensing information set. It will be appreciated that the server may determine which water target has lost tracking by determining the set of target awareness information. When the situation occurs, the server can send a tracking instruction to the boat end based on the target perception information group so as to control the boat end to realize continuous tracking of the water target corresponding to the target perception information group through perception recognition. When the ship end continuously tracks the water target lost at the shore end, the first perception information flow of the water target can be transmitted back to the server in real time, so that the integrity of the track of the water target lost at the shore end is ensured.
In some embodiments, for the water target for which track information has been obtained, the water target may be numbered in order to facilitate further tracking and track fitting by the server. Through the numbering, even if a certain water target is lost by the shore end, the server can inform the water target of the numbering of the water target lost by the shore end, so that the server can keep tracking on the water target through the water target and combine the data returned by the water target with the historical data corresponding to the numbering so as to comprehensively and accurately obtain the track information of the water target.
In some embodiments, to improve accuracy of track information acquisition, the first and second perceptual information streams in each of the perceptual information sets may be weighted and fused. In the fusion process, the respective weights of the two perception information streams can be determined first, and then weighted fusion is carried out according to the corresponding weights. In particular, the weight of the second perceived information stream may be determined based on the distance of the aquatic target from the shore end.
The general shore end can detect the water targets in 10-20 seas, and assuming that the detectable range of the shore end is 20 seas off-shore, the shore end can be divided into four sections, namely [ 0-5 ], [ 5-10 ], [ 10-15) and [ 15-20 ], by taking 5 seas as one section. When the distance between the water target and the shore end is between [ 0-5 ] sea, the weight of the second perception information flow can be between [ 1-0.75); when the distance between the water target and the shore end is between [ 5-10 ] sea, the weight of the second perception information flow can be between [ 0.75-0.5); when the distance between the water target and the shore end is between [ 10-15 ] sea, the weight of the second perception information flow can be between [ 0.5-0.25); when the distance between the water target and the shore end is between [ 15-20 ] sea, the weight of the second perception information flow can be between [ 0.25-0 ].
In order to refine the weight values, the distance can be further refined for the weight range in each interval, so that more accurate weight values are obtained. It can be understood that the sum of the weight values between the first sensing information stream and the second sensing information stream is equal to 1, so that the weight value of the first sensing information stream can be deduced after the weight value of the second sensing information stream is determined, thereby realizing the weighted fusion of the two sensing information streams and obtaining more accurate track information.
Corresponding to the track acquisition method described in the above embodiments, fig. 4 shows a block diagram of the virtual device structure of the server 4 provided in the embodiment of the present application, and for convenience of explanation, only the portion relevant to the embodiment of the present application is shown.
Referring to fig. 4, the server 4 includes:
the obtaining module 41 is configured to obtain at least one first perceived information stream of the hull-side and at least one second perceived information stream of the shore-side, where the first perceived information stream includes first attribute information of the aquatic target perceived by the hull-side, and the second perceived information stream includes second attribute information of the aquatic target perceived by the shore-side;
the determining module 42 is configured to determine at least one sensing information group in at least one first sensing information stream and at least one second sensing information stream, where each sensing information group includes one first sensing information stream and one second sensing information stream, and the water targets corresponding to the first sensing information stream and the second sensing information stream in the same sensing information group are the same;
And the fusion module 43 is configured to fuse the first sensing information flow and the second sensing information flow in the sensing information groups for each sensing information group, so as to obtain track information of the on-water target corresponding to the sensing information groups.
Optionally, the server 4 may further include:
the first generation module is used for acquiring the observation information of the boat end and generating at least one first perception information flow based on the observation information;
the second generation module is used for acquiring detection information of the shore end and generating at least one second perception information stream based on the detection information.
Optionally, the first generating module is specifically configured to: and coupling the observation data based on longitude and latitude coordinates by taking the boat end as a reference object to obtain at least one first perception information stream.
Optionally, the second perceived information stream includes a local first perceived information stream and a local second perceived information stream, and the second generating module may include:
a dividing unit for dividing the detected data into first detected data and second detected data based on the data type;
the first processing unit is used for processing the first detection data based on the feature fusion method to obtain at least one piece of local first perception information flow;
And the second processing unit is used for processing the second detection data based on the identification fusion method to obtain at least one piece of local second perception information flow.
Optionally, the first processing unit may include:
the first extraction subunit is used for respectively carrying out feature extraction on first detection data obtained from a plurality of sensors to obtain a first feature vector corresponding to each sensor;
the first fusion subunit is used for fusing all the obtained first feature vectors to obtain fusion feature vectors;
and the first recognition subunit is used for recognizing the fusion feature vector to obtain at least one piece of local first perception information flow.
Optionally, the second processing unit may include:
the second extraction subunit is used for respectively carrying out feature extraction on second detection data obtained from the plurality of sensors to obtain second feature vectors corresponding to each sensor;
the second recognition subunit is used for recognizing each second feature vector to obtain a recognition result corresponding to each sensor;
and the second fusion subunit is used for fusing all the obtained recognition results to obtain at least one local second perception information stream.
Optionally, the server 4 may further include:
And the tracking module is used for sending a tracking instruction to the boat end if the target perception information group exists, so as to control the boat end to track the water target corresponding to the target perception information group, wherein the target perception information group is a perception information group of which the second perception information flow is not updated in a preset time period.
It should be noted that, because the content such as the information interaction and the execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
Fig. 5 is a schematic entity structure of a server according to an embodiment of the present application. As shown in fig. 5, the server 5 of this embodiment includes: at least one processor 50 (only one shown in fig. 5), a memory 51, and a computer program 52 stored in the memory 51 and executable on the at least one processor 50, the processor 50 implementing steps in any of the track acquisition method embodiments described above, such as steps 201-203 shown in fig. 2, when executing the computer program 52.
The processor 50 may be a central processing unit (Central Processing Unit, CPU), the processor 50 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may in some embodiments be an internal storage unit of the server 5, such as a hard disk or a memory of the server 5. The memory 51 may also be an external storage device of the server 5 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the server 5.
Further, the memory 51 may also include both an internal storage unit of the terminal device and an external storage device. The memory 51 is used to store an operating device, an application program, a boot loader (BootLoader), data, and other programs, etc., such as program codes of computer programs. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps that may implement the various method embodiments described above.
Embodiments of the present application provide a computer program product which, when run on a mobile terminal, causes the mobile terminal to perform steps that may be performed in the various method embodiments described above.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application implements all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a server, a recording medium, computer Memory, read-Only Memory (ROM), random access Memory (RAM, random Access Memory), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other manners. For example, the apparatus/network device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (9)

1. A track acquisition method applied to a server, the track acquisition method comprising:
acquiring at least two first perception information flows of a ship end and at least two second perception information flows of a shore end, wherein the first perception information flows comprise first attribute information of a water target perceived by the ship end, and the second perception information flows comprise second attribute information of the water target perceived by the shore end;
Determining at least two perception information groups in the at least two first perception information flows and the at least two second perception information flows, wherein each perception information group comprises a first perception information flow and a second perception information flow, and the water targets corresponding to the first perception information flow and the second perception information flow in the same perception information group are the same;
fusing the first perception information flow and the second perception information flow in the perception information groups aiming at each perception information group to obtain track information of an aquatic target corresponding to the perception information groups;
wherein after obtaining the track information of the water target corresponding to the perception information set, the method further comprises the following steps:
setting a number for the water target with the obtained track information;
if a target perception information group exists, a tracking instruction is sent to the boat end so as to control the boat end to track a water target corresponding to the target perception information group, wherein the target perception information group is a perception information group which is not updated by the second perception information flow in a preset time period, and the tracking instruction comprises the number of the water target corresponding to the target perception information group.
2. The track acquisition method as set forth in claim 1, further comprising, before the acquiring at least two first perceived information streams at the hull-side and at least two second perceived information streams at the shore-side:
obtaining observation data of the boat end, and generating at least two pieces of first perception information flow based on the observation data;
and acquiring detection data of the shore end, and generating at least two pieces of second perception information flow based on the detection data.
3. The track acquisition method as claimed in claim 2, wherein said generating at least two of said first streams of perceptual information based on said observation data comprises:
and coupling the observation data based on longitude and latitude coordinates by taking the boat end as a reference object to obtain at least two first perception information streams.
4. The track acquisition method as claimed in claim 2, wherein the second perceived information stream includes a local first perceived information stream and a local second perceived information stream, the generating at least two of the second perceived information streams based on the detection data comprising:
dividing the detected data into first detected data and second detected data based on the data type;
Processing the first detection data based on a feature fusion method to obtain at least two local first perception information streams;
and processing the second detection data based on an identification fusion method to obtain at least two pieces of local second perception information flow.
5. The track acquisition method as set forth in claim 4, wherein the processing the first detection data based on the feature fusion method to obtain at least two local first perceptual information streams includes:
respectively extracting the characteristics of the first detection data obtained from a plurality of sensors to obtain a first characteristic vector corresponding to each sensor;
fusing all the obtained first feature vectors to obtain fused feature vectors;
and identifying the fusion feature vector to obtain at least two local first perception information streams.
6. The track acquisition method as set forth in claim 4, wherein the processing the second detection data based on the identification fusion method to obtain at least two local second perceptual information flows includes:
respectively extracting features of the second detection data obtained from the plurality of sensors to obtain second feature vectors corresponding to the sensors;
Respectively identifying each second characteristic vector to obtain an identification result corresponding to each sensor;
and fusing all the obtained identification results to obtain at least two local second perception information streams.
7. A server, comprising:
the acquisition module is used for acquiring at least two first perception information streams of the ship end and at least two second perception information streams of the shore end, wherein the first perception information streams comprise first attribute information of the water target perceived by the ship end, and the second perception information streams comprise second attribute information of the water target perceived by the shore end;
the determining module is used for determining at least two perception information groups in the at least two first perception information flows and the at least two second perception information flows, wherein each perception information group comprises a first perception information flow and a second perception information flow, and the water targets corresponding to the first perception information flow and the second perception information flow in the same perception information group are the same;
the fusion module is used for fusing the first perception information flow and the second perception information flow in the perception information groups aiming at each perception information group to obtain track information of an aquatic target corresponding to the perception information groups;
The server also comprises a tracking module, wherein the tracking module is used for setting numbers for the water targets with the obtained track information; if a target perception information group exists, a tracking instruction is sent to the boat end so as to control the boat end to track a water target corresponding to the target perception information group, wherein the target perception information group is a perception information group which is not updated by the second perception information flow in a preset time period, and the tracking instruction comprises the number of the water target corresponding to the target perception information group.
8. A track acquisition system comprising a shore end, a boat end and a server, the boat end and the shore end being in communication with the server, respectively, the server comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the track acquisition method according to any one of claims 1 to 6 when executing the computer program.
9. A computer-readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, implements the trajectory acquisition method according to any one of claims 1 to 6.
CN202210374001.7A 2022-04-11 2022-04-11 Track acquisition method, track acquisition system and server Active CN114898593B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210374001.7A CN114898593B (en) 2022-04-11 2022-04-11 Track acquisition method, track acquisition system and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210374001.7A CN114898593B (en) 2022-04-11 2022-04-11 Track acquisition method, track acquisition system and server

Publications (2)

Publication Number Publication Date
CN114898593A CN114898593A (en) 2022-08-12
CN114898593B true CN114898593B (en) 2024-02-02

Family

ID=82716476

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210374001.7A Active CN114898593B (en) 2022-04-11 2022-04-11 Track acquisition method, track acquisition system and server

Country Status (1)

Country Link
CN (1) CN114898593B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110322114A (en) * 2019-05-23 2019-10-11 平安城市建设科技(深圳)有限公司 Cell recommended method, device, equipment and storage medium based on big data
CN111060877A (en) * 2019-12-25 2020-04-24 智慧航海(青岛)科技有限公司 Data processing method for shore-based radar
CN111157982A (en) * 2019-11-20 2020-05-15 智慧航海(青岛)科技有限公司 Intelligent ship and shore cooperative target tracking system and method based on shore-based radar
CN111738484A (en) * 2020-04-28 2020-10-02 腾讯科技(深圳)有限公司 Method and device for selecting addresses of bus stops and computer readable storage medium
CN113296509A (en) * 2021-05-21 2021-08-24 上海海事大学 Autonomous trajectory tracking fusion control method for unmanned surface vessel
CN113359123A (en) * 2021-06-04 2021-09-07 华能国际电力江苏能源开发有限公司 System and method for protecting offshore wind farm submarine cable based on AIS and radar information fusion

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7182869B2 (en) * 2017-12-28 2022-12-05 古野電気株式会社 Target detection device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110322114A (en) * 2019-05-23 2019-10-11 平安城市建设科技(深圳)有限公司 Cell recommended method, device, equipment and storage medium based on big data
CN111157982A (en) * 2019-11-20 2020-05-15 智慧航海(青岛)科技有限公司 Intelligent ship and shore cooperative target tracking system and method based on shore-based radar
CN111060877A (en) * 2019-12-25 2020-04-24 智慧航海(青岛)科技有限公司 Data processing method for shore-based radar
CN111738484A (en) * 2020-04-28 2020-10-02 腾讯科技(深圳)有限公司 Method and device for selecting addresses of bus stops and computer readable storage medium
CN113296509A (en) * 2021-05-21 2021-08-24 上海海事大学 Autonomous trajectory tracking fusion control method for unmanned surface vessel
CN113359123A (en) * 2021-06-04 2021-09-07 华能国际电力江苏能源开发有限公司 System and method for protecting offshore wind farm submarine cable based on AIS and radar information fusion

Also Published As

Publication number Publication date
CN114898593A (en) 2022-08-12

Similar Documents

Publication Publication Date Title
CN109343050B (en) Radar video monitoring method and device
EP2138956B1 (en) Adaptive match metric selection for automatic target recognition
KR101334804B1 (en) Integration method of satellite information and ship information for integrated ship monitoring
US20150241560A1 (en) Apparatus and method for providing traffic control service
CN105698800B (en) A kind of modified navigation navigation methods and systems
EP3555665B1 (en) Detection and elimination of gnss spoofing signals with pvt solution estimation
CN112911249B (en) Target object tracking method and device, storage medium and electronic device
CN110082790A (en) A kind of satellite the deception recognition methods, apparatus and system of facing moving terminal
CA2785384C (en) Method for classifying objects in an imaging surveillance system
KR102399539B1 (en) Method and apparatus for identifying an object
CN115598669A (en) Navigation multi-feature GNSS deception jamming detection method, system, equipment and medium
CN114898593B (en) Track acquisition method, track acquisition system and server
US11585943B2 (en) Detection and elimination of GNSS spoofing signals with PVT solution estimation
CN114556449A (en) Obstacle detection and re-identification method and device, movable platform and storage medium
WO2023275544A1 (en) Methods and systems for detecting vessels
Hadzagic et al. Hard and soft data fusion for maritime traffic monitoring using the integrated ornstein-uhlenbeck process
KR102558387B1 (en) System and method for providing motion information and size information of ship based on real-time radar image
CN115932834A (en) Anti-unmanned aerial vehicle system target detection method based on multi-source heterogeneous data fusion
Davidson et al. Sensor fusion system for infrared and radar
Ruscio et al. Information communication technology (ict) tools for preservation of underwater environment: A vision-based posidonia oceanica monitoring
Liu et al. Quadrotor visual navigation under GPS spoofing attack
Bhuyan et al. Tracking with multiple cameras for video surveillance
CN117974792B (en) Ship target detection positioning method based on vision and AIS data cooperative training
CN116592871B (en) Unmanned ship multi-source target information fusion method
Sathyadevan GPS Spoofing Detection in UAV Using Motion Processing Unit

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant