CN114898593A - Track acquisition method, track acquisition system and server - Google Patents

Track acquisition method, track acquisition system and server Download PDF

Info

Publication number
CN114898593A
CN114898593A CN202210374001.7A CN202210374001A CN114898593A CN 114898593 A CN114898593 A CN 114898593A CN 202210374001 A CN202210374001 A CN 202210374001A CN 114898593 A CN114898593 A CN 114898593A
Authority
CN
China
Prior art keywords
information
perception information
perception
stream
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210374001.7A
Other languages
Chinese (zh)
Other versions
CN114898593B (en
Inventor
黄旭艳
张云飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Yunzhou Intelligence Technology Ltd
Original Assignee
Zhuhai Yunzhou Intelligence Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Yunzhou Intelligence Technology Ltd filed Critical Zhuhai Yunzhou Intelligence Technology Ltd
Priority to CN202210374001.7A priority Critical patent/CN114898593B/en
Publication of CN114898593A publication Critical patent/CN114898593A/en
Application granted granted Critical
Publication of CN114898593B publication Critical patent/CN114898593B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G3/00Traffic control systems for marine craft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/42Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for mass transport vehicles, e.g. buses, trains or aircraft
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Ocean & Marine Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application discloses a track acquisition method, a track acquisition device, a server and a computer storage medium, wherein the method comprises the following steps: acquiring at least one first perception information flow of a boat end and at least one second perception information flow of a shore end, wherein the first perception information flow comprises first attribute information of a water target perceived by the boat end, and the second perception information flow comprises second attribute information of the water target perceived by the shore end; determining at least one perception information group based on two perception information streams, wherein each perception information group comprises a first perception information stream and a second perception information stream, and the aquatic targets corresponding to the first perception information stream and the second perception information stream in the same perception information group are the same; and for each perception information group, fusing the first perception information flow and the second perception information flow in the perception information group to obtain the track information of the aquatic target corresponding to the perception information group. The method can improve the accuracy of the target track fitting on water.

Description

Track acquisition method, track acquisition system and server
Technical Field
The present application belongs to the technical field of data processing, and in particular, to a trajectory acquisition method, a trajectory acquisition system, a server, and a computer-readable storage medium.
Background
At present, a shore-based detection method is generally adopted for tracking the aquatic target, namely, the aquatic target is tracked through equipment such as a shore-based radar station and photoelectric monitoring. However, the equipment in the method is limited by the sheltering of the terrain of the water area, a blind area exists in the tracking process, the whole-process tracking of the target on the water cannot be realized, and the accuracy of track fitting is low.
Disclosure of Invention
The application provides a track acquisition method, a track acquisition system, a server and a computer readable storage medium, which can improve the accuracy of the track fitting of an overwater target.
In a first aspect, the present application provides a trajectory acquisition method, which is applied to a server, and the trajectory acquisition method includes:
acquiring at least one first sensing information flow of a boat end and at least one second sensing information flow of a shore end, wherein the first sensing information flow comprises first attribute information of a water target sensed by the boat end, and the second sensing information flow comprises second attribute information of the water target sensed by the shore end;
determining at least one sensing information group in the at least one first sensing information stream and the at least one second sensing information stream, wherein each sensing information group comprises a first sensing information stream and a second sensing information stream, and the aquatic targets corresponding to the first sensing information stream and the second sensing information stream in the same sensing information group are the same;
and for each perception information group, fusing the first perception information flow and the second perception information flow in the perception information group to obtain track information of the aquatic target corresponding to the perception information group.
In a second aspect, the present application provides a server comprising:
an obtaining module, configured to obtain at least one first sensing information stream at a boat end and at least one second sensing information stream at a shore end, where the first sensing information stream includes first attribute information of a water target sensed by the boat end, and the second sensing information stream includes second attribute information of the water target sensed by the shore end;
a determining module, configured to determine at least one sensing information group in the at least one first sensing information stream and the at least one second sensing information stream, where each sensing information group includes one first sensing information stream and one second sensing information stream, and the aquatic targets corresponding to the first sensing information stream and the second sensing information stream in the same sensing information group are the same;
and the fusion module is used for fusing the first perception information flow and the second perception information flow in the perception information group aiming at each perception information group to obtain the track information of the aquatic target corresponding to the perception information group.
In a third aspect, the present application provides a trajectory acquisition system, including a shore end, a boat end, and a server, where the boat end and the shore end are respectively in communication connection with the server, the server includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the method according to the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of the first aspect.
In a fifth aspect, the present application provides a computer program product comprising a computer program which, when executed by one or more processors, performs the steps of the method of the first aspect as described above.
Compared with the prior art, the application has the beneficial effects that: firstly, acquiring at least one first perception information flow at a boat end and at least one second perception information flow at a shore end, and then matching the two perception information flows to obtain at least one perception information group, wherein the perception information group comprises one first perception information flow and one second perception information flow, and the two perception information flows in the same perception information group both point to the same overwater target; and fusing two kinds of perception information streams in the same perception information group to obtain the track information of the aquatic target corresponding to the perception information group. According to the method, the overwater maneuvering sensing capability of the boat is combined on the basis of bank end detection, the overwater target is tracked in a combined mode, the influence of a blind area of the bank end on a tracking result in the tracking process can be effectively reduced, more accurate and comprehensive track information is obtained, and the accuracy of track fitting is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a trajectory acquisition system provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of a trajectory acquisition method provided in an embodiment of the present application;
fig. 3 is a schematic diagram of an electronic chart obtained based on fusion of two kinds of perception information provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a virtual device of a server according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an entity apparatus of a server according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution proposed in the present application, the following description will be given by way of specific examples.
A track acquisition system provided in an embodiment of the present application is described below. Referring to fig. 1, fig. 1 shows an architecture of a trajectory acquisition system. The trajectory acquisition system comprises three-terminal equipment: the first end is a shore base; the second end is a boat, such as an unmanned boat; the third end is a server. It is understood that the track acquisition system may include a plurality of shore bases and a plurality of boats, wherein a shore base may be combined with at least one boat to track the aquatic target within the detection range and acquire the track information of the aquatic target. Accordingly, a plurality of shore bases can be respectively combined with a plurality of boats to form a plurality of detection ranges, and for each combination, the process of tracking and detecting the aquatic target is similar.
The shore-based and the boat are both integrated with wireless communication devices, and the wireless communication devices are both provided with signal transceiving antennas. Therefore, the shore base and the boat can establish wireless communication connection with the server through the wireless communication device integrated with the shore base and the boat. Alternatively, for shore-based and boat, the wireless communication connection between the two parties can be established through the wireless communication device integrated with the shore-based and boat. Of course, in the case that the wireless communication connection is not established between the shore base and the boat, the shore base and the boat can establish the communication connection by using the server as a communication bridge.
In the track acquisition process, the shore base and the boat respectively detect the aquatic targets in respective signal coverage ranges, the data obtained by the two ends after detection can be processed by self to generate corresponding perception information streams, and then the corresponding perception information streams are directly uploaded to the server; or the obtained data can be directly uploaded to a server and processed by the server to respectively generate two kinds of perception information streams. In the practical application process, the computing capacities of the two ends can be fully considered, for the boat end with weaker computing capacity, the obtained data can be directly uploaded to the server and processed by the server, and a first sensing information flow of the boat end is generated; and the data obtained by the shore end can be processed by the shore end to generate a second perception information flow, and then the second perception information flow is uploaded to the server.
Optionally, the craft may be integrated with radar detectors, cameras, and an Automatic Identification System (AIS) for the vessel. The boat, through its integrated equipment or system, can observe the aquatic target within its signal coverage to obtain observation data. The boat or server may then generate at least one first stream of perceptual information based on the observation data. Specifically, the observation data includes radar data, video data, AIS data, and the like, and is not limited herein.
Optionally, the shore base may be integrated with a radar detector, a camera, an AIS, and a positioning system, which may be, by way of example only, a global positioning system or a beidou satellite navigation system, and is not limited thereto. The shore base can detect the water target in the signal coverage range through the integrated device or system thereof to obtain detection data. The shore-based or server may then generate at least one second perceptual information stream based on the detection data. Specifically, the detection data includes radar data, video data, AIS data, positioning data, and the like, and is not limited herein.
It is to be understood that shore-based or boat may establish wired communication with respective integrated physical devices. Taking a shore-based entity device (referred to as a first entity device, and distinguished from the entity device integrated with the boat) as an example for explanation, the shore-based entity device and the first entity device are both connected to a preset control bus; that is, for both the shore-based and first entity devices, a communication connection based on the control bus is established. For example only, the control bus may be a Controller Area Network (CAN) bus or other type of control bus, and is not limited thereto.
Based on the above trajectory acquisition system, a trajectory acquisition method provided in the embodiment of the present application is described below. Referring to fig. 2, the track acquisition method is applied to a server, and mainly describes a fusion process of two kinds of perceptual information streams at a shore end and a boat end. The track acquisition method comprises the following steps:
step 201, at least one first perception information flow of a boat end and at least one second perception information flow of a shore end are obtained.
The perception information stream includes attribute information of an object on water, wherein the attribute information may include perception time, longitude, latitude, speed, heading, size, appearance, track, identity, and the like. It can be considered that one sensing information stream corresponds to one water target; correspondingly, if a water target is sensed continuously, the sensing information flow can be updated continuously. In the signal coverage range of the boat end, assuming that the boat end can sense at least one aquatic target, at least one sensing information flow can be obtained, and the sensing information flow is marked as a first sensing information flow for forming a distinction with a shore end; similarly, in the signal coverage range of the shore end, assuming that it can sense at least one aquatic target, at least one sensing information stream may also be acquired and recorded as a second sensing information stream.
Step 202, at least one perceptual information group is determined in at least one first perceptual information stream and at least one second perceptual information stream.
After obtaining the two kinds of perception information flows, in order to accurately fit the track of the aquatic target which can be perceived by both the boat end and the shore end, matching can be performed based on the two kinds of perception information flows to obtain at least one perception information group. And aiming at each perception information group, a first perception information flow and a second perception information flow are included, and the two perception information flows in the perception information group point to the same aquatic target. That is to say, only the aquatic target that boat end and bank end all perceived can obtain its corresponding perception information group, based on this, can ensure the comprehensiveness and the accuracy of this aquatic target's track information.
It should be understood that the above matching of the two perceptual information streams is performed in an ideal communication state, and it can be considered that there is no delay in the acquisition process of the two perceptual information streams, i.e. the first perceptual information stream and the second perceptual information stream are time-synchronized in the acquisition process. For example, at the t +1 th time, at least one first perceptual information stream at the time t and at least one second perceptual information stream at the time t may be acquired simultaneously. Through the constraint of time synchronization, two kinds of perception information flows in the same perception information group can be ensured to accurately point to the same overwater target, and the accuracy of acquiring the track of the overwater target is improved.
In some embodiments, delays in wireless communications are unavoidable in practical application scenarios. Therefore, after the two kinds of perception information streams are obtained, the two kinds of obtained perception information streams can be subjected to time synchronization, so that the two kinds of perception information streams in the same perception information group can be accurately pointed to the same water target. For example, a first perceptual information stream from time 0 to time t-1 and a second perceptual information stream from time 0 to time t are obtained at time t +1, and the two perceptual information streams at each time may be sequentially aligned, so as to determine a perceptual information group based on the aligned two perceptual information streams.
Since the signal coverage of the boat end is smaller than that of the shore end, there is an intersection between the water targets that the boat end and the shore end can perceive. Assuming that the boat end can sense n aquatic targets, the shore end can sense m aquatic targets, and the number of the aquatic targets with intersection between the boat end and the shore end is i, the server can obtain i sensing information sets from the two information flows through matching.
In an ideal situation, assuming that the signal coverage of the shore end completely covers the signal coverage of the boat end, it can be considered that: n is less than or equal to m, and n is i. That is, after acquiring n first perceptual information streams and m second perceptual information streams, n perceptual information groups may be obtained from the two information streams by matching.
However, in an actual application scenario, the shore end is statically sensed, and the boat end is dynamically sensed, so that the signal coverage of the boat end exceeds that of the shore end. In this case, the quantitative relationship between the aquatic targets that can be sensed by the both is not determined, that is, the part of the aquatic target that can be sensed by the boat end cannot be sensed, and on the contrary, the part of the aquatic target that can be sensed by the shore end cannot be sensed. That is, after n first perceptual information streams and m second perceptual information streams are obtained, i perceptual information groups may be obtained by matching from the two information streams, n and m do not have a fixed number relationship, but it may be determined that i is not necessarily greater than n, and i is also not necessarily greater than m.
And 203, fusing the first perception information flow and the second perception information flow in each perception information group to obtain track information of the aquatic target corresponding to the perception information group.
In order to reduce the influence of the blind area of the shore end on the target tracking on water, the sensing information flow of the shore end and the sensing information flow of the boat end can be matched and fused. Through the mutual matching and fusion of the two kinds of perception information flows, the complementation of the bank-end static global perception capability and the boat-end dynamic local perception capability can be realized, the limitation that the bank-end detection equipment is shielded by a terrain water area is effectively made up, and the accuracy of the target track fitting on water is improved. Specifically, in the process of information fusion, it is only meaningful to fuse two kinds of perception information streams corresponding to the same underwater target. Namely, the two kinds of perception information flows pointing to the same aquatic target in one perception information group are fused to obtain the track information of the aquatic target corresponding to the perception information group.
In the embodiment of the application, at least one first perception information flow at a boat end and at least one second perception information flow at a shore end are obtained first, and then at least one perception information group is obtained by matching the two perception information flows, wherein the perception information group comprises one first perception information flow and one second perception information flow, and the two perception information flows in the same perception information group both point to the same overwater target; and fusing two kinds of perception information streams in the same perception information group to obtain track information which is relatively comprehensive and accurate in the water target and corresponds to the perception information group. According to the method, on the basis of shore end detection, the overwater target is tracked by combining the overwater maneuvering sensing capability of the boat, so that the influence of a blind area of the shore end on a tracking result in the tracking process can be effectively reduced, more accurate and comprehensive track information is obtained, and the accuracy of track fitting is improved.
By way of example only, referring to fig. 3, fig. 3 shows a schematic diagram of an electronic chart based on fusion of two perceptual information. In the figure, the shore end can perceive three aquatic targets a, b and c, the boat end can perceive c, the track information of c can be obtained through matching of the first perception information flow and the second perception information flow, and then the track of c is obtained through fitting. It is understood that when the marine target is closer to the offshore end and the confidence of the second perceptual information stream is higher, the track information of the marine target, such as a in the figure, can be determined only according to the perceptual information of the offshore end. Among them, the trajectory of c can be considered to be more accurate and comprehensive than that of a.
In some embodiments, before the step 201, the method further includes:
and A1, acquiring observation data of the boat end, and generating at least one first perception information flow based on the observation data.
And A2, acquiring detection data of the bank end, and generating at least one second perception information stream based on the detection data.
For the first perception information flow, the first perception information flow can be generated by the boat end or the server according to observation data of the boat end; for the second perceptual information stream, the second perceptual information stream may be generated by the bank side or the server according to the detection data of the bank side. It should be understood that the generation process of the sensing information stream may be understood as a process of identifying sensing data obtained from various sensors.
In some embodiments, in order to obtain a more accurate first sensing information stream, the step a1 specifically includes:
and A11, coupling the observation data based on the longitude and latitude coordinates by taking the boat end as a reference object to obtain at least one first perception information flow.
When the boat terminal observes the aquatic target in the signal coverage range, the boat terminal uses the boat terminal as a reference object to sense first attribute information of the aquatic target. Taking the position information of the aquatic target a as an example, the boat end can measure the distance of the aquatic target a through the distance sensor to obtain a distance value, and the distance value is observation data; similarly, the boat end can observe the position of the water target a through the position sensor to obtain an azimuth angle, and the azimuth angle is also observation data. After the distance value and the azimuth angle are obtained, the distance value and the azimuth angle can be coupled to the longitude and latitude coordinates by combining the positioning information of the boat end to the boat end, so that the position information of the aquatic target a expressed by the longitude and latitude coordinates is obtained, and partial information in a first perception information stream corresponding to the aquatic target a is obtained.
In some embodiments, in order to obtain a more accurate second sensing information stream, the step a2 specifically includes:
step a21, the detected data is divided into first detection data and second detection data based on the data type.
And A22, processing the first detection data based on a feature fusion method to obtain at least one local first perception information stream.
And A23, processing the second detection data based on the identification fusion method to obtain at least one local second perception information stream.
For different data types, different methods can be adopted for processing in order to improve the accuracy of the sensing result. First, probe data is divided based on data types to obtain two types of probe data, which are denoted as first probe data and second probe data. Aiming at the first detection data, processing the first detection data by adopting a feature fusion method to obtain at least one local first perception information flow; for the second detection, the second detection may be processed by using an identification fusion method to obtain at least one local second perceptual information stream.
In some embodiments, the step a22 specifically includes:
step A221, feature extraction is performed on first detection data obtained from a plurality of sensors respectively, and a first feature vector corresponding to each sensor is obtained.
And step A222, fusing all the obtained first feature vectors to obtain fused feature vectors.
And step A223, identifying the fusion characteristic vector to obtain at least one local first perception information flow.
In order to facilitate feature extraction, the first detection data may be divided into different data groups in units of the number of sensors, and then feature extraction may be performed on the first detection data in units of groups. For example, there are 3 sensors, and for each sensor, the corresponding first detection data may be divided into one group of data, so as to obtain 3 groups of data; and then, performing feature extraction on the first detection data by taking the group as a unit to obtain a first feature vector corresponding to each sensor, namely obtaining 3 first feature vectors. Of course, if among the three sensors, two of them are of the same type, both are radar sensors, for example; since the types of data obtained by the sensors of the same type are not different, the first detection data may be divided into different data groups based on the type of the sensor class, and feature extraction may be performed on the first detection data. After the first feature vector corresponding to each sensor is obtained, the feature vectors may be fused to obtain a single vector, i.e., a fused feature vector. Through the identification of the fusion characteristic vector, at least one local first perception information flow can be obtained.
In some embodiments, the step a23 specifically includes:
step A231, respectively performing feature extraction on the second detection data obtained from the plurality of sensors to obtain a second feature vector corresponding to each sensor.
And A232, respectively identifying each second feature vector to obtain an identification result corresponding to each sensor.
And step A233, fusing all the obtained recognition results to obtain at least one local second perception information stream.
The feature extraction process of the second detection data is similar to the extraction process of the first detection data, namely: the second detection data is divided into different data groups by taking the number or the type of the sensors as a unit, and then the second detection data is subjected to feature extraction by taking the groups as a unit. However, the subsequent identification process is different from the first detection data, that is, after the second eigenvector corresponding to each sensor is obtained, each second eigenvector may be identified respectively to obtain the identification result corresponding to each sensor, and then the identification results corresponding to each sensor are fused to obtain at least one local second sensing information stream. It can be understood that, if the second detection data is divided according to the types of the sensors, in the identification process, the second feature vector corresponding to each type of sensor may be identified to obtain an identification result corresponding to each type of sensor, and then all the obtained identification results are fused to obtain at least one local second sensing information stream.
In some embodiments, in order to ensure that the first and second streams of perceptual information within the same set of perceptual information can correspond to the same aquatic target, the set of perceptual information may be determined based on identification information of the aquatic target. The identification information may be any one of or a combination of two or more of longitude, latitude, speed, heading, size, appearance, trajectory, and identity. The perception information group is determined, for example, using longitude and latitude as identification information. For two perceptual information streams implementing time synchronization, assuming that at time t, the first perceptual information stream N1 includes longitude 1 and latitude 1, and the second perceptual information stream M1 also includes longitude 1 and latitude 1, then N1 and M1 may be determined as one perceptual information group. However, in practical application scenarios, there is a difference in the relevant data of the unified water target perceived by the shore end and the boat end, that is, longitude 2 and latitude 2, rather than longitude 1 and latitude 1, may be included in M1. That is, if the identification information of the first perceptual information stream and the second perceptual information stream that are identical at the same time is used as the basis for determining the perceptual information group, it is very likely that the perceptual information group cannot be determined. Therefore, when determining the perceptual information group, a similarity threshold may be set, and when the similarity between the identification information 1 in one first perceptual information stream and the identification information 2 in one second perceptual information stream is greater than the similarity threshold, the two perceptual information streams may be determined as the perceptual information group. Still taking the above actual scenario as an example: assuming that the similarity threshold is w, the similarity between longitude 1 and longitude 2 can be calculated as S1, and the similarity between latitude 1 and latitude 2 as S2, then the two similarities are compared with w respectively, and when S1 > w and S2 > w, N1 and M1 can be determined as one sensing information group. The perception information group determined by the method can ensure that two perception information streams contained in the perception information group point to the same water target, and the accuracy of track acquisition is improved.
In some embodiments, in order to achieve continuous tracking of the target on water, after step 203, the method further includes:
and if the target perception information group exists, sending a tracking instruction to the boat end to control the boat end to track the aquatic target corresponding to the target perception information group, wherein the target perception information group is a perception information group of which the second perception information flow is not updated in a preset time period.
For a water target that has acquired trajectory information, the shore end may fail to continuously detect the water target for a preset period of time for some reason. For example, when the target on water exits the detection range of the shore end, or when the target on water travels into the blind zone of the shore end, the target on water cannot be detected by the shore end, so that the second sensing information stream in the sensing information group corresponding to the target on water cannot be updated within a period of time. Such a water target can be determined as a water target with a lost shore end, and the sensing information set corresponding to the water target is the target sensing information set. It will be appreciated that the server can determine which aquatic target has a heel-drop by determining the set of target awareness information. When the situation occurs, the server can send a tracking instruction to the boat end based on the target perception information group so as to control the boat end to continuously track the water target corresponding to the target perception information group through perception identification. When the boat end continuously tracks the overwater target lost with the shore end, the first perception information flow of the overwater target can be transmitted back to the server in real time, and therefore the integrity of the overwater target track lost with the shore end is guaranteed.
In some embodiments, for a marine target for which trajectory information has been obtained, the marine target may be numbered to facilitate further tracking and trajectory fitting of the marine target by the server. Through the serial number, even bank end is with losing certain target on water, the server also can inform the ship end with the serial number of the target on water that loses with this bank end, and from this, the server accessible ship end keeps tracking this target on water to combine the data that the ship end passback and the historical data that this serial number corresponds, with the comprehensive accurate track information who obtains this target on water.
In some embodiments, to improve the accuracy of the track information acquisition, the first perceptual information stream and the second perceptual information stream in each perceptual information group may be weighted and fused. In the fusion process, the respective weights of the two perceptual information streams may be determined first, and then the weighted fusion may be performed according to the corresponding weights. In particular, the weight of the second stream of perceptual information may be determined in dependence on the distance of the aquatic target from the shore end.
The bank end can detect the aquatic target within 10-20 nautical miles, and if the bank end can detect the aquatic target within 20 nautical miles, the bank end can be divided into four intervals, namely [ 0-5 ], [ 5-10 ], [ 10-15 ] and [ 15-20 ], with 5 nautical miles as one interval. When the distance between the overwater target and the shore end is between [ 0-5) of the sea, the weight of the second perception information flow can be between [ 1-0.75); when the distance between the overwater target and the shore end is between [5 and 10) nautical miles, the weight of the second perception information flow can be between [0.75 and 0.5); when the distance between the overwater target and the shore end is between [10 and 15) of the sea, the weight of the second perception information flow can be between [0.5 and 0.25); the weight of the second perceptual information stream may be between [ 0.25-0 ] when the distance between the aquatic target and the shore end is between [ 15-20) nautical miles.
In order to refine the weight value, the distance can be refined, and the weight range in each interval is further refined to obtain a more precise weight value. It can be understood that the sum of the weight values between the first perception information stream and the second perception information stream is equal to 1, and therefore after the weight value of the second perception information stream is determined, the weight value of the first perception information stream can be deduced, so that the two perception information streams are weighted and fused, and more accurate track information is obtained.
Fig. 4 shows a block diagram of a virtual device structure of the server 4 provided in the embodiment of the present application, corresponding to the trajectory acquisition method described in the foregoing embodiment, and only shows portions related to the embodiment of the present application for convenience of description.
Referring to fig. 4, the server 4 includes:
the acquiring module 41 is configured to acquire at least one first sensing information stream at the boat end and at least one second sensing information stream at the shore end, where the first sensing information stream includes first attribute information of the aquatic target sensed by the boat end, and the second sensing information stream includes second attribute information of the aquatic target sensed by the shore end;
a determining module 42, configured to determine at least one sensing information group in at least one first sensing information stream and at least one second sensing information stream, where each sensing information group includes one first sensing information stream and one second sensing information stream, and the aquatic targets corresponding to the first sensing information stream and the second sensing information stream in the same sensing information group are the same;
and the fusion module 43 is configured to fuse the first sensing information stream and the second sensing information stream in each sensing information group to obtain track information of the aquatic target corresponding to the sensing information group.
Optionally, the server 4 may further include:
the first generation module is used for acquiring observation information of the boat end and generating at least one first perception information flow based on the observation information;
and the second generation module is used for acquiring the detection information of the bank end and generating at least one second perception information stream based on the detection information.
Optionally, the first generating module is specifically configured to: and coupling observation data based on longitude and latitude coordinates by taking the boat end as a reference object to obtain at least one first perception information flow.
Optionally, the second perceptual information stream includes a local first perceptual information stream and a local second perceptual information stream, and the second generating module may include:
a dividing unit configured to divide the detected data into first detection data and second detection data based on the data type;
the first processing unit is used for processing the first detection data based on a feature fusion method to obtain at least one local first perception information stream;
and the second processing unit is used for processing the second detection data based on the identification fusion method to obtain at least one local second perception information stream.
Optionally, the first processing unit may include:
the first extraction subunit is used for respectively performing feature extraction on first detection data obtained from the plurality of sensors to obtain a first feature vector corresponding to each sensor;
the first fusion subunit is used for fusing all the obtained first feature vectors to obtain fusion feature vectors;
and the first identification subunit is used for identifying the fusion characteristic vector to obtain at least one local first perception information flow.
Optionally, the second processing unit may include:
the second extraction subunit is used for respectively performing feature extraction on second detection data obtained from the plurality of sensors to obtain a second feature vector corresponding to each sensor;
the second identification subunit is used for respectively identifying each second feature vector to obtain an identification result corresponding to each sensor;
and the second fusion subunit is used for fusing all the obtained identification results to obtain at least one local second perception information stream.
Optionally, the server 4 may further include:
and the tracking module is used for sending a tracking instruction to the boat end if the target perception information group exists so as to control the boat end to track the aquatic target corresponding to the target perception information group, wherein the target perception information group is a perception information group of which the second perception information flow is not updated in a preset time period.
It should be noted that, for the information interaction and execution process between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the method embodiment of the present application, and thus reference may be made to the method embodiment section for details, which are not described herein again.
Fig. 5 is a schematic entity structure diagram of a server according to an embodiment of the present application. As shown in fig. 5, the server 5 of this embodiment includes: at least one processor 50 (only one is shown in fig. 5), a memory 51, and a computer program 52 stored in the memory 51 and executable on the at least one processor 50, wherein the processor 50 executes the computer program 52 to implement the steps of the above-mentioned arbitrary trajectory acquisition method embodiment, such as the step 201 and 203 shown in fig. 2.
The Processor 50 may be a Central Processing Unit (CPU), and the Processor 50 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 51 may in some embodiments be an internal storage unit of the server 5, such as a hard disk or a memory of the server 5. The memory 51 may also be an external storage device of the server 5 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. provided on the server 5.
Further, the memory 51 may also include both an internal storage unit of the terminal device 5 and an external storage device. The memory 51 is used for storing an operating device, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of a computer program. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a server, recording medium, computer Memory, Read-Only Memory (ROM), Random-Access Memory (RAM), electrical carrier wave signals, telecommunications signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A track acquisition method is applied to a server, and is characterized by comprising the following steps:
acquiring at least one first perception information stream of a boat end and at least one second perception information stream of a shore end, wherein the first perception information stream comprises first attribute information of a water target perceived by the boat end, and the second perception information stream comprises second attribute information of the water target perceived by the shore end;
determining at least one sensing information group in the at least one first sensing information stream and the at least one second sensing information stream, wherein each sensing information group comprises a first sensing information stream and a second sensing information stream, and the aquatic targets corresponding to the first sensing information stream and the second sensing information stream in the same sensing information group are the same;
and for each perception information group, fusing the first perception information flow and the second perception information flow in the perception information group to obtain track information of the aquatic target corresponding to the perception information group.
2. The trajectory acquisition method of claim 1, further comprising, prior to acquiring at least one first perceptual information stream at a boat end and at least one second perceptual information stream at a shore end:
acquiring observation data of the boat end, and generating at least one first perception information flow based on the observation data;
and acquiring detection data of the bank end, and generating at least one second perception information flow based on the detection data.
3. The trajectory acquisition method of claim 2, wherein said generating at least one of said first streams of perceptual information based on said observation data comprises:
and coupling the observation data based on longitude and latitude coordinates by taking the boat end as a reference object to obtain at least one first perception information flow.
4. The trajectory acquisition method of claim 2, wherein the second stream of perceptual information comprises a local first stream of perceptual information and a local second stream of perceptual information, the generating at least one of the second streams of perceptual information based on the detection data comprising:
dividing the detected data into first detection data and second detection data based on the data type;
processing the first detection data based on a feature fusion method to obtain at least one local first perception information flow;
and processing the second detection data based on an identification fusion method to obtain at least one local second perception information stream.
5. The trajectory acquisition method of claim 4, wherein the processing the first detection data based on the feature fusion method to obtain at least one of the local first perceptual information streams comprises:
respectively extracting features of the first detection data obtained from the plurality of sensors to obtain a first feature vector corresponding to each sensor;
fusing all the obtained first feature vectors to obtain fused feature vectors;
and identifying the fusion characteristic vector to obtain at least one local first perception information flow.
6. The trajectory acquisition method according to claim 4, wherein the processing the second detection data based on the recognition fusion method to obtain at least one local second perceptual information stream comprises:
respectively extracting features of the second detection data obtained from the plurality of sensors to obtain a second feature vector corresponding to each sensor;
respectively identifying each second feature vector to obtain an identification result corresponding to each sensor;
and fusing all the obtained recognition results to obtain at least one local second perception information flow.
7. The trajectory acquisition method according to any one of claims 1 to 6, wherein after the fusing the first perceptual information stream and the second perceptual information stream in the perceptual information sets for each of the perceptual information sets to obtain trajectory information of the aquatic target corresponding to the perceptual information set, the method further comprises:
and if the target perception information group exists, sending a tracking instruction to the boat end to control the boat end to track the aquatic target corresponding to the target perception information group, wherein the target perception information group is a perception information group of which the second perception information flow is not updated in a preset time period.
8. A server, comprising:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring at least one first perception information stream of a boat end and at least one second perception information stream of a shore end, the first perception information stream comprises first attribute information of a water target perceived by the boat end, and the second perception information stream comprises second attribute information of the water target perceived by the shore end;
the determining module is used for determining at least one sensing information group in the at least one first sensing information stream and the at least one second sensing information stream, wherein each sensing information group comprises a first sensing information stream and a second sensing information stream, and the aquatic targets corresponding to the first sensing information stream and the second sensing information stream in the same sensing information group are the same;
and the fusion module is used for fusing the first perception information flow and the second perception information flow in the perception information group aiming at each perception information group to obtain the track information of the aquatic target corresponding to the perception information group.
9. A trajectory acquisition system, comprising a shore end, a boat end and a server, wherein the boat end and the shore end are respectively in communication connection with the server, and the server comprises a memory, a processor and a computer program stored in the memory and operable on the processor, wherein the processor, when executing the computer program, implements the trajectory acquisition method according to any one of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, implements the trajectory acquisition method according to any one of claims 1 to 7.
CN202210374001.7A 2022-04-11 2022-04-11 Track acquisition method, track acquisition system and server Active CN114898593B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210374001.7A CN114898593B (en) 2022-04-11 2022-04-11 Track acquisition method, track acquisition system and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210374001.7A CN114898593B (en) 2022-04-11 2022-04-11 Track acquisition method, track acquisition system and server

Publications (2)

Publication Number Publication Date
CN114898593A true CN114898593A (en) 2022-08-12
CN114898593B CN114898593B (en) 2024-02-02

Family

ID=82716476

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210374001.7A Active CN114898593B (en) 2022-04-11 2022-04-11 Track acquisition method, track acquisition system and server

Country Status (1)

Country Link
CN (1) CN114898593B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190204416A1 (en) * 2017-12-28 2019-07-04 Furuno Electric Co., Ltd. Target object detecting device, method of detecting a target object and computer readable medium
CN110322114A (en) * 2019-05-23 2019-10-11 平安城市建设科技(深圳)有限公司 Cell recommended method, device, equipment and storage medium based on big data
CN111060877A (en) * 2019-12-25 2020-04-24 智慧航海(青岛)科技有限公司 Data processing method for shore-based radar
CN111157982A (en) * 2019-11-20 2020-05-15 智慧航海(青岛)科技有限公司 Intelligent ship and shore cooperative target tracking system and method based on shore-based radar
CN111738484A (en) * 2020-04-28 2020-10-02 腾讯科技(深圳)有限公司 Method and device for selecting addresses of bus stops and computer readable storage medium
CN113296509A (en) * 2021-05-21 2021-08-24 上海海事大学 Autonomous trajectory tracking fusion control method for unmanned surface vessel
CN113359123A (en) * 2021-06-04 2021-09-07 华能国际电力江苏能源开发有限公司 System and method for protecting offshore wind farm submarine cable based on AIS and radar information fusion

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190204416A1 (en) * 2017-12-28 2019-07-04 Furuno Electric Co., Ltd. Target object detecting device, method of detecting a target object and computer readable medium
CN110322114A (en) * 2019-05-23 2019-10-11 平安城市建设科技(深圳)有限公司 Cell recommended method, device, equipment and storage medium based on big data
CN111157982A (en) * 2019-11-20 2020-05-15 智慧航海(青岛)科技有限公司 Intelligent ship and shore cooperative target tracking system and method based on shore-based radar
CN111060877A (en) * 2019-12-25 2020-04-24 智慧航海(青岛)科技有限公司 Data processing method for shore-based radar
CN111738484A (en) * 2020-04-28 2020-10-02 腾讯科技(深圳)有限公司 Method and device for selecting addresses of bus stops and computer readable storage medium
CN113296509A (en) * 2021-05-21 2021-08-24 上海海事大学 Autonomous trajectory tracking fusion control method for unmanned surface vessel
CN113359123A (en) * 2021-06-04 2021-09-07 华能国际电力江苏能源开发有限公司 System and method for protecting offshore wind farm submarine cable based on AIS and radar information fusion

Also Published As

Publication number Publication date
CN114898593B (en) 2024-02-02

Similar Documents

Publication Publication Date Title
US10641906B2 (en) GPS jammer and spoofer detection
US20150241560A1 (en) Apparatus and method for providing traffic control service
KR101334804B1 (en) Integration method of satellite information and ship information for integrated ship monitoring
CN103175525A (en) Radar image simulation system and method based on electronic chart and navigation data
CN110889380B (en) Ship identification method and device and computer storage medium
Kazimierski Problems of data fusion of tracking radar and AIS for the needs of integrated navigation systems at sea
US11879983B2 (en) Location method using GNSS signals
CN113033439B (en) Method and device for data processing and electronic equipment
CN111860215B (en) Target object position determining method, terminal device and navigation system
CN112904390A (en) Positioning method, positioning device, computer equipment and storage medium
US11481920B2 (en) Information processing apparatus, server, movable object device, and information processing method
CN114898593A (en) Track acquisition method, track acquisition system and server
KR102102398B1 (en) Apparatus and method for making navigation performance evaluation in real time
CN111474536A (en) Intelligent ship autonomous positioning system and method based on shore-based radar system
Kaniewski et al. Visual-based navigation system for unmanned aerial vehicles
CN107941220B (en) Unmanned ship sea antenna detection and navigation method and system based on vision
Liu et al. Towards intelligent navigation in future autonomous surface vessels: developments, challenges and strategies
Davidson et al. Sensor fusion system for infrared and radar
CN116774252B (en) Navigation deception jamming detection method based on single receiver pseudo-range variation
Sathyadevan GPS Spoofing Detection in UAV Using Motion Processing Unit
CN115128598B (en) Behavior identification method based on fusion of visual perception and radar perception and terminal equipment
WO2023074014A1 (en) Vessel monitoring device, vessel monitoring method, and program
Thiagarajah et al. Localization for ships during automated docking using a monocular camera
CN112347218B (en) Unmanned ship environment map generation method and unmanned ship sensing system
KR102666053B1 (en) Apparatus and method for estimating location

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant