CN117274847A - Intelligent navigation brain system based on deep fusion of shipping technology and operation method thereof - Google Patents

Intelligent navigation brain system based on deep fusion of shipping technology and operation method thereof Download PDF

Info

Publication number
CN117274847A
CN117274847A CN202311165518.6A CN202311165518A CN117274847A CN 117274847 A CN117274847 A CN 117274847A CN 202311165518 A CN202311165518 A CN 202311165518A CN 117274847 A CN117274847 A CN 117274847A
Authority
CN
China
Prior art keywords
data
navigation
ship
module
intelligent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311165518.6A
Other languages
Chinese (zh)
Inventor
徐明强
宋亮
尚雨廷
胡筱渊
鲁明洋
彭国均
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cttic Communiation Big Data Shanghai Technology Co ltd
Original Assignee
Cttic Communiation Big Data Shanghai Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cttic Communiation Big Data Shanghai Technology Co ltd filed Critical Cttic Communiation Big Data Shanghai Technology Co ltd
Priority to CN202311165518.6A priority Critical patent/CN117274847A/en
Publication of CN117274847A publication Critical patent/CN117274847A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content

Abstract

The application provides an intelligent navigation brain system based on deep fusion of shipping technology and an operation method thereof, and aims to provide navigation support and decision with comprehensive integration, intellectualization and high efficiency. The system comprises a data access module, a data processing and library building module, a network service module, a user side module and an AR augmented reality function, can realize space fusion of data such as navigation environment information, ship information, hydrological weather information and the like, and can display the data on a unified man-machine interaction platform. The invention effectively expands the visual perception capability and information dimension of the pilot to the ship navigation situation, and provides comprehensive and visual auxiliary information for the pilot to safely drive the ship; the accuracy and timeliness of a driver in mastering navigation elements around a ship are greatly improved under severe navigation conditions.

Description

Intelligent navigation brain system based on deep fusion of shipping technology and operation method thereof
Technical Field
The invention relates to the technical field of shipping and data fusion, in particular to an intelligent navigation brain system based on deep fusion of shipping technology and an operation method thereof.
Background
With the development of shipping services and the progress of technology, intelligent navigation systems are becoming a focus of attention in the field of shipping. However, current navigation systems are often only single technology applications, lack of comprehensive integration and deep fusion, and limit the safety, efficiency and intelligence level of ships. Thus, there is a need for an intelligent navigational brain system based on the advanced fusion of new generation shipping technologies to address these issues.
In recent years, the rapid development of augmented reality technology brings new innovation opportunities to the field of ship navigation. The AR technology can fuse virtual information with a real environment, so that a user can intuitively perceive and understand navigation information, and a more intuitive, real-time and personalized navigation experience is provided. Under the current situation, the conditions of the adjustment, transformation, upgrading and development of the industrial structure of the ships in China are urgent. Meanwhile, "digitalization, networking and intellectualization" become important trends of future technological transformation, and ship construction is developing towards design intellectualization, product intellectualization, management refinement, information integration and the like. The China shipbuilding enterprises are urgently required to be promoted in the aspects of digital design capability, equipment automation, intelligent degree and the like, optimize and recreate the existing shipbuilding process, continuously promote informatization and industrialization deep fusion, improve digital integrated design capability and greatly develop important intelligent manufacturing equipment.
Expert scholars in the international navigation field gradually find through research that more and more new sensors bring a large amount of multi-source navigation information, and many repeated and redundant information are pushed, so that navigation decisions of a driver are frequently confused, and even the driver is influenced to make timely and correct manipulation decisions. Therefore, developing an intelligent navigation brain system based on deep fusion of shipping technology and an operation method thereof is a technical problem to be solved.
Disclosure of Invention
Because the prior art has the defects, the intelligent navigation brain system based on the depth fusion of the shipping technology and the operation method thereof are provided, multi-source complex and various navigation information is effectively integrated, the information and real video data watched by a camera are intuitively and uniformly displayed through space transformation and information fusion technology, the navigation speed, the heading and other navigation factors are comprehensively analyzed through a computer efficient data processing technology, the navigation dangerous information is well predicted, and the safety of ship navigation is effectively improved.
In order to achieve the above object, in one aspect, the present invention provides an intelligent navigation brain system based on deep fusion of shipping technology, which is characterized in that: the system comprises a data access module, a data processing and library building module, a network service module and a user side module;
the data access module refers to that the intelligent navigation brain system follows the output interface standard of various navigation devices and systems, accesses various navigation information in real time, and inputs the navigation information into the intelligent navigation brain system host server through a shipboard network line; the data processing and database building module is responsible for classifying and storing the collected navigation data, carrying out risk assessment on the related navigation data according to the requirements of ship driving and water area safety monitoring, and processing various data including CCTV video data, AIS ship data, ARPA radar data, hydrological weather data, navigation notification and the like through space transformation and data fusion technology; the network service module is responsible for sending the processed and built navigation data to the client for data feedback; the user terminal module comprises a ship user AR display system and a remote shore-based user AR display system.
Further, the data access module comprises a GPS navigator, an anemoclinograph, a sounding instrument, a log, a compass, CCTV video camera equipment and TCP Socket client which are connected with a server.
Further, the CCTV video photographing apparatus includes four IP cameras having high-definition, long-distance photographing performance: the compass is arranged at the middle position of the compass deck and is used for shooting the navigation environment of the water area in the bow direction; one of the left and right side endpoints of the compass deck are respectively arranged, and the two compass deck are used for shooting the navigation environment of the water areas of the left and right sides of the ship; a compass deck is installed at the stern direction or near the stern flagpole position and is used for shooting the navigation environment of a water area in the stern direction; the four groups of cameras adopt an intranet deployment mode, an intelligent navigation brain system displays and adopts AR monitoring to present a main picture, and the intelligent navigation brain system cooperates with service functions such as navigation information fusion, risk warning and the like, so that large-scale navigation safety monitoring of a large scene in a 360-degree range near a ship can be realized.
Further, the data processing and library building module extracts position data and attribute information of targets such as navigation marks, navigation channels, water depth points, shorelines and the like in the sea chart data, and the sea chart target module is displayed in a video image according to a video enhanced image processing method; the AIS ship position data and attribute information acquired by the data processing and library building module according to NEMA protocol are displayed in a video image according to a video enhanced image processing method; the data processing and library building module integrates the radar APRA ship position data and attribute information acquired by the NEMA protocol, and provides a video image integrating module for more comprehensive and visual navigation information and environmental perception according to a video enhanced image processing method, an ARPA target module displayed in a video image, a route module for carrying out data visualization according to route information input by a user, a user mark data module for marking or identifying according to personal needs of the user and a video image integrating module for integrating the video image acquired by a ship camera with other navigation data.
Further, the on-board user AR display system includes a cab AR display system, a mobile AR display system, and a captain/large sub-room AR display system; the remote shore-based user AR display system comprises a shore-based AR display system transmitted via a satellite or public network interface.
On the other hand, the invention provides an operation method of the intelligent navigation brain system, which is characterized in that: the method comprises the following steps:
step S1: and (3) data access: the GPS navigator acquires ship position information data, the anemograph acquires wind speed and wind direction data, the depth finder acquires water depth data, the log acquires ship position and speed data, the compass acquires ship heading data, and the CCTV video camera acquires ship periphery video and image data;
step S2: data processing and library building: the data processing and library building module firstly collects and evaluates the multi-source navigation data and integrates the available data; integrating the multisource multiscale navigation data by adopting various means including space reference transformation, data format conversion and attribute coding correspondence, thereby completing multisource navigation data integration;
step S3: the data after fusion is transmitted back to the user terminal module through the network service module;
step S4: after receiving the data, the user terminal finally displays the data on a user terminal computer or a mobile phone in an AR augmented reality mode, and powerful support is provided for ship safety control decision.
Further, in the step S2, the specific steps include:
step S201: performing video image recognition training on the system, and performing feature extraction and sliding window scanning on the training image;
step S202: modeling the extracted object features, putting the model into the model training again, and performing post-processing on the primary detection result;
step S203: and (3) performing machine learning and training on the model, integrating the feature learning and the classifier into a frame to solve the problems of poor visibility object and intelligent video image recognition of the object without navigation attribute information auxiliary matching and fusion under the complex navigation condition.
Further, in the step S4, the specific steps include:
acquiring high-precision position data of a shooting head in augmented reality; based on the high-precision position data, realizing image spatialization; superposing elements such as ships, navigation marks, channel lines and the like in the spatialization image space; attribute information of the element is displayed.
Compared with the prior art, the navigation system and the navigation method thereof provide comprehensive integrated, intelligent and efficient navigation support and decision, effectively integrate multi-source complex navigation information through space transformation and information fusion technology, perform efficient processing analysis on acquired data according to the navigation state of the ship, visually and uniformly display the data, and effectively improve the navigation safety of the ship.
Drawings
The invention and its features and advantages will become more apparent from reading of the detailed description of non-limiting embodiments, given with reference to the following drawings. Like numbers refer to like parts throughout. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.
FIG. 1 is a schematic diagram of a smart navigation brain system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of AR augmented reality for a brain system for intelligent navigation according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a mobile object video image intelligent recognition technology of an intelligent navigation brain system according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. It is to be understood that all of the described exemplary embodiments are only some, but not all, embodiments and examples of the present invention. Rather, these exemplary embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the technical disclosure to those skilled in the art.
In the following detailed description, numerous specific details are set forth in order to provide a more thorough understanding of the invention. However, it will be apparent to those skilled in the art that well-known algorithms and models are not shown in detail in order to avoid obscuring the principles of the present invention.
Examples
Referring to fig. 1, the present embodiment provides an intelligent navigation brain system based on deep fusion of shipping technology, which includes a data access module 1 for accessing various navigation information, a data processing and database building module 2 for performing spatial fusion and processing on navigation data, a network service module 3 for data transmission and network communication, and a client module 4 for visualizing data information.
The data access module 1 comprises a GPS navigator, an anemograph, a depth finder, a log, a compass, CCTV (Closed Circuit Television ) video camera equipment and TCP Socket client which are connected with a server. The GPS (Global Positioning System ) navigator obtains ship position information data, anemometry obtains wind speed and wind direction data, sounding instrument obtains water depth data, log obtains ship position and speed data, compass obtains ship heading data, CCTV video camera obtains peripheral video and image data of the ship.
As a preferred technical solution, further: the CCTV video camera equipment comprises four IP (internet protocol) cameras with high definition and long-distance camera shooting performance, wherein one IP camera is arranged at the middle position of a compass deck and used for shooting the water area navigation environment in the bow direction, one IP camera is arranged at the end points of the left and right sides of the compass deck, two IP cameras are respectively arranged at the end points of the left and right sides of the compass deck and used for shooting the water area navigation environment in the left and right sides of a ship, and one IP camera is arranged near the stern direction of the compass deck or the stern flagpole and used for shooting the water area navigation environment in the stern direction; the four groups of cameras adopt an intranet deployment mode, an intelligent navigation brain system displays and presents a main picture by AR (Augmented Reality ) monitoring, and the intelligent navigation brain system is matched with navigation information fusion, risk warning and other service functions, so that large-scale navigation safety monitoring of a large scene in a 360-degree range near a ship can be realized.
As a preferred technical solution, further: the data processing and library building module 2 extracts position data and attribute information of targets such as navigation marks, navigation channels, water depth points, shorelines and the like in the chart data, and displays the chart targets in the video image according to a video enhanced image processing method; the data processing and library building module 2 is an AIS (Automatic Identification System ) target module which is displayed in a video image according to a video enhanced image processing method according to AIS (automatic identification system) ship position data and attribute information acquired by NEMA (National Marine Electronics Association, american national marine electronics Association) protocol; the data processing and library building module 2 integrates the ship position data and attribute information of the radar APRA (AUTOMATIC RADAR PLOTTING AIDS, automatic radar plotting instrument) acquired by NEMA protocol, the ARPA target module displayed in the video image, the route module for carrying out data visualization according to the route information input by the user, the user mark data module for marking or identifying according to the personal requirement of the user, and the video image integrating module for integrating the video image acquired by the ship camera with other navigation data to provide more comprehensive and visual navigation information and environment perception.
The network service module 3 is responsible for sending the processed and built navigation data to the client for data feedback.
The customer premise module 4 includes two parts, an on-board customer and a remote shore-based customer. As an example, on-board user modules mainly have a cockpit AR display system, a mobile AR display system, and a captain/large sub-room AR display system; the remote shore-based user module mainly comprises a shore-based AR display system transmitted through a satellite or public network interface. The ship navigation state is checked in real time in various view angles such as ship view angles, shore-based view angles and the like, and the ship navigation state participates in giving a ship safe operation instruction when necessary.
As shown in fig. 2, the principle of the intelligent navigation brain system is to connect and superimpose real world and virtual world to generate an experience exceeding three dimensions, the system integrates and analyzes various navigation information resources of a ship, fuses processed navigation information into a video shot in real time, combines the virtual world formed by various information with the real world visually shot by the video, and performs instant risk assessment and risk warning on the information according to the ship navigation safety requirement, thereby providing real-time, uniform, visual and valuable interactive visual navigation interfaces for on-duty drivers, related driving management personnel and remote shore-based related departments, forming an important component of navigation brain, and providing powerful auxiliary means for ship safety control decision. The method of preprocessing the video image and matching the characteristic points is adopted to realize augmented reality, namely, firstly, analyzing the video image, capturing the object mark, spatialization the object mark, then, establishing the characteristic points of the object mark, and matching with navigation information resources.
The operation method of the intelligent navigation brain system comprises the following steps:
step S1: and (3) data access: the GPS navigator acquires ship position information data, the anemograph acquires wind speed and wind direction data, the depth finder acquires water depth data, the log acquires ship position and speed data, the compass acquires ship heading data, and the CCTV video camera acquires ship periphery video and image data;
step S2: data processing and library building: the data processing and library building module firstly collects and evaluates the multi-source navigation data and integrates the available data; integrating the multisource multiscale navigation data by adopting various means including space reference transformation, data format conversion and attribute coding correspondence, thereby completing multisource navigation data integration;
step S3: the data after fusion is transmitted back to the user terminal module through the network service module;
step S4: after receiving the data, the user terminal finally displays the data on a user terminal computer or a mobile phone in an AR augmented reality mode, and powerful support is provided for ship safety control decision. As an example, step S4 specifically includes the following procedure: acquiring high-precision position data of a shooting head in augmented reality; based on the high-precision position data, realizing image spatialization; superposing elements such as ships, navigation marks, channel lines and the like in the spatialization image space; attribute information of the element is displayed.
As a preferred technical solution, further: as shown in fig. 3, the implementation steps of the intelligent recognition technology for the video image of the mobile object are as follows:
(1) Firstly, carrying out video image recognition training on a system, and carrying out feature extraction and sliding window scanning on a training image;
(2) Modeling the extracted object features, putting the model into the model training again, and performing post-processing on the primary detection result;
(3) And finally, performing machine learning and training on the model, and integrating the feature learning and the classifier into a frame to solve the problems of poor visibility object and intelligent video image recognition of the object without navigation attribute information auxiliary matching and fusion under the complex navigation condition.
In summary, the application provides an intelligent navigation brain system based on deep fusion of shipping technology and an operation method thereof, aiming at providing comprehensive integrated, intelligent and efficient navigation support and decision-making. The system comprises a data access module, a data processing and library building module, a network service module, a user side module and an AR augmented reality function, can realize space fusion of data such as navigation environment information, ship information, hydrological weather information and the like, and can display the data on a unified man-machine interaction platform. The invention effectively expands the visual perception capability and information dimension of the pilot to the ship navigation situation, and provides comprehensive and visual auxiliary information for the pilot to safely drive the ship; the accuracy and timeliness of a driver in mastering navigation elements around a ship are greatly improved under severe navigation conditions.
The preferred embodiments of the present invention have been described above. It is to be understood that the invention is not limited to the specific embodiments described above, wherein devices and structures not described in detail are to be understood as being implemented in a manner common in the art; any person skilled in the art can make many possible variations and modifications to the technical solution of the present invention or modifications to equivalent embodiments without departing from the scope of the technical solution of the present invention, using the methods and technical contents disclosed above, without affecting the essential content of the present invention. Therefore, any simple modification, equivalent variation and modification of the above embodiments according to the technical substance of the present invention still fall within the scope of the technical solution of the present invention.

Claims (8)

1. An intelligent navigation brain system based on shipping technology depth fusion, which is characterized in that: the system comprises a data access module, a data processing and library building module, a network service module and a user side module; the data access module refers to that the intelligent navigation brain system follows the output interface standard of various navigation devices and systems, accesses various navigation information in real time, and inputs the navigation information into the intelligent navigation brain system host server through a shipboard network line; the data processing and database building module is responsible for classifying and storing the collected navigation data, carrying out risk assessment on the related navigation data according to the requirements of ship driving and water area safety monitoring, and processing various data including CCTV video data, AIS ship data, ARPA radar data, hydrological weather data, navigation notification and the like through space transformation and data fusion technology; the network service module is responsible for sending the processed and built navigation data to the client for data feedback; the user terminal module comprises a ship user AR display system and a remote shore-based user AR display system.
2. The intelligent navigation brain system based on depth fusion of shipping technology according to claim 1, wherein the data access module comprises a GPS navigator, an anemometer, a depth finder, a log, a compass, a CCTV video camera, and TCP Socket client connected to a server.
3. The intelligent navigation brain system based on the depth fusion of the shipping technology according to claim 2, wherein the CCTV video camera device comprises four IP cameras with high definition and long-distance camera capabilities, wherein the IP cameras are installed at the middle position of a compass deck and are used for shooting the navigation environment of a water area in the bow direction; one of the left and right side endpoints of the compass deck are respectively arranged, and the two compass deck are used for shooting the navigation environment of the water areas of the left and right sides of the ship; a compass deck is installed at the stern direction or near the stern flagpole position and is used for shooting the navigation environment of a water area in the stern direction; the four groups of cameras adopt an intranet deployment mode, and the intelligent navigation brain system display adopts AR monitoring to present a main picture and cooperates with navigation information fusion, risk warning and other service functions.
4. The intelligent navigation brain system based on the depth fusion of the shipping technology according to claim 1 or 2, wherein the data processing and library building module extracts the position data and attribute information of targets such as navigation marks, navigation channels, deep water points, shorelines and the like in the chart data, and displays the chart targets in the video image according to a video enhanced image processing method; the AIS ship position data and attribute information acquired by the data processing and library building module according to NEMA protocol are displayed in a video image according to a video enhanced image processing method; the data processing and library building module is used for integrating the video image acquired by the ship camera with other navigation data according to the radar APRA ship position data and attribute information acquired by the NEMA protocol, the ARPA target module displayed in the video image, the route module used for carrying out data visualization according to route information input by a user, the user mark data module used for marking or identifying according to the personal requirement of the user and the video image acquired by the ship camera.
5. An intelligent navigation brain system based on a depth fusion of shipping technology according to claim 1 or 2, wherein said on-board user AR display system comprises a driver's cabin AR display system, a mobile AR display system and a captain/large sub-room AR display system; the remote shore-based user AR display system comprises a shore-based AR display system transmitted via a satellite or public network interface.
6. A method of operating an intelligent navigation brain system according to any one of claims 1 to 5, wherein: the method comprises the following steps:
step S1: and (3) data access: the GPS navigator acquires ship position information data, the anemograph acquires wind speed and wind direction data, the depth finder acquires water depth data, the log acquires ship position and speed data, the compass acquires ship heading data, and the CCTV video camera acquires ship periphery video and image data;
step S2: data processing and library building: the data processing and library building module firstly collects and evaluates the multi-source navigation data and integrates the available data; integrating the multisource multiscale navigation data by adopting various means including space reference transformation, data format conversion and attribute coding correspondence, thereby completing multisource navigation data integration;
step S3: the data after fusion is transmitted back to the user terminal module through the network service module;
step S4: after receiving the data, the user terminal finally displays the data on a user terminal computer or a mobile phone in an AR augmented reality mode, and powerful support is provided for ship safety control decision.
7. The method for operating an intelligent navigation brain system according to claim 6, wherein in the step S2, the specific steps include:
step S201: performing video image recognition training on the system, and performing feature extraction and sliding window scanning on the training image;
step S202: modeling the extracted object features, putting the model into the model training again, and performing post-processing on the primary detection result;
step S203: and (3) performing machine learning and training on the model, integrating the feature learning and the classifier into a frame to solve the problems of poor visibility object and intelligent video image recognition of the object without navigation attribute information auxiliary matching and fusion under the complex navigation condition.
8. The method for operating an intelligent navigation brain system according to claim 6, wherein in the step S4, the specific steps include:
acquiring high-precision position data of a shooting head in augmented reality; based on the high-precision position data, realizing image spatialization; superposing elements such as ships, navigation marks, channel lines and the like in the spatialization image space; attribute information of the element is displayed.
CN202311165518.6A 2023-09-11 2023-09-11 Intelligent navigation brain system based on deep fusion of shipping technology and operation method thereof Pending CN117274847A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311165518.6A CN117274847A (en) 2023-09-11 2023-09-11 Intelligent navigation brain system based on deep fusion of shipping technology and operation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311165518.6A CN117274847A (en) 2023-09-11 2023-09-11 Intelligent navigation brain system based on deep fusion of shipping technology and operation method thereof

Publications (1)

Publication Number Publication Date
CN117274847A true CN117274847A (en) 2023-12-22

Family

ID=89213486

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311165518.6A Pending CN117274847A (en) 2023-09-11 2023-09-11 Intelligent navigation brain system based on deep fusion of shipping technology and operation method thereof

Country Status (1)

Country Link
CN (1) CN117274847A (en)

Similar Documents

Publication Publication Date Title
CN109886210B (en) Traffic image recognition method and device, computer equipment and medium
CN109725310B (en) Ship positioning supervision system based on YOLO algorithm and shore-based radar system
KR101941521B1 (en) System and method for automatic tracking of marine objects
CN111524392B (en) Comprehensive system for assisting intelligent ship remote driving
CN109961522B (en) Image projection method, device, equipment and storage medium
Wawrzyniak et al. Automatic watercraft recognition and identification on water areas covered by video monitoring as extension for sea and river traffic supervision systems
CN111625159B (en) Man-machine interaction operation interface display method and device for remote driving and terminal
JP2021152966A (en) Seat belt wearing state recognition method, recognition apparatus, electronic device, and recording medium
CN113286081B (en) Target identification method, device, equipment and medium for airport panoramic video
CN115908442A (en) Image panorama segmentation method for unmanned aerial vehicle ocean monitoring and model building method
CN112949457A (en) Maintenance method, device and system based on augmented reality technology
CN115079108A (en) Integrated visual situation presentation processing device and processing method
CA2282064A1 (en) A system and method for use with a moveable platform
CN115082813A (en) Detection method, unmanned aerial vehicle, detection system and medium
Wu et al. Cooperative unmanned surface vehicles and unmanned aerial vehicles platform as a tool for coastal monitoring activities
CN117274847A (en) Intelligent navigation brain system based on deep fusion of shipping technology and operation method thereof
CN115723919A (en) Auxiliary navigation method and device for ship yaw
CN116243725A (en) Substation unmanned aerial vehicle inspection method and system based on visual navigation
EP3905223A1 (en) Aircraft display systems and methods for identifying target traffic
CN115082811A (en) Method for identifying and measuring distance of marine navigation ship according to image data
CN212411192U (en) Information integrated system of amphibious boat and amphibious boat
CN102087525A (en) Vessel traffic flow multi-image cooperative remote monitoring method
CN111563428A (en) Airport parking space intrusion detection method and system
CN115719368B (en) Multi-target ship tracking method and system
CN213748480U (en) Electronic sand table system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination