CN111862389B - Intelligent navigation perception and augmented reality visualization system - Google Patents

Intelligent navigation perception and augmented reality visualization system Download PDF

Info

Publication number
CN111862389B
CN111862389B CN202010702462.3A CN202010702462A CN111862389B CN 111862389 B CN111862389 B CN 111862389B CN 202010702462 A CN202010702462 A CN 202010702462A CN 111862389 B CN111862389 B CN 111862389B
Authority
CN
China
Prior art keywords
data
ship
module
environment sensing
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010702462.3A
Other languages
Chinese (zh)
Other versions
CN111862389A (en
Inventor
马勇
邱倩倩
赵玉蛟
李�昊
毛晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University of Technology WUT
Original Assignee
Wuhan University of Technology WUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University of Technology WUT filed Critical Wuhan University of Technology WUT
Priority to CN202010702462.3A priority Critical patent/CN111862389B/en
Publication of CN111862389A publication Critical patent/CN111862389A/en
Application granted granted Critical
Publication of CN111862389B publication Critical patent/CN111862389B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0816Indicating performance data, e.g. occurrence of a malfunction
    • G07C5/0825Indicating performance data, e.g. occurrence of a malfunction using optical means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B34/00Vessels specially adapted for water sports or leisure; Body-supporting devices specially adapted for water sports or leisure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B79/00Monitoring properties or operating parameters of vessels in operation
    • B63B79/10Monitoring properties or operating parameters of vessels in operation using sensors, e.g. pressure sensors, strain gauges or accelerometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/088Non-supervised learning, e.g. competitive learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0808Diagnosing performance data
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0816Indicating performance data, e.g. occurrence of a malfunction
    • G07C5/0833Indicating performance data, e.g. occurrence of a malfunction using audio means
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Chemical & Material Sciences (AREA)
  • Ocean & Marine Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Combustion & Propulsion (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses an intelligent navigation perception and augmented reality visualization system, which belongs to the field of intelligent ships and comprises the following components: the system comprises a ship external environment sensing module, a ship internal environment sensing module, a data processing and storing module, a data display module and an information input/output interface. The ship external environment sensing module is used for acquiring water surface traffic conditions, meteorological conditions and ship navigation directions, the ship internal environment sensing module is used for detecting ship navigation states, ship states, fuel oil and the like, the data processing and storing module can organically integrate and store data acquired by the ship external environment sensing module and the ship internal environment sensing module, the data display module can vividly and visually display related contents in the augmented reality display module, all-round monitoring is carried out on ships to provide data support for an intelligent ship control system, ship navigation environment information acquisition can be realized, ship navigation state information acquisition, sensing data integration and storage, functions such as decision making assistance and the like are realized.

Description

Intelligent navigation perception and augmented reality visualization system
Technical Field
The invention belongs to the field of intelligent ships, and particularly relates to an intelligent navigation perception and augmented reality visualization system.
Background
At present, the number of tourists in the cruise ship continuously rises, the tourists are gradually younger, the new breakthrough is realized in the bearing capacity of the cruise ship, and the potential safety hazard problem of the cruise ship is prominent due to diversification of navigation destinations.
With the development of automation theory and artificial intelligence technology, ship intelligence is regarded as an important measure for improving waterway transportation in the future. The intelligent ship requires comprehensive sensing of surrounding situations and timely and effective operation based on the sensing information to guarantee navigation safety. The development of the intelligent ship comprehensive sensing technology brings convenience to the intelligent cruise ship. The intelligent navigation sensing and visualization system combines sensors of different types, comprehensively senses the external environment of ship navigation and the internal state of the ship, provides visual information in real time, provides reference for ship driving and management personnel, assists decision-making, provides safety guarantee for ship navigation and shipborne equipment operation, is favorable for improving the alert performance of shipmen, guarantees the safety benefit, and avoids the occurrence of traffic accidents on water.
Therefore, how to realize the intelligent navigation perception and augmented reality visualization system is a technical problem which needs to be solved urgently at present.
Disclosure of Invention
Aiming at the defects or improvement requirements of the prior art, the invention provides an intelligent navigation sensing and augmented reality visualization system which can realize the functions of acquiring ship navigation environment information, acquiring ship navigation state information, fusing and storing sensing data, assisting decision and the like.
To achieve the above object, the present invention provides an intelligent navigation sensing and augmented reality visualization system, comprising: the system comprises a ship external environment sensing module, a ship internal environment sensing module, a data processing and storing module, a data display module and an information input/output interface;
the ship external environment sensing module is used for acquiring the water surface traffic condition, the meteorological condition and the ship navigation direction so as to sense the ship navigation environment information in real time;
the ship internal environment sensing module is used for detecting the ship navigation state, the ship body condition and fuel oil so as to obtain ship navigation state information in real time;
the data processing and storing module is used for fusing and storing the data acquired by the ship external environment sensing module and the ship internal environment sensing module;
the data display module is used for displaying relevant data through the augmented reality display module and the display screen to carry out all-round monitoring on the ship;
the information input/output interface is a general input/output interface and is used for exporting data.
Preferably, the external environment sensing module of the ship comprises an electronic chart information and display system, an electronic position finder, an electronic compass, a navigation radar, a depth finder, an anemorumbometer and a visibility sensor; the internal environment sensing module of the ship comprises an electronic fuel metering device, an electronic clinometer, ship draft measuring equipment and a navigational speed and course measuring device.
Preferably, the data processing and storing module comprises an external data processing module, an internal data processing module, a data fusion module and a data storing module;
the external data processing module is used for processing the sensor data in the ship external environment sensing module;
the internal data processing module is used for processing the sensor data in the ship internal environment sensing module;
the data fusion module is used for fusing external data and internal data processed by the external data processing module and the internal data processing module;
the data storage module is used for storing the fused data.
Preferably, the external data processing module is configured to set a threshold for each sensor data in the ship external environment sensing module, and send out a visual and/or audible alarm signal when a certain data exceeds the corresponding threshold;
the internal data processing module is used for respectively setting threshold values for the data of each sensor in the ship internal environment sensing module and sending out visual and/or audible alarm signals when certain data exceeds the corresponding threshold value.
Preferably, the data fusion module is configured to fuse the data acquired by the ship external environment sensing module and the ship internal environment sensing module according to a preset format, and store the fused data through the data storage module.
Preferably, the data storage module stores the perception data in a format of time-coordinate-course-water depth-wind speed-wind direction-visibility-fuel quantity-ship attitude-draught-navigation speed-navigation range-radar detection target information.
Preferably, the information input/output interface is configured to receive a data retrieval instruction input by a user, extract data indicated by the data retrieval instruction from the data storage module in a time-information format, and display the extracted data through the data display module.
Preferably, the data display module is used for displaying nearby obstacle information and information influencing ship navigation safety when the ship navigates in the augmented reality display module, and displaying the electronic chart and other temporarily unimportant data on the display screen.
Preferably, the augmented reality display module is used for displaying warning information, and if the existing data exceed a safety threshold, targeted warning information is displayed.
Preferably, the augmented reality display module establishes a graphical model for the radar detection data, and displays nearby obstacle information and attitude information of the ship in real time during navigation of the ship in the augmented reality display module.
In general, compared with the prior art, the above technical solution contemplated by the present invention can achieve the following beneficial effects:
the system can monitor ship navigation environment information and ship self state information in real time, and fuse and visually present various sensor data, so that ship driver alertness is improved, water traffic accident rate is reduced, and support is provided for cruise safety.
Drawings
Fig. 1 is a schematic structural diagram of an intelligent navigation sensing and augmented reality visualization system according to an embodiment of the present invention;
fig. 2 is a diagram illustrating a normal state and a warning state of an augmented reality display module according to an embodiment of the present invention;
fig. 3 is a ship augmented reality visualization effect diagram provided by the embodiment of the invention;
fig. 4 is a schematic diagram of fusion performed by using a deep neural network according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Example one
Fig. 1 is a schematic structural diagram of an intelligent navigation sensing and augmented reality visualization system according to an embodiment of the present invention, including: the system comprises a ship external environment sensing module, a ship internal environment sensing module, a data processing and storing module, a data display module and an information input/output interface, and can realize functions of ship navigation environment information acquisition, ship navigation state information acquisition, sensing data fusion and storage, decision assistance and the like;
the ship external environment sensing module is used for acquiring water surface traffic conditions, meteorological conditions and ship navigation directions, sensing ship navigation environment information in real time and avoiding water traffic accidents;
the ship internal environment sensing module is used for detecting the ship navigation state, the ship body condition, fuel oil and the like, acquiring ship navigation state information in real time, avoiding the occurrence of ship operation faults and comprehensively guaranteeing the navigation safety of the cruise ship;
the data processing and storing module can fuse and store the data acquired by the ship external environment sensing module and the ship internal environment sensing module;
the data display module comprises an augmented reality display module and a display screen, the augmented reality display module can display data acquired by the sensing module, a driver can conveniently keep watching and know various data of ship navigation in time, barrier information and ship self-attitude information during ship navigation are displayed in real time, accidents caused by missing of the best processing opportunity are effectively avoided, the display screen is used for displaying an electronic chart and displaying related contents, ship driving and management personnel can conveniently monitor ships in all directions, and problems can be found in time;
the information input/output interface adopts a general input/output interface and is used for importing historical data and exporting and backing up the data.
Furthermore, the external environment sensing module of the ship comprises an electronic chart information and display system, an electronic position finder, an electronic compass, a navigation radar, a depth finder, an anemorumbometer, a visibility sensor and the like.
As a preferred embodiment, the external environment sensing module of the ship may include 2 sets of electronic chart information and display systems, 2 electronic position indicators, 2 electronic compasses, 2 navigation radars, 1 depth finder, 1 anemorumbometer, and 1 visibility sensor.
Further, the internal environment sensing module of the ship comprises an electronic fuel metering device, an electronic inclinometer, a ship draft measuring device, a navigational speed and navigational distance measuring device and the like.
As a preferred embodiment, the internal environment sensing module of the ship comprises 1 electronic fuel metering device, 2 electronic inclinometers, 6 ship draft measuring devices and 1 set of navigational speed and navigational distance measuring device.
Furthermore, the data processing and storing module comprises an external data processing module, an internal data processing module, a data fusion module and a data storing module;
the external data processing module is used for processing sensor data in the ship external environment sensing module;
the internal data processing module is used for processing the sensor data in the ship internal environment sensing module;
the data fusion module is used for fusing the external data and the internal data processed by the external data processing module and the internal data processing module;
in the embodiment of the invention, a deep neural network is adopted for data fusion, the network can integrate data information of various different high-dimensional heterogeneous modes, the internal relation among the data is effectively disclosed, and the strong noise in the sample is thinned by utilizing the L1 and L2 regularization, so that accurate multi-source perception data is obtained.
The deep neural network is based on a deep automatic coding machine structure and comprises five parts, namely a sub-network, a data fusion layer, a data characterization layer, a hidden layer and a data sparse layer, wherein each sub-network is responsible for extracting high-level abstract characterization from different modal data input by a sensor, and meanwhile, the designed sub-neural network structures are different in consideration of different complexities of different modalities; and the data fusion layer searches for the relation between different data modes and is used for fusing a group of refined high-level abstract features extracted from the uppermost layer of each sub-network. The subnetwork and the data fusion layer form a root network, and when the root network is trained, the whole training is divided into two stages: an unsupervised independent mode pre-training stage and a supervised multi-mode combined perception stage. In the unsupervised independent mode training stage, when each hidden layer is trained, the current observation data is used as input x, and a weight matrix W is processed on the hidden layer 1 And a mapping function
Figure GDA0003822870010000061
Linear transformation is carried out, and coding mapping is carried out to obtain implicit characteristic expression
Figure GDA0003822870010000062
Then according to the characteristic that its characteristic expression can rebuild original data, the weight matrix w is passed through in the decoding stage 2 And a mapping function
Figure GDA0003822870010000063
Obtaining a reconstructed input
Figure GDA0003822870010000064
Then the optimization target is the reconstruction error of the original input and the reconstruction input:
Figure GDA0003822870010000065
after the pre-training of the independent modes is completed, the whole network needs to be subjected to a multi-mode joint perception process. As shown in fig. 4, the data fusion layer is connected with the subnetworks to which all the modalities belong by weight values, and the parameters are adjusted simultaneously with the subnetworks completing the pre-training in the prediction process. In order to extract the mode independent features with the same structure from the original data of different modes, the weight is connected and shares the same weight T in the training process. Fine tuning phase, defining h m And performing optimization training on the model for the uppermost neuron of the mth mode by using a back propagation algorithm. The loss function is defined as:
Figure GDA0003822870010000066
Figure GDA0003822870010000067
where m denotes the number of modalities, N denotes the number of training samples, y (i) Represents a sample x (i) True values of the states of (c), for example: true values of states such as navigation speed, relative distance, h m The uppermost neuron representing the mth modality,
Figure GDA0003822870010000068
is the predicted value of the model to the state. b root Representing an offset vector, T i And an ith row vector representing the weight matrix T. In the fine adjustment process of the whole network, iterative adjustment is carried out in turnParameters of the respective sub-networks. And only adjusting the parameters of one sub-network each time, fixing the parameters of other sub-networks, and adjusting the network to which the next mode belongs after the weight value is updated until all the modes are adjusted. The data fusion layer is only used for joint fine tuning of the root network, and the layer is cancelled after network training is completed.
The data representation layer receives input of a root network, the input is projected to a low-dimensional feature space after passing through the hidden layer, unsupervised information and supervised information are jointly used for parameter optimization of a weight W by the sparse layer to remove noise and redundancy to obtain related physical quantities, then classification is carried out through a classifier to obtain required accurate data, and finally multi-mode data processing based on deep learning is achieved. In training the top three-tier network, the final loss function is defined as:
Figure GDA0003822870010000071
Figure GDA0003822870010000072
Figure GDA0003822870010000073
the entire loss function is divided into three parts: supervised discrimination loss L diss Unsupervised generation of loss L gen And a regularization term, β being a coefficient. The minimum mean square loss enables the finally extracted multi-modal fusion features to have strong discriminability. On the other hand, the extracted fusion features need to have strong generation capability while retaining strong discrimination capability. Generating losses for measuring input x' and output
Figure GDA0003822870010000074
The reconstruction error with less reconstruction loss means that the extracted fusion feature is preservedMuch original information. Loss of reconstruction
Figure GDA0003822870010000075
Is defined as:
Figure GDA0003822870010000076
s (-) represents sigmoid function, and b' are bias terms.
In order to avoid overfitting of the model and smooth the parameters of the model at the same time, two regular terms L1 and L2 are introduced, and the weight parameters W are thinned. The method for preventing the overfitting can eliminate the influence of the noise of the sensor observation data on the state true value. Thereby obtaining accurate scene state data.
The data storage module is used for storing the fused data.
Furthermore, the external data processing module is used for respectively setting threshold values for data of each sensor in the ship external environment sensing module and sending out visual and/or auditory alarm signals when certain data exceeds the corresponding threshold value;
the internal data processing module is used for respectively setting threshold values for each sensor data in the ship internal environment sensing module and sending out visual and/or audible alarm signals when certain data exceeds the corresponding threshold value.
In the embodiment of the present invention, the threshold value may be determined according to the type of data acquired by each sensor.
And the data fusion module is used for fusing the data acquired by the ship external environment sensing module and the ship internal environment sensing module according to a preset format and storing the fused data through the data storage module.
Further, the data storage module stores perception data in a format of time-coordinate-course-water depth-wind speed-wind direction-visibility-fuel quantity-ship posture-draught-navigation speed-navigation range-radar detection target information.
Furthermore, the external data processing module and the internal data processing module can independently analyze a certain variable in a time-information format, including the movement of all radar detection targets, the change of wind speed and wind direction, the change of visibility, the change of ship attitude, the change of fuel quantity and the like, so that the information of ship manipulation behavior, ship performance change, energy consumption condition, meteorological change and the like can be analyzed.
Furthermore, the external data processing module and the internal data processing module can set threshold values for different variables, judge whether different types of data exceed the threshold values, and send out visual and audible alarm signals when the ship is exposed to conditions of overlarge navigation speed, severe meteorological conditions, grounding, insufficient fuel, collision risk and the like.
Further, if any data exceeds the safety threshold, the data display module acquires the data and displays warning information in the augmented reality display module.
Further, the input/output interface is connected with the sensors in the ship external environment sensing module and the ship internal environment sensing module in a wired mode.
Further, the information input/output interface supports import/export of history data, and import/export of external data.
Further, the data display module can display nearby obstacle information and information with a large influence on the ship navigation safety when the ship navigates in the augmented reality display module, and display an electronic chart and other temporarily unimportant data on the display screen.
Further, the augmented reality display module can be used for showing warning information, if there is data to exceed the safety threshold, can show pertinent alarm information.
Furthermore, the augmented reality display module can establish a graphical model for the radar detection data, and obstacle information nearby and attitude information of the ship during navigation of the ship are displayed in the augmented reality display module in real time.
In an embodiment of the invention, the system operation comprises the following steps: the ship external environment sensing module senses ship navigation environment information and imports the information into the external data processing module; the internal ship environment sensing module senses ship navigation state information and guides the information into the internal data processing module; the data fusion module fuses data and then stores the data into the data storage module; when no user operation exists, the external data processing module and the internal data processing module extract the latest stored data from the data storage module, judge whether the data such as wind speed, visibility, water depth, barrier distance and the like in the environment exceed a threshold value, and send out an alarm signal in a visual and auditory way if the data exceed the threshold value; when a user operates, the information input/output interface extracts corresponding information from the data storage module according to the requirement of the user; data transmission gives data display module, and data display module will select different show modes according to data type and the urgent degree of condition, if the condition is urgent, then show urgent alarm information through the augmented reality display module on, otherwise, will select partial data show according to the information importance degree, and other data then show on the display screen.
Example two
As shown in fig. 1, the system includes a ship external environment sensing module, a ship internal environment sensing module, a data processing and storing module, a data display module and an information input/output interface;
the ship external environment sensing module comprises an electronic chart information and display system, an electronic position finder, an electronic compass, a navigation radar, a depth finder, an anemorumbometer and a visibility sensor;
the ship internal environment sensing module comprises a fuel electronic metering device, an electronic clinometer, ship draft measuring equipment and a speed and range measuring device;
the data processing and storing module is divided into an external data processing module and an internal data processing module, and can fuse and store data acquired by the ship external environment sensing module and the ship internal environment sensing module;
as shown in fig. 2, the data display module can display nearby obstacle information and information that has a large impact on the safety of the ship navigation and related warning signals in the augmented reality display module, and display an electronic chart and other temporarily unimportant data on the display screen.
In the embodiment of the invention, the electronic chart information and display system and the speed and range measuring device are arranged in a ship cockpit, the anemorumbometer, the visibility sensor, the electronic position finder, the electronic compass and the navigation radar are arranged on a deck, the depth finder is arranged at the bottom of a ship bow, the electronic fuel oil metering device is arranged in an oil storage cabin, the electronic inclinometer is arranged in a midship, and the ship draft measuring equipment is respectively arranged on the ship bow, the midship and the stern on the port and the starboard.
And the data fusion module extracts the latest stored data from the data storage module at regular intervals, judges whether the data amount exceeds a normal range, and gives an alarm through visual and auditory signals if the data amount exceeds the normal range so as to warn ship drivers to make corresponding precautions.
As shown in fig. 3, the augmented reality display device in the data display module displays the ship posture in a three-dimensional visual form, and simultaneously displays the obstacles around the ship detected by the radar, and also displays information such as speed, wind speed and direction, fuel quantity and the like when the ship sails, and displays other alarm signals according to the condition that whether the data exceeds the threshold value. The display screen displays data such as an electronic chart, visibility, coordinates, course, time, water depth, draught, voyage and the like.
It should be noted that, according to the implementation requirement, each step/component described in the present application can be divided into more steps/components, and two or more steps/components or partial operations of the steps/components can be combined into new steps/components to achieve the purpose of the present invention.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (7)

1. The utility model provides an intelligence navigation perception and augmented reality visual system which characterized in that includes: the system comprises a ship external environment sensing module, a ship internal environment sensing module, a data processing and storing module, a data display module and an information input/output interface;
the ship external environment sensing module is used for acquiring a water surface traffic condition, a meteorological condition and a ship navigation direction so as to sense ship navigation environment information in real time;
the ship internal environment sensing module is used for detecting the ship navigation state, the ship body condition and fuel oil so as to obtain ship navigation state information in real time;
the data processing and storing module is used for fusing and storing the data acquired by the ship external environment sensing module and the ship internal environment sensing module;
the data display module is used for displaying relevant data through the augmented reality display module and the display screen to carry out all-round monitoring on the ship;
the information input/output interface is a general input/output interface and is used for exporting data;
the data processing and storing module comprises an external data processing module, an internal data processing module, a data fusion module and a data storing module;
the external data processing module is used for processing the sensor data in the ship external environment sensing module; the internal data processing module is used for processing the sensor data in the ship internal environment sensing module; the data fusion module is used for fusing external data and internal data processed by the external data processing module and the internal data processing module;
the data fusion module adopts a deep neural network for data fusion, the deep neural network consists of five parts, namely a sub-network, a data fusion layer, a data representation layer, a hidden layer and a data sparse layer, and each sub-network is responsible for extracting high-level abstract representations of different modal data input by the sensor; the data fusion layer searches for the relation among different data modes and is used for fusing a group of refined high-level abstract features extracted from the uppermost layer of each sub-network, the sub-networks and the data fusion layer form a root network, the data characterization layer, the hidden layer and the data sparse layer form an upper network, and the data fusion layer and the data sparse layer form an upper networkThe sub-networks to which all the modes belong are connected by weight values, the data characterization layer receives the input of the sub-networks and the data fusion layer, projects the input to a low-dimensional feature space after passing through the hidden layer, then the sparse layer jointly uses unsupervised information and supervised information for parameter optimization of the weight value W to remove noise and redundancy to obtain related physical quantities, then the sparse layer classifies the physical quantities by a classifier to obtain required accurate data, and finally multi-mode data processing based on deep learning is realized, wherein in training of the upper network, a final loss function is defined as:
Figure FDA0003835181600000021
L diss for supervised discrimination loss, L gen For unsupervised generation of losses, beta is a coefficient,
Figure FDA0003835181600000022
Figure FDA0003835181600000023
m denotes the number of modalities, N denotes the number of training samples, y (i) The true value of the state of the sample is represented,
Figure FDA0003835181600000024
for the model to predict the value of the state,
Figure FDA0003835181600000025
h m the uppermost neuron of the m-th mode, T i I-th row vector representing weight matrix T, b root Denotes a bias vector, x' (i) In order to reconstruct the input(s),
Figure FDA0003835181600000026
h is a weighted matrix W in the hidden layer 1 And a mapping function
Figure FDA0003835181600000027
Performing linear transformation to encode and map the input data xA representation of the characteristics into the hidden layer,
Figure FDA0003835181600000028
w 2 a weight matrix representing the decoding stage is shown,
Figure FDA0003835181600000029
generating a loss for measuring input x' and output for a mapping function at a decoding stage
Figure FDA00038351816000000210
The reconstruction error with less reconstruction loss means that more original information is reserved in the extracted fusion characteristics, and the reconstruction loss is less
Figure FDA00038351816000000211
Is defined as:
Figure FDA00038351816000000212
s (-) represents a sigmoid function, b and b' are bias terms, and W is a weight parameter;
the data storage module stores the fused data in a format of time-coordinate-course-water depth-wind speed-wind direction-visibility-fuel quantity-ship attitude-draught-speed-range-radar detection target information;
the augmented reality display module establishes a graphical model, and displays nearby obstacle information and self attitude information of the ship in real time when the ship navigates in the augmented reality display module.
2. The system of claim 1, wherein the external environment sensing module comprises an electronic chart information and display system, an electronic positioning instrument, an electronic compass, a navigation radar, a depth finder, an anemometer and a visibility sensor; the internal environment sensing module of the ship comprises an electronic fuel metering device, an electronic clinometer, ship draft measuring equipment and a navigational speed and course measuring device.
3. The system of claim 2, wherein the external data processing module is configured to set a threshold for each sensor data in the external environment sensing module of the ship, and to send out a visual and/or audible alarm signal when a certain data exceeds its corresponding threshold;
the internal data processing module is used for respectively setting threshold values for the data of each sensor in the ship internal environment sensing module and sending out visual and/or audible alarm signals when certain data exceeds the corresponding threshold value.
4. The system according to claim 3, wherein the data fusion module is configured to fuse the data acquired by the ship external environment sensing module and the ship internal environment sensing module according to a preset format, and store the fused data through the data storage module.
5. The system according to claim 4, wherein the information input/output interface is configured to receive a data retrieval command input by a user, extract data indicated by the data retrieval command from the data storage module in a time-information format, and display the extracted data through the data display module.
6. The system of claim 1, wherein the data display module is used for displaying nearby obstacle information and information affecting the safety of the ship during the ship sailing in the augmented reality display module, and displaying an electronic chart and other temporarily unimportant data on the display screen.
7. The system of claim 6, wherein the augmented reality display module is configured to display a warning message and display a targeted warning message if the presence data exceeds a safety threshold.
CN202010702462.3A 2020-07-21 2020-07-21 Intelligent navigation perception and augmented reality visualization system Active CN111862389B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010702462.3A CN111862389B (en) 2020-07-21 2020-07-21 Intelligent navigation perception and augmented reality visualization system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010702462.3A CN111862389B (en) 2020-07-21 2020-07-21 Intelligent navigation perception and augmented reality visualization system

Publications (2)

Publication Number Publication Date
CN111862389A CN111862389A (en) 2020-10-30
CN111862389B true CN111862389B (en) 2022-10-21

Family

ID=73001280

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010702462.3A Active CN111862389B (en) 2020-07-21 2020-07-21 Intelligent navigation perception and augmented reality visualization system

Country Status (1)

Country Link
CN (1) CN111862389B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113942623A (en) * 2021-09-18 2022-01-18 武汉理工大学 Intelligent navigation platform and navigation method
CN116125996B (en) * 2023-04-04 2023-06-27 北京千种幻影科技有限公司 Safety monitoring method and system for unmanned vehicle
CN117818851B (en) * 2024-03-04 2024-05-24 成都锦城学院 Ship monitoring system and method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013012277A2 (en) * 2011-07-21 2013-01-24 한국해양연구원 Augmented reality system using transparent display for ship and method for enabling same
CN106970387A (en) * 2017-04-19 2017-07-21 武汉理工大学 A kind of cruiseway Traffic flow detection method based on AIS and Radar Data Fusion
CN108550281A (en) * 2018-04-13 2018-09-18 武汉理工大学 A kind of the ship DAS (Driver Assistant System) and method of view-based access control model AR
CN109725310A (en) * 2018-11-30 2019-05-07 中船(浙江)海洋科技有限公司 A kind of ship's fix supervisory systems based on YOLO algorithm and land-based radar system
CN110673600A (en) * 2019-10-18 2020-01-10 武汉理工大学 Unmanned ship-oriented automatic driving integrated system
CN111025295A (en) * 2019-11-22 2020-04-17 青岛海狮网络科技有限公司 Multi-ship cooperative sensing data fusion system and method based on shore-based radar

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6687583B1 (en) * 1999-12-15 2004-02-03 Yacht Watchman International Vessel monitoring system
CN106327610B (en) * 2016-08-27 2018-08-14 南通中远海运川崎船舶工程有限公司 A kind of arctic navigation intelligent ship
CN106372750A (en) * 2016-08-30 2017-02-01 深圳远航股份有限公司 Sailing management method and system
CA3061410C (en) * 2017-04-25 2023-03-21 Bae Systems Plc Watercraft
CN107749093B (en) * 2017-09-01 2020-05-05 上海海事大学 Optimized ship state information data structure and transmission and recording method thereof
CN108873799B (en) * 2018-06-29 2021-07-27 南京海联智能科技有限公司 Shipborne intelligent driving auxiliary terminal
CN109636921A (en) * 2018-12-17 2019-04-16 武汉理工大学 Intelligent vision ship sensory perceptual system and data processing method based on cloud platform
CN210895576U (en) * 2019-12-27 2020-06-30 江苏恒澄交科信息科技股份有限公司 Cloud black box ship navigation data recording system for inland river shipping
CN111339229B (en) * 2020-02-24 2023-04-18 交通运输部水运科学研究所 Ship autonomous navigation aid decision-making system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013012277A2 (en) * 2011-07-21 2013-01-24 한국해양연구원 Augmented reality system using transparent display for ship and method for enabling same
CN106970387A (en) * 2017-04-19 2017-07-21 武汉理工大学 A kind of cruiseway Traffic flow detection method based on AIS and Radar Data Fusion
CN108550281A (en) * 2018-04-13 2018-09-18 武汉理工大学 A kind of the ship DAS (Driver Assistant System) and method of view-based access control model AR
CN109725310A (en) * 2018-11-30 2019-05-07 中船(浙江)海洋科技有限公司 A kind of ship's fix supervisory systems based on YOLO algorithm and land-based radar system
CN110673600A (en) * 2019-10-18 2020-01-10 武汉理工大学 Unmanned ship-oriented automatic driving integrated system
CN111025295A (en) * 2019-11-22 2020-04-17 青岛海狮网络科技有限公司 Multi-ship cooperative sensing data fusion system and method based on shore-based radar

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于PSO的BP神经网络-Markov船舶交通流量预测模型;范庆波,江福才,马全党,马勇;《上海海事大学学报》;20180630;全文 *

Also Published As

Publication number Publication date
CN111862389A (en) 2020-10-30

Similar Documents

Publication Publication Date Title
CN111862389B (en) Intelligent navigation perception and augmented reality visualization system
CN108873799B (en) Shipborne intelligent driving auxiliary terminal
KR102604969B1 (en) Autonomous navigation method using image segmentation
US11915594B2 (en) Collision avoidance method and system for marine vessels
Goerlandt et al. A risk-informed ship collision alert system: Framework and application
KR102466804B1 (en) Autonomous navigation method using image segmentation
CN115017246A (en) Holographic navigation scene graph system for intelligent navigation of ship and construction method
CN111613094A (en) Port water area ship traffic risk early warning method
CN111240325A (en) Unmanned ship scene understanding method based on navigation situation ontology modeling
US10895802B1 (en) Deep learning and intelligent sensing systems for port operations
Rong et al. Evaluation of near-collisions in the Tagus River Estuary using a marine traffic simulation model
CN115723919B (en) Auxiliary navigation method and device for ship yaw
CN118245756B (en) Intelligent ship navigation analysis method and system based on situation awareness
CN111964678A (en) River channel navigation decision-making method, device and system
CN116597693B (en) Inland navigation monitoring system and method
KR20170023534A (en) Ship navigation apparatus and method for providing route information for ship
CN117232520A (en) Ship intelligent navigation system and navigation method suitable for offshore navigation
CN115877843A (en) Ice zone ship path planning method and system based on optimal control and storage medium
KR20240080189A (en) Distance measurement method and distance measurement device using the same
Vagale et al. Evaluation of path planning algorithms of autonomous surface vehicles based on safety and collision risk assessment
CN115268395A (en) Method and system for testing autonomous navigation capability of unmanned ship formation
Wu et al. An overview of developments and challenges for unmanned surface vehicle autonomous berthing
CN112185171B (en) Ship path planning method fusing experience of ship driver
Wang et al. Complex encounter situation modeling and prediction method for unmanned ships based on bounded rational game
CN115544295B (en) Ship-borne intelligent chart system supporting ship safety/autonomous navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Ma Yong

Inventor after: Qiu Qianqian

Inventor after: Zhao Yujiao

Inventor after: Li Hao

Inventor after: Mao Chen

Inventor before: Ma Yong

Inventor before: Mao Chen

Inventor before: Zhao Yujiao

CB03 Change of inventor or designer information
GR01 Patent grant
GR01 Patent grant