CN115237730A - System and method for verifying performance of autonomous navigation ship based on mixed reality - Google Patents

System and method for verifying performance of autonomous navigation ship based on mixed reality Download PDF

Info

Publication number
CN115237730A
CN115237730A CN202111410213.8A CN202111410213A CN115237730A CN 115237730 A CN115237730 A CN 115237730A CN 202111410213 A CN202111410213 A CN 202111410213A CN 115237730 A CN115237730 A CN 115237730A
Authority
CN
China
Prior art keywords
information
ship
virtual object
autonomous navigation
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111410213.8A
Other languages
Chinese (zh)
Inventor
郑致润
金武燮
文庆德
朴润庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electronics and Telecommunications Research Institute ETRI
Original Assignee
Electronics and Telecommunications Research Institute ETRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020210141244A external-priority patent/KR20220145741A/en
Application filed by Electronics and Telecommunications Research Institute ETRI filed Critical Electronics and Telecommunications Research Institute ETRI
Publication of CN115237730A publication Critical patent/CN115237730A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3457Performance evaluation by simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Graphics (AREA)
  • Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Traffic Control Systems (AREA)

Abstract

A system for verifying the performance of an autonomous navigation vessel based on mixed reality is provided. The above-mentioned system includes: a test scenario generation unit that generates a mixed reality-based test scenario for verifying the performance of an autonomous vessel; a mixed reality data generation unit that generates mixed reality data by reflecting virtual object information and actual object information that match a test scene; a sensor information transmitting unit that transmits virtual object information including sensor information generated commensurate with the virtual object information; a judgment information collection unit that collects situation awareness information, navigation path information, and vessel control information of the autonomous navigation vessel, which are determined in proportion to the virtual object information, and actual object information and enhanced virtual object information; and an autonomous navigation ship navigation performance analysis unit that analyzes performance information of the autonomous navigation ship based on each piece of information collected by the judgment information collection unit.

Description

System and method for verifying performance of autonomous navigation ship based on mixed reality
Technical Field
The invention relates to an autonomous navigation ship performance verification system and method based on mixed reality.
Background
As a method of verifying the performance of a ship, various methods using a model hull manufactured in correspondence with an actual hull are mainly used. As an example, a method of using a model ship and being able to test loading conditions of an actual ship body is proposed in the prior invention (korean laid-open patent publication No. 10-2020-0072052). In addition, a method of calculating a correlation between the motion of the icebreaking model ship and the ice load by analyzing information on the movement of the icebreaking model ship in the water tank of the iced river has been proposed (korean laid-open patent publication No. 10-2015-0134648). In addition, in addition to a method using a model ship, a method of selecting a sailing route in an actual sea area and systematically analyzing information generated when the route is sailed has been proposed (korean laid-open patent publication No. 10-2020-0063524).
The currently proposed methods for testing the performance of ships focus on evaluating the performance and navigation performance of the ship body. However, in the case of an autonomous navigation ship, it is necessary to perform a performance test for an autonomous navigation function in addition to the performance and navigation performance of the ship itself.
Generally, for ships, more time is required to test the function of an autonomously sailing ship because of the slow speed. In addition, in order to test functions necessary for situation awareness, route planning, collision avoidance, and the like for autonomous navigation, various situations in which the ship meets other ships must be generated, and thus, a lot of time and cost are required. It is assumed that, in the process of testing the autonomous navigation function of the autonomous navigation ship, collision with other ships may occur in the case where a specific function does not normally operate, and this may cause environmental pollution. Therefore, there is a need to develop a safer and more economical method for testing the autonomous navigation function of an autonomously sailing vessel.
Disclosure of Invention
(problems to be solved by the invention)
The invention aims to provide a mixed reality-based autonomous navigation ship performance verification system and a method, which enable an autonomous navigation ship to determine and apply mixed reality by itself to verify necessary functions such as condition perception, path planning, collision avoidance and the like required by navigation.
However, the problems to be solved by the present invention are not limited to the problems described above, but other problems may also exist.
(measures taken to solve the problems)
A mixed reality-based autonomous-vessel performance verification system of a first aspect of the present invention, which is directed to solving the above-mentioned problems, is provided with a communication module, a memory, and a processor, and is executed by the processor, and includes: a test scenario generation unit that generates a test scenario based on mixed reality for verifying the performance of an autonomous navigation ship; a mixed reality data generating unit that generates mixed reality data by reflecting virtual object information and actual object information that match the test scene; a sensor information transmitting unit that transmits virtual object information including sensor information generated commensurate with the virtual object information; a determination information collection unit that collects situation awareness information, navigation path information, and vessel control information of the autonomous navigation vessel, which are determined in correspondence with the virtual object information, and the actual object information and the enhanced virtual object information; and an autonomous navigation ship navigation performance analysis unit that analyzes performance information of the autonomous navigation ship based on each piece of information collected by the determination information collection unit.
In addition, the mixed reality-based autonomous navigation ship performance verification method of the second aspect of the present invention includes: a step of generating a mixed reality-based test scenario for verifying the performance of an autonomous vessel; reflecting the virtual object information and the actual object information which are equivalent to the test scene to generate mixed reality data; a step of transmitting virtual object information including sensor information generated in proportion to the virtual object information to an autonomous navigation vessel; a step of collecting condition perception information, navigation path information, and vessel control information of the autonomous navigation vessel, and other vessel navigation information and enhanced virtual object information, which are determined in proportion to the virtual object information; and analyzing the performance information of the autonomous navigation ship based on the collected information.
A computer program of another aspect of the present invention, which is directed to solving the above-described problems, executes a mixed reality-based autonomous navigation ship performance verification method in conjunction with a computer as hardware, and is stored in a computer-readable recording medium.
Other specific aspects of the invention are included in the detailed description and the accompanying drawings.
(Effect of the invention)
In the case of an autonomous navigation ship, since there is no operator, it is necessary to verify the functions of a situation sensing system for sensing the surrounding situation and an automatic navigation system for determining a route, in addition to the control function of the ship. However, actually operating a plurality of ships to verify the performance requires a lot of time and expense, and there is a risk of collision with other ships due to misjudgment of the autonomous navigation system.
An embodiment according to the present invention as described above, which aims to solve these problems, has the following advantages: the method and the system verify the necessary functions of condition perception, path planning, collision avoidance and the like required when the autonomous navigation ship determines the course by itself and navigates in the presence of other virtual ships and actual ships through mixed reality, thereby being capable of performing more stable and economic performance verification.
In addition, an embodiment of the present invention has the following advantages: similar voyage information is retrieved from existing ship voyage information based on the characteristics of the ship, and the characteristics of a voyage path (ship concentration, voyage characteristics, environmental information, international maritime collision avoidance rules, and the like) are reflected, so that a representative scene to be used for automatically testing the functions of the autonomously sailing ship can be automatically generated.
Meanwhile, an embodiment of the present invention has the following advantages: the accuracy of a situation awareness system can be analyzed on an autonomous vessel that is actually sailing by additionally providing virtual object information to various sensors that can be applied to the autonomous vessel.
In addition, an embodiment of the present invention has the following advantages: the performance of the autonomous navigation algorithm can be provided at a predetermined level or score by analyzing analysis indexes such as compliance with international marine collision avoidance rules, collision risk evaluation indexes, ship maneuverability evaluation indexes, navigation route indexes, and accuracy of a condition sensing system, which are derived by comparative analysis of the navigation paths of the autonomous navigation ship and the actual ship, by using a deep neural network.
Various effects of the present invention are not limited to the above-mentioned effects, and other effects not mentioned will be clearly understood from the following description by a person of ordinary skill.
Drawings
Fig. 1 is a diagram for schematically illustrating mixed reality in an embodiment of the present invention.
FIG. 2 is a diagram for schematically illustrating a mixed reality based autonomous navigation vessel performance verification system of an embodiment of the present invention.
FIG. 3 is a block diagram of an autonomous vessel flight performance verification system based on mixed reality for illustrating an embodiment of the invention.
Fig. 4 is a diagram for explaining the test scenario generation unit in the embodiment of the present invention.
FIG. 5 is a diagram more particularly illustrating the mixed reality based autonomous navigation vessel performance verification system shown in FIG. 3.
Fig. 6 is a diagram for explaining the mixed reality data generating unit in an embodiment of the present invention.
Fig. 7a and 7b are diagrams for explaining a specific embodiment of the system for verifying the performance of an autonomous navigation vessel based on mixed reality.
FIG. 8 is a flow chart of a method for verifying the performance of an autonomous vessel based on mixed reality, in accordance with an embodiment of the present invention.
(description of symbols)
10-a condition sensing system, 100-an autonomous navigation ship performance verification system, 110-a test scene generating part, 120-a mixed reality data generating part, 130-a sensor information transmitting part, 140-a judgment information collecting part, and 150-an autonomous navigation ship navigation performance analyzing part.
Detailed Description
The advantages and features of the present invention, and the methods of accomplishing the same, will become apparent from the following detailed description of the embodiments when taken in conjunction with the accompanying drawings. However, the present invention is not limited to the embodiments disclosed below but can be embodied in various forms different from each other, and the embodiments are provided only for completeness of disclosure of the present invention and to inform the scope of the present invention to a person having ordinary skill in the art of the present invention and the present invention is defined only by the scope of the claims.
The terms used in the present specification are used for describing the embodiments and are not intended to limit the present invention. In this specification, the singular forms also include the plural forms unless specifically mentioned in the sentence. The term "comprising" and/or "comprises 8230" \8230 "(" comprises ") used in the specification does not exclude the presence or addition of one or more other constituent elements other than the aforementioned constituent elements. Throughout the specification, the same reference numerals refer to the same constituent elements, and "and/or" includes all combinations of the respective constituent elements and one or more constituent elements mentioned. Although "first", "second", and the like are used for describing various constituent elements, it is obvious that these constituent elements are not limited by these terms. These terms are used only for the purpose of distinguishing one constituent element from another constituent element. Therefore, it is obvious that the first component mentioned below may be the second component within the scope of the technical idea of the present invention.
Unless otherwise defined, all terms (including technical and scientific terms) used in this specification are to be used in a meaning commonly understood by one of ordinary skill in the art. In addition, terms defined in commonly used dictionaries should not be interpreted ideally or excessively unless they are explicitly and specifically defined.
The invention relates to an autonomous navigation ship performance verification system 100 and method based on mixed reality.
An autonomous sailing vessel is a vessel that can autonomously determine a route and sail without human intervention. An autonomous navigation vessel includes: a condition sensing system (location awareness system) 10 that senses a surrounding condition using various sensors, an automatic navigation system (automatic navigation system) 20 that determines a route in consideration of surrounding condition information, an ocean condition, and the like, a Dynamic position maintaining system (Dynamic positioning system) 30 that controls the position of an autonomously navigating vessel so that the autonomously navigating vessel navigates according to the determined route, a propulsion control system (propulsion control system) 40 that controls the propulsion power of the autonomously navigating vessel, and the like. On the other hand, in the description of the present invention, unless otherwise specifically motivated, an autonomously sailing vessel will be described with the concept of encompassing condition sensing system 10, automatic sailing system 20, dynamic position maintenance system 30, and propulsion control system 40, where reference is made to an autonomously sailing vessel.
In the conventional ship test, since an operator senses the surrounding situation and determines the route, the control functions of the ship such as the dynamic position maintaining system 30 and the propulsion control system 40 are tested in such a manner that the test is performed while operating the ship. The testing of these control functions does not require multiple other vessels or various forms of navigation paths and there is no risk of collision with other vessels when testing the vessels.
However, in the case of an autonomous navigation ship, since an operator is not present, it is necessary to test the functions of the condition sensing system and the automatic navigation system 20 in addition to the control function of the ship. In order to test the situation awareness system and the automatic navigation system 20, scenes in which other ships, obstacles, and the like move along various paths in various environments are required. However, activating a plurality of other vessels in order to test the autonomously navigating vessel requires a lot of time and expense, and there is a risk that collision with other vessels may occur due to misjudgment of the automatic navigation system 20.
Recently, mixed Reality (MR) that further enhances interactive functions with a user by integrating Augmented Reality (AR) and Virtual Reality (VR) is receiving increasing attention. Mixed reality enables a user to see and hear content that does not exist in reality while directly feeling the real situation.
In an embodiment of the invention, a test scenario for testing necessary functions such as condition perception, path planning, collision avoidance and the like required when the autonomous navigation ship self-determines a course and navigates can be automatically generated, and performance information of the autonomous navigation ship is verified through the scenario in which virtual other ships and other actual ships exist based on mixed reality. Therefore, the embodiment of the invention has the advantage of being capable of testing the autonomous navigation function more conveniently.
Hereinafter, an autonomous navigation ship performance verification system 100 based on mixed reality according to an embodiment of the present invention will be described with reference to fig. 1 to 8.
Fig. 1 is a diagram for schematically illustrating mixed reality in an embodiment of the present invention.
When the autonomous navigation ship a verifies the performance of the autonomous navigation ship a when navigating in the actual sea area according to the specific scene, other actual ships C may exist in the navigation sea area.
In addition to such other actual vessels C, an embodiment of the present invention automatically generates a test scenario, generates a plurality of virtual vessels B1, B2, B3 commensurate with the generated test scenario, and transfers the generated virtual vessels B1, B2, B3 to the situation awareness system 10 of the autonomously sailing vessel a, so that the autonomously sailing vessel a judges as if the virtual vessels B1, B2, B3 exist in the actual space.
At this time, an embodiment of the present invention may generate a plurality of pieces of virtual object information according to the test scenario, and at this time, the virtual object information may include not only the virtual ship but also various pieces of obstacle information. That is, the virtual object is a concept of all objects generated in a virtual manner including a virtual ship, a virtual obstacle, and the like.
An embodiment of the present invention utilizing such a mixed reality has the advantage of being able to test the function of an autonomously sailing vessel in various test scenarios while avoiding the risk of collision with other vessels.
FIG. 2 is a diagram for schematically illustrating a mixed reality based autonomous navigation vessel performance verification system 100 in accordance with an embodiment of the present invention.
The mixed reality based autonomous navigation vessel performance verification system 100 of an embodiment of the present invention automatically generates a test scenario for verifying the performance of an autonomous navigation vessel. And, virtual object information and environment information commensurate with the generated test scenario are provided to the autonomous navigation vessel.
Next, the mixed reality-based autonomous-cruise ship performance verification system 100 collects the navigation information of the autonomous-cruise ship, and analyzes the performance information of the autonomous-cruise ship based on the navigation information. At this time, as for the performance information of the autonomous navigation ship, the navigation paths of experts for a plurality of test scenarios and the navigation path determined by the autonomous navigation ship may be used as learning data and verification data of an artificial intelligence algorithm, thereby deriving the performance of the autonomous navigation ship.
On the other hand, the performance of the autonomous navigation ship can be evaluated based on whether or not the International maritime collision avoidance rule (colleg) is complied with, a collision risk evaluation index, a ship maneuverability evaluation index, a navigation path index, the object recognition accuracy of the condition sensing system, and the like. The performance of these autonomous vessels is provided in a prescribed score (e.g., 0-100 points) or a plurality of level (primary/intermediate/advanced) classifications.
FIG. 3 is a block diagram of an autonomous navigation vessel performance verification system 100 based on mixed reality for illustrating an embodiment of the invention.
The system 100 for verifying the performance of an autonomous navigation ship based on mixed reality according to an embodiment of the present invention includes: the test scenario generating unit 110, the mixed reality data generating unit 120, the sensor information transmitting unit 130, the judgment information collecting unit 140, and the autonomous navigation vessel voyage performance analyzing unit 150.
The test scenario generation unit 110 automatically generates a mixed reality-based test scenario for verifying the performance of an autonomous vessel. In this case, the test scenario is characterized by being provided based on a mixed reality in which real space information and virtual space information are integrated. The test scenario generated by the test scenario generation unit 110 is passed to the mixed reality data generation unit 120.
The mixed reality data generation unit 120 generates mixed reality data by reflecting virtual object information and actual object information appropriate for the test scenario. In this case, the virtual object information includes a ship, a buoy, an obstacle, and the like. The mixed reality data generation unit 120 can generate environmental information (virtual weather conditions, etc.) described later.
In general, since various sensors such as AIS (Automatic Identification System), radar, microphone, camera (color image, distance image, infrared image, etc.), etc. are used in the situation sensing System 10 of an autonomously navigating vessel, in the mixed reality data generating section 120, after a virtual object corresponding to a test scene is rendered into a three-dimensional space, virtual object information conforming to the respective sensors is generated using position information of the respective sensors used in the situation sensing System 10.
In the case of a sensor that collects information existing in an actual physical space, such as a camera or a radar of the condition awareness system 10, in order to exhibit the same effect as that of a virtual object existing in the actual physical space, it is necessary to accurately calculate three-dimensional space information to generate the virtual object and model the generated virtual object as being the same as the actual one.
In this case, the AIS information and the sound information for the acoustic event generated in accordance with the virtual object information may be transmitted to the condition sensing system 10 through an actual device such as an AIS transmitter or a speaker, instead of being directly transmitted to the sensor of the autonomous navigation ship.
The sensor information transmission unit 130 transmits virtual object information including sensor information generated in accordance with the virtual object information to the situation awareness system 10.
The judgment information collecting part 140 collects various kinds of result information of the autonomously sailing ship including the situation awareness information of the situation awareness system 10 of the autonomously sailing ship determined in accordance with the virtual object information, the sailing route information determined by the automatic sailing system 20, and the ship control information from the dynamic position maintenance system 30 and the propulsion control system 40, and collects the actual object information and the enhanced virtual object information and transfers them to the autonomously sailing ship sailing performance analyzing part 150.
The sensor information transmitting unit 130 and the judgment information collecting unit 140 perform a function of transmitting sensor data required for the autonomously sailing ship or collecting result values of the respective systems of the autonomously sailing ship in cooperation with physical systems (the situation sensing system 10, the automatic sailing system 20, the dynamic position maintaining system 30, the propulsion control system 40, and the like) of the autonomously sailing ship.
The situation awareness system 10 of the autonomous navigation ship analyzes virtual object information including sensor information, sets a navigation path of the autonomous navigation ship using the recognized situation information, and navigates, thereby enabling evaluation of performance according to a test scenario of the autonomous navigation ship.
The autonomous navigation vessel traveling performance analysis unit 150 analyzes the performance information of the autonomous navigation vessel based on each piece of information collected by the judgment information collection unit 140. That is, the autonomous navigation ship navigation performance analyzing unit 150 receives situation awareness information, navigation path information, and ship control information generated during navigation, which are analyzed by the autonomous navigation ship using the mixed reality data, and generates navigation information by receiving actual object information and augmented virtual object information, and analyzes a test scene and the actual navigation information to test the performance of the autonomous navigation ship.
As described above, the autonomous navigation ship voyage performance analysis unit 150 may generate performance analysis results based on detailed indicators such as object recognition accuracy of the situation awareness system 10, economy and risk of the voyage route, and whether or not the international maritime collision avoidance rule of the voyage route is complied with, and may provide the performance analysis results in a predetermined score or autonomous navigation level by integrating these.
Fig. 4 is a diagram for explaining the test scenario generation unit 110 in one embodiment of the present invention.
The test scenario generating part 110 in an embodiment of the present invention may include a test scenario automatic generating part 111 and a test scenario managing part 112.
The test scenario automatic generation unit 111 automatically generates a test scenario for verifying the performance of the autonomous navigation ship.
Although various test scenarios are required in order to verify performance under various conditions of an autonomously navigating vessel, there are limitations to the manual generation of test scenarios by humans.
Therefore, the test scenario automatic generation unit 111 collects other ship travel information based on various ship travel information collected from a conventional ship Traffic control system (VTS) and various conventional databases in which ship travel information is recorded.
Next, the test scenario automatic generation unit 111 searches the collected navigation information for navigation information that matches the characteristic information of the ship. At this time, since the navigation characteristics of the ship are closely affected by the size and type of the ship, the test scenario automatic generation unit 111 searches for navigation information having characteristic information (ship type, size, specification, etc.) of the autonomously navigating ship.
Since the retrieved voyage information includes various scenarios, it is necessary to derive a representative test scenario for performance verification.
Therefore, the test scenario automatic generation unit 111 clusters (clustering) the retrieved navigation information according to predetermined navigation characteristics, divides and generates the navigation information into a plurality of group navigation information, and automatically generates a heterogeneity test scenario that can take various situations into consideration based on the group navigation information.
Here, as an example, the predetermined voyage characteristics include the density of ships existing around the route, whether or not international marine collision avoidance regulations in the voyage information are complied with, the type of the navigation area, environmental information (wind, current, field of view, and the like), and voyage information characteristics (voyage distance, average speed, and the like), and the test scenario automatic generation unit 111 defines the predetermined voyage characteristics and performs clustering.
In this case, various methods used in the pattern recognition field, such as k-means clustering (k-means clustering) and spectral clustering (spectral clustering), can be applied as the clustering method. If a clustering algorithm is used, k representative test scenarios are automatically generated.
In this way, when a test scenario in which the characteristics of the autonomous navigation vessel used for performance verification are reflected is automatically generated, the test scenario management unit 112 verifies a plurality of generated test scenarios by a predetermined algorithm or an administrator, and stores the verified test scenarios. Of course, during verification, a part of the test scenario may be modified according to requirements, and the like.
Fig. 5 is a diagram more particularly illustrating the mixed reality based autonomous navigation vessel performance verification system 100 shown in fig. 3.
When a test scenario in which the characteristics of the autonomous navigation ship as the performance verification target are reflected is generated in the test scenario generation unit 110, the test scenario is supplied to the mixed reality data generation unit 120.
The mixed reality data generating unit 120 receives the test scene, generates virtual object information corresponding to the test scene, generates sensor information to which the virtual object information is added in the real space, and transmits the sensor information to the sensor information transmitting unit 130.
Fig. 6 is a diagram for explaining the mixed reality data generating unit 120 according to an embodiment of the present invention.
The mixed reality data generator 120 includes a real object information manager 121, a virtual object information manager 122, and a sensor information generator 123.
The actual object information management unit 121 collects actual object information (obstacles, other vessels, and the like) existing in the current sea area (real space) of the autonomous navigation vessel.
The actual object information management unit 121 includes a surrounding situation information collection unit 121a and a ship course prediction unit 121b.
The surrounding situation information collection unit 121a collects and manages surrounding situation information from sensors (AIS, radar, CCTV (closed circuit television), and the like) of the autonomous navigation ship. The surrounding situation information collection unit 121a collects actual space information around the autonomous navigation ship by various sensors, and receives the sensor information generated by the sensor information generation unit 123 to perform a function of mapping and managing the actual space information and the surrounding situation information. The virtual object information mapped to the actual space information in this manner is transmitted to the autonomous ship navigation performance analyzer 150 and the ship navigation path prediction unit 121b.
The ship course prediction unit 121b predicts a course of the autonomous navigation ship based on the surrounding situation information collected and mapped by the surrounding situation information collection unit 121 a. The predicted course is passed to the virtual object information management unit 122, and whether or not the international maritime collision avoidance rule is complied with is verified by the international maritime collision avoidance rule verification unit (COLREG rule verification unit) 122b.
The virtual object information management unit 122 compares the actual object information (surrounding situation information) with the virtual object defined in the test scenario to generate and manage virtual object information.
The virtual object information management unit 122 first performs a function of comparing the test scene with the actual object information and selecting a virtual object as follows: that is, virtual objects (ships, floats, terrain, etc.) that are needed in the test scenario but do not exist in real space and need to be generated. At this time, the generated virtual object is managed in a manner of being mapped to the actual scene. Here, the actual scene refers to an actually executed scene such as a test scene that has been executed previously.
The virtual object information management unit 122 may include a virtual object path generation unit 122a and an international sea collision avoidance rule verification unit 122b.
The virtual object path generating unit 122a performs the following functions: that is, a function of determining the current position and the movement path of the virtual object in consideration of the actual object information, the virtual object information, and the current position information of the autonomous navigation ship in the test scene (actual scene) applied at the previous time.
If the moving route of the virtual object is determined, the international marine collision avoidance rule verification unit 122b determines whether the international marine collision avoidance rule for the generated moving route is complied with, and if it is determined that the international marine collision avoidance rule is not complied with, the moving route is newly generated by the virtual object route generation unit 122 a.
The sensor information generation unit 123 generates sensor information (installation position, sensor characteristics, and the like) for the virtual object information in accordance with the sensor information applied to the autonomous navigation ship. That is, the sensor information generating unit 123 converts the virtual object information so as to conform to the sensor mounted on the verification target autonomous navigation vessel, using the virtual object information and the sensor information of the autonomous navigation vessel.
As one example, the sensor information generating part 123 generates virtual object information in a form of collecting the virtual object information by the sensor after mapping the virtual object information to the real space, and transfers the virtual object information thus generated to the sensor information transferring part 130.
At this time, the generated virtual object information may also be transmitted through a real space, as one example, may be transmitted through network communication. As another example, if AIS information for a virtual other vessel is transmitted in real space by an AIS transmitter, the status awareness system 10 of the autonomous sailing vessel can perceive the AIS information. However, in the case of image information, since it cannot be transmitted through the real space, the virtual object information is enhanced in the real image, converted into the real image, and transmitted to the situation awareness system 10 through the sensor information transmitting unit 130.
In an embodiment of the present invention, the sensor information generating part 123 may include an image information generating unit 123a, an AIS information generating unit 123b, a radar information generating unit 123c, an acoustic signal generating unit 123d, and an environment information generating unit 123e.
The image information generating unit 123a includes a ship position comparing part 1231a, an image modeling part 1232a, and an image generating part 1233a as an embodiment.
The ship position comparing part 1231a analyzes the virtual object information at the current time and the position of the autonomous ship using the path information of the virtual object.
The image modeling part 1232a generates an image in which the virtual object information is reflected based on the sensor setting information and the sensor characteristic information. That is, the image modeling part 1232a performs a function of rendering the shape of the three-dimensional virtual object viewed from the viewpoint of the actual camera into a two-dimensional image using, for example, characteristic (resolution, angle of view, lens focal length, etc.) information of the camera and position and direction information of the virtual object.
The image generating part 1233a converts the image generated by the image modeling part 1232a into a live view image. That is, since the above-described generated image is mixed with an actual image and a computer graphic image, a process of converting into a live view image is performed in order to be used as an input of an actual camera. In this case, an image-to-image conversion (image-to-image conversion) method based on a Generative Adaptive Network (GAN) or the like may be used for conversion into the live-action image.
As an embodiment, the AIS information generating unit 123b includes a ship information analyzing part 1231b, an AIS information generating part 1232b, and an AIS information transmitting part 1233b.
The ship information analysis unit 1231b analyzes information of other ships that are generated as virtual objects. That is, the ship information analysis unit 1231b analyzes the current position Of the other ship as the virtual object and the previous voyage information, calculates the latitude, longitude, and Rate Of Turn (ROT) Of the current ship, the Speed Over Ground (SOG), and the True heading (True heading) information, and transmits them to the AIS information generation unit 1232b.
The AIS information generating unit 1232b generates AIS information based on the information analyzed by the ship information analyzing unit 1231 b. The AIS information generating unit 1232b generates the information analyzed by the ship information analyzing unit 1231b as data in the AIS message format. In addition, the AIS information generating part 1232b performs a function of analyzing the state of the current ship and determining an AIS message transmission cycle.
The AIS information transmitting part 1233b transmits the message using the physical AIS transmitter in conformity with the transmission cycle determined by the AIS information generating part 1232b.
On the other hand, in the case of AIS information, it is obvious that it can be transmitted to an autonomous sailing ship through network communication, as with other sensor information.
As an embodiment, the radar information generating unit 123c includes a virtual object information analyzing part 1231c, a radar object image synthesizing part 1232c, and a radar information analyzing part 1233c.
The virtual object information analysis unit 1231c analyzes the virtual object information of the periphery of the autonomous ship. In the case of radar, even if there are a plurality of objects, only information of neighboring objects is generated if they are located in the same direction. Therefore, the virtual object information analysis unit 1231c performs the following functions: that is, the function of analyzing the virtual object information (position, size, shape, etc.) around the autonomous navigation ship, analyzing whether there is an overlapping object from the viewpoint of the autonomous navigation ship, and specifying the object to be generated by the radar object image synthesizing unit 1232 c.
The radar object image synthesizing part 1232c synthesizes the analyzed virtual object information into a radar image. The radar object image synthesizing part 1232c performs a function of generating image information of the same format as the corresponding radar image using the virtual object information (position, size, shape, etc.) and the radar information of the autonomous navigation ship.
The radar information analysis part 1233c extracts object information from the synthesized radar image.
In one embodiment, the acoustic signal generating unit 123d includes a ship track analyzing unit 1231d, an acoustic signal generating unit 1232d, and an acoustic signal transmitting unit 1233d.
The ship route analysis unit 1231d analyzes the route of another ship to be a virtual object. The international maritime collision avoidance rules stipulate: the meaning of the vessel is expressed by a combination of short and long tones of a siren installed in the vessel. For example, a ship that is going to overtake another ship generates two long tones and one short tone, which means "the ship is going to overtake the noble ship from the starboard side".
Therefore, the ship track analysis part 1231d analyzes the generated navigation path of the virtual ship to determine whether the navigation path meets the rules defined in the international marine collision avoidance rule, and determines whether the siren sound needs to be generated according to the rules defined in the international marine collision avoidance rule.
If it is determined that the siren sound needs to be generated, the sound signal generating part 1232d generates a siren sound conforming to the international maritime collision avoidance rule, which is a combination of a mono sound, a short sound, a long sound, etc., based on the collected siren sound signal of the ship.
Thereafter, the acoustic signal transmitting part 1233d transmits the generated siren sound (acoustic signal) to the autonomous navigation ship through the speaker.
On the other hand, in the case of an acoustic signal, it is needless to say that the acoustic signal can be transmitted to the autonomous navigation ship through network communication, similarly to other sensor information.
The environment information generating unit 123e includes an environment information analyzing part 1231e and an environment information transmitting part 1232e as an embodiment.
The environment information analyzing part 1231e generates virtual environment information based on the environment information defined in the test scenario, and the environment information transmitting part 1232e transmits the generated environment information.
In the case of considering environmental information when an autonomous sailing ship determines an airline, the environmental information is directly generated by a sensor installed in an actual ship, or current weather information and weather forecast information are collected and utilized through the internet such as the National Oceanic and Atmospheric Administration (NOAA) or the weather bureau.
Therefore, the environmental information analyzing part 1231e performs a function of analyzing the environmental information defined in the test scenario to virtually generate information of each environmental sensor currently installed in the autonomous navigation ship, current weather information, and forecast information. The environmental information thus generated is supplied to the autonomous navigation ship by the environmental information transmitting part 1232e through the network communication line together with other sensor information.
Referring again to fig. 5, the sensor information transmitting unit 130 receives virtual object information transmitted by network communication or the like from the mixed reality data generating unit 120, transmits sensor information included in the virtual object information to each sensor of the autonomous navigation vessel, and transmits the virtual object information to the situation awareness system 10.
The situation awareness system 10 transmits sensor information mixed with virtual object information, which is obtained by reflecting the sensor information on the virtual object information received by the situation awareness system 10 through the sensor information transmitting unit 130, to the automatic navigation system 20, and the automatic navigation system 20 determines a navigation route and controls autonomous navigation of the engine and the ship based on the sensor information.
At this time, the judgment information collecting part 140 receives the determined voyage route information and the ship control information, and transmits these pieces of information to the autonomous voyage ship voyage performance analyzing part 150.
The autonomous navigation ship voyage performance analysis unit 150 includes a voyage information collection unit 151, a voyage information analysis unit 152, and a voyage performance analysis unit 153.
The navigation information collection unit 151 receives the situation awareness information, the navigation route information, and the ship control information transmitted from the determination information collection unit 140. The navigation information collection unit 151 receives the real object information (obstacle, navigation information of other vessel, etc.) and the augmented virtual object information transmitted from the mixed reality data generation unit 120.
The navigation information analyzing unit 152 time-synchronizes the pieces of information collected by the navigation information collecting unit and groups the synchronized pieces of information. The travel information analyzing part 152 may distinguish information of other vessels acquired by the autonomous traveling vessel at a specific time, enhanced object information, and situation awareness information perceived in the situation awareness system 10 according to the type of the sensor and analyze the accuracy and overall accuracy of each sensor of the situation awareness system 10. The travel information analysis unit 152 can also generate travel information by combining the position information of the autonomous travel ship and the control information.
The voyage performance analysis unit 153 compares and analyzes the actual voyage information of the autonomous voyage ship and the voyage route information determined according to the test scenario using various factors, and evaluates the performance information of the autonomous voyage ship.
The navigation performance analysis unit 153 compares the navigation paths of the autonomous navigation ship and the actual ship to derive analysis indexes including the international marine collision avoidance rule compliance, the collision risk evaluation index, the ship operability evaluation index, the navigation path index, and the object recognition accuracy of the condition sensing system 10. The derived analysis index is input to a predetermined artificial intelligence algorithm learned in advance, and performance information differentiated into a predetermined score or a plurality of levels according to the input is acquired as an analysis result.
At this time, the time of avoidance and avoidance operations are main evaluation items in terms of whether international maritime collision avoidance regulations are complied with or not. The avoidance maneuver is a distance and time remaining between the two vessels at the start point of the avoidance maneuver, and the avoidance maneuver is a maneuver suitable for the avoidance direction of the navigation and a sufficient avoidance angle and deceleration that can be perceived by the other vessels.
The collision risk evaluation index uses the proximity and collision probability of the ship using a quantitative evaluation module for Marine traffic safety evaluation recommended by International Association of maritime Navigation Aids and Lighthouse Authorities (IALA).
In this case, the approach degree of the ship in the collision risk evaluation index shows the stability of the closest distance between the two ships, and the collision probability shows whether or not there is a collision risk that the collision probability between the two ships is 10-4 or more.
Next, the ship drivability evaluation index evaluates a rudder, an engine, and a margin control force, in which the rudder shows an average rudder usage amount and an average ROT (Rate of turn) during an avoidance operation, the engine shows an average ship speed (kts) and a ship speed Rate per minute during the avoidance operation, and the margin control force refers to a margin control force of the rudder and the engine during the avoidance operation.
Next, the course path indicators analyze course differences, which reflect the differences between the defined path in the scene and the actual course path, and economics, which reflect the time of flight and fuel consumption, distance traveled, etc.
Next, the accuracy of the condition sensing system 10 is the difference between the surrounding condition information (actual object information and virtual object information) detected when the autonomous navigation ship navigates, and the navigation information of other ships and virtual object information generated from the test scene collected by the mixed reality data generating unit 120.
On the other hand, as one example of an artificial intelligence algorithm for analyzing the performance of an autonomously sailing vessel, a deep neural network may be utilized. The deep neural network learns a navigation record generated by a plurality of shipmakers having various experiences and a network evaluating performance by using a Ground truth (Ground route) value generated by simulation as learning data. At this time, since the amount of learning data is limited, various data enhancement techniques (signal conversion, GAN-based signal generation, etc.) can be utilized.
Fig. 7a and 7b are diagrams for explaining a specific embodiment of the system 100 for verifying the performance of an autonomous navigation vessel based on mixed reality.
Fig. 7a shows an example in which the sensor information delivery part 130 and the judgment information collection part 140 are embodied as separate autonomous navigation ship agents (agents) in the mixed reality-based autonomous navigation ship performance verification system 100. In the embodiment of fig. 7a, the autonomous ship to be a performance test target is specifically implemented as a system in which only the autonomous ship agent is provided, and the test scenario generation unit 110, the mixed reality data generation unit 120, and the autonomous ship navigation performance analysis unit 150 of the autonomous ship performance verification system 100 are disposed at positions separate from the autonomous ship.
Fig. 7b is a specific embodiment in which the autonomous-vessel agent and the autonomous-vessel performance verification system 100 are integrated, and the entire autonomous-vessel performance verification system 100 is disposed on an autonomous vessel to be a performance test target.
The method for verifying the performance of an autonomous navigation ship based on mixed reality according to an embodiment of the present invention is described below with reference to fig. 8.
FIG. 8 is a flow chart of a method for verifying the performance of an autonomous vessel based on mixed reality, in accordance with an embodiment of the present invention.
On the other hand, the steps shown in fig. 8 may be understood as being performed by the mixed reality based autonomous navigation vessel performance verification system 100 described above, but are not necessarily limited thereto.
First, a mixed reality-based test scenario for verifying the performance of an autonomously navigating vessel is generated (S110). In step S110, the characteristics of the autonomous navigation ship to be a performance verification target and the navigation logs of other ships stored in the existing database are analyzed to generate K representative test scenarios reflecting ship density existing around the route, the marine collision avoidance rule generated in the navigation logs, the type and environmental information of the navigation area, and the characteristics of the navigation information.
Next, the navigation state and the surrounding situation of the autonomous navigation ship at the current time T are analyzed to generate mixed reality data in which virtual object information and actual object information corresponding to the test scene are reflected (S120).
Next, virtual object information including sensor information generated according to the virtual object information is transferred to the autonomous navigation ship (S130), and condition sensing information, navigation path information, ship control information, and other ship navigation information and enhanced virtual object information of the autonomous navigation ship determined according to the virtual object information are collected (S140). At this time, steps S120 to S140 are repeated until the navigation is finished (S150).
Thereafter, as the voyage ends, the performance information of the autonomously sailing ship is analyzed based on the collected pieces of information (S160). In step S160, the navigation paths of the autonomous navigation ship and the test scene are comparatively analyzed based on the analysis indexes including the international maritime collision avoidance rule (colleg) compliance, the collision risk evaluation index, the ship drivability evaluation index, the navigation path index, and the object recognition accuracy of the condition sensing system 10 to evaluate the performance.
On the other hand, in the above description, the steps S110 to S160 may be further divided into more steps or combined into fewer steps according to the specific embodiment of the present invention. Further, a part of the steps may be omitted or the order between the steps may be changed as necessary. On the other hand, the contents of the mixed reality based autonomous navigation vessel performance verification system 100 of fig. 1 to 7b may also be applied to the mixed reality based autonomous navigation vessel performance verification method of fig. 8.
The above-described autonomous navigation ship performance verification method based on mixed reality according to an embodiment of the present invention may be embodied as a program (or an application program) stored in a medium to be executed in conjunction with a computer as hardware.
The program may include Code (Code) encoded in a computer language such as C, C + +, JAVA, ruby, machine language, etc., which is readable by a processor (CPU) of the computer through a device interface of the computer, so that the computer executes the method implemented as the program by reading the program. Such Code may include Functional Code (Functional Code) relating to functions or the like defining various functions required to perform the above-described method, and may include control Code relating to an execution order required for the processor of the computer to execute the various functions in a prescribed order. In addition, such code may further include additional information required for the processor of the computer to perform the various functions described above or memory reference-related code to which location (address number) of the internal or external memory of the computer the medium should be referred to. In addition, in the case where the processor of the above computer needs to communicate with any other computer or server or the like located at a Remote location (Remote) to perform the above various functions, the code may further include communication-related code on how the communication module of the above computer should be used to communicate with any other computer or server or the like located at a Remote location and what kind of information or medium or the like should be transmitted/received when the communication is performed.
The storage medium is not a medium that stores data for a short time such as a register, a cache memory, a memory, or the like, but refers to a medium that stores data semi-permanently and can be read (read) by a device. Specifically, as examples of the above-mentioned storage medium, there are ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, but are not limited thereto. That is, the program may be stored in various recording media in various servers accessible to the computer or in various recording media in the computer of the user. In addition, the above medium may be distributed in computer systems connected through a network and stores computer readable codes in a distributed manner.
The foregoing description of the present invention is intended to be illustrative, and those skilled in the art will appreciate that it can be easily modified into other specific forms without changing the technical idea or essential features of the present invention. The embodiments described above are therefore to be understood as illustrative in all respects and not restrictive. For example, the components described as unitary may be implemented in a distributed manner, and similarly, the components described as distributed may be implemented in a combined manner.
The scope of the present invention is defined by the appended claims, rather than the foregoing detailed description, and all changes and modifications that come within the meaning and range of equivalency of the claims are to be construed as being embraced therein.

Claims (14)

1. A mixed reality-based autonomous navigation ship performance verification system, which is provided with a communication module, a memory and a processor and is executed by the processor, is characterized by comprising:
a test scenario generation unit that generates a test scenario based on mixed reality for verifying the performance of an autonomous navigation ship;
a mixed reality data generating unit that generates mixed reality data by reflecting virtual object information and actual object information that match the test scene;
a sensor information transmitting unit that transmits virtual object information including sensor information generated commensurate with the virtual object information;
a judgment information collection unit that collects condition sensing information, navigation path information, and vessel control information of the autonomous navigation vessel, which are determined in proportion to the virtual object information, and the actual object information and the enhanced virtual object information; and (c) a second step of,
and an autonomous navigation ship navigation performance analysis unit that analyzes performance information of the autonomous navigation ship based on each piece of information collected by the determination information collection unit.
2. The mixed reality based autonomous navigation vessel performance verification system of claim 1,
the test scenario generation unit includes a test scenario automatic generation unit that collects ship travel information collected from the ship traffic management system and other ship travel information that has been collected and stored, searches the collected travel information for travel information that matches characteristic information of the own ship, clusters the searched travel information according to predetermined travel characteristics to generate a plurality of pieces of group travel information, and generates a plurality of corresponding test scenarios based on the generated group travel information.
3. The mixed reality based autonomous navigation vessel performance verification system of claim 2,
the test scenario generation unit includes a test scenario management unit that verifies a plurality of generated test scenarios by a predetermined algorithm or an administrator and stores the verified test scenarios.
4. The mixed reality based autonomous navigation vessel performance verification system of claim 1,
the mixed reality data generation unit includes:
an actual object information management unit that collects actual object information existing in a current sea area of the autonomous navigation vessel;
a virtual object information management unit that compares the actual object information with the test scenario to generate virtual object information; and (c) a second step of,
and a sensor information generating unit that generates sensor information for the virtual object information so as to match information applied to the sensor of the autonomous navigation vessel.
5. The mixed reality based autonomous navigation vessel performance verification system of claim 4,
the virtual object information management unit includes:
a virtual object path generation unit that generates a movement path in consideration of the actual object information and virtual object information in a test scene applied at a previous time and current position information of an autonomous navigation vessel; and (c) a second step of,
and an international sea collision avoidance rule verifying unit for verifying whether the international sea collision avoidance rule for the generated travel route is complied with.
6. The mixed reality based autonomous navigation vessel performance verification system of claim 4,
the actual object information management unit includes:
a surrounding situation information collection unit that collects surrounding situation information from a sensor of the autonomous navigation ship and manages the sensor information of the sensor information generation unit by mapping the sensor information to the surrounding situation information; and the number of the first and second groups,
and a ship navigation path prediction unit that predicts a navigation path of the autonomous navigation ship based on the mapped surrounding situation information.
7. The mixed reality based autonomous navigation vessel performance verification system of claim 4,
the sensor information generating unit maps the virtual object information to a current navigation sea area which is a real space, and generates sensor information in which virtual object information is mixed, the sensor information having a form obtained from each of the sensors.
8. The mixed reality based autonomous navigation vessel performance verification system of claim 4,
the sensor information generating section includes an image information generating unit,
the image information generation unit includes: a ship position comparison unit that analyzes the virtual object information at the current time and the position of the autonomous navigation ship; an image modeling unit that generates an image in which the virtual object information is reflected, based on sensor setting information and sensor characteristic information; and an image generation unit that converts the generated image into a live-action image.
9. The mixed reality based autonomous navigation vessel performance verification system of claim 4,
the sensor information generating section includes an AIS information generating unit,
the AIS information generating unit includes: a ship information analysis unit that analyzes the current position of another ship that becomes the virtual object and previous navigation information to calculate latitude, longitude, ROT, SOG, and true course information of the current ship; an AIS information generating unit that generates data in an AIS message format from the respective information received by the ship information analyzing unit; and an AIS information transmission unit that transmits the AIS message using a physical AIS transmitter in accordance with the transmission cycle determined by the AIS information generation unit.
10. The mixed reality based autonomous navigation vessel performance verification system of claim 4,
the sensor information generation section includes a radar information generation unit,
the radar information generation unit includes: a virtual object information analysis unit that analyzes the virtual object information; a radar object image synthesizing unit that synthesizes the analyzed virtual object information into a radar image; and a radar information analysis unit that extracts object information from the combined radar image.
11. The mixed reality based autonomous navigation vessel performance verification system of claim 4,
the sensor information generating unit includes an acoustic signal generating unit,
the acoustic signal generation unit includes: a ship track analysis unit for analyzing a navigation route of another ship which becomes the virtual object, and comparing the navigation route with international maritime collision avoidance rules to determine whether to generate a corresponding siren sound signal; an acoustic signal generating unit which generates a siren sound in accordance with the international maritime collision avoidance rule based on the collected siren acoustic signals of the ship when the generation of the siren acoustic signals is required as a result of the comparison; and an acoustic signal transmission unit that transmits the generated whistle acoustic signal.
12. The mixed reality based autonomous navigation vessel performance verification system of claim 4,
the sensor information generating section includes an environmental information generating unit,
the environment information generating unit includes: an environment information analysis unit that generates virtual environment information based on environment information defined in the test scenario; and an environment information transmission unit that transmits the generated environment information.
13. The mixed reality based autonomous navigation vessel performance verification system of claim 1,
the autonomous navigation ship navigation performance analysis unit includes a navigation performance analysis unit that compares navigation routes of the autonomous navigation ship and an actual ship based on the collected information to derive an analysis index including whether international maritime collision avoidance rules are complied with, a collision risk evaluation index, a ship maneuverability evaluation index, a navigation route index, and object recognition accuracy of a condition sensing system, inputs the derived analysis index into a predetermined artificial intelligence algorithm learned in advance, and acquires performance information differentiated into a predetermined score or a plurality of levels according to the input as an analysis result.
14. A mixed reality-based autonomous navigation ship performance verification method comprises the following steps:
a step of generating a mixed reality-based test scenario for verifying the performance of an autonomous vessel;
reflecting the virtual object information and the actual object information corresponding to the test scene to generate mixed reality data;
a step of transmitting virtual object information including sensor information generated in proportion to the virtual object information to an autonomous navigation vessel;
a step of collecting condition perception information, navigation path information, and vessel control information of the autonomous navigation vessel, and other vessel navigation information and enhanced virtual object information, which are determined in proportion to the virtual object information; and (c) a second step of,
and analyzing the performance information of the autonomous navigation ship based on the collected information.
CN202111410213.8A 2021-04-22 2021-11-23 System and method for verifying performance of autonomous navigation ship based on mixed reality Pending CN115237730A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20210052576 2021-04-22
KR10-2021-0052576 2021-04-22
KR10-2021-0141244 2021-10-21
KR1020210141244A KR20220145741A (en) 2021-04-22 2021-10-21 System and method for evaluating the performance of autonomous ships using mixed reality

Publications (1)

Publication Number Publication Date
CN115237730A true CN115237730A (en) 2022-10-25

Family

ID=83665936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111410213.8A Pending CN115237730A (en) 2021-04-22 2021-11-23 System and method for verifying performance of autonomous navigation ship based on mixed reality

Country Status (1)

Country Link
CN (1) CN115237730A (en)

Similar Documents

Publication Publication Date Title
CN110221546B (en) Virtual-real integrated ship intelligent control system test platform
CN111830990B (en) Autonomous navigation control management system for large unmanned ship
CN104730949A (en) Affective user interface in an autonomous vehicle
KR20220145741A (en) System and method for evaluating the performance of autonomous ships using mixed reality
Naus Drafting route plan templates for ships on the basis of AIS historical data
CN113156947A (en) Method for planning path of ship in dynamic environment
CN111459132A (en) Evaluation method and system for navigation function of ship
CN109911140A (en) A kind of water-area navigation information enhancement device, system and method
KR102110939B1 (en) Apparatus and method for virtual ship traffic reproduction
Copping et al. Likelihood of a marine vessel accident from wind energy development in the Atlantic
Vagale et al. Evaluation of path planning algorithms of autonomous surface vehicles based on safety and collision risk assessment
CN110444046A (en) A kind of restricted waters non conflicting can meet ship cluster Situation analysis method
Gil et al. Semi-dynamic ship domain in the encounter situation of two vessels
Woerner COLREGS-compliant autonomous collision avoidance using multi-objective optimization with interval programming
Bolbot et al. A method to identify and rank objects and hazardous interactions affecting autonomous ships navigation
CN111824357B (en) Test method, test device, electronic equipment and computer readable storage medium
CN115237730A (en) System and method for verifying performance of autonomous navigation ship based on mixed reality
Aarsæther et al. Adding the human element to ship manoeuvring simulations
KR101728603B1 (en) Three dimension ship maneuvering simulator available on the pc using google map and enc
CN113470435B (en) Method and terminal for generating intelligent ship test scene of natural anti-marine environment
CN113885533B (en) Unmanned driving method and system of unmanned boat
Wang et al. Complex encounter situation modeling and prediction method for unmanned ships based on bounded rational game
Olindersson et al. Development of a software to identify and analyse marine traffic situations
Akkermann et al. Scenario-based V&V in a maritime co-simulation framework
Last Analysis of automatic identification system data for maritime safety

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination