EP3893223A1 - Erkennungsfähigkeitsverifizierung für fahrzeuge - Google Patents

Erkennungsfähigkeitsverifizierung für fahrzeuge Download PDF

Info

Publication number
EP3893223A1
EP3893223A1 EP20168606.0A EP20168606A EP3893223A1 EP 3893223 A1 EP3893223 A1 EP 3893223A1 EP 20168606 A EP20168606 A EP 20168606A EP 3893223 A1 EP3893223 A1 EP 3893223A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
dcv
server
detection capability
vessel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP20168606.0A
Other languages
English (en)
French (fr)
Inventor
Jarkko JÄRVINEN
Kalle Kettunen
Jan Karsten
Jaakko Saarela
Kristian Vaajala
Vilhelm Backman
Jonne Pohjankukka
Pasi Pitkänen
Eetu Kummala
Anu Peippo
Jere Laaksonen
Jari Uggeldahl
Jukka Moisala
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vtt Senseway Oy
Original Assignee
Vtt Senseway Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vtt Senseway Oy filed Critical Vtt Senseway Oy
Priority to EP20168606.0A priority Critical patent/EP3893223A1/de
Publication of EP3893223A1 publication Critical patent/EP3893223A1/de
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G3/00Traffic control systems for marine craft

Definitions

  • Various example embodiments generally relate to the field of navigation systems for vehicles.
  • some example embodiments relate to verifying or improving detection capabilities of autonomous, semi-autonomous, or non-autonomous vehicles.
  • Navigation systems may be based on information acquired locally at vehicles.
  • Each vehicle may comprise a positioning system to determine a location of the vehicle.
  • Each vehicle may be further equipped with sensors to determine information about nearby vehicles or other objects. However, reliability of the locally detected information may not be sufficient for all applications.
  • Example embodiments of the present disclosure enable verifying or improving reliability of local detections in vehicle navigation systems. These benefits may be achieved by the features of the independent claims. Further implementation forms are provided in the dependent claims, the description, and the drawings.
  • an apparatus comprises means for receiving, from a client associated with a vehicle, information associated with at least one object detected by the vehicle and/or a location of the vehicle; means for obtaining verification information associated with the at least one object or the location of the vehicle from at least one data source; and means for determining a detection capability indicator for the vehicle based on the verification information and at least one of: the information associated with the at least one object or the location of the vehicle.
  • a method comprises receiving, from a client associated with a vehicle, information associated with at least one object detected by the vehicle and/or a location of the vehicle; obtaining verification information associated with the at least one object or the location of the vehicle from at least one data source; and determining a detection capability indicator for the vehicle based on the verification information and at least one of: the information associated with the at least one object or the location of the vehicle.
  • a computer program is configured, when executed by an apparatus, to cause the apparatus at least to: receive, from a client associated with a vehicle, information associated with at least one object detected by the vehicle and/or a location of the vehicle; obtain verification information associated with the at least one object or the location of the vehicle from at least one data source; and determine a detection capability indicator for the vehicle based on the verification information and at least one of: the information associated with the at least one object or the location of the vehicle.
  • an apparatus comprises at least one processor; and at least one memory including computer program code; the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to: receive, from a client associated with a vehicle, information associated with at least one object detected by the vehicle and/or a location of the vehicle; obtain verification information associated with the at least one object or the location of the vehicle from at least one data source; and determine a detection capability indicator for the vehicle based on the verification information and at least one of: the information associated with the at least one object or the location of the vehicle.
  • an apparatus comprises means for detecting a location of a vehicle; means for detecting at least one object; means for determining information associated with the at least one object; means for transmitting the information associated with the at least one object or the location of the vehicle to a server; and means for receiving, from the server, a detection capability indicator associated with the vehicle.
  • a method comprises detecting a location of a vehicle; detecting at least one object; determining information associated with the at least one object; transmitting the information associated with the at least one object or the location of the vehicle to a server; and receiving, from the server, a detection capability indicator associated with the vehicle.
  • a computer program is configured, when executed by an apparatus, to cause the apparatus at least to: detect a location of a vehicle; detect at least one object; determine information associated with the at least one object; transmit the information associated with the at least one object or the location of the vehicle to a server; and receive, from the server, a detection capability indicator associated with the vehicle.
  • an apparatus comprises at least one processor; and at least one memory including computer program code; the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to: detect a location of a vehicle; detect at least one object; determine information associated with the at least one object; transmit the information associated with the at least one object or the location of the vehicle to a server; and receive, from the server, a detection capability indicator associated with the vehicle.
  • a centralized traffic management approach may be applied to control and/or monitor a plurality of vehicles.
  • decisions may be made based on information received from multiple data sources, including the vehicles, and therefore verification of received information may be desired in order to avoid making navigation decision based on unreliable information.
  • a server may receive information associated with at least one object detected by a vehicle. Furthermore, the server may receive an indication of a location of the vehicle. The server may obtain verification information associated with the object or the vehicle from other data source(s) and determine a detection capability indicator for the vehicle based on the received object and/or location information and the verification information. The server may provide an indication of the detection capability indicator to a client associated with the vehicle or control navigation of the vehicle. Navigation control data may be determined based on at least the detection capability indicator at the server or the client to enable safe traffic control taking into account different and dynamically varying detection capabilities of vehicles.
  • FIG. 1 illustrates an example of a maritime navigation control system 100, according to an example embodiment. Even though examples have been described using maritime navigation as an example, it is appreciated that example embodiments may be applied also to other navigation systems, for example ground traffic navigation systems, involving for example a plurality of connected cars, air traffic navigation systems, space traffic navigation systems, or the like.
  • the maritime navigation control system 100 may comprise at least one vessel 110.
  • the vessel 110 may comprise a DCV (detection capability verification) client 112 and at least one detection model 114.
  • the vessel 110 may further comprise sensor(s) for detecting location of the vessel 110, other vessels 116 or objects 118, environmental conditions, or the like.
  • Vessels 110, 116 are provided as examples of vehicles.
  • the DCV client 112 and the detection model 114 may be provided as separate devices and/or software modules at the vessel 110. However, it is also possible that DCV client 112 and the detection model 114 are integrated within a single device and/or software module.
  • the DCV client 112 and/or the detection model 114 may be for example integrated within a control system of the vessel 110.
  • the DCV client 112 and/or the detection model 114 may be embodied separate from the vessel 110, for example as application software on a device, such as for example a mobile phone, a tablet, a laptop, a radar module, or the like.
  • the DCV client 112 may provide a user interface to enable a user 111, for example a captain of the vessel 110, to interact with the DCV client 112. For example, a captain of the vessel 110 may provide his/her estimation of an object to the DCV client 112 via the user interface.
  • the DCV client 112 may be configured to communicate with a DCV server 130 to exchange information about the detected objects, environmental conditions, or the like.
  • the DCV server 130 may provide the DCV client 112 with corrected and/or additional information about the detected object. Communication between the DCV client 112 and the DCV server 130 may be implemented in any suitable manner, for example over a satellite or radio link.
  • the DCV server 130 may comprise one or more communication interfaces for receiving data from a plurality of data sources.
  • the data sources may comprise the vessel 110, the one or more other vessels 116, one or more sensor stations 120, or one or more other data sources 140.
  • the plurality of vessels 110, 116 may comprise vessel(s) registered to the DCV server 130 or vessel(s) not registered to the DCV server 130.
  • the DCV server 130 may receive information from the vessels 110, 116 over a radio link, for example over a VHF (very high frequency) channel.
  • the information may be transmitted by the vessels 110, 111 for example based on the automatic identification system (AIS).
  • AIS automatic identification system
  • the one or more sensor stations 120 may comprise any suitable type of sensors for acquiring information about vessels 110, 116 or other object(s) 118 within the range of the sensor stations 120.
  • the sensor stations 120 may comprise camera(s), microphone(s), onshore or offshore radar(s), lidar(s), or the like.
  • the camera(s) may comprise visible light camera(s), infrared camera(s), or in general wide-spectrum camera(s) capable of capturing electromagnetic radiation beyond the wavelength range of visible light.
  • the sensor stations 120 may be provided at fixed locations at a sufficient distance from expected routes of the vessels 110, 116.
  • Sensor stations 120 may be also mobile, for example, any of the different type of sensors may be provided on a drone.
  • the sensor stations 120 may provide data to the DCV server 130 over any suitable communication interface, for example a wired interface, a wireless local area network connection such as for example Wi-Fi, or over an internet connection provided for example by a cellular radio access network (RAN) node 124, which may provide cellular data services for example in accordance with the 4G LTE standard specified by the 3 rd Generation Partnership Project (3GPP).
  • the DCV server 130 may be implemented as a cloud service.
  • the maritime navigation control system 100 may further comprise at least one satellite 122, which may be for example configured to receive AIS data from the vessels 110, 116 and provide the AIS data to an AIS server.
  • the AIS data may comprise a maritime mobile service identity (MMSI) of a vessel 110, 116, a navigation status (e.g. at anchor, cruising with/without engine etc.), speed and/or course over ground, heading, vessel type, vessel dimensions, draught, or the like.
  • MMSI maritime mobile service identity
  • the satellite 122 may also operate as one sensor station. For example, images captured by satellite may be provided to the DCV server 130.
  • the maritime navigation control system 100 may further comprise other data sources 140.
  • an AIS receiver or server may operate as another data source by providing AIS information of the vessels 110, 116 to the DCV server 130.
  • Other data sources 140 may also include external sources of traffic data, such as for example a vessel traffic service (VTS).
  • VTS vessel traffic service
  • Information received from the vessels 110, 116 or the other data sources 140 may be used as verification information for verifying information locally detected at vessel 110 and/or to estimate detection capabilities of the vessel 110.
  • the DCV server 130 may update navigation control parameters of the vessel 110. The same approach may be applied also to the other vessels 116.
  • the DCV server 130 may be autonomous or it may be operated or supervised by a user 131.
  • user 131 may be enabled to monitor and/or modify the navigation control parameters determined by the DCV server 130.
  • a user may be requested to accept navigation control data determined or updated by the DCV server 130 before transmission to the vessels 110, 116.
  • FIG. 2 illustrates an example embodiment of an apparatus 200 configured to practice one or more example embodiments.
  • the apparatus 200 may comprise for example a computing device such as for example a server device, a client device, a mobile phone, a tablet computer, a laptop, a sensor station 120 or the like.
  • apparatus 200 may comprise a vehicle, such as for example the vessel 110, or a device integrated to or associated with a vehicle.
  • apparatus 200 is illustrated as a single device it is appreciated that, wherever applicable, functions of apparatus 200 may be distributed to a plurality of devices.
  • the apparatus 200 may comprise at least one processor 202.
  • the at least one processor may comprise, for example, one or more of various processing devices, such as for example a co-processor, a microprocessor, a controller, a digital signal processor (DSP), a processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like.
  • various processing devices such as for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • MCU microcontroller unit
  • hardware accelerator a special-purpose computer chip, or the like.
  • the apparatus may further comprise at least one memory 204.
  • the memory may be configured to store, for example, computer program code or the like, for example operating system software and application software.
  • the memory may comprise one or more volatile memory devices, one or more non-volatile memory devices, and/or a combination thereof.
  • the memory may be embodied as magnetic storage devices (such as hard disk drives, floppy disks, magnetic tapes, etc.), optical magnetic storage devices, or semiconductor memories (such as mask ROM, PROM (programmable ROM), EPROM (erasable PROM), flash ROM, RAM (random access memory), etc.).
  • Apparatus 200 may further comprise communication interface 208 configured to enable the apparatus 200 to transmit and/or receive information, to/from other apparatuses.
  • the communication interface 208 may be configured to provide at least one wireless radio connection, such as for example a 3GPP mobile broadband connection (e.g. 3G, 4G, 5G).
  • the communication interface may be configured to provide one or more other type of connections, for example a wireless local area network (WLAN) connection such as for example standardized by IEEE 802.11 series or Wi-Fi alliance; a short range wireless network connection such as for example a Bluetooth, NFC (near-field communication), or RFID connection; a wired connection such as for example a local area network (LAN) connection, a universal serial bus (USB) connection or an optical network connection, or the like; or a wired Internet connection.
  • Communication interface 208 may comprise, or be configured to be coupled to, at least one antenna to transmit and/or receive radio frequency signals.
  • One or more of the various types of connections may be also implemented as separate communication interfaces, which may be coupled or configured to be coupled to a plurality of antennas.
  • the communication interface may also comprise an internal communication interface within a system, such as for example a data bus, for example within a navigation control module of the vessel 110.
  • the apparatus 200 may further comprise a user interface 210 comprising an input device and/or an output device.
  • the input device may take various forms such a keyboard, a touch screen, or one or more embedded control buttons.
  • the output device may for example comprise a display, a speaker, a vibration motor, or the like.
  • some component and/or components of the apparatus such as for example the at least one processor and/or the memory, may be configured to implement this functionality.
  • this functionality may be implemented using program code 206 comprised, for example, in the memory 204.
  • the apparatus comprises a processor or processor circuitry, such as for example a microcontroller, configured by the program code when executed to execute the embodiments of the operations and functionality described.
  • the functionality described herein can be performed, at least in part, by one or more hardware logic components.
  • illustrative types of hardware logic components include Field-programmable Gate Arrays (FPGAs), application-specific Integrated Circuits (ASICs), application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), Graphics Processing Units (GPUs).
  • FPGAs Field-programmable Gate Arrays
  • ASICs application-specific Integrated Circuits
  • ASSPs application-specific Standard Products
  • SOCs System-on-a-chip systems
  • CPLDs Complex Programmable Logic Devices
  • GPUs Graphics Processing Units
  • the apparatus comprises means for performing at least one method described herein.
  • the means comprises the at least one processor, the at least one memory including program code configured to, when executed by the at least one processor, cause the apparatus to perform the method.
  • FIG. 3 illustrates an example of improving reliability of local detection by satellite and shore-side detections, according to an example embodiment.
  • the vessel 110 may be navigating in an area 310 comprising also another vessel 116.
  • the vessel 110 may be equipped with sensors for performing local detections, for example a GNSS receiver to determine location of the vessel 110 based on a signals received from GNSS satellites 320, a radar and a camera to detect the other vessel 116, or the like.
  • a probability for detecting the other vessel 116 with sufficient accuracy for example with respect to location, direction, size, or type of the other vessel 116 may be 50 %.
  • the vessel 110 may comprise a DCV client 112, which may communicate with a DVC cloud 330, for example by open-sea connectivity via a satellite 322 or short-sea connectivity via an access point or a base station 324.
  • Open-sea or alternatively deep sea, may refer to a region beyond a range of shore-side communication stations. It is however understood that satellite communication may be applied also within the range of short-sea connectivity.
  • Sensor station 326 may perform shore-side detections in the area 310 and communicate information of detected objects to the DCV cloud 330.
  • the probability of detecting the other vessel 116 with sufficient accuracy by shore-side detection is 50 %.
  • Satellite 328 operating as another sensor station may perform satellite detections in the area 310, for example by a camera, and communicate information of detected objects to the DCV cloud 330.
  • the probability of detecting the other vessel 116 with sufficient accuracy by satellite detection is 20 %.
  • the DCV client 112 may receive the locally detected information from the sensor (s) of the vessel 110 and deliver the locally detected information to the DVC cloud 330, for example over open-sea connectivity or short-sea connectivity.
  • the probability of detecting the other vessel 116 with sufficient accuracy is higher than each individual probability. For example, using the multiple information sources, the probability of detecting the other vessel 116 with sufficient accuracy may be 90 %.
  • the different detections may be also fused at the DCV cloud 330 to improve accuracy of the detections.
  • the DCV cloud 330 may communicate the received or fused information to the DCV client 112 and thereby the probability of sufficient detection, when relying on the multiple available detections, is increased to 90 % also at vessel 110. Accuracy of detecting location of the vessel 110 by GNSS, or other positioning system, may be improved in a similar fashion.
  • the detection model 114 may underestimate a size of the vessel 110.
  • the DCV cloud 330 may already be aware of the size, heading, and/or speed of the other vessel 116 with a high confidence level. This information may be communicated to the vessel 110 to compensate for the limited local detection capability.
  • the DCV client 112 is enabled to verify and/or improve its understanding of the current situation.
  • FIG. 4 illustrates an example of a neural network, according to an example embodiment.
  • a neural network may comprise a computation graph with a plurality of layers.
  • neural network 400 may comprise an input layer, one or more hidden layers, and an output layer. Nodes of the input layer, i 1 to i n , may be connected to one or more of the m nodes of the first hidden layer, n 11 to n 1m . Nodes of the first hidden layer may be connected to one or more of the k nodes of the second hidden layer, n 21 to n 2k . It is appreciated that even though the example neural network 400 illustrates two hidden layers, a neural network may apply any number and any type of hidden layers. Neural network 400 may further comprise an output layer.
  • Nodes of the last hidden layer in the example of FIG. 4 the nodes of the second hidden layer, may be connected to one or more nodes of the output layer, o 1 to o j . It is noted that the number of nodes may be different for each layer of the network.
  • a node may be also referred to as a neuron, a computation unit, or an elementary computation unit.
  • FIG. 5 illustrates an example of an elementary computation unit, according to an example embodiment.
  • the elementary computation unit may comprise a node 501, which may be configured to receive one or more inputs, a 1 to a n , from one or more nodes of one or more previous layers and compute an output based on the input values received.
  • the node 501 may also receive feedback from one or more nodes of one or more subsequent layers.
  • Inputs may be associated with parameters to adjust the influence of a particular input to the output. For example weights w 1 to w n associated with the inputs a 1 to a n may be used to multiply the input values a 1 to a n .
  • the node 501 may be further configured combine the inputs to an output, or an activation.
  • the node 501 may be configured to sum the modified input values.
  • a bias or offset b may be also applied to add a constant to the combination of modified inputs.
  • Weights and biases may be learnable parameters. For example, when the neural network is trained for a particular task, the values of the weights and biases associated with different inputs and different nodes may be updated such that an error associated with performing the task is reduced to an acceptable level.
  • an activation function f () may be applied to control when and how the node 501 provides the output. The output may be provided to nodes of one or more following layers of the network, and/or to one or more nodes of one or more previous layers of the network.
  • a neural network may be characterized by the structure of the network and/or values of the weights.
  • a neural network may be therefore adapted to different environments, or domains, by modifying the weights.
  • the detection model 114 deployed at the vessel 110 may be updated to meet local requirements by updating at least a subset of the weights.
  • the local detection model may have been trained with local training data and therefore it may provide better detection capability at a particular local region.
  • FIG. 6 illustrates an example of an artificial intelligence based detection model, according to an example embodiment.
  • the detection model 600 comprises a convolutional image classification network.
  • a convolutional layer may perform convolutional operations to extract information from input data, for example an image 602, to form a plurality of feature maps 606.
  • a feature map may be generated by applying a filter or a kernel to a subset of input data, for example block 604 in image 602, and sliding the filter through the input data to obtain a value for each element of the feature map.
  • the filter may comprise a matrix or a tensor, which may be for example multiplied with the input data to extract features corresponding to that filter.
  • a plurality of feature maps may be generated based on applying a plurality of filters.
  • a further convolutional layer may take as input the feature maps from a previous layer and apply the same filtering principle on the feature maps 606 to generate another set of feature maps 608. Weights of the filters may be adapted to different domains, similar to parameters of neural network 400. Similar to node 501, an activation function may be applied to the output of the filter(s).
  • the convolutional neural network may further comprise one or more other type of layers such as for example fully connected layers 610 after and/or between the convolutional layers. An output may be provided by an output layer 612, which in this example comprises a classification layer.
  • Outputs of the detection model 600 may comprise a probability distribution, for example a vector 614, where each element of the vector 614 may indicate a probability of the input image belonging to a particular class, such as for example the classes of cargo ships, yachts, canoes, or the like.
  • the detection model(s) 114 may operate on different types of sensor data, for example image data, video data, radar data, lidar data, sonar data, or the like, and be configured for various tasks such as for example, detecting an object, a size of the object, a distance to the object, a location of the object, a classification or a type of the object, a speed and/or a heading of the object, or the like.
  • the detection model(s) 114 may be purely deterministic or comprise deterministic algorithms in addition to artificial intelligence. For example, objects and/or their properties may be determined based on technologies such as computer vision, pattern recognition, sensor signal processing, or the like.
  • FIG. 7 illustrates an example of a sequence diagram for determining a detection capability indicator for a vessel, according to an example embodiment.
  • the DCV client 112 may register at the DCV server 130.
  • the DCV client 112 may for example transmit a DCV registration request to the DCV server 130.
  • the DCV registration request may comprise information identifying the DCV client 112 and/or the vessel 110.
  • the DCV registration request may comprise information of the vessel 110 or environmental context information of the vessel 110.
  • the registration request may comprise an identity of the vessel 110, an indication of sensor(s) available at the vessel 110 and/or types of the sensor(s), an indication of detection model(s) 114 and/or type(s) of detection model(s) 114 available at the vessel 110.
  • the registration request may comprise environmental context information such as for example a characterization of current weather conditions, visibility, illuminance, wind speed, wave height, time of day, or the like.
  • the registration request may be transmitted as one or more messages. Alternatively, similar information may be provided in other control message(s) during or after the DCV registration 701.
  • the DCV server 130 may register the DCV client 112 and/or the vessel 110, for example in response to receiving the DCV registration request.
  • the DCV server 130 may respond with a DCV registration response to confirm the registration to DCV client 112.
  • the DCV server 130 may further store the received information associated with the vessel 110 and/or the environmental context information.
  • the environmental context information may be associated with a time stamp, for example the time of receiving the environmental context information from the DCV client 112.
  • the DCV server 130 may transmit a verification dataset to the DCV client 112.
  • the DCV client 112 may receive the verification dataset from the DCV server 130.
  • the verification dataset may comprise at least one object.
  • the object may be provided as a virtual test object to evaluate detection capability of the detection model 114.
  • the verification dataset may for example comprise emulated or simulated sensor data for at least one sensor of the vessel 110, for example a camera, a lidar, a radar, a sonar, or the like.
  • the verification dataset may comprise virtual computer generated content or real content captured or recorded in advance, for example by sensor(s) of other vessels 116.
  • the verification data may also comprise a combination of real and virtual content, for example one or more virtual test objects embedded within the real content.
  • the verification dataset may be associated with particular environmental conditions, for example to test detection capability of the vessel 110 in the particular conditions.
  • the verification dataset may further comprise one or more ground-truth labels for the at least one object included in the verification dataset, for example to enable the DCV client 112 to estimate detection capability of the vessel 110 or to retrain or fine-tune the detection model 114 locally at the vessel 110.
  • the vessel 110 comprises a plurality of detection models
  • a respective plurality of verification datasets may be transmitted to the DVC client 112.
  • detection models 114 configured to detect speed and type of an object may use same video data as the verification dataset.
  • the DCV server 130 may select the verification dataset(s) based on the indication of the sensor(s) and/or detection model(s) 114 received from the DCV client 112, for example as part of the DCV registration 701.
  • the DCV server 130 may further request the DCV client 112 to execute the detection model(s) 114 based on the verification dataset(s).
  • a verification may be based on known real objects in proximity of the vessel 110 and therefore provision of the verification dataset(s) may be optional.
  • the verification may be based on fixed objects with known locations, but also based on dynamic objects reported to the DCV server 130 by a plurality of vessels 110, 116 or other data sources 140 with sufficient confidence level such that the dynamic objects can be used as reference objects for verification purposes.
  • the DCV server 130 may request the DCV client 112 to execute the detection model 114 based on real sensor data.
  • the request may for example indicate a location and/or a sensing direction for performing the detections. If there are multiple detection models 114, the DCV server 130 may indicate a plurality of locations and/or sensing directions associated with respective sensors and/or detection models 114.
  • the locations may be near known real objects, for example sea marks or landmarks, and considered as reference locations for detecting the real objects.
  • the DCV client 112 may provide input data, for example the received verification data or real sensor data, to the detection model 114.
  • the detection model 114 may be executed to detect at least one object. Furthermore, the detection model 114 may determine information associated with the object(s) detected in the input data, for example information associated with location, heading, speed, velocity, size, or classification of the object(s), as described above.
  • the object(s) may for example comprise another vessel 116, a sea mark, and/or a landmark.
  • the detection model 114 may provide an output comprising the determined information associated with the detected object(s) to the DCV client 112.
  • the DCV client 112 may detect or retrieve location of the vessel 110.
  • the DCV client 112 may receive the location data from a positioning system of the vessel 110, for example a GNSS receiver.
  • the DCV client 112 may detect the location itself.
  • the DCV client 112 may be configured to request the location from a positioning system embedded in the mobile phone.
  • any suitable means for determining location of the vessel 110, or an apparatus associated therewith, may be used.
  • the object information and/or the location information of the vessel 110 may be provided to the DCV server 130.
  • the DCV client 112 may transmit the information associated with the at least one object or the location of the vessel 110 to the DCV server 130.
  • the DCV server 130 may receive, from the DCV client 112, the information associated with the at least one object detected by the vessel 110 and/or location of the vessel 110.
  • the DCV client 112 may further transmit, and the DCV server 130 may receive, environmental context information associated with detection of the at least one object.
  • the DCV client 112 may indicate weather conditions, time of day, illuminance, visibility, or other environmental context information associated with capture of the sensor data that was input to the detection model 114.
  • the environmental context information may comprise any information that is relevant for reliability of any detection.
  • the DCV server 130 may receive verification information from at least one sensor station 120.
  • the verification information may comprise information associated with the at least one object, which is similar to the information provided by the DVC client 112, but which has been detected by the at least one sensor station 120.
  • the DCV server 130 may receive verification information from at least one other data source 140, such as for example an AIS server or a vessel traffic system (VTS).
  • the verification information may comprise information associated with the at least one object, which is similar to the information provided by the DVC client 112, but which has been detected or gathered by the at least one data source 140.
  • the DCV server 130 may obtain verification information associated with the at least one object, for example the vessels 110, 116 or the other object 118, from at least one data source.
  • the at least one data source may comprise at least one other vessel 116, at least one sensor station 120, a vessel traffic system, or the like.
  • a vessel traffic system may be an external traffic management function, which may maintain situational traffic information and control navigation of vessels 110, 116 within a region.
  • the verification information may be retrieved from a local data source, for example a memory of the DCV server 130, for example if the verification information comprises the verification dataset.
  • the DCV server 130 may determine a detection capability indicator for the vessel 110 based on the information associated with the at least one object or the location of the vessel 110 and the verification information.
  • the DCV server 130 may for example compare the information associated with the object(s) received from the vessel 110 and the sensor stations 120 or other data sources 140. Alternatively, or additionally, the DCV server 130 may compare the location of the vessel 110 received from the vessel 110 and location (s) of the vessel 110 received from the sensor stations 120 or the other data sources 140.
  • the DCV server 130 may for example calculate an average, or a weighted average, of the locations of the vessel 110 and determine a detection capability indicator for the vessel 110 based on a distance to the average location.
  • the DCV server 130 may determine, for example based on majority voting, which detections are correct.
  • the detection capability may be then determined for example as a ratio between correctly detected objects by the vessel 110 and the number of detected objects or a number of objects that the vessel 110 should have detected.
  • the detection capability indicator may comprise an index value, ranging for example from 1 to 100, where value 100 may indicate the highest detection capability.
  • the DCV server 130 may determine the detection capability indicator for the vessel 110 based on the environmental context information. For example, if the information of the at least one object received from the DCV client 112 at 707 is associated with poor visibility, the DCV server 130 may take this condition into account when determining the detection capability indicator and give a higher detection capability to the vessel 110 than to another vessel that provides the same detection results in good visibility. This enables a detection capability indicator of the vessel 110 to be determined without extensive influence of external factors such as for example weather conditions or time of day.
  • the detection capability indicator may be for example scaled based on the current environmental context information with respect to historical data of object information detected in different environmental conditions.
  • the DCV server 130 may associate the detection capability indicator with the environmental context information. For example, instead of scaling the detection capability indicator based on current environmental conditions, the DCV server 130 may determine the detection capability indicator without taking into account the environmental context information received from the DCV client 112. However, the resulting detection capability indicator may be associated, for example in the memory of DCV server 130, with the environmental context information. This way the DCV server 130 may obtain information of detection capabilities of each vessel 110, 116 in different environmental conditions. For example, the DCV server 130 may determine a first value of the detection capability indicator for vessel 110 for detections at night time and a second value of the detection capability indicator for detections at day time. Therefore, a currently applicable detection capability indicator of the vessel 110 may be dependent on various environmental conditions.
  • the DCV server 130 may store the detection capability indicator(s).
  • the DCV server 130 may for example store a detection capability history for one or more vessels 110, 116, or sensor(s) of the vessels 110, 116.
  • the stored detection capability history may be used to detect a trend of detection capability. For example, if a detection capability indicator decreases for a predetermined number of times and/or a predetermined amount, the DCV server 130 may transmit the DCV client a notification of decreased or decreasing detection capability.
  • the DCV client 112 may be for example provided with a warning if the determined detection capability indicator has decreased from a previous determination.
  • the DCV server 130 may send determined detection capabilities to DCV client 112, which may store the detection capability history and/or determine a trend of detection capability similar to the DCV server 130.
  • the DCV client 112 may further provide a notification to user 111, for example the captain of the vessel 110, about the decreased or decreasing detection capability.
  • the verification dataset provided by the DCV server 130 may be also associated with environmental context information.
  • the emulated sensor data included in the verification dataset may simulate or be captured in particular environmental conditions such as for example at a foggy or rainy night. Therefore, the DCV server 130 may associate a detection capability indicator determined based on objects detected in the verification dataset, by the detection model 114, with the environmental context information of the verification dataset.
  • the DCV server 130 may request the DCV client 112 to execute the detection model 114 for a plurality of verification datasets corresponding to different environmental conditions. As a result, the DCV server 130 may obtain a plurality of detection capability indicators associated with different environmental context information.
  • the detection capability indicator may be associated with at least one sensor of the vessel 110.
  • the DCV server 130 may for example request the DCV client 112 to execute the detection model 114 based on input data of a particular sensor(s).
  • the DCV client 112 may indicate which sensor(s) were used to obtain input data for the detection model 114.
  • the DCV server 130 may then associate the subsequently determined detection capability indicator with indication(s) of the relevant sensor(s).
  • the DCV server 130 may use this information for example to monitor capabilities of different sensors over time and notify DCV client 112 of deterioration of detection capability, or potential contamination or malfunction of different sensors. Therefore, according to an example embodiment, the DCV server 130 may transmit a notification of a degraded performance of at least one sensor to the DCV client 112.
  • the DCV server 130 may further determine whether a value of the detection capability indicator is sufficient. For example, a vessel 110 may be required to recognize objects in the local environment, or local objects such as for example sea marks, with a sufficient reliability. In some situations, if the vessel 110 does not have sufficient detection capability, the detection capability may be improved or navigation of the vessel 110 may be restricted in order to enable safe traffic control.
  • the DCV server 130 may determine or be configured with threshold(s) for the detection capability. Different thresholds may be determined or configured for different environmental conditions, traffic situations, regions, or the like. For example, if it's a sunny day, a lower visual detection capability may be considered sufficient. If there are only a few vessels in the same region, a lower radar detection capability may be considered sufficient. If the vessel 110 is located at a difficult navigation route, higher detection capabilities may be required.
  • the DCV server 130 may transmit the detection capability indicator to the DCV client 112. This enables the DCV client 112 to be informed about the detection capability of the vessel 110, or at least one sensor of the vessel 110.
  • the DCV client 112 may receive the detection capability indicator. Alternatively, the DCV client 112 may determine the detection capability information itself, as will be further described below.
  • the detection capability indicator may be used to determine a confidence level for one or more subsequent detections. For example, if a detection capability indicator for a positioning system of the vessel 110 is low, this may be indicated as a low confidence level when reporting the location of the vessel 110 to the DCV server 130. Similarly, a confidence level may be associated with the object information reported to the DVC server, for example at 716. The DCV server 130 may use confidence levels associated with different detections, for example when processing the verification information. For example, detections with low confidence level may be discarded or a low weight may be given to detections with low confidence level. A low confidence level may be also taken into account for one or more dependent detections.
  • the navigation control data may be determined based on at least one confidence level associated with the information of the at least one object detected by the vessel 110, the location of the vessel 110, and/or the verification information.
  • the DCV server 130 may transmit model update information to the DCV client 112.
  • the model update information may comprise a local detection model.
  • the local detection model may comprise a detection model that has been designed or trained for use at the current region of the vessel 110. Providing a new local detection model enables to significantly improve detection capability of the vessel 110.
  • the DCV server 130 may transmit an update to the detection model 114 of the vessel 110 or calibration data for the detection model 114 of the vessel 110.
  • An update of the detection model 114 may comprise new values for at least a subset of weights, or other parameters, of the detection model 114.
  • the update information may comprise calibration data for the detection model 114.
  • Detections performed by the vessel 110, or particular sensor(s) of the vessel 110 may exhibit a systematic error, such as for example an offset in a location of a detected object, for example due to disruption of positioning signals.
  • the calibration data may therefore comprise data to adjust outputs of the sensor(s) and/or the detection model, for example to eliminate or reduce the systematic error. Transmission of the local detection model, the update, or the calibration data may be in response to determining that the value of the detection capability indicator is not sufficient.
  • the detection model 114 of the vessel 110 may be updated at the DCV server 130.
  • the DCV client 112 may transmit its detection model 114 to the DCV server 130.
  • the DCV server may update the detection model 114 received from the DCV client 112 and provide an updated detection model to the DCV client 112 at 712.
  • the DCV server 130 may retrain or fine-tune the detection model 114 of the vessel 110 based on training data that is representative of the local conditions or local domain.
  • the DCV client 112 may update or replace the detection model 114 based on the model update information received from DCV server 130.
  • the updated detection model may be executed to detect objects, similar to 704.
  • the updated detection model 114 may provide information of the detected objects to the DCV client 112, similar to 705.
  • the DCV client 112 may transmit the object information output by the updated detection model, location information of the vessel 110, and/or other information to the DCV server 130, similar to 707.
  • the DCV client 112 may update the detection model 114 based on the verification dataset provided at 702.
  • the DCV client 112 may for example use the ground-truth labels associated with objects of the verification dataset to retrain or fine-tune the detection model 114. This enables the detection capability of the vessel 110 to be improved locally at the vessel 110. Transmission of the model update information at 712 may be therefore optional.
  • the DCV client 112 may transmit the determined detection capability indicator to the DCV server 130.
  • the DCV server 130 may determine a detection capability indicator based on the received object information and the verification information, for which an update may be also received from the at least one sensor station 120 and/or the at least one other data source 140.
  • Operations 712 to 717 describe updating the detection model 114 and verifying detection capability of the updated detection model. It is however noted that updating the detection model 114 is optional and other measures may be applied to ensure safe traffic management. For example, navigation of the vessel 110 may be restricted instead of updating of the detection model 114. This approach may be applied for example if the vessel 110 does not have sufficient resources, for example memory resources or computational power, to execute the updated model, or if the vessel 110 is otherwise incompatible with the updated model.
  • the DCV server 130 may determine navigation control data for the vessel 110 based on the detection capability indicator and/or a current environmental context of the vessel 110.
  • the DCV server 130 may further transmit the navigation control data to the DCV client 112.
  • the navigation control data may comprise navigation instructions or restrictions for the vessel 110.
  • the navigation control data may comprise at least one of: a restriction or a permission to enter a region, a restriction or a permission to perform automatic docking to a harbor, a requirement for piloting or remote piloting, a maximum allowed speed, a request or a command to enter a safe navigation mode, a request or a command to transmit an indication of low detection capability, or a request or a command to provide a visual indication of the low detection capability.
  • Transmitting, for example broadcasting, or providing a visual indication of the low detection capability may be used to inform other vessel(s) 116 about the low detection capability.
  • a safe navigation mode may refer to set of predetermined navigation parameters, for example a limp state, which enables the vessel 110 to be safely routed for example to a closest harbor.
  • the navigation control data may further comprise an indication of an allowed navigation route and/or parameters of the allowed navigation route.
  • the DCV client 112 may receive the navigation control data from the DCV server 130.
  • the DCV server 130 may take into account the current environmental context of the vessel 110 when determining the navigation control data. As discussed above, a plurality of detection capability indicators corresponding to different environmental conditions may be determined for the vessel 110. When determining the navigation control data, the DCV server 130 may use the detection capability that is associated with current environmental context information of the vessel 110, for example current visibility or intensity of rainfall. This enables the DCV server 130 to optimize traffic management with respect to efficiency and safety in different environmental conditions.
  • determining the navigation control data for the vessel 110 may be further based on detection capability indicators determined for one or more other vessels 116. For example, even if detection capability of the vessel 110 is sufficient, navigation of the vessel 110 may be more strictly controlled if there are other vessels 116 around that do not have sufficient detection capabilities.
  • the DCV server 130 may transmit the navigation control data to the DCV client 112.
  • the DCV server 130 may further transmit an indication of environmental conditions applicable for the navigation control data.
  • the DCV server 130 may transmit multiple instances of the navigation control data, where each instance of the navigation control data may correspond to different environmental conditions.
  • the DCV client 112 may determine the detection capability indicator, for example based on the ground-truth labels or other verification information associated with the detected objects.
  • the DCV client 112 may further determine the navigation control data for the vessel 110, similar to 718. This enables local verification of detection capabilities at the vessel 110.
  • the DCV client 112 may store the determined detection capability indicator or a detection capability history.
  • the DCV client 112 may further use the determined detection capability history for determining a trend of detection capability and/or notifying a user 111 or the DCV server 130 about a change or the trend in the detection capability, as described above.
  • the DCV client 112 may control navigation of the vessel 110 according to the navigation control data.
  • the DCV client 112 may provide the navigation control data to a navigation system of the vessel 110.
  • the navigation system may comprise an autonomous or a semi-autonomous navigation system such as for example an automatic steering control (ASC) system.
  • the automatic steering control system may comprise an autopilot or other means to automatically control navigation of the vessel 110.
  • the automatic steering control system may be configured to control speed and direction of the vessel 110 and have access to various vessel sub-systems such as for example steering or motor control.
  • the navigation control data may be displayed to user 111, for example the captain of the vessel 110, in order to enable the user 111 to monitor the automatic navigation or to manually navigate the vessel 110 according to the navigation control data.
  • the navigation system may configure or reconfigure its automatic steering control system based on the received navigation control data.
  • the DCV client 112 may determine the instance of navigation control data to be applied based on the current environmental context of the vessel 110. For example, in case of high illuminance the DCV client 112 may determine to use an instance of navigation control data associated with high illuminance. This enables the vessel 110 to be navigated more efficiently in good environmental conditions.
  • FIG. 8 illustrates an example of a visualization of navigation control data, according to an example embodiment.
  • the navigation control data may for example indicate borders 801, 802 of an allowed navigation route for the vessel 110.
  • the navigation control data may further indicate at least one location, for example within the allowed navigation route.
  • the at least one location may be associated with at least one navigation control parameter.
  • the at least one location may for example comprise at least one checkpoint 803 to 807 or at least one geographical zone 808.
  • a checkpoint may comprise a discrete location or a line, for example between the borders 801, 802 of the allowed navigation route.
  • the allowed navigation route, and/or parameters thereof, may be determined taking into account the detection capabilities of the vessel 110, other vessels 116 and/or their detection capabilities, and/or environmental context information such as for example current weather conditions or weather forecast information. For example, a width of the allowed navigation route may be determined based on the detection capability indicator(s). A wider allowed navigation route may be allocated to a vessel with better detection capabilities. This provides more freedom for an autonomous or semi-autonomous vessel or a captain of a manually navigated vessel to make navigation decisions.
  • the navigation control data enables the DCV server 130, or alternatively the DCV client 112, to control navigation of the vessel 110 such that current detection capability of the vessel 110 is taken into account.
  • the at least one navigation control parameter may comprise at least one of: a requested entry time at a checkpoint; a requested direction, a course over ground, and/or speed after crossing a checkpoint or at a geographical zone; or a maximum allowed speed after crossing a checkpoint or at a geographical zone.
  • a vessel with low detection capabilities may be controlled more strictly by determining more checkpoints or indicating a lower maximum allowed speed in a region, for example the zone 808, which may be close to object 118.
  • detection capabilities of different types of vehicles may be verified during operation, maintenance, or inspection.
  • One example application is certification or standardization of autonomous vehicles, semi-autonomous vehicles, smart vehicles, or sensors, devices, or systems associated with such vehicles, for example an autonomy engine of a vehicle.
  • a laptop or other device comprising the DCV server 130 may be connected to the vehicle, and the detection model(s) 114 of the vehicle may be executed based on corresponding verification dataset(s) to verify detection capability of the vehicle or the sensor(s).
  • the DCV server 130 may provide a certification of the vehicle and/or one or more sensors, for example in response to determining that a detection capability of the vehicle and/or the sensor(s) is sufficient. Same approach may be applied also when determining whether a vehicle imported from aboard has sufficient detection capabilities for local traffic. Furthermore, updates of the detection model(s) 114 may be performed when local environmental conditions, such as for example appearance of road signs or other navigation assets change. In such a scenario, it may be beneficial to verify detection capabilities of vehicles before initiating large-scale detection model updates or vehicle recalls, in order to determine which vehicles need an update.
  • a sensor manufacturer may deploy DCV client 112 in sensor modules of vehicles. For example, a radar manufacturer may provide DCV client 112 in its radar modules.
  • a DCV server 130 may be configured to determine detection capability indicator(s) for the sensor(s) equipped with the DCV client 112. Reliability of detections performed based on corresponding sensor(s) may be estimated based on the detection capability indicator(s).
  • FIG. 9 illustrates an example of a method 900 for determining a detection capability indicator at a server, according to an example embodiment.
  • the method may comprise receiving, from a client associated with a vehicle, information associated with at least one object detected by the vehicle and/or a location of the vehicle.
  • the method may comprise obtaining verification information associated with the at least one object or the location of the vehicle from at least one data source.
  • the method may comprise determining a detection capability indicator for the vehicle based on the verification information and at least one of: the information associated with the at least one object or the location of the vehicle.
  • the method may further comprise transmitting the detection capability indicator to the client.
  • the method may further comprise determining navigation control data for the vehicle based on the detection capability indicator
  • the method may further comprise transmitting the navigation control data to the client.
  • the method may further comprise receiving, from the client, environmental context information associated with detection of the at least one object.
  • the method may further comprise determining the detection capability indicator for the vehicle based on the environmental context information.
  • the method may further comprise associating the detection capability indicator with the environmental context information.
  • the method may further comprise the at least one data source may comprise at least one of: at least one other vehicle, at least one sensor station, or a vehicle traffic system.
  • the at least one sensor station may comprise a camera, a radar, or a lidar.
  • the method may further comprise transmitting a verification dataset comprising the at least one object to the client.
  • the method may further comprise transmitting, to the client, for example in response to determining that a value of the detection capability indicator is not sufficient, at least one of: a local detection model; an update to a detection model of the vehicle; or calibration data for the detection model of the client.
  • the method may further comprise determining navigation control data for the vehicle based on the detection capability indicator and/or a current environmental context of the vehicle.
  • the information associated with the at least one object may comprise at least one of: a location of the at least one object, a heading of the at least one object, a speed of the at least one object, a velocity of the at least one object, a size of the at least one object, or a classification of the at least one object.
  • the at least one object may comprise at least one of: another vehicle, a sea mark, or a landmark.
  • the verification dataset may comprise emulated sensor data for at least one sensor of the vehicle.
  • the at least one sensor of the vehicle may comprise at least one of: a camera, a lidar, a radar, or a sonar.
  • the verification dataset may comprise one or more ground-truth labels for the at least one object.
  • the detection capability indicator may be associated with the at least one sensor of the vehicle.
  • FIG. 10 illustrates an example of a method 1000 for receiving a detection capability indicator at a client, according to an example embodiment.
  • the method may comprise detecting a location of a vehicle.
  • the method may comprise detecting at least one object.
  • the method may comprise determining information associated with the at least one object.
  • the method may comprise transmitting the information associated with the at least one object or the location of the vehicle to a server.
  • the method may comprise receiving, from the server, a detection capability indicator associated with the vehicle.
  • the detection capability indicator may be associated with environmental context information.
  • the method may further comprise determining navigation control data for the vehicle based on the detection capability indicator and/or a current environmental context of the vehicle.
  • the information associated with the at least one object comprises at least one of: a location of the at least one object, a heading of the at least one object, a speed of the at least one object, a velocity of the at least one object, a size of the at least one object, or a classification of the at least one object.
  • the at least one object may comprises at least one of: another vehicle, a sea mark, or a landmark.
  • the method may further comprise receiving, from the server, a verification dataset comprising the at least one object.
  • the method may further comprise executing a detection model on the verification dataset to detect the at least one object.
  • the verification dataset may comprise emulated sensor data for at least one sensor of the vehicle.
  • the at least one sensor of the vehicle may comprise at least one of: a camera, a lidar, a radar, or a sonar.
  • the verification dataset may comprise one or more ground-truth labels for the at least one object.
  • the method may further comprise determining the detection capability indicator based on comparing the information associated with the at least one object with the one or more ground-truth labels.
  • the detection capability indicator may be associated with the at least one sensor of the vehicle.
  • An apparatus may be configured to perform or cause performance of any aspect of the method(s) described herein.
  • a computer program may comprise instructions for causing, when executed, an apparatus to perform any aspect of the method(s) described herein.
  • an apparatus may comprise means for performing any aspect of the method(s) described herein.
  • the means comprises at least one processor, and memory including program code, the at least one processor, and program code configured to, when executed by the at least one processor, cause performance of any aspect of the method(s).
  • subjects may be referred to as 'first' or 'second' subjects, this does not necessarily indicate any order or importance of the subjects. Instead, such attributes may be used solely for the purpose of making a difference between subjects.

Landscapes

  • Engineering & Computer Science (AREA)
  • Ocean & Marine Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
EP20168606.0A 2020-04-08 2020-04-08 Erkennungsfähigkeitsverifizierung für fahrzeuge Withdrawn EP3893223A1 (de)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP20168606.0A EP3893223A1 (de) 2020-04-08 2020-04-08 Erkennungsfähigkeitsverifizierung für fahrzeuge

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP20168606.0A EP3893223A1 (de) 2020-04-08 2020-04-08 Erkennungsfähigkeitsverifizierung für fahrzeuge

Publications (1)

Publication Number Publication Date
EP3893223A1 true EP3893223A1 (de) 2021-10-13

Family

ID=70227927

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20168606.0A Withdrawn EP3893223A1 (de) 2020-04-08 2020-04-08 Erkennungsfähigkeitsverifizierung für fahrzeuge

Country Status (1)

Country Link
EP (1) EP3893223A1 (de)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2610636A1 (de) * 2011-12-29 2013-07-03 Windward Ltd. Bereitstellung einer nahezu Echtzeit-Meeresüberwachung anhand von Satellitenbildmaterial und externen Daten
US9183711B2 (en) * 2010-08-03 2015-11-10 Selex Sistemi Integrati S.P.A. Anti-piracy system for the maritime navigation in critical areas, and device for data extraction from on board sensors
US20180204458A1 (en) * 2014-03-04 2018-07-19 Waymo Llc Reporting Road Event Data and Sharing with Other Vehicles
US20200050893A1 (en) * 2018-08-10 2020-02-13 Buffalo Automation Group Inc. Training a deep learning system for maritime applications

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9183711B2 (en) * 2010-08-03 2015-11-10 Selex Sistemi Integrati S.P.A. Anti-piracy system for the maritime navigation in critical areas, and device for data extraction from on board sensors
EP2610636A1 (de) * 2011-12-29 2013-07-03 Windward Ltd. Bereitstellung einer nahezu Echtzeit-Meeresüberwachung anhand von Satellitenbildmaterial und externen Daten
US20180204458A1 (en) * 2014-03-04 2018-07-19 Waymo Llc Reporting Road Event Data and Sharing with Other Vehicles
US20200050893A1 (en) * 2018-08-10 2020-02-13 Buffalo Automation Group Inc. Training a deep learning system for maritime applications

Similar Documents

Publication Publication Date Title
US10852749B2 (en) Learning good features for visual odometry
CN113366496B (zh) 用于粗略和精细对象分类的神经网络
US11507084B2 (en) Collaborative 3-D environment map for computer-assisted or autonomous driving vehicles
US11783568B2 (en) Object classification using extra-regional context
US10671068B1 (en) Shared sensor data across sensor processing pipelines
US20170206431A1 (en) Object detection and classification in images
US9069376B2 (en) Unpredictable vehicle navigation
CN109552212B (zh) 用于自主车辆中的雷达定位的系统和方法
US20180024239A1 (en) Systems and methods for radar localization in autonomous vehicles
US20210356953A1 (en) Deviation detection for uncrewed vehicle navigation paths
JP2020004366A (ja) 情報処理装置、情報処理方法、及び、プログラム
CN115019060A (zh) 目标识别方法、目标识别模型的训练方法及装置
EP3893223A1 (de) Erkennungsfähigkeitsverifizierung für fahrzeuge
US20240071122A1 (en) Object recognition method and time-of-flight object recognition circuitry
EP3989034B1 (de) Automatische auswahl eines sicheren landeplatzes für unbemannte luftfahrtsysteme
US8237607B2 (en) Tracking coordinator for air-to-air and air-to-ground tracking
CN110633616A (zh) 信息处理装置、信息处理方法、以及记录介质
EP3865819A1 (de) Virtueller korridor zur fahrzeugnavigation
US20230022049A1 (en) Vessel field of awareness apparatus and method
US20240125919A1 (en) Scintillation-based neural network for radar target classification
US20240094382A1 (en) Systems and methods for using computer vision to guide processing of receive responses of radar sensors of a vehicle
US20230061238A1 (en) A camera system and a method for determining a weather situation
US20230024799A1 (en) Method, system and computer program product for the automated locating of a vehicle
CN117496513A (zh) 语义分割中的图像异常检测方法、装置及存储介质
CN116929399A (zh) 驾驶路径搜索方法、装置、设备和自动驾驶车辆

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210224

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20231101