WO2021184134A1 - Système de surveillance et d'identification d'actions d'objets ; et gestion en temps réel desdits objets sur la base des actions identifiées, qui permet la détection de situations de risque augmentant la sécurité du fonctionnement des équipements et des personnes impliquées - Google Patents

Système de surveillance et d'identification d'actions d'objets ; et gestion en temps réel desdits objets sur la base des actions identifiées, qui permet la détection de situations de risque augmentant la sécurité du fonctionnement des équipements et des personnes impliquées Download PDF

Info

Publication number
WO2021184134A1
WO2021184134A1 PCT/CL2020/050025 CL2020050025W WO2021184134A1 WO 2021184134 A1 WO2021184134 A1 WO 2021184134A1 CL 2020050025 W CL2020050025 W CL 2020050025W WO 2021184134 A1 WO2021184134 A1 WO 2021184134A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
time
structural module
video
points
Prior art date
Application number
PCT/CL2020/050025
Other languages
English (en)
Spanish (es)
Inventor
Gabriel Eugenio PAIS CERNA
Original Assignee
Axion Spa
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Axion Spa filed Critical Axion Spa
Priority to PCT/CL2020/050025 priority Critical patent/WO2021184134A1/fr
Publication of WO2021184134A1 publication Critical patent/WO2021184134A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
    • G08G1/127Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams to a central station ; Indicators in a central station

Definitions

  • the present invention is framed in operations that require the coordination of vehicle dispatch, mainly in mining operations, although the invention can be used in any industry where the general optimization of vehicle dispatch is required, either in goods transportation industries, such as mail, or courier companies, as well as public transport companies such as buses, subways, or fleets with multiple vehicles that require the necessary coordination to meet the objectives imposed by the specific industry, either, reduce operating times, increase transportation of passengers or cargo, reduce waiting times, etc.
  • This choice should not be considered a limitation in the applications of the present invention, but should be considered as an example to represent the main elements of the real-time monitoring system that allows projecting the productivity of each monitored vehicle, this information being useful for a decision maker in different production operations.
  • the quality of information available to generate a system that allows optimizing the monitoring of equipment has been complex to implement as a standard, given that at the time of this request there are only complex and expensive technologies that do not allow to fully capture with quality, quantity and sufficient granularity of said information.
  • the present invention aims to develop a system that allows the acquisition of video, wireless or cable transmission, and analysis by computer vision algorithms for the measurement of Key Performance Indicators (KPIs, for its acronym in English) of the unit sub-operations that constitute the operation of loading and transporting equipment in mining operations as well as the auxiliary equipment that participates in the operation.
  • KPIs Key Performance Indicators
  • the autonomous video recording of the different operations of the site will allow accidents to be recorded.
  • This real-time record of the operation allows a decision maker to review the security situation (s) and update security protocols if necessary.
  • this autonomous video recording allows the operation of the site to be reviewed remotely by expert personnel when the system throws alerts against certain safety indicators that are not meeting their normal values in the operation.
  • W02020000383A1 describes systems and methods for object detection in order to achieve real-time detection with low latency.
  • the system comprises a first CPU which performs general tasks, it also comprises a second CPU which handles the image frames coming from a camera and furthermore said CPU computes the data captured by said camera. Additionally, it is indicated that said system is capable of interpreting the data and recognizing the objects in an image.
  • WO2019203921 A1 describes an object recognition system, which uses integral channel function detectors. Such a system extracts a candidate target region from a particular image, and then generates a modified confidence score based on the location and detection height of the candidate target. The candidate regions are classified based on neural networks, and the result is classified objects. It is stated that if the classified object is a system target, a device can be controlled based on that target.
  • WO2019162241 A1 describes an object detection system based on neural networks.
  • the described system incorporates a network configured to receive a deep image formatted as RGB as initial data, and computes an output data which is indicative of at least one characteristic of an object.
  • the system is configured to receive that output data and compute predictions about the location of a region in the received deep image that includes an object and an object class. It is further indicated that this system is in real time.
  • US2019066304A1 describes systems and methods related to the segmentation of detected objects a view delivered as input via a camera in real time.
  • the object segmentation system works in such a way that, upon receiving data as input, the system takes images from that video camera, and later the images are processed using an algorithm that uses machine-learning to identify or recognize one. or more objects.
  • US2018189573A1 describes a system for target detection and tracking, which includes technologies to detect pedestrian and / or vehicle movements in a real world environment, which manages static and dynamic occlusions, and which continues to track targets that are moves across fields of view from multiple cameras.
  • US2016224837A1 describes a method and a real-time system for the recognition of objects and people's faces.
  • the system uses multiple video or camera data to collect information about the location of the objects to be detected, and then this information is transmitted to a web-based distributor.
  • the system is adaptive and the data is heuristically analyzed.
  • US2012243730A1 describes a collaborative object analysis system.
  • the collaborative object analysis capability allows a group of cameras to analyze an object collaboratively, even when the object is in motion. Analysis of an object can identify the object, tracking the object while the object is in motion, analysis of one or more characteristics of the object, and the like.
  • a camera is configured to discover the camera capacity information for one or more neighboring cameras, and to generate, based on said camera capacity information, one or more actions to be performed by one or more neighboring cameras. to facilitate object analysis.
  • the collaborative object analysis capability also enables additional functions related to object analysis, such as alert functions, archive functions (for example, storage of captured video, object tracking information, object recognition information, etc. ) and the like.
  • the solution consists of a system that allows to monitor industrial cargo vehicles and auxiliary vehicles in real time, and in particular allows to identify actions of said vehicles, to manage and project different KPIs and adherence to the shift plan in loading and transportation.
  • the invention proposes an autonomous system comprising the following components: (1) a control system, (2) a telemetry system, (3) a high-capacity image analysis system, and (4) a module structural.
  • Figure 1 This figure compares the monitoring systems in comparison to the present invention called BERTUS.
  • the characteristics that were evaluated were the following: origin of registration, constant monitoring of procedures, real time, constant registration, data quality, alarm for accident prevention, and complete fleet (the entire fleet is considered).
  • the structural module that houses the system, the high-capacity telemetry system and the image acquisition system comprises the following components: (a) WiFi directional antenna that provides 5 Ghz signal, b) solar kit (solar panel plus deep cycle battery), c) microcomputer; and d) high resolution IP cameras.
  • FIG. 3 This figure shows a summary diagram of the components of the present invention and how they work in two different configurations.
  • the monitoring of a zone of the mining site (primary crushing) that is being monitored by an IP camera located in the structural module (BERTUS cam-mobile) is shown.
  • Said primary crushing activity is in a line of sight with respect to the control system, therefore using a single structural module that has a directional antenna that provides 5 Ghz it is possible to transmit the captured information to the control system.
  • the configuration of the invention is exemplified when the monitoring point does not have a clear line of sight with respect to the control system.
  • FIG. 4 This figure shows an image of the object visualization software where the identified object is observed and the different parameters of said object are shown.
  • this solution is complementary to current dispatch systems, as it is focused on monitoring and transmitting the sub-unit operations of loading, transportation and auxiliary vehicles online.
  • Some of these sub-operations are: shovel waiting time, shovel loading time in front, loading shovel maneuvering time, shovel unloading time, truck departure time, truck travel time by route section, truck arrival time at destination, operation start time, rest time, number of trucks in queue and number of arrests.
  • the sum of the times of these sub-unit operations consists of the time of the unit operation and the online monitoring of each of these will allow to deliver online tools to the dispatcher to make real-time decisions on how to allocate their resources, or they can also be integrated as input to the dispatch recommendation software currently.
  • the technical effects expected and effectively achieved by the present invention are: (1) increased safety for people at mining sites, (2) increased effective use of loading, transportation and auxiliary vehicles, and (3) increase in the effective performance of the operation, which contributes to an increase in the production of the mining sites where said system is implemented.
  • the invention comprises a video-based real-time monitoring system comprising at least the following components:
  • High capacity telemetry system This component consists of the transmission of large volumes of online video from the location of the camera to a control center, which cannot be in a radius greater than 2.5 km.
  • the high capacity telemetry system and the Image analysis are housed in a structural module that has the following components: (a) a solar kit (solar panel plus deep cycle battery), (b) WiFi directional antenna that provides a 5 Ghz signal, (c) microcomputer; and d) high resolution IP cameras.
  • the structural module may be a four-wheeled cart comprising each of the elements (a), (b), (c) and (d), which were named above.
  • said structural module corresponds to that of figure 2 of the present application.
  • the carriage can be exchanged for any other object that allows each of said components to be supported simultaneously.
  • the elements of the system necessary to carry out this activity acquire the following configuration:
  • the monitoring system must have only one control system
  • the high capacity telemetry system is housed in a single structural module that consists of the following components: (a) a solar kit (solar panel plus deep cycle battery), (b) directional WiFi antenna that provides signal in 5 Ghz, and c) high resolution IP cameras;
  • An image analysis system comprising only a microcomputer, which is connected to the high resolution IP camera that is in the structural module described in point (2).
  • the system works as follows: a) IP cameras take images of at least 30 squares per second; b) The microcomputer processes the image information obtained at each time, during the video recording by the IP camera and generates a text file with the quantitative information of the image; c) The text file generated in point b) is sent to the control system through the 5 Ghz WiFi connection provided by the directional antenna; d) In the control center, the information is integrated with the information of the other objective points, it is consolidated, analyzed and projected to check if these values are sub-standards with respect to previously established KPI's; e) If there are abnormal KPI ' s values, an alarm is generated that alerts decision makers of the abnormal value and shows the video of the situation associated with said alarm; f) The decision maker in response to this alert may request the viewing of the video for a period of time greater than that shown in the alert video and may also monitor the operation online by video; and g) The decision maker, in response to this alert and subsequent visualization of the situation, will carry out
  • the monitoring system must have only one control system
  • the high capacity telemetry system is housed in a single structural module that consists of the following components: (a) a solar kit (solar panel plus deep cycle battery), (b) directional WiFi antenna that provides signal in 5 Ghz, and c) high resolution IP cameras;
  • a second structural module that consists of the following components: (a) a solar kit (solar panel plus deep cycle battery) and (b) WiFi directional antenna that provides 5 Ghz signal.
  • IP cameras take images of at least 30 squares per second;
  • the microcomputer processes the image information obtained at each time, by the IP camera and generates a text file with a size of the order of kb;
  • the information is integrated with the information from the other objective points, it is consolidated, analyzed and projected to check if these values are sub-standard with respect to previously established KPIs;
  • the decision maker in response to this alert may request the viewing of the video for a period of time greater than that shown in the alert video and may also monitor the operation online by video;
  • the monitoring system must have only one control system
  • the high-capacity telemetry system is housed in a single structural module that consists of the following components: (a) a solar kit (panel solar plus deep cycle battery), (b) WiFi directional antenna that provides 5 Ghz signal, and c) high resolution IP cameras;
  • IP cameras take images of at least 30 squares per second
  • the microcomputer processes the image information obtained at each time, by the IP camera and generates a text file with a size of the order of kb;
  • a structural module that is at a distance less than or equal to 2.5 km in a straight line and without obstacles in its path, sends the text file obtained in point b) to the control system, through of the 5 Ghz Wfi connection provided by the directional antenna of said structural module;
  • the decision maker in response to this alert may request the viewing of the video for a period of time greater than that shown in the alert video and You can also monitor the operation online by video;
  • the actions to be identified in the mining objects are the following: shovel waiting time, shovel loading time in front, loaded shovel maneuvering time, shovel unloading time, truck exit time, truck travel time per route section, truck arrival at destination, operation start time, rest time, number of trucks in queue and number of arrests, among others.
  • the actions associated with security to be identified are the following: equipment not allowed, person in a place not allowed, team in a place not allowed, near equipment, among others
  • the structural modules are as many as the job requires, and can be between 1 to 1000.
  • the time between when an image is taken and arriving at the dispatch and control center is between 0.01 second and 120 seconds.
  • said time is preferably between 0.01 second and 15 seconds.
  • the system described in the present invention makes it possible to simultaneously identify a plurality of different objects.
  • the radioelectric spectrum is divided into frequency bands that are tendered to third parties for their exclusive use in order to minimize unnecessary interference.
  • certain bands released for general use For example, the 2.4 GHz and 5 GHz bands are bands released almost everywhere in the world, used by the IEEE 802.11 standard widely accepted by the Wi-Fi alliance. These standards include the IEEE 802.11h in 2.4GHz that reaches speeds of 300 Mbps and the IEEE 802.11ac in 5GHz that achieves theoretical speeds of 1, 3 Gb / s. Both standards have been implemented in a multitude of equipment, being the least popular 802.11ac today and with less radio noise interference.
  • Computer vision is defined as all the methods for acquiring, processing and analyzing images, or series of images, with the aim of producing useful information for decision-making.
  • autonomously control robots drones, autonomous cars, mechanical arms, among others
  • the identification of objects and the detection of events can be highlighted.
  • Object identification is based on locating specific objects in an image or video.
  • Image filtering where filters are used to eliminate unnecessary noise (filters: Gaussian, Median, Bilateral) or to recover certain important characteristics (contour detector or other features).
  • Filters Gaussian, Median, Bilateral
  • contour detector or other features Depending on the complexity of the object or the environment, in some cases it is possible to identify the object only with successive filters and techniques based on the appearance and characteristics of the object; Algorithm training, where many images with and without the object are required to train a classifier that can distinguish when the object is in the scene.
  • the images used in this way must be previously marked with the object; Using the Classifier, once trained, it is possible to use the classifier for new images outside the training group, obtaining the location of the object or objects in the image. Too There is the ability to obtain a particular state of the object under study, for example, it is possible to obtain the state of deterioration of a car or the model to which it belongs.
  • Event detection is a technique that allows to recognize when a certain event occurs in time, for example, if the object went out of view range or if it stopped somewhere.
  • the methods usually used are based on calculating measurements of distance between pixels of two or more images in time and creating motion vectors that can provide data on the event that needs to be monitored.
  • High capacity telemetry in industrial environment Wireless communication between the camera and the processing center requires understanding of electromagnetic waves and digital communications. Communication must be continuous and high bandwidth ( ⁇ 60 Mb / s) in hostile terrain (which means you will not always have line of sight), which represents a challenge in terms of calculating the link budget, choice of equipment and choice of network architecture (taking into account that it must be expandable to more cameras).
  • Image analysis In computer vision there is a great challenge in analyzing the main characteristics of the images and programming the algorithms necessary for the identification of objects and events to be tracked. Many of these algorithms need particular adjustments depending on the particular characteristics of the problem to be solved. Identifying these characteristics and implementing them is a task that takes time. Additionally, some techniques require a training phase where the system must be fed with pre-processed data, the preparation of this data is also a difficulty to take into account.
  • the main areas that are necessary to master are; image processing and analysis, machine learning and event identification.
  • the differentiating attributes are as follows:
  • Video information of each loading point and transport route in the mine facilitates risk prevention and accident analysis itself for the improvement and updating of work procedures.
  • Variable 2 Robustness of the signal. That is, percentage of the data received, in relation to the data sent
  • the present invention has application in all industries that require optimization in the dispatch of transport equipment.
  • the present invention has been described with a particular focus on the mining industry, without this implying that the applicant waives the other applications that the invention may have, such as, for example, dispatch of fleets of buses, airplanes, ships, etc. trucks, or other forms of transportation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Transportation (AREA)
  • Economics (AREA)
  • Automation & Control Theory (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Mechanical Engineering (AREA)
  • Quality & Reliability (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Alarm Systems (AREA)

Abstract

La solution consiste en un système permettant de surveiller des véhicules industriels de chargement et des véhicules auxiliaires en temps réel et, en particulier, permettant d'identifier des actions desdits véhicules pour gérer et projeter différents ICP et l'adhésion du plan du jour en ce qui concerne le chargement et le transport. Au vu de ce qui précède, l'invention concerne un système autonome comprenant les composants suivants : (1) un système de commande, (2) un système de télémétrie, (3) un système d'analyse d'images à capacité élevée, et (4) un module structural.
PCT/CL2020/050025 2020-03-19 2020-03-19 Système de surveillance et d'identification d'actions d'objets ; et gestion en temps réel desdits objets sur la base des actions identifiées, qui permet la détection de situations de risque augmentant la sécurité du fonctionnement des équipements et des personnes impliquées WO2021184134A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CL2020/050025 WO2021184134A1 (fr) 2020-03-19 2020-03-19 Système de surveillance et d'identification d'actions d'objets ; et gestion en temps réel desdits objets sur la base des actions identifiées, qui permet la détection de situations de risque augmentant la sécurité du fonctionnement des équipements et des personnes impliquées

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CL2020/050025 WO2021184134A1 (fr) 2020-03-19 2020-03-19 Système de surveillance et d'identification d'actions d'objets ; et gestion en temps réel desdits objets sur la base des actions identifiées, qui permet la détection de situations de risque augmentant la sécurité du fonctionnement des équipements et des personnes impliquées

Publications (1)

Publication Number Publication Date
WO2021184134A1 true WO2021184134A1 (fr) 2021-09-23

Family

ID=77769506

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CL2020/050025 WO2021184134A1 (fr) 2020-03-19 2020-03-19 Système de surveillance et d'identification d'actions d'objets ; et gestion en temps réel desdits objets sur la base des actions identifiées, qui permet la détection de situations de risque augmentant la sécurité du fonctionnement des équipements et des personnes impliquées

Country Status (1)

Country Link
WO (1) WO2021184134A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100201829A1 (en) * 2009-02-09 2010-08-12 Andrzej Skoskiewicz Camera aiming using an electronic positioning system for the target
US20140244096A1 (en) * 2013-02-27 2014-08-28 Electronics And Telecommunications Research Institute Apparatus and method for cooperative autonomous driving between vehicle and driver
US20170139411A1 (en) * 2015-11-16 2017-05-18 Polysync Technologies, Inc. Autonomous Vehicle Platform and Safety Architecture
CL2018001513A1 (es) * 2015-12-15 2018-10-19 Freeport Mcmoran Inc Análisis basado en la velocidad del vehículo
CL2018003557A1 (es) * 2018-12-10 2019-02-15 Axion Spa Sistema de monitoreo y despacho en tiempo real, de equipos, que permite la detección de situaciones de riesgo aumentando la seguridad de la operación de equipos y personas involucradas.

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100201829A1 (en) * 2009-02-09 2010-08-12 Andrzej Skoskiewicz Camera aiming using an electronic positioning system for the target
US20140244096A1 (en) * 2013-02-27 2014-08-28 Electronics And Telecommunications Research Institute Apparatus and method for cooperative autonomous driving between vehicle and driver
US20170139411A1 (en) * 2015-11-16 2017-05-18 Polysync Technologies, Inc. Autonomous Vehicle Platform and Safety Architecture
CL2018001513A1 (es) * 2015-12-15 2018-10-19 Freeport Mcmoran Inc Análisis basado en la velocidad del vehículo
CL2018003557A1 (es) * 2018-12-10 2019-02-15 Axion Spa Sistema de monitoreo y despacho en tiempo real, de equipos, que permite la detección de situaciones de riesgo aumentando la seguridad de la operación de equipos y personas involucradas.

Similar Documents

Publication Publication Date Title
US10489982B2 (en) Device, system and method for controlling a display screen using a knowledge graph
US10552687B2 (en) Visual monitoring of queues using auxillary devices
KR102260120B1 (ko) 딥러닝기반 행동인식장치 및 그 장치의 구동방법
DE112018006556T5 (de) Trainieren eines maschinenlernmodells mit digitalem audio und/oder video
CN103280108B (zh) 基于视觉感知和车联网的客车安全预警系统
Zear et al. Intelligent transport system: A progressive review
Narzt et al. Be-in/be-out with bluetooth low energy: Implicit ticketing for public transportation systems
CN110035390B (zh) 基于uwb定位动态围栏的隧道安全管理方法、装置及系统
DE102013019488A1 (de) Bilderfassung mit schutz der privatsphäre
US11669909B1 (en) Vehicle inspection systems and methods
JP2014176092A5 (fr)
CN110602449A (zh) 一种基于视觉的大场景下施工安全智能监控系统方法
US20190073735A1 (en) Enhanced alert/notification system for law enforcement identifying and tracking of stolen vehicles and cargo
DE112018007122T5 (de) Bereitstellen von verfügbarkeitsindikatoren in einer geteilten fahrzeugumgebung
CN107045807A (zh) 利用共享的运载工具危险传感器数据的地面上的运载工具碰撞避免
DE112015000123T5 (de) System zum Teilen von Informationen zwischen Fußgängern und einem Fahrer
Garibotto et al. White paper on industrial applications of computer vision and pattern recognition
Badura et al. Intelligent traffic system: Cooperation of MANET and image processing
Kumar et al. Convergence of IoT, Blockchain, and Computational Intelligence in Smart Cities
CN117135172A (zh) 一种基于物联网的智慧城市事故救援方法和系统
US20230334675A1 (en) Object tracking integration method and integrating apparatus
Mahale et al. Vehicle and passenger identification in public transportation to fortify smart city indices
Reis et al. Network management by smartphones sensors thresholds in an integrated control system for hazardous materials transportation
WO2021184134A1 (fr) Système de surveillance et d'identification d'actions d'objets ; et gestion en temps réel desdits objets sur la base des actions identifiées, qui permet la détection de situations de risque augmentant la sécurité du fonctionnement des équipements et des personnes impliquées
KR101714723B1 (ko) 승객 밀집 정보를 제공하는 방법 및 서비스 서버

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20925400

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20925400

Country of ref document: EP

Kind code of ref document: A1