US20240174274A1 - Obstacle detection for a rail vehicle - Google Patents

Obstacle detection for a rail vehicle Download PDF

Info

Publication number
US20240174274A1
US20240174274A1 US18/552,489 US202218552489A US2024174274A1 US 20240174274 A1 US20240174274 A1 US 20240174274A1 US 202218552489 A US202218552489 A US 202218552489A US 2024174274 A1 US2024174274 A1 US 2024174274A1
Authority
US
United States
Prior art keywords
rail vehicle
objects
determining
data
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/552,489
Inventor
Andrew Palmer
Egzone Ademi
Christian Wendt
Wolfgang WAIZENEGGER
Andreas Wiratanaya
Christoph Zimmermann
Jost Schnee
Jonathan Pilborough
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HAYS AG
Siemens Mobility GmbH
Original Assignee
HAYS AG
Siemens Mobility GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HAYS AG, Siemens Mobility GmbH filed Critical HAYS AG
Assigned to HAYS AG reassignment HAYS AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WAIZENEGGER, Wolfgang
Assigned to ABIDAT GMBH reassignment ABIDAT GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ADEMI, Egzone, WENDT, CHRISTIAN
Assigned to Siemens Mobility GmbH reassignment Siemens Mobility GmbH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PILBOROUGH, Jonathan, WIRATANAYA, Andreas, SCHNEE, JOST, PALMER, ANDREW, ZIMMERMANN, CHRISTOPH
Assigned to Siemens Mobility GmbH reassignment Siemens Mobility GmbH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HAYS AG
Assigned to Siemens Mobility GmbH reassignment Siemens Mobility GmbH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ABIDAT GMBH
Publication of US20240174274A1 publication Critical patent/US20240174274A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L23/00Control, warning or like safety means along the route or between vehicles or trains
    • B61L23/04Control, warning or like safety means along the route or between vehicles or trains for monitoring the mechanical state of the route
    • B61L23/041Obstacle detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L15/00Indicators provided on the vehicle or train for signalling purposes
    • B61L15/0072On-board train data handling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L2205/00Communication or navigation systems for railway traffic
    • B61L2205/04Satellite based navigation systems, e.g. global positioning system [GPS]

Definitions

  • the invention relates to a method for obstacle detection for a rail vehicle.
  • the invention also relates to an obstacle detection device.
  • the invention relates to a rail vehicle.
  • This notification is particularly important in regions in which a view of the rails is impeded, e.g. due to curves, vegetation, buildings or walls, and the trains do not have enough braking distance to come to a standstill before the obstacle due to their late inevitably late detection thereof.
  • the images are converted into a depth image or into a point cloud, allowing generic objects to be detected.
  • the 3D position of the detected object is compared with a detected course of the line in three dimensions in order to determine whether the object represents a potential danger or a potential obstacle.
  • Camera-based approaches have the disadvantage that they require suitable light conditions in order to function. At night in particular, RGB cameras are unable to provide usable results. Therefore alternative camera technologies are usually applied, e.g. thermal imaging cameras, in order to improve the performance of the system under all possible light conditions.
  • Lidar-based solutions are likewise already used. However, these solutions do not function for object detection if the tracks are not visible, whether this is due to the course of the line or due to external visibility restrictions such as e.g. weather conditions, snow coverage, etc. In both cases, the 3D information supplied by the sensors is compared with the course of the line that has been detected by means of camera images, in order to determine whether obstacles are present.
  • Lidar-based rail detection functions on the basis of the highly reflective metallic surface of the rails and only at a short distance from a rail vehicle. It is moreover likewise dependent on the sensor being able to capture the course of the line.
  • the position of the rail vehicle can be estimated by means of location-finding algorithms and on the basis of a combination of sensors, e.g. GNSS, IMU, radar, odometry, balises, etc.
  • sensors e.g. GNSS, IMU, radar, odometry, balises, etc.
  • Landmarks can take the form of e.g. parts of the fixed rail infrastructure such as e.g. signals, signs or sets of points.
  • map-based approaches by means of which it is theoretically possible to increase the accuracy of position-finding, has failed in the past because the determination of the position, and in particular the orientation, of the rail vehicle relative to the map must be extremely accurate in order to determine whether a detected object is located on the stretch of track. For example, a slight error of e.g. 0.1 percent in the determination of the orientation of the rail vehicle results in a positional error or a positional deviation of 1 m at a range of 600 m.
  • the object is therefore to provide a method and an apparatus for obstacle detection for rail vehicles, offering greater reliability, resilience and precision than previous approaches to the solution.
  • This object is achieved by a method for obstacle detection for a rail vehicle according to claim 1 , an obstacle detection device according to claim 10 , and a rail vehicle according to claim 11 .
  • a monitoring region in front of the rail vehicle is determined as a function of a position and a course of a track on which the train vehicle is travelling and a position and dimensions of a monitoring profile of the rail vehicle.
  • the monitoring profile comprises a clearance gage of the rail vehicle.
  • the clearance gage comprises a maximum vehicle cross section or a maximum vehicle gage as a function of the proper motion of the rail vehicle, which in turn depends on the curve radius and the true speed.
  • the monitoring profile can also comprise an extended clearance gage, which includes e.g. a predefined safety zone around the actual core region or the actual clearance gage.
  • the monitoring profile can also be divided into zones at different distances around the rail vehicle.
  • the clearance gage is surrounded by a so-called emergency profile and the emergency profile by a warning profile.
  • provision can be made for incremental countermeasures such as e.g. the output of warning signals, regular braking or emergency braking.
  • the course of the track is understood to be the trajectory of the track and the spatial or geographical course thereof.
  • sensor data for determining the position and orientation of the rail vehicle is captured and the position and spatial course of the track on which the train vehicle is travelling are determined on the basis of an alignment of the determined position and orientation of the rail vehicle with digital map data.
  • the captured sensor data is therefore used for auto-location-finding of the rail vehicle. If position and orientation of the rail vehicle are known with sufficient precision, a position of the track section that the train will be travelling on and a position of the monitoring profile of the rail vehicle can also be determined therefrom. If e.g. the position or orientation of the rail vehicle captured by the sensor data are only imprecisely known due to possible variations in the sensor data, this imprecise data can be aligned with the digital map data. If e.g.
  • a position of the rail vehicle does not lie precisely on a track, the position can be corrected to the effect that the rail vehicle has to be located on the track.
  • the measured orientation of the rail vehicle can likewise vary from an orientation of a track section on which the rail vehicle, according to its sensor-based position data, must be located.
  • information which is captured by on-board sensors in particular positional information, relating to stationary objects in the environment of the rail vehicle, e.g. landmarks or the rail section in front of the rail vehicle, can be aligned with map data in order thus to reduce inaccuracies when determining the position and the orientation of the rail vehicle.
  • objects in an environment of the rail vehicle are captured by sensors of the rail vehicle.
  • the sensors for capturing the environment are designed to provide images, i.e. they must depict the environment and offer sufficient resolution for obstacles to be captured and detected as such.
  • the images can be realized as 2D images or 3D images. Specific embodiments are explained in detail below. Provision is also made for determining which objects are potential collision obstacles by checking whether and to what extent they overlap with the monitoring region.
  • a reaction or response to those objects categorized as potential obstacles is determined as a function of geometric and kinetic object attributes.
  • the check whether and to what extent the captured objects overlap with the monitoring region, and the determination of the object attributes, can take place both as part of the object capture and after subsequent tracking of the detected objects.
  • a situational evaluation can be e.g. confirmed or revised and the intersection of the tracked objects with the monitoring region can be determined more precisely or a change of the intersection can be captured on the basis of the dynamic proper motion of the objects.
  • the reaction or response is usually determined by specifications in the rail region, which specify e.g. minimum object sizes, overlap areas, etc. that must be reached in order to trigger a braking operation.
  • the typical procedure is to calculate the service braking distance and the emergency braking distance of the rail vehicle as a function of the current speed of the rail vehicle and other information relating to the rail vehicle such as e.g. its weight and the incline or the gradient of the line section ahead.
  • Objects of sufficient size and having a sufficiently large area of overlap with the monitoring region are then treated as collision obstacles and a response or reaction is determined on the basis of the interval to the collision obstacle and the braking distance of the rail vehicle. If e.g. the interval to a collision obstacle is less than the service braking distance, the emergency brake can be activated.
  • an acoustic warning can be transmitted to the collision obstacle by means of the signal hooter of the rail vehicle. If the interval to the collision obstacle is in a range in which a standard braking maneuver is initiated, it is possible either to initiate such a braking maneuver or, if the interval to the collision obstacle is greater than the service braking distance by a predefined threshold value, to refrain from a braking maneuver initially and merely activate the signal hooter.
  • the nature of the response can be modified on the basis of specific partial volumes with which the collision obstacle overlaps in the monitoring region, e.g. a warning volume or an emergency volume, and the detected properties of collision obstacles.
  • the obstacle information can also be transmitted to a remote operations control center in order to draw general attention to the obstacle.
  • a remote operations control center can also issue additional instructions in situations in which an automatic rail vehicle is not allowed to make a decision independently.
  • a rail vehicle can send sensor data to the operations control center after coming to a halt in front of a detected collision obstacle. A person assesses the collision obstacle properly there and decides whether the rail vehicle can run over the collision obstacle without danger, or whether the collision obstacle has been moved aside sufficiently that the rail vehicle can continue onwards.
  • the sensor data transmitted to the operations control center can enable a human operator to establish which device could be used to clear the collision obstacle.
  • a device may comprise e.g. a cutting tool for a fallen tree or a piece of construction machinery for a rock.
  • the inventive method can be supplemented by the use of infrastructure-based obstacle detection systems. Since these systems are static, they can easily monitor a section of a rail string for collision obstacles. Such systems can advantageously be installed in regions with poor visibility such as e.g. regions with blind spots and regions with a high risk of collision such as e.g. railway stations. They can also be used to monitor the entrances of tunnels. In this case, they can not only monitor whether a potential collision obstacle is present on the line section but also whether an object such as e.g. a person or an animal has entered a tunnel, and can therefore transmit a warning that a collision obstacle might be present in the tunnel. This information can then be used to influence the strategy of a rail vehicle, e.g. to trigger a braking maneuver or select a lower speed for the rail vehicle.
  • a collision danger is advantageously determined with greater precision and greater reliability than is possible using conventional procedures, thereby improving the safety and the effectiveness of automatic rail traffic.
  • the range of the collision obstacle detection is increased since this depends in particular on the accuracy of the measurement of the orientation of the rail vehicle and the orientation of the sensors that are used to detect the collision obstacle.
  • this way of determining the course of the line can also be realized for line sections which are not straight or have reduced visibility of the course of the line, e.g. due to snow coverage or limited conditions of visibility.
  • the inventive method allows the detection of generic objects on the basis of 3D data. This is advantageous in particular over methods which are based on machine learning and by means of which only specific classes of objects can therefore be detected and identified.
  • the inventive obstacle detection device has a monitoring region determination unit for determining a monitoring region in front of a rail vehicle.
  • the determination of the monitoring region in particular the position and extent thereof, is effected as a function of a position and a course of a track on which the train vehicle is travelling, and a position and dimensions of a monitoring profile of the rail vehicle.
  • the determination of the monitoring region is effected by means of capturing sensor data, e.g. GNSS data, and determining the position and orientation of the rail vehicle on the basis of this sensor data.
  • the position and the course of the track on which the train vehicle is travelling are then determined on the basis of an alignment of the determined position and orientation of the rail vehicle with digital map data.
  • the inventive obstacle detection device also includes a sensor unit for the sensory capture of objects in the environment of the rail vehicle.
  • a sensor unit for the sensory capture of objects in the environment of the rail vehicle.
  • Any suitable sensors for capturing, locating and possibly identifying objects can be used as sensor units, e.g. cameras, lidar systems or radar systems.
  • the inventive obstacle detection device further comprises an obstacle determination unit for determining which objects are possible collision obstacles, specifically by checking whether and to what extent these objects overlap with the monitoring region.
  • the inventive obstacle detection device has a response determination unit for determining a reaction to the objects that are categorized as potential collision obstacles. The determination of a reaction is effected as a function of the previously cited geometric and kinetic attributes of the objects.
  • the inventive obstacle detection device has the same advantages as the inventive method for obstacle detection for a rail vehicle.
  • the inventive rail vehicle has the inventive obstacle detection device and a control device for controlling a driving strategy of the rail vehicle as a function of a reaction which is determined by the obstacle detection device.
  • the inventive rail vehicle has the same advantages as the inventive obstacle detection device.
  • Some components of the inventive obstacle detection device can, possibly after adding hardware systems such as e.g. a sensor unit, be embodied primarily in the form of software components. This relates in particular to the monitoring region determination unit, the obstacle determination unit and the response determination unit.
  • these components can in principle also be realized partly in the form of programmable hardware, e.g. FPGAs or similar, especially when particularly high-speed calculations are required.
  • the required interfaces can take the form of software interfaces, e.g. when only a transfer of data from other software components is required.
  • they can also take the form of hardware-based interfaces which are activated by suitable software.
  • a largely software-based realization has the advantage that computing systems which are already present in a rail vehicle can also be upgraded easily by means of a software update after possibly adding further hardware elements, e.g. additional sensor units, in order to operate in the inventive manner.
  • the object of the invention is also achieved by a corresponding computer program product comprising a computer program which can be loaded directly into a storage device of such a computing system, said computer program having program sections for executing the inventive method steps that can be realized by means of software when the computer program is executed in the computing system.
  • Such a computer program product can, in addition to the computer program, optionally comprise additional elements such as e.g. documentation and/or additional components including hardware components such as e.g. hardware keys (dongles, etc.) for using the software.
  • additional elements such as e.g. documentation and/or additional components including hardware components such as e.g. hardware keys (dongles, etc.) for using the software.
  • a computer-readable medium e.g. a memory stick, hard disc or other transportable or permanently installed data medium
  • the computing unit can have e.g. one or more interworking microprocessors or similar.
  • the inventive method for obstacle detection for a rail vehicle in order to achieve even greater precision when locating obstacles, provision is made for calibrating the position measurement and orientation measurement of the rail vehicle by determining positional data relating to landmarks or other distinctive features in an environment of the rail vehicle on the basis of sensor data from the environment.
  • the positional data that is determined in relation to the landmarks is then compared with the corresponding positional data relating to the landmarks in the digital map data. If the values of the positional data based on the sensor data and on the digital map data differ from each other, this variation is used as a calibration value in order to make an adjustment to the position determination and orientation determination, i.e. the auto-location-finding of the rail vehicle and the orientation measurement thereof, with reference to the sensor data.
  • the precision of the position and orientation of the rail vehicle is advantageously improved further.
  • the accuracy is particularly important when determining the position and orientation of the rail vehicle because, at greater distance, even a small inaccuracy when determining this data can cause a significant variation in the specification of a position of a possible collision obstacle.
  • the attributes of the objects preferably comprise at least one of the following attribute types: an object is characterized e.g. by its approximate dimensions and area, or by information indicating which partial volume it overlaps, or by the interval from the rail vehicle to the object, or by the current speed of the rail vehicle.
  • the monitoring profile in particular the extended clearance gage of the rail vehicle, can comprise e.g. a plurality of differently dimensioned partial profiles which are nested one within the other and are each assigned a different danger level. According to the partial profile, or the partial volume assigned to the respective partial profile, into which the detected object has already intruded, and as a function of the distance of the object from rail vehicle, an incremental reaction of the rail vehicle can be triggered.
  • a warning signal can be emitted initially and, if the object intrudes into an inner volume or if the object moves closer to the rail vehicle, or in the case of a corresponding speed and direction of motion of the object and/or of the rail vehicle, a braking maneuver can be triggered.
  • a monitoring region is determined on the basis of a digital map or land map, and the position and orientation of the rail vehicle are determined in the digital map, preferably on the basis of a combination of sensor data.
  • Sensor data which can return information about the environment of the rail vehicle, as well as intervals and orientations of objects relative to the rail vehicle, is advantageously combined with map data in order thereby to determine the position and orientation of the rail vehicle. If the rail vehicle is situated at a specific position relative to a position indicated in the digital map, the position of the rail vehicle can then be specified exactly.
  • an estimated position and estimated orientation for the rail vehicle can initially be determined with the aid of sensor data which includes a degree of inaccuracy.
  • sensor data preferably comprises at least one of the following data types:
  • the feature data comprises at least one of the following data types:
  • a known position of landmarks or the track can advantageously be used to determine a variation of a position measurement and/or orientation measurement by the sensors and to undertake a corresponding adjustment.
  • various sensor data or sensor data of various types is advantageously combined in order to determine the position and orientation of the rail vehicle in the digital map.
  • a multisensor fusion method is therefore applied.
  • said combination comprises filtering by means of a Kalman filter or particle filter and/or graph-based methods and/or sensor-specific filtering.
  • the status of the rail vehicle is preferably restricted to its movement along the track.
  • the accuracy of the position-finding and the determination of the orientation of the rail vehicle can be further improved thus.
  • the position of the rail vehicle is restricted by the movement of the contact points, i.e. the wheels, along the track.
  • the orientation of the fixed vehicle body can, due to the mounting on rotatable running gear and as a result of the spring suspension, vary to some extent from the orientation of the track at the contact points.
  • the inventive method for obstacle detection for a rail vehicle provision is preferably also made for individual sensors to be calibrated with the digital map in order to correct effects of vibration or drift in the extrinsic calibration.
  • the initially estimated position and orientation of the rail vehicle or the sensor can be used as a starting point, and the position and orientation of the sensor in the map can then be made more precise with reference to landmarks and/or other features that have been detected in the environment of the rail vehicle.
  • landmarks can be used to specify more accurately the position and orientation of the sensor, while the detected rail string can be used to refine the determination of the orientation of the sensor.
  • the calibration therefore also preferably comprises the following steps:
  • the step of the sensory capture of objects in the environment of the rail vehicle most preferably comprises determining a position and a size of the objects in the environment on the basis of sensor data.
  • This data can advantageously be used for the purpose of a danger assessment and for the purpose of determining an adequate reaction to the obstacle that is present.
  • a point cloud can be generated in particular from raster-type scanning of the environment. Such an option is applied e.g. when a lidar system is deployed for the purpose of object detection.
  • the point cloud is then broken down into individual objects by means of a so-called clustering method or a cluster analysis. This is understood to mean a method for detecting similarity structures in usually relatively large pools of data. These groups of measured points found thus and having similar properties are referred to as clusters, and the group assignment as clustering.
  • the cluster analysis is intended to identify new groups in the data.
  • a clustering method for clustering on the basis of the spatial distances of the points comprises e.g. a Euclidian distance clustering algorithm.
  • the individual objects are then checked to determine whether they have an overlap region with the monitoring region of the rail vehicle. It is also advantageously possible to identify and separate generic objects that cannot be detected using a machine-learning based method for specific object classes.
  • a closest object point within the monitoring profile is determined, said object point lying closest to the rail vehicle as measured along the course of the line.
  • all object points are grouped together in a 2D plane by combining all projection planes, said projection planes running through each object point and being oriented orthogonally to the track.
  • a projection plane is determined which runs through the closest object point relative to the vehicle along the rail string and is oriented orthogonally to the track on which the rail vehicle is travelling. All object points of the cluster are then projected onto the projection plane. In this way, all detectable points of a detected object lie on a plane which represents the shortest interval from the object to the rail vehicle.
  • the projection of the object thus generated can then be compared using a sectional area of the monitoring region comprising the projection plane. It is moreover possible to determine geometric object attributes of the object or for the point cloud. Such object attributes comprise the area, the dimensions and the position of the object or of the entire point cloud relative to the rail vehicle, and the area of the subset of the point cloud within the monitoring region or the partial volumes. For the purpose of specifying these object attributes, it is possible e.g. to form a bounding box around this point cloud, to calculate a convex envelope, or also to create a grid layout map.
  • the dataset that must be processed can advantageously be limited to that part of an object which is situated in the vicinity of the rail region or the monitoring region. Furthermore, greater accuracy can be achieved by directly calculating an overlap volume or an overlap area on the initial point cloud, than by first performing a modeling step for the complete object, e.g. forming a bounding box and performing the calculation on the basis of the model.
  • the dynamic behavior of a possible collision object can also be captured by tracking this object.
  • measured data relating to the object e.g. its position, speed, size and sectional areas can be processed with the aid of a filter, e.g. a Kalman filter or a particle filter.
  • a filter e.g. a Kalman filter or a particle filter.
  • the dynamic attributes of an object such as e.g. speed or acceleration do not have to be measured directly, but can be derived from a plurality of positions at different time points.
  • An intersection of the tracked object with the monitoring region can also be determined during the tracking.
  • Object attributes such as e.g. the distance, the height and width, the sectional area of the projected point cloud with the monitoring region, but also typical tracking attributes such as e.g. speed and confidence, are also used to assess the collision danger for this variant.
  • FIG. 1 shows a flow diagram which illustrates a method for obstacle detection for a rail vehicle according to an exemplary embodiment of the invention
  • FIG. 2 shows a schematic representation of a scenario for obstacle detection by a rail vehicle, where the actual orientation of a rail vehicle varies from the determined orientation
  • FIG. 3 shows a schematic representation of determining a projection plane of the object points of a cluster according to an exemplary embodiment of the invention
  • FIG. 4 shows a schematic representation of a projection of an object, wherein some of the object points lie in a sectional area of the monitoring region,
  • FIG. 5 shows a schematic 2D representation of a bounding box around that part of the object which lies within the sectional area of the monitoring region comprising the projection plane,
  • FIG. 6 shows a schematic 2D representation of a convex envelope around that part of the object which lies within the sectional area of the monitoring region comprising the projection plane,
  • FIG. 7 shows a schematic representation of a grid layout map which comprises the sectional area of the monitoring region comprising the projection plane
  • FIG. 8 shows a schematic representation of an obstacle detection device according to an exemplary embodiment of the invention.
  • FIG. 9 shows a schematic representation of a rail vehicle according to an exemplary embodiment of the invention.
  • FIG. 1 shows a flow diagram 100 which illustrates a method for obstacle detection for a rail vehicle 2 (see FIG. 2 ) according to an exemplary embodiment of the invention.
  • step 1 .I for a process which monitors the surroundings of the rail vehicle 2 , information relating to a monitoring profile PR, e.g. a clearance gage, and in particular its position PS relative to the rail vehicle 2 is provided together with digital map data LK for obstacle detection, said information and data being required subsequently for the purpose of determining a position and dimensions of a monitoring region V U .
  • the map data LK and the profile data PR can be stored in a database in combined form, but can also be obtained separately from different sources.
  • sensor data SDL is captured for the purpose of auto-location-finding of the rail vehicle 2 .
  • GNSS data or data from an inertial measurement unit is captured for this purpose.
  • step 1 .III information relating to the surroundings of the rail vehicle 2 , in particular sensor data from landmarks LM and sensor data from objects O, is captured by sensors.
  • the surroundings or the environment comprises at least the monitoring region V U . It may however extend beyond the monitoring region V U , e.g. in order to detect moving objects O at an early stage.
  • Objects O are detected within the sensor data, which depicts the environment of the rail vehicle.
  • the objects O can be e.g. vehicles, people, or even trees, shopping carts or rocks that have fallen onto the track.
  • the digital map data LK, the sensor data SDL for auto-location-finding of the rail vehicle 2 , and the sensor data from the landmarks LM are processed in order to determine a position PSF and orientation OR of the rail vehicle 2 and, from this, the position PG of the current track section and possibly also the orientation thereof.
  • the true speed of the rail vehicle together with odometry data can be used for the purpose of position-finding.
  • the knowledge of the position PS and dimensions of a monitoring profile of the rail vehicle 2 are also used to determine suitable dimensions, e.g. the required width, height and extent along the line section, as well as a suitable positioning of the monitoring region V U .
  • the monitoring region V U is determined on the basis of the determined position PSF and orientation OR of the rail vehicle 2 , the position PG of the current track section, the knowledge of the position PS of the monitoring profile PR and the dimensions thereof, said monitoring region V U comprising the region which is captured in front of the rail vehicle 2 by the on-board sensors, and through which the rail vehicle 2 will next travel, and in which a collision could therefore occur. Adjacent tracks can also form part of the monitoring region V U .
  • step 1 .VI provision is made for determining which objects O represent potential collision obstacles KH. This determination takes the form of checking whether and to what extent they overlap with the monitoring region V U . If such an overlap exists, a collision danger is present. If a detected object O is sufficiently far outside the monitoring region V U and its anticipated trajectory also does not cross the monitoring region V U , which moves dynamically in the direction of travel, the object O can be categorized as not dangerous. Otherwise, the object O must be categorized as a potential collision object or potential collision obstacle KH.
  • a reaction R of the rail vehicle 2 to the objects O categorized as potential collision obstacles KH is determined.
  • the type of the reaction R is determined as a function of object attributes such as e.g. the approximate dimensions of the respective object O. In the case of small non-human objects such as e.g. small animals, for example, it is better to simply drive over them and full braking would not be a proportionate response to such an obstacle. Otherwise in the case of people or large obstacles it is essential to avoid a collision in order to avoid personal injury or damage to the rail vehicle 2 .
  • the reaction R is also determined in the step 1 .VII as a function of the volume with which and the extent to which the object O overlaps.
  • the potential collision object KH is situated in the peripheral region of the monitoring region V U , it might be possible to avoid a collision with the rail vehicle 2 even as it continues its run. However, if the potential collision object KH is in the center of the monitoring region V U , braking is essential when the potential collision object KH is above a certain size. Furthermore, the reaction R is also dependent on the interval from the rail vehicle 2 to the collision object KH. If the collision object KH is still very far away, relatively gentle braking and the emission of a warning signal might suffice if it is a living object or an object that is controlled by a person. In this way, the object O can be provoked into a reaction or induced to leave the danger region.
  • the instructions might prescribe reducing the speed of the rail vehicle in order at least to moderate the impact with the collision obstacle KH on the line section.
  • the reaction R is also determined as a function of the current speed of the rail vehicle 2 .
  • FIG. 2 shows a schematic representation 20 of a scenario for obstacle detection by a rail vehicle 2 , with calibration and adjustment of object location-finding by the rail vehicle 2 .
  • the scenario 20 comprises five partial scenarios 20 a , 20 b , 20 c , 20 d , 20 e , by means of which individual aspects of the object location-finding and the calibration or adjustment are shown in detail.
  • a first scenario 20 a on the far left in FIG. 2 illustrates how and at what positions the sensors of the rail vehicle 2 , which is located at the bottom edge of the first scenario 20 a , perceive the individual objects in its environment.
  • a motor vehicle O, two trees O 2 , O 3 and three landmarks LM 1 , LM 2 , LM 3 are captured by the sensors.
  • the cited objects are shown with hatching marks.
  • a second scenario 20 b shows a map extract in which are marked the captured landmarks LM 1 , LM 2 , LM 3 and the track GK on which the rail vehicle 2 is travelling.
  • the rail vehicle 2 is also marked at the bottom edge of the map in the second scenario 20 b.
  • a fourth scenario 20 d the first three scenarios 20 a , 20 b , 20 c are shown superimposed on top of each other.
  • the map positions of the landmarks LM 1 , LM 2 , LM 3 vary from the positions of the landmarks LM 1 , LM 2 , LM 3 as captured by the rail vehicle 2 .
  • the detected motor vehicle O is situated outside the monitoring region V U of the rail vehicle 2 .
  • one of the trees O 3 appears to project into the monitoring region V U of the rail vehicle 2 .
  • the actual orientation of the rail vehicle 2 also varies slightly from the self-determined orientation, this being evident in that the orientation of the rail vehicle 2 in the map and the self-determined orientation do not correspond exactly.
  • correction of the sensor measurement is now undertaken in an adjustment step so that the positions of the landmarks LM 1 , LM 2 , LM 3 as determined by the sensor measurement and the map data correspond.
  • the fifth scenario 20 e The result can be seen in the fifth scenario 20 e . It can now be seen there that the motor vehicle O is situated in the monitoring region V U of the rail vehicle 2 , and is even already on the tracks GK, and therefore represents a potential collision obstacle. Conversely, it is clear from the fifth scenario 20 e that the tree O 3 is not situated in the monitoring region V U of the rail vehicle 2 and therefore does not represent a danger to the rail vehicle 2 .
  • FIG. 3 illustrates a schematic representation 30 of a determination of a projection plane PRE of a point cloud CL of an object O according to an exemplary embodiment of the invention.
  • the representation 30 is shown as a plan view.
  • the right-hand side of FIG. 3 shows a track GK comprising two rails S with crossties SW, on which the rail vehicle 2 travels in an x-direction or in the direction of the arrow, i.e. from the bottom towards the top in FIG. 3 .
  • a position X N on the track GK is now determined, where the projection plane PRE cuts through the position X N and, extending orthogonally to the course of the track in the direction of travel, through that point OPN of the object O which is closest to the rail vehicle 2 as viewed in an x-direction.
  • this position X N is the point at which a center line of the track GK has the smallest interval to the closest point OPN of the object O.
  • the 2D positions of the individual object points OP of the point cloud CL that is assigned to the object O are then determined in the projection plane PRE.
  • FIG. 4 shows a schematic 2D representation 40 of a point cloud CL, some of the object points OP of the point cloud CL lying in a sectional area F U of the monitoring region V U and forming the so-called intersection points and some of the object points OP of the point cloud CL lying outside this sectional area F U .
  • the representation shown in FIG. 4 can therefore be considered as a cross-sectional representation at the x-position X N .
  • shown at the bottom edge of the sectional area F U in FIG. 4 are the cross-sectional areas of the rails S of the track GK on which the rail vehicle 2 is running.
  • the sectional area F U provides a model of the object O, in order to assess the relevance of a possible collision obstacle and plan suitable countermeasures.
  • a schematic 2D representation 50 of a bounding box BB is shown around that part of the point cloud CL which lies within the sectional area F U of the monitoring region V U .
  • the bounding box BB is represented as a square with broken-line edges.
  • a convex envelope CH can also be used instead of a rectangular bounding box BB to contain the object points OP.
  • FIG. 6 shows a schematic 2D representation 60 in which, for the purpose of determining the area of the intersection points, a polygonal convex envelope CH marked by broken lines is placed around that part of the projected point cloud CL which lies within the sectional area F U of the monitoring region V U .
  • That part of the point cloud CL which lies within the sectional area F U of the monitoring region V U , or the monitoring area F U can also be included in a grid layout map.
  • FIG. 7 shows a schematic representation 70 of a grid layout map GBK comprising the sectional area F U of the monitoring region V U .
  • the grid layout map GBK comprises a grid matrix of square fields OG. Occupied fields OG of the grid layout map GBK are shaded gray in FIG. 8 . Subsequent evaluations can then be performed with ease on the basis of the grid fields OG that are occupied or unoccupied by digital data, thereby further reducing the computing effort.
  • FIG. 8 shows a schematic representation of an obstacle detection device 80 for a rail vehicle 2 according to an exemplary embodiment of the invention.
  • the obstacle detection device 80 comprises a plurality of sensor units 85 (shown on the left-hand side of the image) for auto-location-finding of the rail vehicle 2 and a number of further sensor units 82 for capturing the surroundings.
  • the sensor units 85 for auto-location-finding comprise sensors that are based on different physical principles, e.g. inertial sensors, satellite navigation systems, speed sensors or receive units for receiving radio signals from infrastructure equipment, by means of which the obstacle detection device 80 can determine the position PSF and orientation OR of the rail vehicle 2 .
  • the sensor units 82 for capturing the surroundings comprise sensors for depicting the surroundings of the rail vehicle, e.g.
  • the obstacle detection device 80 also comprises a map and profile unit 87 by means of which the digital map data LK can be provided together with data relating to the dimensions of a monitoring profile PR and the position PS of the monitoring profile PR.
  • the map data LK and profile data PR can also be obtained from a plurality of separate and different sources.
  • IMU inertial measurement unit
  • the obstacle detection device 80 also comprises a monitoring region determination unit 88 which, on the basis of the monitoring profile PR, the position PS of the monitoring profile PR, the position PSF and orientation OR of the rail vehicle 2 , as well as a position PG and a course of that part of the rail string on which the rail vehicle 2 is currently running, determines a monitoring volume or the position and dimensions of a monitoring region V U .
  • the sensor data relating to the captured objects O or positions thereof is transmitted to an obstacle determination unit 83 which, on the basis of the positions of the objects O together with the knowledge of the position and extent of the monitoring region V U , determines which objects O are possible collision obstacles KH. Details of the procedure for this are explained with reference to FIG. 3 to FIG. 7 .
  • the captured data relating to the determined collision obstacles KH e.g. their position, their approximate dimensions and the interval from the collision obstacles KH to the rail vehicle 2 , are transmitted to a response determination unit 84 .
  • the response determination unit 84 is configured to determine a reaction R to the objects O that are categorized as potential obstacles KH, as a function of the approximate dimensions G of the respective object O, the size of the collision volume with which it overlaps, the interval to the object O, and the current speed of the rail vehicle.
  • FIG. 9 shows a schematic representation 90 of a rail vehicle 2 according to an exemplary embodiment of the invention.
  • the rail vehicle 2 in FIG. 9 is travelling on a track GK from left to right, i.e. in the direction of the arrow.
  • the rail vehicle 2 comprises an obstacle detection device 80 with sensor units 82 for capturing the surroundings and the other units, these being constructed and functioning in the manner illustrated in FIG. 8 .
  • the obstacle detection device 80 determines the position PSF and orientation OR of the rail vehicle 2 as well as the position PG of the track GK and the position of the monitoring region V U in front of the rail vehicle 2 .
  • a reaction R determined by the obstacle detection device 80 is transmitted to a control device 86 of the rail vehicle 2 , which controls the rail vehicle 2 in accordance with the determined reaction R. For example, a warning signal is generated and a braking maneuver is initiated in order to avoid a collision with an object O determined to be a collision obstacle KH.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method for obstacle detection for a rail vehicle includes determining a monitoring region in front of the vehicle based on position and course of a track which the vehicle travels and position and dimensions of a monitoring profile of the vehicle. During the determination, sensor data for determining position and orientation of the vehicle are captured and position and course of the track which the vehicle travels are determined based on alignment of the determined position and orientation of the vehicle with digital map data. Objects in the vehicle environment are captured by sensors of the vehicle. Which objects are potential collision obstacles is determined by checking whether and to what extent they overlap with the monitoring region. A reaction to the objects categorized as potential obstacles is determined based on geometric and kinetic attributes of the objects. An obstacle detection device and a vehicle are also provided.

Description

  • The invention relates to a method for obstacle detection for a rail vehicle. The invention also relates to an obstacle detection device. In addition, the invention relates to a rail vehicle.
  • In the context of rail traffic, it occasionally occurs that objects such as e.g. people or road vehicles, shopping carts that have been thrown onto the rails, or even lumps of rock or fallen trees, are found on the track and therefore represent a threat to the safety of the rail traffic and, in the case of people and road vehicles, are themselves seriously endangered by the possibility of a collision with a moving rail vehicle. Furthermore, objects such as e.g. scree or mud which are not so easily identifiable and in particular cannot easily be assigned to an object class may be found on the rails or in a monitoring region of a rail vehicle. Therefore some form of generic obstacle detection is necessary in order to recognize such objects as obstacles in a timely manner, even if they cannot be directly identified, in order to initiate a braking operation for an approaching rail vehicle so that a collision between the rail vehicle and such an object can be prevented. One of the critical tasks of a locomotive engineer is consequently to observe the rail region continuously for possible obstacles or objects which could collide with the rail vehicle and to decide whether a reaction thereto, e.g. a braking maneuver, must be initiated. To this end, the locomotive engineer monitors not only the actual rail region on which the train is travelling, but also adjacent rail strings. If an obstacle is detected on an adjacent stretch of track, the driver notifies a monitoring center which forewarns other trains. This notification is particularly important in regions in which a view of the rails is impeded, e.g. due to curves, vegetation, buildings or walls, and the trains do not have enough braking distance to come to a standstill before the obstacle due to their late inevitably late detection thereof.
  • Since the braking distance of a rail vehicle before coming to standstill is in the kilometer range for high-speed trains, and a possible collision object must also be detected in darkness or under any sort of weather conditions, the sensory capabilities of the driver are often not sufficient for this task, even on straight sections. Therefore provision must additionally be made for a technical device by means which an obstacle can also be detected under poor conditions of visibility and at greater distance.
  • In the case of driverless vehicle control, it is particularly important to have a system which automatically detects possible obstacles correctly and can decide whether they could represent a threat to the vehicle itself or to other road and rail users, since it is by definition not possible for a driver to monitor the stretch of track.
  • Automatic detection systems for potential obstacles have already been studied and developed experimentally. Infrastructure-based solutions in which a multiplicity of sensors are installed in order to monitor a specific region of a stretch of track are sometimes used to monitor regions in which accidents frequently occur, e.g. railroad crossings. It is however not feasible to equip the entire rail network with infrastructure-based obstacle detection systems. Therefore the requirement exists for an on-board system to perform the task of a driver to observe and monitor the stretch of track.
  • Most on-board systems use cameras in the form of monoscopic cameras or stereoscopic cameras. On the basis of the recorded images, the tracks are identified without the additional use of a map and solely by means of image analysis, this being based either on a conventional algorithm which functions in a similar manner to road marking detection, or on an approach that is based on machine learning. However, most of these approaches suffer from limited reliability in certain situations in which the visibility ahead and particularly in a monitoring region of a rail vehicle is limited.
  • For the purpose of approaches based on monoscopic cameras, use is generally made of an algorithm based on machine learning, said algorithm being applied in order to detect the relevant obstacles in the surroundings of the rail vehicle. The lower edge of the bounding box of the detected object can then be compared with the detected course of the line, in order to determine whether the object represents a possible collision obstacle. The interval between the object and the camera is usually determined on the basis of a flat-earth model. However, this has a limitation in that the detection is limited to classes of objects for which the algorithm has been trained, e.g. for vehicles and people only, and other objects are not detected or are ignored by the algorithm. Therefore generic obstacle detection is not possible using an approach which is based on machine learning. Safety certification for approaches based solely on machine learning has not been conceivable until now, since said approaches rely on random statistical methods and therefore do not satisfy the requirements of a safety certification.
  • In the case of approaches based on stereoscopic cameras, the images are converted into a depth image or into a point cloud, allowing generic objects to be detected. The 3D position of the detected object is compared with a detected course of the line in three dimensions in order to determine whether the object represents a potential danger or a potential obstacle.
  • Camera-based approaches have the disadvantage that they require suitable light conditions in order to function. At night in particular, RGB cameras are unable to provide usable results. Therefore alternative camera technologies are usually applied, e.g. thermal imaging cameras, in order to improve the performance of the system under all possible light conditions.
  • Active sensors such as e.g. lidar or radar systems are already in use. Radar sensors function under most weather conditions, but suffer from limited accuracy and resolution in comparison with lidar systems. Lidar-based solutions are likewise already used. However, these solutions do not function for object detection if the tracks are not visible, whether this is due to the course of the line or due to external visibility restrictions such as e.g. weather conditions, snow coverage, etc. In both cases, the 3D information supplied by the sensors is compared with the course of the line that has been detected by means of camera images, in order to determine whether obstacles are present.
  • One of the challenges faced by existing systems is reliably to detect the position of the stretch of track under all conditions. Camera-based detection of the course of the line functions well for straight sections and gentle curves, but relies on the rails being visible. Lidar-based rail detection functions on the basis of the highly reflective metallic surface of the rails and only at a short distance from a rail vehicle. It is moreover likewise dependent on the sensor being able to capture the course of the line.
  • Since optical rail detection only functions reliably for line sections with visible rails, e.g. approximately straight stretches of track, it has been proposed to deploy the on-board system only for stretches of track where the rails can be detected to a satisfactory degree, and to deploy approaches which are infrastructure-based or airborne-based in order to cover the other regions of the rail network.
  • The position of the rail vehicle can be estimated by means of location-finding algorithms and on the basis of a combination of sensors, e.g. GNSS, IMU, radar, odometry, balises, etc.
  • In other existing location-finding approaches, use is made of landmarks which are detected by cameras, lidar systems or radar systems in order to improve the location-finding. This option is useful when the GNSS coverage is poor. Landmarks can take the form of e.g. parts of the fixed rail infrastructure such as e.g. signals, signs or sets of points.
  • The implementation of map-based approaches, by means of which it is theoretically possible to increase the accuracy of position-finding, has failed in the past because the determination of the position, and in particular the orientation, of the rail vehicle relative to the map must be extremely accurate in order to determine whether a detected object is located on the stretch of track. For example, a slight error of e.g. 0.1 percent in the determination of the orientation of the rail vehicle results in a positional error or a positional deviation of 1 m at a range of 600 m.
  • The object is therefore to provide a method and an apparatus for obstacle detection for rail vehicles, offering greater reliability, resilience and precision than previous approaches to the solution.
  • This object is achieved by a method for obstacle detection for a rail vehicle according to claim 1, an obstacle detection device according to claim 10, and a rail vehicle according to claim 11.
  • According to the inventive method for obstacle detection for a rail vehicle, a monitoring region in front of the rail vehicle is determined as a function of a position and a course of a track on which the train vehicle is travelling and a position and dimensions of a monitoring profile of the rail vehicle. The monitoring profile comprises a clearance gage of the rail vehicle. The clearance gage comprises a maximum vehicle cross section or a maximum vehicle gage as a function of the proper motion of the rail vehicle, which in turn depends on the curve radius and the true speed. The monitoring profile can also comprise an extended clearance gage, which includes e.g. a predefined safety zone around the actual core region or the actual clearance gage. The monitoring profile can also be divided into zones at different distances around the rail vehicle. For example, the clearance gage is surrounded by a so-called emergency profile and the emergency profile by a warning profile. According to the region into which an object intrudes, provision can be made for incremental countermeasures such as e.g. the output of warning signals, regular braking or emergency braking. The course of the track is understood to be the trajectory of the track and the spatial or geographical course thereof.
  • For the purpose of determining the monitoring region, sensor data for determining the position and orientation of the rail vehicle is captured and the position and spatial course of the track on which the train vehicle is travelling are determined on the basis of an alignment of the determined position and orientation of the rail vehicle with digital map data. The captured sensor data is therefore used for auto-location-finding of the rail vehicle. If position and orientation of the rail vehicle are known with sufficient precision, a position of the track section that the train will be travelling on and a position of the monitoring profile of the rail vehicle can also be determined therefrom. If e.g. the position or orientation of the rail vehicle captured by the sensor data are only imprecisely known due to possible variations in the sensor data, this imprecise data can be aligned with the digital map data. If e.g. a position of the rail vehicle does not lie precisely on a track, the position can be corrected to the effect that the rail vehicle has to be located on the track. The measured orientation of the rail vehicle can likewise vary from an orientation of a track section on which the rail vehicle, according to its sensor-based position data, must be located. In this case, as explained in detail below, information which is captured by on-board sensors, in particular positional information, relating to stationary objects in the environment of the rail vehicle, e.g. landmarks or the rail section in front of the rail vehicle, can be aligned with map data in order thus to reduce inaccuracies when determining the position and the orientation of the rail vehicle.
  • Furthermore, objects in an environment of the rail vehicle are captured by sensors of the rail vehicle. The sensors for capturing the environment are designed to provide images, i.e. they must depict the environment and offer sufficient resolution for obstacles to be captured and detected as such. The images can be realized as 2D images or 3D images. Specific embodiments are explained in detail below. Provision is also made for determining which objects are potential collision obstacles by checking whether and to what extent they overlap with the monitoring region.
  • Finally, a reaction or response to those objects categorized as potential obstacles is determined as a function of geometric and kinetic object attributes. The check whether and to what extent the captured objects overlap with the monitoring region, and the determination of the object attributes, can take place both as part of the object capture and after subsequent tracking of the detected objects. By means of said tracking, a situational evaluation can be e.g. confirmed or revised and the intersection of the tracked objects with the monitoring region can be determined more precisely or a change of the intersection can be captured on the basis of the dynamic proper motion of the objects.
  • The reaction or response is usually determined by specifications in the rail region, which specify e.g. minimum object sizes, overlap areas, etc. that must be reached in order to trigger a braking operation. The typical procedure is to calculate the service braking distance and the emergency braking distance of the rail vehicle as a function of the current speed of the rail vehicle and other information relating to the rail vehicle such as e.g. its weight and the incline or the gradient of the line section ahead. Objects of sufficient size and having a sufficiently large area of overlap with the monitoring region are then treated as collision obstacles and a response or reaction is determined on the basis of the interval to the collision obstacle and the braking distance of the rail vehicle. If e.g. the interval to a collision obstacle is less than the service braking distance, the emergency brake can be activated. In addition, an acoustic warning can be transmitted to the collision obstacle by means of the signal hooter of the rail vehicle. If the interval to the collision obstacle is in a range in which a standard braking maneuver is initiated, it is possible either to initiate such a braking maneuver or, if the interval to the collision obstacle is greater than the service braking distance by a predefined threshold value, to refrain from a braking maneuver initially and merely activate the signal hooter. The nature of the response can be modified on the basis of specific partial volumes with which the collision obstacle overlaps in the monitoring region, e.g. a warning volume or an emergency volume, and the detected properties of collision obstacles.
  • In addition to generating a response of the rail vehicle, the obstacle information can also be transmitted to a remote operations control center in order to draw general attention to the obstacle. This is beneficial if e.g. the adjacent track is also being monitored and other rail vehicles can be warned of possible collision obstacles which are not currently in their monitoring region. The operations control center can also issue additional instructions in situations in which an automatic rail vehicle is not allowed to make a decision independently. For example, a rail vehicle can send sensor data to the operations control center after coming to a halt in front of a detected collision obstacle. A person assesses the collision obstacle properly there and decides whether the rail vehicle can run over the collision obstacle without danger, or whether the collision obstacle has been moved aside sufficiently that the rail vehicle can continue onwards. If the collision obstacle in front of the rail vehicle must be removed before the rail vehicle can continue its journey, the sensor data transmitted to the operations control center can enable a human operator to establish which device could be used to clear the collision obstacle. Such a device may comprise e.g. a cutting tool for a fallen tree or a piece of construction machinery for a rock.
  • If on-board obstacle detection and on-board handling of the collision obstacle are not sufficient, the inventive method can be supplemented by the use of infrastructure-based obstacle detection systems. Since these systems are static, they can easily monitor a section of a rail string for collision obstacles. Such systems can advantageously be installed in regions with poor visibility such as e.g. regions with blind spots and regions with a high risk of collision such as e.g. railway stations. They can also be used to monitor the entrances of tunnels. In this case, they can not only monitor whether a potential collision obstacle is present on the line section but also whether an object such as e.g. a person or an animal has entered a tunnel, and can therefore transmit a warning that a collision obstacle might be present in the tunnel. This information can then be used to influence the strategy of a rail vehicle, e.g. to trigger a braking maneuver or select a lower speed for the rail vehicle.
  • As a result of the combined use of sensor data and map data, a collision danger is advantageously determined with greater precision and greater reliability than is possible using conventional procedures, thereby improving the safety and the effectiveness of automatic rail traffic. In particular, by virtue of the map-based approach and the consequently increased precision, the range of the collision obstacle detection is increased since this depends in particular on the accuracy of the measurement of the orientation of the rail vehicle and the orientation of the sensors that are used to detect the collision obstacle. Furthermore, this way of determining the course of the line can also be realized for line sections which are not straight or have reduced visibility of the course of the line, e.g. due to snow coverage or limited conditions of visibility. Furthermore, the inventive method allows the detection of generic objects on the basis of 3D data. This is advantageous in particular over methods which are based on machine learning and by means of which only specific classes of objects can therefore be detected and identified.
  • The inventive obstacle detection device has a monitoring region determination unit for determining a monitoring region in front of a rail vehicle. The determination of the monitoring region, in particular the position and extent thereof, is effected as a function of a position and a course of a track on which the train vehicle is travelling, and a position and dimensions of a monitoring profile of the rail vehicle. As mentioned above, the determination of the monitoring region is effected by means of capturing sensor data, e.g. GNSS data, and determining the position and orientation of the rail vehicle on the basis of this sensor data. The position and the course of the track on which the train vehicle is travelling are then determined on the basis of an alignment of the determined position and orientation of the rail vehicle with digital map data.
  • The inventive obstacle detection device also includes a sensor unit for the sensory capture of objects in the environment of the rail vehicle. Any suitable sensors for capturing, locating and possibly identifying objects can be used as sensor units, e.g. cameras, lidar systems or radar systems.
  • The inventive obstacle detection device further comprises an obstacle determination unit for determining which objects are possible collision obstacles, specifically by checking whether and to what extent these objects overlap with the monitoring region. In addition, the inventive obstacle detection device has a response determination unit for determining a reaction to the objects that are categorized as potential collision obstacles. The determination of a reaction is effected as a function of the previously cited geometric and kinetic attributes of the objects. The inventive obstacle detection device has the same advantages as the inventive method for obstacle detection for a rail vehicle.
  • The inventive rail vehicle has the inventive obstacle detection device and a control device for controlling a driving strategy of the rail vehicle as a function of a reaction which is determined by the obstacle detection device. The inventive rail vehicle has the same advantages as the inventive obstacle detection device.
  • Some components of the inventive obstacle detection device can, possibly after adding hardware systems such as e.g. a sensor unit, be embodied primarily in the form of software components. This relates in particular to the monitoring region determination unit, the obstacle determination unit and the response determination unit.
  • However, these components can in principle also be realized partly in the form of programmable hardware, e.g. FPGAs or similar, especially when particularly high-speed calculations are required. Likewise, the required interfaces can take the form of software interfaces, e.g. when only a transfer of data from other software components is required. However, they can also take the form of hardware-based interfaces which are activated by suitable software.
  • A largely software-based realization has the advantage that computing systems which are already present in a rail vehicle can also be upgraded easily by means of a software update after possibly adding further hardware elements, e.g. additional sensor units, in order to operate in the inventive manner. In this respect, the object of the invention is also achieved by a corresponding computer program product comprising a computer program which can be loaded directly into a storage device of such a computing system, said computer program having program sections for executing the inventive method steps that can be realized by means of software when the computer program is executed in the computing system.
  • Such a computer program product can, in addition to the computer program, optionally comprise additional elements such as e.g. documentation and/or additional components including hardware components such as e.g. hardware keys (dongles, etc.) for using the software.
  • For the purpose of transportation to the storage device of the computing system and/or for the purpose of storage on the computing system, it is possible to use a computer-readable medium, e.g. a memory stick, hard disc or other transportable or permanently installed data medium, on which are stored the program sections of the computer program that can be read in and executed by a computing unit. For this purpose, the computing unit can have e.g. one or more interworking microprocessors or similar.
  • The dependent claims and the following description each contain particularly advantageous embodiments and developments of the invention. In this case, the claims in one statutory class of claim can be developed in particular in a similar way to the dependent claims of another statutory class of claim and the specification parts thereof. Moreover, in the context of the invention, the various features of different exemplary embodiments and claims can also be combined to form further exemplary embodiments.
  • In the inventive method for obstacle detection for a rail vehicle, in order to achieve even greater precision when locating obstacles, provision is made for calibrating the position measurement and orientation measurement of the rail vehicle by determining positional data relating to landmarks or other distinctive features in an environment of the rail vehicle on the basis of sensor data from the environment. The positional data that is determined in relation to the landmarks is then compared with the corresponding positional data relating to the landmarks in the digital map data. If the values of the positional data based on the sensor data and on the digital map data differ from each other, this variation is used as a calibration value in order to make an adjustment to the position determination and orientation determination, i.e. the auto-location-finding of the rail vehicle and the orientation measurement thereof, with reference to the sensor data.
  • The precision of the position and orientation of the rail vehicle is advantageously improved further. The accuracy is particularly important when determining the position and orientation of the rail vehicle because, at greater distance, even a small inaccuracy when determining this data can cause a significant variation in the specification of a position of a possible collision obstacle.
  • The attributes of the objects preferably comprise at least one of the following attribute types: an object is characterized e.g. by its approximate dimensions and area, or by information indicating which partial volume it overlaps, or by the interval from the rail vehicle to the object, or by the current speed of the rail vehicle. The monitoring profile, in particular the extended clearance gage of the rail vehicle, can comprise e.g. a plurality of differently dimensioned partial profiles which are nested one within the other and are each assigned a different danger level. According to the partial profile, or the partial volume assigned to the respective partial profile, into which the detected object has already intruded, and as a function of the distance of the object from rail vehicle, an incremental reaction of the rail vehicle can be triggered. For example, according to the speed of the rail vehicle and of the object, and as a function of the position of the object and the distance of the object from the rail vehicle, a warning signal can be emitted initially and, if the object intrudes into an inner volume or if the object moves closer to the rail vehicle, or in the case of a corresponding speed and direction of motion of the object and/or of the rail vehicle, a braking maneuver can be triggered.
  • As mentioned above, in a step of the inventive method, a monitoring region is determined on the basis of a digital map or land map, and the position and orientation of the rail vehicle are determined in the digital map, preferably on the basis of a combination of sensor data. Sensor data which can return information about the environment of the rail vehicle, as well as intervals and orientations of objects relative to the rail vehicle, is advantageously combined with map data in order thereby to determine the position and orientation of the rail vehicle. If the rail vehicle is situated at a specific position relative to a position indicated in the digital map, the position of the rail vehicle can then be specified exactly.
  • For example, an estimated position and estimated orientation for the rail vehicle can initially be determined with the aid of sensor data which includes a degree of inaccuracy. Such sensor data preferably comprises at least one of the following data types:
      • satellite navigation data,
      • IMU data (IMU=inertial measurement unit),
      • speed sensor data,
      • odometry data,
      • infrastructure-based positional data, preferably via radio signals,
      • feature data that is detected in the environment of the rail vehicle and can be aligned with the digital map.
  • In an embodiment of the inventive method for obstacle detection for a rail vehicle, the feature data comprises at least one of the following data types:
      • landmarks,
      • the track.
  • A known position of landmarks or the track can advantageously be used to determine a variation of a position measurement and/or orientation measurement by the sensors and to undertake a corresponding adjustment.
  • In the inventive method for obstacle detection for a rail vehicle, various sensor data or sensor data of various types is advantageously combined in order to determine the position and orientation of the rail vehicle in the digital map. A multisensor fusion method is therefore applied. By virtue of combining sensor data from different sensors, it is usually possible to reduce or compensate measurement inaccuracies of individual sensor types by means of averaging and/or a combination of sensor data which is suitably weighted for current environmental conditions, such that reliability is further improved during the obstacle detection.
  • In a variant of the inventive method for obstacle detection for a rail vehicle, said combination comprises filtering by means of a Kalman filter or particle filter and/or graph-based methods and/or sensor-specific filtering. In this case, the status of the rail vehicle is preferably restricted to its movement along the track. The accuracy of the position-finding and the determination of the orientation of the rail vehicle can be further improved thus. The position of the rail vehicle is restricted by the movement of the contact points, i.e. the wheels, along the track. The orientation of the fixed vehicle body can, due to the mounting on rotatable running gear and as a result of the spring suspension, vary to some extent from the orientation of the track at the contact points.
  • In the inventive method for obstacle detection for a rail vehicle, provision is preferably also made for individual sensors to be calibrated with the digital map in order to correct effects of vibration or drift in the extrinsic calibration. In this case, the initially estimated position and orientation of the rail vehicle or the sensor can be used as a starting point, and the position and orientation of the sensor in the map can then be made more precise with reference to landmarks and/or other features that have been detected in the environment of the rail vehicle. For example, landmarks can be used to specify more accurately the position and orientation of the sensor, while the detected rail string can be used to refine the determination of the orientation of the sensor.
  • In the inventive method for obstacle detection for a rail vehicle, the calibration therefore also preferably comprises the following steps:
      • using an estimated position and orientation of the rail vehicle as a starting point,
      • refining the position and orientation of the sensor in the map by means of alignment with landmarks and other features that have been detected in the environment of the rail vehicle.
  • In the inventive method for obstacle detection for a rail vehicle, the step of the sensory capture of objects in the environment of the rail vehicle most preferably comprises determining a position and a size of the objects in the environment on the basis of sensor data. This data can advantageously be used for the purpose of a danger assessment and for the purpose of determining an adequate reaction to the obstacle that is present.
  • In the inventive method for obstacle detection for a rail vehicle, as part of the step for determining which objects represent possible collision obstacles, provision is also preferably made for initially generating a point cloud with object points on the basis of the sensor data that has been captured from the object. A point cloud can be generated in particular from raster-type scanning of the environment. Such an option is applied e.g. when a lidar system is deployed for the purpose of object detection. The point cloud is then broken down into individual objects by means of a so-called clustering method or a cluster analysis. This is understood to mean a method for detecting similarity structures in usually relatively large pools of data. These groups of measured points found thus and having similar properties are referred to as clusters, and the group assignment as clustering. Unlike automatic classification, the cluster analysis is intended to identify new groups in the data. Such a clustering method for clustering on the basis of the spatial distances of the points comprises e.g. a Euclidian distance clustering algorithm. The individual objects are then checked to determine whether they have an overlap region with the monitoring region of the rail vehicle. It is also advantageously possible to identify and separate generic objects that cannot be detected using a machine-learning based method for specific object classes.
  • In a particularly preferred variant of the inventive method, a closest object point within the monitoring profile is determined, said object point lying closest to the rail vehicle as measured along the course of the line. Following thereupon, all object points are grouped together in a 2D plane by combining all projection planes, said projection planes running through each object point and being oriented orthogonally to the track. In this case, a projection plane is determined which runs through the closest object point relative to the vehicle along the rail string and is oriented orthogonally to the track on which the rail vehicle is travelling. All object points of the cluster are then projected onto the projection plane. In this way, all detectable points of a detected object lie on a plane which represents the shortest interval from the object to the rail vehicle. The projection of the object thus generated can then be compared using a sectional area of the monitoring region comprising the projection plane. It is moreover possible to determine geometric object attributes of the object or for the point cloud. Such object attributes comprise the area, the dimensions and the position of the object or of the entire point cloud relative to the rail vehicle, and the area of the subset of the point cloud within the monitoring region or the partial volumes. For the purpose of specifying these object attributes, it is possible e.g. to form a bounding box around this point cloud, to calculate a convex envelope, or also to create a grid layout map.
  • In the step for determining which objects represent potential collision obstacles, the dataset that must be processed can advantageously be limited to that part of an object which is situated in the vicinity of the rail region or the monitoring region. Furthermore, greater accuracy can be achieved by directly calculating an overlap volume or an overlap area on the initial point cloud, than by first performing a modeling step for the complete object, e.g. forming a bounding box and performing the calculation on the basis of the model.
  • As mentioned above, the dynamic behavior of a possible collision object can also be captured by tracking this object. In this case, measured data relating to the object, e.g. its position, speed, size and sectional areas can be processed with the aid of a filter, e.g. a Kalman filter or a particle filter. It should be noted here that the dynamic attributes of an object such as e.g. speed or acceleration do not have to be measured directly, but can be derived from a plurality of positions at different time points. An intersection of the tracked object with the monitoring region can also be determined during the tracking. The methods mentioned above in respect of object detection and determination of a collision danger can also be applied in the case of this variant. Object attributes such as e.g. the distance, the height and width, the sectional area of the projected point cloud with the monitoring region, but also typical tracking attributes such as e.g. speed and confidence, are also used to assess the collision danger for this variant.
  • The invention is explained again in greater detail below on the basis of exemplary embodiments and with reference to the appended figures, in which:
  • FIG. 1 shows a flow diagram which illustrates a method for obstacle detection for a rail vehicle according to an exemplary embodiment of the invention,
  • FIG. 2 shows a schematic representation of a scenario for obstacle detection by a rail vehicle, where the actual orientation of a rail vehicle varies from the determined orientation,
  • FIG. 3 shows a schematic representation of determining a projection plane of the object points of a cluster according to an exemplary embodiment of the invention,
  • FIG. 4 shows a schematic representation of a projection of an object, wherein some of the object points lie in a sectional area of the monitoring region,
  • FIG. 5 shows a schematic 2D representation of a bounding box around that part of the object which lies within the sectional area of the monitoring region comprising the projection plane,
  • FIG. 6 shows a schematic 2D representation of a convex envelope around that part of the object which lies within the sectional area of the monitoring region comprising the projection plane,
  • FIG. 7 shows a schematic representation of a grid layout map which comprises the sectional area of the monitoring region comprising the projection plane,
  • FIG. 8 shows a schematic representation of an obstacle detection device according to an exemplary embodiment of the invention, and
  • FIG. 9 shows a schematic representation of a rail vehicle according to an exemplary embodiment of the invention.
  • FIG. 1 shows a flow diagram 100 which illustrates a method for obstacle detection for a rail vehicle 2 (see FIG. 2 ) according to an exemplary embodiment of the invention.
  • In the step 1.I, for a process which monitors the surroundings of the rail vehicle 2, information relating to a monitoring profile PR, e.g. a clearance gage, and in particular its position PS relative to the rail vehicle 2 is provided together with digital map data LK for obstacle detection, said information and data being required subsequently for the purpose of determining a position and dimensions of a monitoring region VU. The map data LK and the profile data PR can be stored in a database in combined form, but can also be obtained separately from different sources.
  • In the step 1.II, sensor data SDL is captured for the purpose of auto-location-finding of the rail vehicle 2. For example, GNSS data or data from an inertial measurement unit is captured for this purpose.
  • In the step 1.III, information relating to the surroundings of the rail vehicle 2, in particular sensor data from landmarks LM and sensor data from objects O, is captured by sensors. The surroundings or the environment comprises at least the monitoring region VU. It may however extend beyond the monitoring region VU, e.g. in order to detect moving objects O at an early stage. Objects O are detected within the sensor data, which depicts the environment of the rail vehicle. The objects O can be e.g. vehicles, people, or even trees, shopping carts or rocks that have fallen onto the track.
  • In the step 1.IV, the digital map data LK, the sensor data SDL for auto-location-finding of the rail vehicle 2, and the sensor data from the landmarks LM are processed in order to determine a position PSF and orientation OR of the rail vehicle 2 and, from this, the position PG of the current track section and possibly also the orientation thereof. The true speed of the rail vehicle together with odometry data can be used for the purpose of position-finding. In the step 1.IV, the knowledge of the position PS and dimensions of a monitoring profile of the rail vehicle 2 are also used to determine suitable dimensions, e.g. the required width, height and extent along the line section, as well as a suitable positioning of the monitoring region VU.
  • An alignment of the determined position PSF and orientation OR of the rail vehicle 2 with the digital map data LK also takes place in the step 1.IV. As mentioned above, this alignment increases the precision with which the values for the position PSF and orientation OR of the rail vehicle 2 are determined, making use of the constraint that the rail vehicle 2 can only travel along a track.
  • In the step 1.V, the monitoring region VU is determined on the basis of the determined position PSF and orientation OR of the rail vehicle 2, the position PG of the current track section, the knowledge of the position PS of the monitoring profile PR and the dimensions thereof, said monitoring region VU comprising the region which is captured in front of the rail vehicle 2 by the on-board sensors, and through which the rail vehicle 2 will next travel, and in which a collision could therefore occur. Adjacent tracks can also form part of the monitoring region VU.
  • In the step 1.VI, provision is made for determining which objects O represent potential collision obstacles KH. This determination takes the form of checking whether and to what extent they overlap with the monitoring region VU. If such an overlap exists, a collision danger is present. If a detected object O is sufficiently far outside the monitoring region VU and its anticipated trajectory also does not cross the monitoring region VU, which moves dynamically in the direction of travel, the object O can be categorized as not dangerous. Otherwise, the object O must be categorized as a potential collision object or potential collision obstacle KH.
  • In the step 1.VII, a reaction R of the rail vehicle 2 to the objects O categorized as potential collision obstacles KH is determined. The type of the reaction R is determined as a function of object attributes such as e.g. the approximate dimensions of the respective object O. In the case of small non-human objects such as e.g. small animals, for example, it is better to simply drive over them and full braking would not be a proportionate response to such an obstacle. Otherwise in the case of people or large obstacles it is essential to avoid a collision in order to avoid personal injury or damage to the rail vehicle 2. The reaction R is also determined in the step 1.VII as a function of the volume with which and the extent to which the object O overlaps. If the potential collision object KH is situated in the peripheral region of the monitoring region VU, it might be possible to avoid a collision with the rail vehicle 2 even as it continues its run. However, if the potential collision object KH is in the center of the monitoring region VU, braking is essential when the potential collision object KH is above a certain size. Furthermore, the reaction R is also dependent on the interval from the rail vehicle 2 to the collision object KH. If the collision object KH is still very far away, relatively gentle braking and the emission of a warning signal might suffice if it is a living object or an object that is controlled by a person. In this way, the object O can be provoked into a reaction or induced to leave the danger region. However, if the potential collision object KH is relatively close, braking must be correspondingly hard in order to avoid a collision. If the potential collision object KH is too close to the rail vehicle to allow braking at an early stage, the instructions might prescribe reducing the speed of the rail vehicle in order at least to moderate the impact with the collision obstacle KH on the line section. The reaction R is also determined as a function of the current speed of the rail vehicle 2.
  • FIG. 2 shows a schematic representation 20 of a scenario for obstacle detection by a rail vehicle 2, with calibration and adjustment of object location-finding by the rail vehicle 2. The scenario 20 comprises five partial scenarios 20 a, 20 b, 20 c, 20 d, 20 e, by means of which individual aspects of the object location-finding and the calibration or adjustment are shown in detail.
  • A first scenario 20 a on the far left in FIG. 2 illustrates how and at what positions the sensors of the rail vehicle 2, which is located at the bottom edge of the first scenario 20 a, perceive the individual objects in its environment. A motor vehicle O, two trees O2, O3 and three landmarks LM1, LM2, LM3 are captured by the sensors. The cited objects are shown with hatching marks.
  • A second scenario 20 b shows a map extract in which are marked the captured landmarks LM1, LM2, LM3 and the track GK on which the rail vehicle 2 is travelling. The rail vehicle 2 is also marked at the bottom edge of the map in the second scenario 20 b.
  • In a third scenario 20 c, only the monitoring region VU of the rail vehicle 20 c is shown (dotted).
  • In a fourth scenario 20 d, the first three scenarios 20 a, 20 b, 20 c are shown superimposed on top of each other. In this case, it can be seen that the map positions of the landmarks LM1, LM2, LM3 vary from the positions of the landmarks LM1, LM2, LM3 as captured by the rail vehicle 2. In the fourth scenario 20 d, it also appears that the detected motor vehicle O is situated outside the monitoring region VU of the rail vehicle 2. Conversely, one of the trees O3 appears to project into the monitoring region VU of the rail vehicle 2. The actual orientation of the rail vehicle 2 also varies slightly from the self-determined orientation, this being evident in that the orientation of the rail vehicle 2 in the map and the self-determined orientation do not correspond exactly. On the basis of the determined variation between the map positions and the positions determined by sensor measurement of the landmarks LM1, LM2, LM3, correction of the sensor measurement is now undertaken in an adjustment step so that the positions of the landmarks LM1, LM2, LM3 as determined by the sensor measurement and the map data correspond.
  • The result can be seen in the fifth scenario 20 e. It can now be seen there that the motor vehicle O is situated in the monitoring region VU of the rail vehicle 2, and is even already on the tracks GK, and therefore represents a potential collision obstacle. Conversely, it is clear from the fifth scenario 20 e that the tree O3 is not situated in the monitoring region VU of the rail vehicle 2 and therefore does not represent a danger to the rail vehicle 2.
  • On the basis of the knowledge of the positions of the landmarks LM1, LM2 and LM3 and a current measurement of the position and orientation of the landmarks LM1, LM2 and LM3, it is therefore possible to detect a variation of the determined orientation of the rail vehicle 2 and a variation of the capture of objects in the environment of the rail vehicle (e.g. due to drift in the extrinsic calibration of sensors or vibrations), and to use said variation for the purpose of calibrating or adjusting the orientation determination.
  • FIG. 3 illustrates a schematic representation 30 of a determination of a projection plane PRE of a point cloud CL of an object O according to an exemplary embodiment of the invention. The representation 30 is shown as a plan view. The right-hand side of FIG. 3 shows a track GK comprising two rails S with crossties SW, on which the rail vehicle 2 travels in an x-direction or in the direction of the arrow, i.e. from the bottom towards the top in FIG. 3 . A position XN on the track GK is now determined, where the projection plane PRE cuts through the position XN and, extending orthogonally to the course of the track in the direction of travel, through that point OPN of the object O which is closest to the rail vehicle 2 as viewed in an x-direction. In other words, this position XN is the point at which a center line of the track GK has the smallest interval to the closest point OPN of the object O. The 2D positions of the individual object points OP of the point cloud CL that is assigned to the object O are then determined in the projection plane PRE.
  • FIG. 4 shows a schematic 2D representation 40 of a point cloud CL, some of the object points OP of the point cloud CL lying in a sectional area FU of the monitoring region VU and forming the so-called intersection points and some of the object points OP of the point cloud CL lying outside this sectional area FU. The representation shown in FIG. 4 can therefore be considered as a cross-sectional representation at the x-position XN. For better orientation, shown at the bottom edge of the sectional area FU in FIG. 4 are the cross-sectional areas of the rails S of the track GK on which the rail vehicle 2 is running. The sectional area FU provides a model of the object O, in order to assess the relevance of a possible collision obstacle and plan suitable countermeasures.
  • In FIG. 5 , for the purpose of determining an area of the intersection points OP, a schematic 2D representation 50 of a bounding box BB is shown around that part of the point cloud CL which lies within the sectional area FU of the monitoring region VU. The bounding box BB is represented as a square with broken-line edges. A convex envelope CH can also be used instead of a rectangular bounding box BB to contain the object points OP.
  • Concerning this, FIG. 6 shows a schematic 2D representation 60 in which, for the purpose of determining the area of the intersection points, a polygonal convex envelope CH marked by broken lines is placed around that part of the projected point cloud CL which lies within the sectional area FU of the monitoring region VU.
  • Alternatively, that part of the point cloud CL which lies within the sectional area FU of the monitoring region VU, or the monitoring area FU, can also be included in a grid layout map.
  • Concerning this, FIG. 7 shows a schematic representation 70 of a grid layout map GBK comprising the sectional area FU of the monitoring region VU. The grid layout map GBK comprises a grid matrix of square fields OG. Occupied fields OG of the grid layout map GBK are shaded gray in FIG. 8 . Subsequent evaluations can then be performed with ease on the basis of the grid fields OG that are occupied or unoccupied by digital data, thereby further reducing the computing effort.
  • FIG. 8 shows a schematic representation of an obstacle detection device 80 for a rail vehicle 2 according to an exemplary embodiment of the invention. The obstacle detection device 80 comprises a plurality of sensor units 85 (shown on the left-hand side of the image) for auto-location-finding of the rail vehicle 2 and a number of further sensor units 82 for capturing the surroundings. The sensor units 85 for auto-location-finding comprise sensors that are based on different physical principles, e.g. inertial sensors, satellite navigation systems, speed sensors or receive units for receiving radio signals from infrastructure equipment, by means of which the obstacle detection device 80 can determine the position PSF and orientation OR of the rail vehicle 2. The sensor units 82 for capturing the surroundings comprise sensors for depicting the surroundings of the rail vehicle, e.g. lidar systems or image recording units such as e.g. cameras. Using these sensors, sensor data is captured for the purpose of detecting objects O and landmarks LM. The obstacle detection device 80 also comprises a map and profile unit 87 by means of which the digital map data LK can be provided together with data relating to the dimensions of a monitoring profile PR and the position PS of the monitoring profile PR. The map data LK and profile data PR can also be obtained from a plurality of separate and different sources.
  • The obstacle detection device 80 also comprises a position determination unit 81 which, on the basis of known dimensions of the rail vehicle 2, the sensor data SDL for auto-location-finding, e.g. GNSS data or IMU data (IMU=inertial measurement unit), further feature data from the environment such as landmarks LM, and map data LK, determines a position PSF of the rail vehicle 2 and its orientation OR as well as a position PG and a course of that part of the rail string on which the rail vehicle 2 is currently running.
  • The obstacle detection device 80 also comprises a monitoring region determination unit 88 which, on the basis of the monitoring profile PR, the position PS of the monitoring profile PR, the position PSF and orientation OR of the rail vehicle 2, as well as a position PG and a course of that part of the rail string on which the rail vehicle 2 is currently running, determines a monitoring volume or the position and dimensions of a monitoring region VU.
  • The sensor data relating to the captured objects O or positions thereof is transmitted to an obstacle determination unit 83 which, on the basis of the positions of the objects O together with the knowledge of the position and extent of the monitoring region VU, determines which objects O are possible collision obstacles KH. Details of the procedure for this are explained with reference to FIG. 3 to FIG. 7 .
  • The captured data relating to the determined collision obstacles KH, e.g. their position, their approximate dimensions and the interval from the collision obstacles KH to the rail vehicle 2, are transmitted to a response determination unit 84. The response determination unit 84 is configured to determine a reaction R to the objects O that are categorized as potential obstacles KH, as a function of the approximate dimensions G of the respective object O, the size of the collision volume with which it overlaps, the interval to the object O, and the current speed of the rail vehicle.
  • FIG. 9 shows a schematic representation 90 of a rail vehicle 2 according to an exemplary embodiment of the invention. The rail vehicle 2 in FIG. 9 is travelling on a track GK from left to right, i.e. in the direction of the arrow. For the purpose of obstacle detection, the rail vehicle 2 comprises an obstacle detection device 80 with sensor units 82 for capturing the surroundings and the other units, these being constructed and functioning in the manner illustrated in FIG. 8 . As explained above, on the basis of sensor data SDL and map data LK, the obstacle detection device 80 determines the position PSF and orientation OR of the rail vehicle 2 as well as the position PG of the track GK and the position of the monitoring region VU in front of the rail vehicle 2. A reaction R determined by the obstacle detection device 80 is transmitted to a control device 86 of the rail vehicle 2, which controls the rail vehicle 2 in accordance with the determined reaction R. For example, a warning signal is generated and a braking maneuver is initiated in order to avoid a collision with an object O determined to be a collision obstacle KH.
  • It is again noted that the methods and apparatuses described in the foregoing are merely preferred exemplary embodiments of the invention, and that the invention can be modified by a person skilled in the art without departing from the scope of the invention as it is specified in the claims. For the sake of completeness, it is also noted that use of the indefinite article “a” or “an” does not preclude multiple instances of the features concerned. Likewise, the term “unit” does not preclude this consisting of a plurality of components, which can also be spatially distributed if applicable.

Claims (14)

1-13. (canceled)
14. A method for obstacle detection for a rail vehicle, the method comprising the following steps:
determining a monitoring region in front of the rail vehicle as a function of:
a position and a course of a track on which the rail vehicle is travelling, and
a position and dimensions of a monitoring profile of the rail vehicle;
capturing sensor data for determining at least one of a position or an orientation of the rail vehicle;
determining the position and the course of the track on which the rail vehicle is travelling based on at least one of an alignment of the determined position or an orientation of the rail vehicle with digital map data;
capturing objects in an environment of the rail vehicle by using sensors;
determining which of the objects being potential collision obstacles by checking whether and to what extent the obstacles overlap with the monitoring region; and
determining a reaction to the objects categorized as potential obstacles, as a function of at least one of geometric or kinetic attributes of the objects.
15. The method according to claim 14, which further comprises carrying out at least one of a calibration of the position or the orientation of the rail vehicle by:
determining positional data from landmarks in an environment of the rail vehicle; and
comparing the positional data relating to the landmarks with positional data in the digital map data.
16. The method according to claim 14, which further comprises providing the attributes of the objects as at least one attribute type including:
approximate dimensions and an area of a respective object; or
information indicating a volume overlapped by the object; or
an interval from the rail vehicle to the object; or
dynamic attributes of the objects.
17. The method according to claim 15, which further comprises providing the sensor data as:
satellite navigation data; or
IMU data; or
speed sensor data; or
odometry data; or
infrastructure-based positional data; or
feature data detected in the environment of the rail vehicle for auto-location-finding, the feature data configured to be aligned with the digital map; or
sensor data relating to the environment of the rail vehicle for a specification of object attributes.
18. The method according to claim 17, which further comprises providing the feature data for auto-location-finding as:
landmarks; or
the track.
19. The method according to claim 14, which further comprises combining sensor data from a plurality of sources to determine at least one of the position or the orientation of the rail vehicle in the digital map.
20. The method according to claim 15, which further comprises carrying out the method by:
using a Kalman filter or particle filter for filtering; or
using graph-based methods; or
using sensor-specific filtering; and
restricting a status of the rail vehicle to a movement of the rail vehicle along the track.
21. The method according to claim 14, which further comprises carrying out the step of determining which objects are possible collision obstacles by:
generating a point cloud of object points based on the sensor data relating to the environment from one or a plurality of sensors;
breaking the point cloud down into individual objects by clustering; and
determining an overlap region of individual objects with a sectional area of the monitoring region.
22. The method according to claim 21, which further comprises carrying out the determination of the overlap region of the individual objects with the sectional area of the monitoring region by:
determining a closest object point lying closest to the rail vehicle as measured along the track on which the rail vehicle is travelling;
grouping together all object points in a 2D plane by combining all projection planes running through each object point and being oriented orthogonally to the track, all of the object points of the point cloud being projected onto the projection plane of the closest object point; and
determining attributes of the individual objects based on a subset of the object points within the sectional area of the monitoring region.
23. An obstacle detection device, comprising:
a monitoring region determination unit for determining a monitoring region in front of a rail vehicle as a function of:
at least one of a position or a course of a track on which the rail vehicle is travelling, and
at least one of a position or dimensions of a monitoring profile of the rail vehicle;
sensor data for determining at least one of a position or an orientation of the rail vehicle being captured;
at least one of the position or the course of the track on which the rail vehicle is travelling being determined based on at least one of an alignment of the determined position or the orientation of the rail vehicle with digital map data;
a sensor unit for sensory capture of objects in an environment of the rail vehicle;
an obstacle determination unit for determining which objects are potential collision obstacles by checking whether and to what extent the objects overlap with the monitoring region; and
a response determination unit for determining a reaction to the objects categorized as potential collision obstacles, as a function of at least one of geometric or kinetic attributes of the objects.
24. A rail vehicle, comprising:
an obstacle detection device according to claim 23; and
a control device for controlling a driving strategy of the rail vehicle as a function of the reaction determined by the obstacle detection device.
25. A non-transitory computer program product, comprising a computer program configured to be loaded directly into a storage unit of a control device of a rail vehicle, with program sections for executing all steps of the method according to claim 14 when the computer program is executed in the control device.
26. A non-transitory computer-readable medium on which program sections configured to be executed by a computing unit are stored in order to execute all steps of the method according to claim 14 upon the program sections being executed by the computing unit.
US18/552,489 2021-03-26 2022-03-07 Obstacle detection for a rail vehicle Pending US20240174274A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102021203014.9 2021-03-26
DE102021203014.9A DE102021203014A1 (en) 2021-03-26 2021-03-26 Obstacle detection for a rail vehicle
PCT/EP2022/055657 WO2022200020A1 (en) 2021-03-26 2022-03-07 Obstacle detection for a rail vehicle

Publications (1)

Publication Number Publication Date
US20240174274A1 true US20240174274A1 (en) 2024-05-30

Family

ID=80979077

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/552,489 Pending US20240174274A1 (en) 2021-03-26 2022-03-07 Obstacle detection for a rail vehicle

Country Status (4)

Country Link
US (1) US20240174274A1 (en)
EP (1) EP4277826A1 (en)
DE (1) DE102021203014A1 (en)
WO (1) WO2022200020A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022210974A1 (en) 2022-10-18 2024-04-18 Robert Bosch Gesellschaft mit beschränkter Haftung Method for determining collision information regarding a collision risk between a rail vehicle and an object
DE102022213909A1 (en) 2022-12-19 2024-06-20 Robert Bosch Gesellschaft mit beschränkter Haftung Method for determining collision information regarding a collision risk between a rail vehicle and an object

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004050690A1 (en) 2004-10-18 2006-04-20 Siemens Ag Procedure, computer program with program code means, computer program product and device for modeling the environment of an autonomous mobile system
DE102006007788A1 (en) * 2006-02-20 2007-08-30 Siemens Ag Computer-assisted driverless railway train monitoring system, to show its travel behavior, has train-mounted sensors and track position markers for position data to be compared with a stored model
DE102014206473A1 (en) 2014-04-03 2015-10-08 Bombardier Transportation Gmbh Automatic assistance to a driver of a lane-bound vehicle, in particular a rail vehicle
DE102018201531A1 (en) 2018-02-01 2019-08-01 Knorr-Bremse Systeme für Schienenfahrzeuge GmbH Method for detecting a driving tube in rail-bound vehicles

Also Published As

Publication number Publication date
DE102021203014A1 (en) 2022-09-29
WO2022200020A1 (en) 2022-09-29
EP4277826A1 (en) 2023-11-22

Similar Documents

Publication Publication Date Title
US12001217B1 (en) Detecting sensor degradation by actively controlling an autonomous vehicle
US11163309B2 (en) Method for autonomous navigation
US10121367B2 (en) Vehicle lane map estimation
JP6822564B2 (en) Vehicle information storage method, vehicle travel control method, and vehicle information storage device
US9889847B2 (en) Method and system for driver assistance for a vehicle
US20240174274A1 (en) Obstacle detection for a rail vehicle
JP2024025803A (en) Vehicle utilizing space information acquired using sensor, sensing device utilizing space information acquired using sensor, and server
US11679780B2 (en) Methods and systems for monitoring vehicle motion with driver safety alerts
US20220126875A1 (en) Control of an autonomous vehicle based on behavior of surrounding agents and limited observations of environment
KR102279754B1 (en) Method, apparatus, server, and computer program for preventing crash accident
CN114291080A (en) Enhanced obstacle detection
CN110562269A (en) Method for processing fault of intelligent driving vehicle, vehicle-mounted equipment and storage medium
CN113504782B (en) Obstacle collision prevention method, device and system and moving tool
US11999384B2 (en) Travel route generation device and control device
CN113034970A (en) Safety system, automated driving system and method thereof
CN114402361A (en) Autonomous driving object localization by visual tracking and image re-projection
US20230154199A1 (en) Driving control system and method of controlling the same using sensor fusion between vehicles
US20210020039A1 (en) Event detection in temporary traffic control zones
US20240190475A1 (en) Travel area determination device and travel area determination method
CN114787891B (en) Driving support device and driving support system
US20230294737A1 (en) Control subsystem and method to define the response of an autonomous vehicle to an unknown object
US20240010190A1 (en) Systems and methods for pothole detection and avoidance
WO2024075577A1 (en) Mining vehicle autonomous travel system
KR20220088611A (en) Vehicle and automatic driving system
KR20240019928A (en) Method for predicting and determining abnormal state of autonomous driving vehicle and apparatus and system therefor

Legal Events

Date Code Title Description
AS Assignment

Owner name: HAYS AG, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WAIZENEGGER, WOLFGANG;REEL/FRAME:066256/0003

Effective date: 20230907

Owner name: SIEMENS MOBILITY GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PALMER, ANDREW;PILBOROUGH, JONATHAN;SCHNEE, JOST;AND OTHERS;SIGNING DATES FROM 20230825 TO 20230929;REEL/FRAME:066255/0543

Owner name: ABIDAT GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WENDT, CHRISTIAN;ADEMI, EGZONE;SIGNING DATES FROM 20240108 TO 20240112;REEL/FRAME:066255/0826

AS Assignment

Owner name: SIEMENS MOBILITY GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HAYS AG;REEL/FRAME:066274/0452

Effective date: 20230831

Owner name: SIEMENS MOBILITY GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ABIDAT GMBH;REEL/FRAME:066274/0446

Effective date: 20240124

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION