EP3218244A1 - Fonctionnement d'un véhicule ferroviaire pourvu d'un système de génération d'images - Google Patents

Fonctionnement d'un véhicule ferroviaire pourvu d'un système de génération d'images

Info

Publication number
EP3218244A1
EP3218244A1 EP15797627.5A EP15797627A EP3218244A1 EP 3218244 A1 EP3218244 A1 EP 3218244A1 EP 15797627 A EP15797627 A EP 15797627A EP 3218244 A1 EP3218244 A1 EP 3218244A1
Authority
EP
European Patent Office
Prior art keywords
rail vehicle
image
stereo
imaging
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP15797627.5A
Other languages
German (de)
English (en)
Other versions
EP3218244B1 (fr
Inventor
Michael Fischer
Gerald Newesely
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alstom Transportation Germany GmbH
Original Assignee
Bombardier Transportation GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bombardier Transportation GmbH filed Critical Bombardier Transportation GmbH
Priority to PL15797627T priority Critical patent/PL3218244T3/pl
Priority to EP18186200.4A priority patent/EP3431361A3/fr
Publication of EP3218244A1 publication Critical patent/EP3218244A1/fr
Application granted granted Critical
Publication of EP3218244B1 publication Critical patent/EP3218244B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L23/00Control, warning or like safety means along the route or between vehicles or trains
    • B61L23/04Control, warning or like safety means along the route or between vehicles or trains for monitoring the mechanical state of the route
    • B61L23/041Obstacle detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L15/00Indicators provided on the vehicle or train for signalling purposes
    • B61L15/0072On-board train data handling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L27/00Central railway traffic control systems; Trackside control; Communication systems specially adapted therefor
    • B61L27/04Automatic systems, e.g. controlled by train; Change-over to manual control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L27/00Central railway traffic control systems; Trackside control; Communication systems specially adapted therefor
    • B61L27/40Handling position reports or trackside vehicle data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L27/00Central railway traffic control systems; Trackside control; Communication systems specially adapted therefor
    • B61L27/70Details of trackside communication

Definitions

  • the invention relates to a rail vehicle with an imaging system for
  • the invention further relates to a system for operating a rail vehicle. Moreover, the invention relates to a method for operating a rail vehicle.
  • Rail vehicles It is known to operate rail vehicles on routes that are free of other traffic without drivers.
  • rail vehicles In terms of passenger transport, rail vehicles are designed to handle larger numbers of passengers than most types of road vehicles. Examples of driverless rail vehicles are so-called people mover, which operate between the different parts of airports. Rail vehicles have the advantage that they are guided on their lane by externally acting forces and can not leave the route, but in many systems there is the possibility to choose one of several possible routes on switches. Due to the tracking need
  • Rail vehicles not necessarily a steering as in road vehicles. Rail vehicles are therefore well suited for autonomous, driverless operation. In the case of driverless operation in rooms in which persons and / or non-track-guided vehicles also operate, it is also permissible for driverless operation of rail vehicles
  • driver assistance systems can be used to assist the driver in making decisions to steer the vehicle.
  • collision warning systems are known which warn the driver of an impending, potential collision.
  • radar sensors, ultrasonic sensors, laser triangulation systems and / or imaging devices such as digital cameras that produce two-dimensional images of the space outside the rail vehicle can be used.
  • image evaluation the depth of a possible collision object, ie the distance from the image forming device, can be determined.
  • the comparison of image objects in individual images with known depth positions is also possible, which can be determined, for example, in the case of travel paths along which objects at constant intervals or of known length extend.
  • the operation of rail vehicles but also has the disadvantage that in an imminent collision avoidance is possible and even with timely braking the obstacle can not be avoided.
  • the rail vehicle according to its envelope which is determined by the maximum extent of the vehicle cross section, always requires sufficient space that extends immovably along the route.
  • the envelope is also determined by static effects, in particular kinematic effects, and also by dynamic effects, in particular elastic deformations (for example spring travel) of the vehicle.
  • rail vehicles measured in the direction of travel often have larger vehicle lengths, which affects the space required for cornering clearance and difficult to detect the relevant for the operation of the vehicle vehicle exterior.
  • lower acceleration and braking forces are transmitted to the track.
  • the reliability is increased in the autonomous, driverless operation of a rail vehicle, but also in an operation with a driver in the rail vehicle.
  • These measures are preferably carried out or realized in combination with each other.
  • any of the measures can be carried out and the other two measures individually or in combination be referred to each other as training the measure.
  • Each of the measures may be a device or a system and additionally an operating method for
  • a rail vehicle comprises an imaging system for detecting a space outside the rail vehicle, wherein a plurality of image forming means is provided, forming a first stereo pair and a second stereo pair.
  • the image forming means of each of the stereo pairs detect a common part of the space from different angles, thereby enabling calculation of depth information.
  • such a calculation is not mandatory. Rather, the images generated by the respective stereo pair can only be displayed separately, in particular be displayed so that a person with the right eye, the image of one of
  • Imaging device of the stereo pair perceives. The result is the same or similar spatial impression as if the person were looking directly at the room with his own eyes.
  • the distance of the image forming devices of the first stereo pair is in particular greater than the distance of the image forming devices of the second stereo pair. Therefore, at least three imaging devices are needed.
  • the imaging system it is an underlying discovery of the invention that the imaging system
  • Image forming means wherein each two image forming means form a stereo pair.
  • Imaging devices that fail the device or can not be used without defects (i.e., interference-free), which is involved in both stereo pairs, stereoscopic image capture would not be possible. At least four
  • image-forming devices do not cause the failure of an image-forming device that the function of both stereo-pairs is disturbed. At least one stereo pair remains functional. Furthermore, if at least the images of three of the four imaging devices can still be used without interference, two stereo image pairs can be formed. The three imaging devices thus form two stereo pairs of devices and provide two stereo image pairs. At least one image from one of the three imaging devices is therefore used for both stereo image pairs.
  • the statement "at least four imaging devices" expressly includes the case in that the imaging system has more than four imaging devices. This also applies to all embodiments of the invention described below.
  • a failed image generation device is understood to mean that this image generation device does not generate an image, that this image generation device does not generate an image that can be used for the evaluation, and / or that a transmission of an image or images of this image generation device to a
  • a disturbed image-generating device is understood to mean that this image-generating device generates at least one faulty image and / or that a transmission of an image or images thereof
  • Image generating device to the evaluation device is faulty.
  • Cause of a faulty image can z. B. also be an obstacle between an object to be observed outside the vehicle and the image generating device.
  • the faulty image allows z. For example, it does not recognize the object, or it only blurs the object. For example, a windshield wiper moves along a vehicle
  • Imaging device no longer to use. For example, One or more faulty images of a sequence of images can be tolerated if at least one error-free image is then generated again in the same image sequence and / or an object traced by evaluation of the images of the image sequence is recognized again from at least one image of the image sequence. It can e.g. Depending on the situation, it may be decided whether the information provided by the
  • Image generating device generated images can be used and therefore can be dispensed with the formation of other stereo pairs.
  • a rail vehicle having an imaging system for detecting a space outside the rail vehicle, wherein
  • the imaging system has four imaging devices,
  • Imaging system can create or generate two-dimensional images of the space
  • a first and a second of the four imaging devices are arranged at a first distance from one another on the rail vehicle and form a first stereo pair which detects a first common part of the space from different angles
  • a third and a fourth of the four imaging devices are arranged at a second distance from one another on the rail vehicle and form a second stereo pair which detects a second common part of the space from different angles
  • the first distance is greater than the second distance
  • Imaging equipment is connected in an operation of the
  • Imaging system receives image data from the four imaging devices,
  • Imaging equipment during an operating phase of the imaging system is not possible or defective
  • the failed and / or disturbed imaging device may be any of the four imaging devices
  • the evaluation means uses, during the operation phase, the image data which the evaluation means receives from three other of the four image forming means other than the failed and / or distorted image forming means as image data including a first stereo image pair and a second stereo image pair the first stereo image pair corresponds to the image data of two of the three other image forming devices located at a third distance from each other on the rail vehicle, and the second stereo image pair corresponds to the image data of two of the three other image forming devices at a fourth distance from each other the rail vehicle are arranged, and wherein the third distance and the fourth distance are different in size.
  • Imaging equipment detects a space outside the rail vehicle
  • each of the at least four imaging devices can generate or generate two-dimensional images of the space
  • a first and a second of the at least four imaging devices are arranged at a first distance from each other on the rail vehicle and a form a first stereo pair that captures a first common part of the room from different angles,
  • a third and a fourth of the at least four imaging devices are arranged at a second distance from one another on the rail vehicle and form a second stereo pair which detects a second common part of the space from different angles of view,
  • the first distance is greater than the second distance
  • the first common part of the room and the second common part of the room have a common space area
  • Imaging equipment is connected in an operation of the
  • Imaging system receives image data from the four imaging devices,
  • Imaging equipment during an operating phase of the imaging system is not possible or defective
  • the failed and / or disturbed imaging device may be any of the four imaging devices
  • the evaluation means uses, during the operation phase, the image data which the evaluation means receives from three other of the four image forming means other than the failed and / or distorted image forming means as image data including a first stereo image pair and a second stereo image pair the first stereo image pair corresponds to the image data of two of the three other image forming devices located at a third distance from each other on the rail vehicle, and the second stereo image pair corresponds to the image data of two of the three other image forming devices at a fourth distance from each other the rail vehicle are arranged, and wherein the third distance and the fourth distance are different in size.
  • the third distance or the fourth distance may correspond to the first distance or the second distance, depending on which stereo pairs were formed before the failure or disturbance.
  • the rail vehicle is in particular a light one
  • Rail vehicle for example, a tram or light rail.
  • the imaging system can be operated this way.
  • images of three of the four imaging devices are used for the two stereo image pairs, that is, at least one image of one of the three
  • Imaging equipment is used for both stereo image pairs.
  • the first image generation device is then also the third or fourth image generation device or the second image generation device also the third or fourth image formation device.
  • the evaluation device and / or another device of the imaging system can recognize that an evaluation of image data of the
  • Such another means may be, for example, a device which processes images generated by the image forming devices solely for the purpose of detecting the failure and / or the disturbance of an image forming device.
  • the additional device outputs a signal to the evaluation device, for example a signal which uniquely contains the information about the failed and / or disturbed image generation device.
  • the image generating devices continuously generate images over time and the corresponding sequence of images is also evaluated for the purpose of detecting the failure and / or the disturbance.
  • At least one object for example another vehicle or a person
  • an image of the image sequence can be recognized in an image of the image sequence.
  • an attempt is made to recognize this object in the following images of the same image sequence. If the object has disappeared implausibly in at least one of the following images and / or has moved in an implausible manner, it can be decided that the
  • Image forming device is disturbed or at least the transmission or evaluation of images of this imaging device is disturbed. In case of a failure of one
  • Forming apparatuses are to be formed and in the evaluation of the images, the corresponding stereo image pairs are formed, is generally formulated that no arbitrary group of three of the four imaging devices are arranged as the vertices of an equilateral triangle.
  • At least three of the four imaging devices are provided.
  • Each of the stereo pairs can therefore be designed for the detection of the common space area, but at different depth of field.
  • Imaging devices in an operating phase of the image forming device each have a constant focal length.
  • Image acquisition with constant focal lengths is particularly reliable and fast.
  • the problem of having to decide in the presence of multiple objects of interest in the detected space area on which of the objects the image is focused on, is avoided.
  • the time for focusing (that is, adjusting the focal length) can be saved and more images can be generated per time interval in an image sequence.
  • this does not preclude the transition from a first phase of operation to a second phase of operation, for example, because a failure and / or a malfunction is one of
  • Device of at least one of the image forming devices is changed. Such a change is even preferred to the imaging system in the second
  • the image generator that provides images for both stereo image pairs can be set to a shorter focal length than before. This is based on the knowledge that a collection of
  • Objects in particular a detection of the outline of each respective object
  • a detection of objects at a distance that is significantly smaller than the focal length is well possible, whereas a detection of objects at a distance that is significantly smaller than the focal length is not possible or to leads to significant errors in the evaluation.
  • the first and second image forming means and / or the third and fourth image forming means are spaced apart in the horizontal direction, and the first distance and the second distance are related to the horizontal direction. This does not exclude (though not preferred) that the two
  • Imaging devices of the same stereo pair ie the first and second
  • Image forming device or the third and fourth image forming device are arranged at different heights in or on the rail vehicle, wherein an arrangement at the same height is preferred and / or arranged in the vehicle longitudinal direction at different longitudinal positions, wherein an arrangement is preferred at the same longitudinal position.
  • first and third image forming means are arranged one above the other at the same horizontal position or are arranged directly adjacent to each other in the horizontal direction, taking into account their design, smallest possible horizontal distance.
  • the stereoscopic image pairs recorded by the first stereo pair and the second stereo pair can be evaluated jointly in a particularly simple manner, because the first common part of the space detected by the first stereo pair is outside the rail vehicle and that of the first second stereo pair detected second common part of the space each have a defined by the first and third image forming device reference point, wherein the two reference points at least approximately the same horizontal position or when arranged in horizontally next to each other have the least possible horizontal distance from each other.
  • the distances to each other of the four imaging devices may be different. Therefore, in case of failure or malfunction any of the four imaging devices
  • Image forming devices are formed, whose stereo image pairs are good for the
  • Detecting the common space area at different detection depths are suitable. This means that, for example, the first pair of stereo image captures the common area well at greater distances to the vehicle, and the second pair of stereo images capture the common area well at smaller distances to the vehicle.
  • information about the depth of image objects detected in the images of a stereo image pair may be calculated according to the principle of triangulation. Due to the distance of the image forming devices of the same stereo pair that has taken the stereo image pair and due to the fact that the image forming devices view the same image object or the same part of the image object from different angles, a triangle results in the detected space outside the rail vehicle. For example, correspondences of pixels in the two images of the same stereo image pair are formed.
  • Depth information is known per se and will therefore not be described here in detail. In particular, it is therefore possible and is preferably also carried out in embodiments of the present invention such that depth positions are calculated for a plurality of image objects which have been detected by the stereo image pairs.
  • the depth position is related to a reference point of the stereo pair, e.g. in the middle between the two imaging devices of the stereo pair.
  • the first stereo pair is configured and / or used to capture image objects and optionally their depth positions, which have a greater depth than image objects captured by the second stereo pair.
  • the first stereo pair is better suited for capturing objects with greater depths because the distance of the imaging devices of the first stereo pair is greater than that Distance of the image forming devices of the second stereo pair.
  • the imaging system can be designed accordingly that the
  • Depth position coincide in a common space point than in the second stereo pair.
  • the first common part of the space detected by the first stereo pair is predominantly located at greater depth positions than the second common part of the space detected by the second stereo pair. This is already achieved, for example, in that the distance of the
  • Image generation means of the first stereo pair is greater than that of the second stereo pair and optionally the viewing angle difference of the first stereo pair with respect to the image centers is equal to the viewing angle difference of the second stereo pair with respect to the image centers.
  • the viewing angle difference is the deviation of the viewing angle from the viewing angle of the other imaging device of the same stereo pair.
  • the different depth orientation is also achieved with designs that deviate from these equally large viewing angle differences.
  • the viewing angle difference of the first stereo pair may be smaller than the second stereo pair.
  • Imaging devices of the first stereo pair detected spatial areas are smaller than the second stereo pair.
  • the first stereo image pairs i. those of the
  • Image generating means of the first stereo pair generated images, and the second stereo image pairs, ie the images generated by the image generating means of the second stereo pair, first independently (but in particular in the same processing unit) are evaluated and in this way depth information is obtained.
  • the depth information exists in the depth position of at least one object outside the vehicle.
  • the depth information obtained from the first stereo image pairs is compared with the depth information obtained from the second stereo image pairs.
  • depth positions are compared which were determined by evaluating the first stereo image pair as well as the second stereo image pair for the same object.
  • the object may be a road user, such as a road vehicle or a pedestrian.
  • information about a movement of one of the first stereo pair and the second stereo pair is detected both from a temporal sequence of successively recorded first stereo image pairs and from a sequence of successively recorded second stereo image pairs Object is determined, for example by repeated determination of the depth position of the object and preferably by additional determination of the position transverse to the depth direction. Result of such a determination of the movement of the object can eg
  • Another result may be that the object does not collide with the rail vehicle.
  • the movement determined from the respective sequence of stereo image pairs can be extrapolated, for example, into the future.
  • a tolerance in the depth direction is allowed by which the depth positions of the same object determined from the first and second stereo image pairs may deviate from one another.
  • the depth positions deviate from each other by more than the predetermined tolerance, i. If the depth position of one of the stereo image pairs is outside the tolerance range of the depth position from the other stereo image pair, it is decided that the results do not coincide with each other.
  • determining movements from sequences of the stereo image pairs it is possible to proceed accordingly and e.g. a tolerance for the position of an object in the detected space outside the rail vehicle can be specified.
  • the position is determined in particular by the depth position and additionally by two position values transversely to one another and transversely to the depth direction.
  • a comparison becomes possible because the first common part of the space that the first stereo pair detects and the second common part of the room that the second stereo pair detects have a common space area. In other words, the first and second common parts of the room overlap or they are identical in a special case.
  • the four imaging devices are arranged in a front region of the rail vehicle such that the common space region lies in front of the rail vehicle during travel of the rail vehicle in the direction of travel.
  • This also includes cases in which the common room area is adjacent to the route, who still has to drive the rail vehicle.
  • these areas of space next to the route of interest are arranged in a front region of the rail vehicle such that the common space region lies in front of the rail vehicle during travel of the rail vehicle in the direction of travel.
  • the first stereo pair and the second stereo pair do not capture as much of the exterior space of the rail vehicle as possible due to the common space. Rather, it is an advantage of the common space area that the mentioned comparisons are possible. Also, in the event of complete failure, one of the stereo pairs, i. if two of the four imaging devices have failed or are disturbed and either first or second stereo image pairs are unavailable, continued operation of the rail vehicle using the stereo image pairs of the still functional stereo pair is possible. In this case, the rail vehicle can be operated in particular in an operating mode in which the operation and in particular the driving operation is subject to restrictions. But even if there are still two stereo pairs of images available, but the ratio of the distances of
  • the common space area is therefore chosen in particular so that i. the image-forming devices are designed and / or aligned such that the parts of the outer space required for the operation of the rail vehicle or a driver assistance system in the common
  • Room area lie. In the case described below, this is e.g. the part of the external space which lies in front of the rail vehicle in the direction of travel, with the exception of a short, e.g. some 10 cm deep part of the room, which is directly at the front of the
  • Imaging devices are located directly on the front of the rail vehicle inside or outside. “Inside” or “outside” in this case means that the
  • a position of the surface exactly on the envelope surface is considered as internal.
  • the image capture devices are preferably digital cameras, which in particular generate sequences of digital images. But also possible scanning recording methods in which the picture elements of each of the two-dimensional pictures are sequentially detected in rapid succession, thereby
  • the space it is optionally possible to irradiate the space to be detected and to detect the radiation reflected back to the imaging device.
  • the detected radiation is not limited to radiation visible to humans. Rather, alternatively or additionally, radiation in other wavelength ranges can also be detected.
  • the detection of sound waves is possible. However, it is preferred that at least visible radiation is also detected by the imaging devices.
  • the detection of the space or a part of the space in the direction of travel in front of the rail vehicle using the imaging system can be realized by a driver assistance system, in particular on board the rail vehicle.
  • the detection as will be explained in more detail below for the third measure, enables remote monitoring and / or remote control of the rail vehicle.
  • the second measure proposed below for increasing the reliability in the use of an imaging system, relates to the processing and / or transmission of the images generated by the imaging devices
  • this second measure is also applicable if there are not four imaging devices in operation, of which two each form a stereo pair.
  • the second measure is based on the object of specifying a rail vehicle and / or a method for operating a rail vehicle, wherein the reliability of the use of an imaging system is increased in particular for autonomous, driverless operation.
  • the second measure can also be used if only at least one driver assistance system uses the imaging system.
  • Image generation system generates image information using redundant
  • the imaging system be a first
  • Image signal connections are connected to each of the four imaging devices, wherein the first computer unit and the second computer unit are configured, during operation of the imaging system, independently of one another to calculate depth information about a depth of image objects from the image signals received via the image signal connections, with the two-dimensional images from the first stereo pair and / or the second stereo Pair, the depth extending in a direction transverse to an image plane of the two-dimensional images.
  • Calculator unit independently from the image signals calculate depth information on a depth of image objects that were detected with the two-dimensional images of the first stereo pair and / or the second stereo pair, wherein the depth in a direction transverse to an image plane of the two-dimensional images extends.
  • Imaging device (s) is / are the image generating device (s) connected via image signal connections to both a first computer unit of the rail vehicle and a second computer unit of the rail vehicle and transmit during operation image signals to both the first and the second
  • the two computer units process the image information thus obtained independently of each other. As a result, in the event of failure of a signal connection or one of the computer units continued operation using the results of
  • the first and second computer units may be arranged in a common housing or at a distance from each other in the rail vehicle. It is advantageous in any case that the computer units independently evaluate the same image information. Preferably, during operation of the two computer units, a comparison of the results of the processing units obtained by the two computer units is performed
  • Image information performed it can be decided that the function of at least one of the computer units or the image information received from the computer units is / are disturbed.
  • the computer units can alternatively or additionally be used to monitor the respective other computer unit and / or the individual image generation devices of the imaging system for correct function.
  • plausibility checks may be carried out as to whether the function and / or information satisfy / satisfy plausibility criteria.
  • redundant computer units enables secure and reliable transmission of information from the rail vehicle to a remote facility, e.g. a vehicle control center.
  • a remote facility e.g. a vehicle control center.
  • Vehicle control center the information can be transmitted from the rail vehicle, for example, to another rail vehicle, e.g. an operating in the same rail network and / or track section, in particular moving rail vehicle. These modes (e.g., control center operation) will be discussed in more detail. Regardless of whether redundant computer units are used, all functions and features of a control center described in this description can alternatively or additionally be realized by the further rail vehicle. For example, the unprocessed or further processed image information of the
  • Imaging system are sent to the control center and / or the other rail vehicle.
  • the further rail vehicle may be a following vehicle traveling on the same track. Especially when needed, for
  • Imaging system is not possible or is limited possible and / or monitored, the subsequent vehicle with the preceding, first rail vehicle, an actual train (that is, the rail vehicles are or are mechanically coupled to each other) or a virtual train (that is, the rail vehicles
  • the driver can control the train in the following vehicle, in particular control the driving operation, the driver looks at an image display device having one or more screens can, the image information obtained from the first rail vehicle and optionally further processed image information in the subsequent rail vehicle.
  • the rail vehicle may have a first computer unit and a second computer unit, which are each connected via image signal connections to each of the four imaging devices, wherein the first computer unit with a first transmitting device for transmitting image signals to one of the
  • Rail vehicle remote receiving device is connected and the second
  • Computer unit is connected to a second transmitting device for transmitting image signals to the receiving device remote from the rail vehicle.
  • Computer unit and a second computer unit of the rail vehicle respectively receive image signals from each of the four imaging devices via image signal connections, wherein the first computer unit transmits image signals to a receiving device remote from the rail vehicle via a first transmission device and wherein the second computer unit transmits image signals to that of the rail vehicle via a second transmission device remote receiving device sends.
  • the remote receiving device preferably has two receiving units, which are each connected to one of the transmitting devices of the rail vehicle.
  • the connections between the transmitting devices and the receiving device are, in particular, radio links, preferably broadband radio links, such as e.g. according to the mobile radio standard LTE or the mobile radio standard UMTS.
  • the remote receiving device or a device Preferably, the remote receiving device or a device
  • Image signals also mean that they are processed image signals that have been processed in particular by the computer units. However, alternatively or additionally, image signals which have not been processed by the computer units can be sent to the remote receiving device, in particular those image signals which were received by the computer units directly from the imaging system.
  • the transmission connections operated via the first transmission device and the second transmission device may be radio links of the same radio network. Alternatively, however, different radio networks are used for the transmission.
  • Signal connections enable reliable operation and / or reliable monitoring of the rail vehicle.
  • an operation of the rail vehicle In particular, an operation of the rail vehicle
  • Rail vehicle possible, which is controlled by a remote control center and / or by another rail vehicle. This will be discussed in more detail below.
  • the third measure is based on the object of being able to operate a rail vehicle as reliably as possible without a driver.
  • a driver is understood to mean a person who is traveling with the rail vehicle when the vehicle is moving and who controls the driving operation of the rail vehicle, in particular with regard to the traction and with respect to the braking of the rail vehicle.
  • Rail vehicle is arranged remotely, which requests the transmission of the image information from the rail vehicle and thereby triggers the transmission. This allows the control center and in particular a person working therein to monitor the autonomous operation of the rail vehicle, in particular, even if there is no fault and no indication of a fault.
  • the above-described first measure and / or second measure increases the reliability and safety of the autonomous operation, the monitoring and possibly one of the control center from remotely controlled operation of the
  • the third measure can also be realized without the first and second measures.
  • the system comprises the rail vehicle and a control center, which is remote from the rail vehicle.
  • the control center By means of the control center, the already mentioned remote-controlled driving operation of the rail vehicle and / or a monitoring of the autonomous driving operation of the rail vehicle can be carried out.
  • the rail vehicle preferably has a first transmission device, via which during an operation of the rail vehicle image signals from each of the four
  • Imaging equipment and / or generated by a computer unit of the rail vehicle from the image signals further processed image signals to one of the
  • railway station are transmitted to the first receiving device, wherein the control center is connected to the first receiving device and receives during operation of the rail vehicle from the receiving device received image signals, the control center having an image display device, generated during the operation of the rail vehicle from the received image signals images and be represented, wherein the control center comprises a control device, are generated during operation of the rail vehicle, control signals for controlling a driving operation of the rail vehicle, wherein the control center with a second
  • Transmitter device is connected via the control signals during operation a second receiving device of the rail vehicle are sent and wherein the rail vehicle has a driving system, which during operation of the
  • Rail vehicle receives and processes the control signals generated by the control device of the control center and performs the driving operation of the rail vehicle in accordance with the control signals.
  • a corresponding embodiment of the method of operation also relates to a system comprising the rail vehicle in one of the embodiments described herein and a control station remote from the rail vehicle, wherein during operation of the rail vehicle image signals from each of the four
  • Imaging equipment and / or generated by a computer unit of the rail vehicle from the image signals further processed image signals from a first
  • Sending device of the rail vehicle to a remote from the rail vehicle first receiving device are sent, wherein the control center receives from the first receiving device received image signals, wherein the control center by means of an image display device from the received image signals images and represents, wherein the control center by means of a control device control signals for controlling a Driving operation of the rail vehicle generates, wherein the control center via a second transmitting device, the control signals to a second receiving means of
  • Rail vehicle transmits and wherein a driving system of the rail vehicle receives and processes the control signals from the second receiving device and executes the driving operation of the rail vehicle according to the control signals.
  • the image forming system of the rail vehicle may have a different number of image forming devices whose image information is further processed by at least one computer unit of the rail vehicle and / or their image information without further processing of the first transmitting means is sent to the remote from the rail vehicle first receiving means.
  • the third measure has the advantage that, in some cases, the vehicle continues to drive through one of the control center and / or the other, despite an obstacle which seems to block or block the route
  • Rail vehicle remote controlled driving is possible. This is based on the recognition that there are obstacles that are erroneously classified as insurmountable by an automatic and autonomous driving system of the rail vehicle become. Examples are lightweight but voluminous objects such as cover films, which are used eg on construction sites. It is also possible that an obstacle at slow approach by the rail vehicle voluntarily or automatically leaves the route, z. B. an animal. In particular, in these cases, for example, a person working in the control center can perceive images displayed on the image presentation device that are based on the image information of the vehicle imaging system. Furthermore, the person on the control device of the control center the driving of the
  • Rail vehicle control Even if the autonomous vehicle control on board the rail vehicle is disturbed, the driving operation can be controlled by the control center. With trouble-free operation of at least one stereo pair and with interference-free transmission of image information generated under certain circumstances may further receive and evaluate in the control center depth information on the area in the direction of travel in front of the rail vehicle. Optionally, the depth information is first generated in the control center from the respective stereo image pair. A person in the control center can therefore, similar to a driver of a conventional rail vehicle, his control commands for controlling the driving operation not only on two-dimensional
  • the control center and / or the further rail vehicle has in particular a
  • An image display device for displaying image information obtained by using the image forming system.
  • the image display device may comprise, for example, a screen or an array of screens.
  • the image display device is combined with or has an optical device that, for example, by means of suitable pinhole and / or lenses, the viewing of the individual images alone or
  • Imaging system realistically perceive space with his eyes.
  • the at least one imaging device of the imaging system of the rail vehicle is in particular a device with an optical system (ie an optical device) by means of which the detected radiation incident on the device is directed to a sensor which generates the image information, eg digital,
  • the information from a driver assistance system on board the rail vehicle, from an autonomous driving system of the vehicle and / or from a control center is used for the perception of the space outside the rail vehicle
  • at least one further sensor is used, which detects the surroundings of the vehicle.
  • Imaging device which detects the space in the direction of travel in front of the rail vehicle, at least one of said additional sensors and / or at least one further image generating device, in particular by means of an optics
  • Imaging devices and / or other sensors of the rail vehicle for detecting the space outside the rail vehicle and / or signal generator for outputting signals in the space outside the rail vehicle may be arranged in particular at least partially in a projection on the outer surface of the rail vehicle, which is bar-shaped. Therefore, at least one sensor and / or a signal generator can be arranged at least partially in the bar-shaped projection.
  • An advantage of the beam-shaped projection is that the construction of the rail vehicle compared to a design without beam-shaped projection has to be changed only slightly. All parts of the rail vehicle that are inside the outer shell of an existing rail vehicle construction can be carried out as before.
  • a beam-shaped projection which is additionally provided on the outer surface of the rail vehicle, can easily fixing areas for attaching the beam-shaped projection and the
  • Performing at least one connecting line of the sensor and / or signal generator can be found.
  • the beam-shaped, elongated configuration of the projection makes it possible to freely position attachment points and passages within portions of the projection.
  • a bar-shaped projection also has the advantage that there is space for the arrangement of the at least one sensor and / or signal generator, which does not or only slightly claimed the space located inside the projection in the outer shell of the rail vehicle. Further, from a protrusion on the outer surface of the rail vehicle, a greater part of the outer space can be detected obstacle-free or signals can be sent in a larger part of the outer space without obstruction than in an arrangement within planar or un-protruded surface regions of the rail vehicle.
  • the position of the sensor is therefore favorable for the detection of the external space and the position of the signal generator is favorable for the emission of signals in the outer space. For example, there is nothing to prevent detection of the exterior space and / or emission of signals in the vertical direction or approximately vertical direction to the floor immediately adjacent to the rail vehicle.
  • the bar-shaped projection also protects the sensor and / or signal transmitter against external influences.
  • externally applied forces for example, trees standing next to the track
  • the beam-shaped projection also protects against other external influences such as dirt, precipitation and moisture and / or sunlight.
  • the signal generator may in particular be an acoustic signal generator for outputting an acoustic signal (for example a warning) and / or a signal optical signal generator for outputting an optical signal act.
  • an optical signal is also understood to be light perceivable by persons, which, for example, can impinge on a projection surface, such as a road surface, so that characters and / or images, which are visually perceptible, are projected onto the projection surface.
  • the optical signal transmitter can therefore be referred to as a projector.
  • the beam-shaped projection extends in a longitudinal direction which is the direction of the largest outer dimension of the beam-shaped projection, the longitudinal direction being transverse to the vertical direction along the outer surface of the beam
  • the longitudinal direction can follow the outer contour of the rail vehicle.
  • the longitudinal direction corresponding to the outer contour may have a developed (for example, at the transition between unwound sidewalls of the rail vehicle) and / or curved (for example on curved side walls of the rail vehicle) course.
  • the bar-shaped projection can be carried out in different ways / be.
  • the beam-shaped projection may be attached to the outer surface of a railcar body, e.g. by welding, gluing, riveting and / or screwing.
  • a positive connection is possible if the carbody is designed according to its outer surface, e.g. is provided with a in the longitudinal direction of the beam-shaped projection to be fastened extending profile / is then attached to the beam-shaped projection is / is.
  • the beam-shaped projection can be designed as an integral part of the car body or the car roof of the rail vehicle / be.
  • the cross-sectional profile of the beam-shaped projection in particular with the exception of the end portions at the opposite ends in the longitudinal direction of the projection and / or with the exception of the area in which the sensor and / or signal generator is located, is constant in terms of shape and size of the cross section. Even in areas in which the course of the bar-shaped projection is angled in its longitudinal direction, for example in order to adapt to the outer contour of the rail vehicle, the shape and / or size of the cross section may deviate from the otherwise constant cross section.
  • a preferred cross-sectional shape is trapezoidal, with the longer of the parallel sides of the trapezium being inside and, for example, with the outer surface of the trapezoid Car body is connected and the shorter side of the parallel sides of the trapezium outside.
  • the projection tapers in the cross section from inside to outside.
  • the material for the projection in particular according to the cross-sectional shape angled metal or plastic sheet profiles, e.g. Polypropylene or other polymers in question. Fiber-reinforced plastics are also well suited due to their strength and their low weight.
  • the material of the bar-shaped projection forms at least one outer wall extending in its longitudinal direction, which forms an interior of the
  • an elongated housing is formed in this way, wherein an interior or cavity of the bar-shaped
  • Projection in the longitudinal direction of the projection extends. It is preferred that the cavity without closed partition pass in different longitudinal sections from the one end region of the beam-shaped projection to the opposite end region of the beam-shaped projection. However, this does not exclude that different bar-shaped projections abut each other at their end regions. Alternatively, long, e.g. over several meters in the longitudinal direction
  • connection lines can be laid in the form of cable harnesses as a trunk group in the bar-shaped projection.
  • the bundle is at a single point of transition from the interior of the
  • the beam-shaped projection led into the interior of the rail vehicle.
  • the beam-shaped projection may extend along an outer circumferential line which, viewed from above, extends around the rail vehicle.
  • the beam-shaped projection extends along from
  • a longer beam-shaped projection has the advantage of providing space for sensors and / or transducers in different areas of the outer surface and, unlike a plurality of beam-shaped protrusions spaced apart, has fewer end portions against which objects may abut. It also offers the possibility of receiving connection lines of the sensors and / or signal transmitters over its entire longitudinal extension or at least a part thereof.
  • Other devices of the rail vehicle in particular guides for guiding the movement of doors, may be integrated in the projection.
  • the beam-shaped projection in the manner of a ring can be closed in itself around the rail vehicle. This makes it possible to arrange sensors and / or signal transmitters at arbitrary positions in the circumferential direction of the vehicle.
  • the beam-shaped projection extends above an outside window or above outside windows of the rail vehicle.
  • sensors In the area above the windows, sensors have a good position for detecting the space outside the rail vehicle and have a signal generator good position for emitting signals. Also come people, for example when entering and exiting because of the high height of the area above windows not in contact with the projection.
  • Image generation facilities generated stereo image pairs can not only
  • Depth information can be obtained from objects on or on the route.
  • the course of the travel path can be determined. This makes it possible, for example, the operation of the rail vehicle with regard to at least one other Function to control. Possible further functions are, for example, the alignment of wheels (in particular corresponding to the curve radius of a curve of the travel path) of the rail vehicle on which the rail vehicle is traveling and the alignment or activation (eg the switching on) of at least one headlight (in particular according to the course a curve of the track and / or a preceding or following straight track section or curve with another
  • Embodiments contain only sensors. However, it is possible to replace at least one of the sensors by a signal generator and / or to arrange, in addition to the sensors, at least one signal generator at least partially in the bar-shaped projection.
  • the individual figures of the drawing show:
  • Fig. 1 is a side view of a rail vehicle, e.g. a tram or
  • FIG. 2 is a schematic plan view of a front portion of a rail vehicle with an imaging system having two stereo pairs.
  • Fig. 3 is a block diagram with devices in a rail vehicle, via
  • Radio links are connected to a control center
  • FIG. 4 is a simplified external view of a rail vehicle with a laterally encircling beam-shaped projection, which extends above the outer windows of the rail vehicle and in which a plurality of sensors for detecting the outer space of the rail vehicle are arranged,
  • Fig. 5 is a view similar to that of Fig. 4, e.g. from the same
  • Rail vehicle as in FIG. 4, but from the opposite side or a representation of a similar rail vehicle,
  • Fig. 6 is a front view of a rail vehicle with one of the
  • Fig. 7 shows schematically a cross section through a car body of a
  • Movement of the sliding door contains
  • FIG. 8 schematically shows an arrangement of four imaging devices similar to those in FIG. 2 or FIG. 6, wherein all four imaging devices are functional,
  • Fig. 9 shows the arrangement of Fig. 8, but with one of the four
  • Image generating devices failed or disturbed and yet two stereo pairs of image forming devices are formed.
  • Fig. 10 shows the arrangement of Fig. 8, but with another of the four
  • Imaging equipment as failed or disturbed in Fig. 9 and two other stereo pairs of the image forming devices as be Fig. 9 are formed.
  • the rail vehicle 1 shown in FIG. 1 has a front area on the left in the figure and a rear area on the right in the figure. However, it is also possible for the vehicle 1 to travel in the reverse direction during normal operation, e.g. if in the end region shown on the right is also a driver's station or if at least all necessary for a ride to the right facilities such as headlights are available.
  • each of the two end areas shown on the left and right in FIG. 1 there is an imaging system with at least one imaging device and preferably the at least four imaging devices mentioned above.
  • an image generating device 2a of a first image forming system Illustrated in the left end region is an image generating device 2a of a first image forming system and in the right end region an image forming device 2b of a second image forming system.
  • These two imaging systems each detect the outer space of the vehicle 1 located in front of or behind the end area.
  • the image generation devices 2a, 2b are digital cameras that continuously generate two-dimensional images of the exterior space.
  • the image forming devices 2 of the first and second image forming systems are each provided with separate image signal connections 10a, 10b; 1 1 a, 1 1 b connected to a first computer unit 20a and a second computer unit 20b.
  • the first Computing unit 20a is arranged in the left end region or an adjacent central region of the vehicle 1.
  • the second computer unit 20b is arranged in the right-hand end area or an adjacent central area of the vehicle 1. Consequently, the image signal connections 10a, 10b extend in FIG.
  • the computer units 20 are each combined with a transmitting device, which is not shown separately in FIG. From the transmitting device image signals via a radio link 40a, 40b are sent to a control center 60.
  • the radio links are separate radio links, preferably via different mobile radio networks, so that if one of the networks fails, one of the radio links 40a, 40b can still be operated.
  • Imaging system generated image information without further processing by the computer units 20a, 20b and / or in further processed form (for example, with
  • Computing unit 20b only a transmitting device for sending the not
  • Computer units 20a, 20b processed image information, it is at the computer unit at least a part of an evaluation device. Unlike shown in the figures, only a single evaluation device may be present. In particular, this evaluation device receives images from four imaging devices, all of which have a common detection area (space area), i. at least part of all four coverage areas is the same.
  • this evaluation device receives images from four imaging devices, all of which have a common detection area (space area), i. at least part of all four coverage areas is the same.
  • control center 60 via the transmission of signals via a radio link 50a and / or 50b information to the
  • Receiving device for receiving the radio signals from the control center 60 A not in Fig. 1 illustrated signal processing device is connected to the radio links 50a, 50b and can process the received signals from the control center 60 and, for example, control the driving operation of the vehicle 1.
  • the rail vehicle 1 shown schematically in FIG. 2, which may be the rail vehicle 1 from FIG. 1, has its front area
  • Image forming system with four imaging devices 2, 3, 4, 5 on.
  • the first image forming device 2 and the second image forming device 3 form a first stereo pair 2, 3, which has a greater distance in horizontal means from each other than the third image forming device 4 and the fourth
  • Image forming device 5 which form a second stereo pair 4, 5.
  • the opening angles of the space regions detected by the individual image generation devices 2-5 in the direction of travel in front of the vehicle 1 are the same. Due to the larger distance of
  • imaging means 2, 3 is the common part 8a of the space detected by the first stereo pair 2, 3 located a greater distance in front of the rail vehicle 1 than the common part 8b of the space detected by the second stereo pair 4, 5.
  • Numeral 9 is an object lying in front of the vehicle 1 in the direction of travel, which is completely in the common part 8b of the second stereo pair 4, 5, but only partially in the common part 8a of the first stereo pair 2, 3 is located.
  • the first stereo pair 2, 3 serves to detect a space range arranged at a greater distance (ie, in the left-to-right direction in FIG. 2) than the second stereo pair 4, 5 be increased in the direction of travel in front of the rail vehicle 1 space compared to the use of a single stereo pair.
  • the opening angle of the first and second image forming means 2, 3 may be smaller than the opening angle of the third and fourth image forming means 4, 5 and / or by means not shown optical devices associated with the
  • Imaging devices 2-5 are combined, which in the images generated sharp detected space area at the first stereo pair 2, 3 farther away from the
  • Rail vehicle 1 are compared to the second stereo pair 4,5.
  • a rectangular frame indicated by the reference numeral 1 schematically represents the outline of a rail vehicle, e.g. of the rail vehicle 1 from FIG. 1 and / or FIG. 2.
  • a rectangular frame designated by the reference numeral 60 represents the outer contour of a control center for the operation of at least one rail vehicle.
  • the rail vehicle 1 as in FIG. 2, has two stereo pairs 2, 3; 4, 5, which together form an imaging system.
  • the imaging system may alternatively have a different number of
  • each of the imaging devices 2-5 of the imaging system is connected to a first image signal link 11 with a first one
  • Computing unit 20a and connected via a separate, second image signal connection 10 with a second computer unit 20b. During the operation of the
  • the first and second receiving means 63a, 63b are provided with a
  • Image display device 61 of the control center 60 connected. Furthermore, the control center 60 has a control device 62, which is connected via a transmitting device not shown in detail and a radio signal connection 50 with the central vehicle control 23. The corresponding receiving device of the signal connection 50, the part of
  • Rail vehicle 1 may e.g. to be a facility with the first one
  • Transmitting means 21 a or the second transmitting means 21 b is combined, or it may e.g. as a separate or integrated into the central vehicle control 23
  • Receiving device be realized.
  • a second to the
  • Signal connection 50 redundant radio link for the transmission of signals from the control center 60 to be provided to the vehicle 1, such. B. in Fig. 1st
  • the imaging system of the vehicle 1 detects the particular in the direction of travel in front of the vehicle 1 space and generates corresponding two-dimensional images of the room. The so produced
  • Image information is transmitted to the first and second computer units 20 via the first and second signal links 10, 11.
  • each of the computer units 20a, 20b calculates depth information of the objects detected with the images and optionally additionally calculates whether a collision of the
  • Vehicle 1 with an obstacle on the route is imminent. It can also be calculated whether an object is likely to move on the route when the object is moving.
  • the results of the calculations, and preferably at least portions of the unprocessed image information obtained by the imaging system, are transmitted from the computing units 20 to the central vehicle controller 23, which, using the information received from the computing units 20
  • Driving system 25 in particular a traction and braking system, of the rail vehicle. 1 controls. In this way, an autonomous, driverless operation of the vehicle 1 is possible.
  • the central vehicle controller 23 may also have redundant computing units that redundantly, i.e., redundantly, perform all of the data processing operations that occur in it. run separately from each other in the same way. Alternatively or additionally, the central vehicle controller 23 can compare the information received from the two computer units 20a, 20b and check whether significant deviations exist.
  • the central vehicle controller 23 in this way a disturbance at least the operation of a computer unit and / or a part of the
  • the central vehicle controller 23 generates signals which are the result of processing the signals received from the two computer units 20, and sends these signals to the control center 60 via the first and second radio signal links 40a, 40b.
  • those of the computer units 20 output signals via the first and second transmitting means 21 a, 21 b and the first and second radio link 40 a, 40 b are transmitted to the control center 60.
  • the image display device 61 may be combined with a computing device (not shown) which processes the images to be displayed in such a way that they are displayed on the image display device 61.
  • a computing device not shown
  • Computing device to check whether the signals received via the separate radio signal links 40a, 40b significantly differ from each other and therefore the operation is partially disturbed. In particular, corresponding to a fault automatically
  • Measures are taken by the control center 60 via the radio signal connection 50 sends signals to the central vehicle controller 23.
  • At least one person in the control center 60 considers that on the
  • Image display device 61 shown images. This can be limited to periods in which the central vehicle control unit 23 can not autonomously control the driving operation of the vehicle 1. By operating the control device 62, the person Generate control signals via the radio signal connection 50 to the central
  • Vehicle control 23 are transmitted.
  • the person can remotely control the driving operation of the rail vehicle 1 in this way.
  • the person may only control signals for monitoring the operation of the
  • Rail vehicle 1 generate, which are transmitted via the radio signal connection 50 to the central vehicle controller 23 and a transmission of signals via the
  • Radio signal connections 40 causes, which are required for monitoring.
  • the rail vehicle 101 shown in Fig. 4 may be e.g. the rail vehicle 1 may be one of the FIGS. 1 to 3. It has a bar-shaped projection 80 extending above windows 121 in the side walls 1 13 of the vehicle 101 and also above windows 122 in the front region of the vehicle 101, in which a plurality of sensors 2, 105, 106, 107 are integrated or at least integrated a part of their respective volumes are integrated. In the case of partial integration, a portion of the sensor may protrude outwardly of the beam-shaped projection and / or inwardly.
  • the bar-shaped projection 80 can be recessed on the underside of the respective sensor or directly laterally of the respective sensor in order to prevent the sensor from detecting areas outside of the sensor
  • Rail vehicle 101 to allow.
  • the bar-shaped projection 80 extends from the transition region shown on the right in FIG. 4 to an adjacent vehicle body of a vehicle or vehicle part coupled to the vehicle 101 along the longitudinal direction of the vehicle 101 on the side wall in front of the image 1 13 and then around the front of the vehicle 101st
  • the beam-shaped projection 80 extends from the front region further against the longitudinal direction along the opposite side wall 1 13, which is shown in Fig. 5.
  • sensors 105, 107 and 108 for detecting the outer space of the rail vehicle 101.
  • the front area shown in Fig. 5 in the direction of travel looking forward is another image forming device 5 of the imaging system.
  • sensors 105 may be, for example, radar or ultrasonic sensors.
  • the sensors 106, 107 and 108 arranged on the side walls 1 13 may, for example, be digital cameras which, when they stop at stops, detect the area outside the vehicle and in particular around the vehicle doors 102, 103.
  • imaging system can be arranged side by side, preferably side by side in the horizontal direction.
  • first sensor 2 and the third sensor 4 are arranged directly next to each other and have the
  • the sensors of the imaging system could not be in one
  • a beam-shaped projection 80 can not only be used for the arrangement of sensors but can also contain a guide 1 17 for a vehicle door 102.
  • Rail vehicle 101 includes in the illustrated embodiment at the illustrated cross-sectional position only on one side of a sliding door 102.
  • the car body can also on the opposite side of the same
  • Cross-sectional position have a sliding door.
  • Such sliding doors 102 can be moved to open and close only in the straight direction. she
  • Bar-shaped projection can e.g. be present when sliding doors are used, which are not moved outward to open.
  • the beam-shaped projection may have at least a part of the movement guide for moving the sliding door when opening and closing.
  • the bar-shaped projection may contain connecting lines, in particular power supply lines and signal connections, via which the sensors, which are at least partially arranged in the bar-shaped projection, are connected to other devices of the
  • Rail vehicle such as transmission facilities and computer units are connected.
  • the arrangement of four image generating devices 2, 3, 4, 5 shown in FIG. 8 represents a concrete embodiment for the design of the distances between the two
  • the nearest adjacent side-by-side imaging devices may all be equidistant from each other, i. the distance to the respective next imaging device is the same for all imaging devices.
  • the two middle ones may be equidistant from each other, i. the distance to the respective next imaging device is the same for all imaging devices.
  • Imaging devices therefore each have a next adjacent imaging device in the opposite directions. Also in this case, if one of the four image forming devices fails, a first stereo pair with a smaller pitch and a second stereo pair with a larger pitch can always be formed.
  • Imaging equipment namely the distance between the
  • Image generating device 2 and the image forming device 3 labeled A.
  • the distances between nearest neighboring imaging devices are designated B, C, D. The distances are all different. If all four
  • Imaging facilities are able to provide trouble free images of the
  • Vehicle environment to provide an evaluation device for example, the image forming devices 2, 5 (with a distance equal to the sum of the distances B and C) as the first stereo pair and the image forming devices 2, 3 (at distance A) as a second stereo Couple operated.
  • the image forming device 2 is available as a reserve.
  • z. B. the image forming devices 3, 5 (with Distance D) as the first stereo pair and the image forming devices 2, 4 (at a distance C) are operated as a second stereo pair.
  • Imaging devices 2, 4 (at a distance C) are operated as a second stereo pair. Also, the distances C, E differ significantly from each other, so that the different stereo pairs are well suited for the detection of different depth ranges (ie distance ranges to the vehicle).
  • the imaging device 2 has failed or is disturbed, after one of the operating phases mentioned in FIG. 8 a new phase of operation begins, in which the image generating devices 3, 5 (at distance D) are the first stereo pair and the image forming means 4, 5 (at distance B) are operated as a second stereo pair. Also, the distances B, D differ significantly from each other, so that the different stereo pairs are well suited for the detection of different depth ranges.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

L'invention concerne un véhicule ferroviaire (1) pourvu d'un système de génération d'images pour détecter un espace à l'extérieur du véhicule ferroviaire (1). Le système de génération d'images présente quatre dispositifs de génération d'images (2, 3, 4, 5), qui génèrent des images bidimensionnelles de l'espace. Un premier (2) et un deuxième (3) des quatre dispositifs de génération d'images (2, 3, 4, 5) sont disposés à une première distance l'un de l'autre et forment une première paire stéréo (2, 3). Le troisième (4) et le quatrième (4) des quatre dispositifs de génération d'images (2, 3, 4, 5) sont disposés à une deuxième distance l'un de l'autre et forment une deuxième paire stéréo (2, 3), la première distance étant supérieure à la deuxième distance. Pour augmenter la fiabilité, en particulier lors d'un fonctionnement sans conducteur du véhicule, le système de génération d'images présente un dispositif d'évaluation, qui reçoit des données d'image des quatre dispositifs de génération d'images (2, 3, 4, 5). Si l'évaluation des données d'image d'un dispositif de génération d'images (2; 5) en panne et/ou déréglé n'est pas possible ou défectueuse, le dispositif d'évaluation est conçu pour utiliser les données d'image de trois autres (3, 4, 5; 2, 3, 4) des quatre dispositifs de génération d'images (2, 3, 4, 5), qui contiennent une première paire d'images stéréo et une deuxième paire d'images stéréo, la première paire d'images stéréo correspondant aux données d'image de deux des trois autres dispositifs de génération d'images (3, 4, 5; 2, 3, 4) qui sont disposés à une troisième distance les uns des autres et la deuxième paire d'images stéréo correspondant aux données d'image de deux des trois autres dispositifs de génération d'images (2, 3, 4, 5) qui sont disposés à une quatrième distance les uns des autres, la troisième distance et la quatrième distance étant différentes l'une de l'autre.
EP15797627.5A 2014-11-10 2015-11-10 Fonctionnement d'un véhicule ferroviaire pourvu d'un système de génération d'images Active EP3218244B1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PL15797627T PL3218244T3 (pl) 2014-11-10 2015-11-10 Eksploatacja pojazdu szynowego zaopatrzonego w system obrazowania przestrzennego
EP18186200.4A EP3431361A3 (fr) 2014-11-10 2015-11-10 Fonctionnement d'un véhicule ferroviaire doté d'un système d'imagerie

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102014222900.6A DE102014222900A1 (de) 2014-11-10 2014-11-10 Betrieb eines Schienenfahrzeugs mit einem Bilderzeugungssystem
PCT/EP2015/076211 WO2016075138A1 (fr) 2014-11-10 2015-11-10 Fonctionnement d'un véhicule ferroviaire pourvu d'un système de génération d'images

Related Child Applications (2)

Application Number Title Priority Date Filing Date
EP18186200.4A Division-Into EP3431361A3 (fr) 2014-11-10 2015-11-10 Fonctionnement d'un véhicule ferroviaire doté d'un système d'imagerie
EP18186200.4A Division EP3431361A3 (fr) 2014-11-10 2015-11-10 Fonctionnement d'un véhicule ferroviaire doté d'un système d'imagerie

Publications (2)

Publication Number Publication Date
EP3218244A1 true EP3218244A1 (fr) 2017-09-20
EP3218244B1 EP3218244B1 (fr) 2018-09-12

Family

ID=54608500

Family Applications (2)

Application Number Title Priority Date Filing Date
EP18186200.4A Withdrawn EP3431361A3 (fr) 2014-11-10 2015-11-10 Fonctionnement d'un véhicule ferroviaire doté d'un système d'imagerie
EP15797627.5A Active EP3218244B1 (fr) 2014-11-10 2015-11-10 Fonctionnement d'un véhicule ferroviaire pourvu d'un système de génération d'images

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP18186200.4A Withdrawn EP3431361A3 (fr) 2014-11-10 2015-11-10 Fonctionnement d'un véhicule ferroviaire doté d'un système d'imagerie

Country Status (7)

Country Link
US (1) US10144441B2 (fr)
EP (2) EP3431361A3 (fr)
CN (1) CN107107933B (fr)
DE (1) DE102014222900A1 (fr)
ES (1) ES2700830T3 (fr)
PL (1) PL3218244T3 (fr)
WO (1) WO2016075138A1 (fr)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017206123A1 (de) * 2017-04-10 2018-10-11 Robert Bosch Gmbh Verfahren und Vorrichtung zur Fusion von Daten verschiedener Sensoren eines Fahrzeugs im Rahmen einer Objekterkennung
EP3609766A1 (fr) * 2017-04-14 2020-02-19 Bayer CropScience LP Procédé et système de détection de végétation et d'alerte pour un véhicule ferroviaire
JP6816679B2 (ja) * 2017-09-05 2021-01-20 トヨタ自動車株式会社 車両の制御装置
CN107618535B (zh) * 2017-09-28 2018-11-20 建湖金洋科技有限公司 铁道安全维护平台
DE102017217408A1 (de) * 2017-09-29 2019-04-04 Siemens Mobility GmbH Schienenfahrzeug zur Personenbeförderung
CN108583622B (zh) * 2018-04-02 2020-12-25 交控科技股份有限公司 轨道交通状况的识别方法、装置、设备和介质
JP7181754B2 (ja) * 2018-10-15 2022-12-01 株式会社日立製作所 軌道走行車両の障害物検知システムおよび障害物検知方法
DE102018222169A1 (de) * 2018-12-18 2020-06-18 Eidgenössische Technische Hochschule Zürich Bordeigenes visuelles Ermitteln kinematischer Messgrößen eines Schienenfahrzeugs
US10899408B2 (en) * 2019-01-10 2021-01-26 Luxonis LLC Method and apparatus to reduce the probability of vehicle collision
EP3936408B1 (fr) * 2019-03-04 2023-09-27 Hitachi Kokusai Electric Inc. Système de surveillance de train
DE102020206549A1 (de) * 2020-05-26 2021-12-02 Siemens Mobility GmbH Sensormodul sowie Schienenfahrzeug mit einem solchen Sensormodul
CN111935451A (zh) * 2020-07-16 2020-11-13 中国铁道科学研究院集团有限公司电子计算技术研究所 铁路安全监测装置
FR3117981A1 (fr) * 2020-12-21 2022-06-24 Alstom Transport Technologies Véhicule ferroviaire comprenant un dispositif de surveillance et procédé de surveillance associé
DE102021200767A1 (de) 2021-01-28 2022-07-28 Siemens Mobility GmbH Selbstlernendes Warnsystem für Schienenfahrzeuge
CN113031602B (zh) * 2021-03-04 2022-08-02 上海申传电气股份有限公司 一种矿用轨道电机车动态包络线的构建方法
DE102021206475A1 (de) 2021-06-23 2022-12-29 Siemens Mobility GmbH Hindernisdetektion im Gleisbereich auf Basis von Tiefendaten
DE102022212227A1 (de) 2022-11-17 2024-05-23 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Ermitteln eines Betriebszustands eines ein erstes Umfeldsensorsystem und ein zweites Umfeldsensorsystem umfassenden Objekterkennungssystems eines Schienenfahrzeugs

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4855822A (en) 1988-01-26 1989-08-08 Honeywell, Inc. Human engineered remote driving system
US5877897A (en) 1993-02-26 1999-03-02 Donnelly Corporation Automatic rearview mirror, vehicle lighting control and vehicle interior monitoring system using a photosensor array
JP3468428B2 (ja) * 1993-03-24 2003-11-17 富士重工業株式会社 車輌用距離検出装置
JP3522317B2 (ja) * 1993-12-27 2004-04-26 富士重工業株式会社 車輌用走行案内装置
US6891563B2 (en) 1996-05-22 2005-05-10 Donnelly Corporation Vehicular vision system
JP3364419B2 (ja) 1997-10-29 2003-01-08 新キャタピラー三菱株式会社 遠隔無線操縦システム並びに遠隔操縦装置,移動式中継局及び無線移動式作業機械
US6298286B1 (en) 1999-12-17 2001-10-02 Rockwell Collins Method of preventing potentially hazardously misleading attitude data
US20150235094A1 (en) * 2014-02-17 2015-08-20 General Electric Company Vehicle imaging system and method
DE10244127A1 (de) * 2002-09-27 2004-04-08 Siemens Ag Sensorsystem zur Fahrwegüberwachung für eine autonome mobile Einheit, Verfahren sowie Computerprogramm mit Programmcode-Mitteln und Computerprogramm-Produkt zur Überwachung eines Fahrwegs für eine autonome mobile Einheit
DE10341426A1 (de) * 2003-09-09 2005-04-14 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Verfahren zur Raumüberwachung und Raumüberwachungsanlage
DE10353212A1 (de) * 2003-11-13 2005-06-23 Db Netz Ag Verfahren und Vorrichtung zur Erkennung und Vermessung von Vegetation im Umfeld von Verkehrswegen
DE102005057273B4 (de) * 2005-11-25 2007-12-27 Siemens Ag Kommunikationssystem für Fahrzeuge und Streckenzentralen
US7551989B2 (en) 2006-06-21 2009-06-23 Calspan Corporation Autonomous outer loop control of man-rated fly-by-wire aircraft
DE102006056937B4 (de) 2006-11-30 2010-07-22 Deutsches Zentrum für Luft- und Raumfahrt e.V. Fernsteuerbares Fahrzeug und Verfahren zur Steuerung eines fernsteuerbaren Fahrzeuges
EP2115665A1 (fr) * 2007-02-06 2009-11-11 AAI Corporation Utilisation du procédé de différenciation de polarisation pour détecter, capter et éviter des systèmes
DE102007034283A1 (de) 2007-07-20 2009-01-22 Siemens Ag Kommunikationssystem mit schienenfahrzeugseitigen und streckenseitigen Kommunikationseinrichtungen sowie Verfahren zu deren Betrieb
US8255098B2 (en) 2007-10-17 2012-08-28 The Boeing Company Variably manned aircraft
JP4876080B2 (ja) 2008-01-25 2012-02-15 富士重工業株式会社 環境認識装置
US8184196B2 (en) 2008-08-05 2012-05-22 Qualcomm Incorporated System and method to generate depth data using edge detection
DE102008046963A1 (de) * 2008-09-12 2010-06-10 Siemens Aktiengesellschaft Bilderfassungseinheit zur Fusion von mit Sensoren unterschiedlicher Wellenlängenempfindlichkeit erzeugten Bildern
CA2743237C (fr) * 2008-10-22 2014-05-27 International Electronic Machines Corp. Analyse de vehicules a base d'imagerie thermique
DE102009040221A1 (de) 2009-09-07 2011-03-10 Deutsche Telekom Ag System und Verfahren zur sicheren Fernsteuerung von Fahrzeugen
DE102010004653A1 (de) * 2010-01-14 2011-07-21 Siemens Aktiengesellschaft, 80333 Steuerungsverfahren und -anordnung für ein Schienenfahrzeug
CN102085873A (zh) 2011-01-04 2011-06-08 北京清网华科技有限公司 列车途中故障远程诊断系统及方法
DE102011004576A1 (de) * 2011-02-23 2012-08-23 Siemens Aktiengesellschaft Verfahren sowie Einrichtung zum Betreiben eines spurgebundenen Fahrzeugs
JP5276140B2 (ja) * 2011-05-19 2013-08-28 富士重工業株式会社 環境認識装置および環境認識方法
JP5639024B2 (ja) * 2011-09-27 2014-12-10 富士重工業株式会社 画像処理装置
US20140218482A1 (en) * 2013-02-05 2014-08-07 John H. Prince Positive Train Control Using Autonomous Systems

Also Published As

Publication number Publication date
EP3218244B1 (fr) 2018-09-12
DE102014222900A1 (de) 2016-05-12
CN107107933A (zh) 2017-08-29
EP3431361A3 (fr) 2019-06-12
ES2700830T3 (es) 2019-02-19
EP3431361A2 (fr) 2019-01-23
US20180257684A1 (en) 2018-09-13
WO2016075138A1 (fr) 2016-05-19
PL3218244T3 (pl) 2019-01-31
US10144441B2 (en) 2018-12-04
CN107107933B (zh) 2019-03-26

Similar Documents

Publication Publication Date Title
EP3218244B1 (fr) Fonctionnement d'un véhicule ferroviaire pourvu d'un système de génération d'images
DE102017203838B4 (de) Verfahren und System zur Umfelderfassung
EP3497476B1 (fr) Véhicule à moteur et procédé de perception de l'environnement à 360°
EP3181421B1 (fr) Procédé et système de commande automatique d'un véhicule suiveur à l'aide d'un véhicule meneur
EP2586020B1 (fr) Procédé et système pour la validation d'information
EP2722687B1 (fr) Dispositif de sécurité pour un véhicule
DE102016219455B4 (de) Verfahren und aktive Infrastruktur zur Überprüfung einer Umfelderfassung eines Kraftfahrzeugs
DE102011014699B4 (de) Verfahren zum Betrieb eines Fahrerassistenzsystems zum Schutz eines Kraftfahrzeuges vor Beschädigungen und Kraftfahrzeug
DE102017003067A1 (de) Kollisionsverhinderungsvorrichtung und kollisionsverhinderungsverfahren
DE102014206473A1 (de) Automatische Assistenz eines Fahrers eines fahrspurgebundenen Fahrzeugs, insbesondere eines Schienenfahrzeugs
EP2766237A1 (fr) Dispositif d'aide à la conduite d'un véhicule ou de conduite autonome d'un véhicule
EP0913751A1 (fr) Véhicule autonome et méthode de guidage d'un véhicule autonome
DE102012006738A1 (de) Verfahren zur Kontrolle einer Gruppe von Objekten
EP3882733B1 (fr) Procédé et système de conduite autonome d'un véhicule
DE102019202025B4 (de) System und Verfahren zum sicheren Betreiben eines automatisierten Fahrzeugs
DE102014222906B4 (de) Schienenfahrzeug mit einem Sensor zur Erfassung eines Raumes außerhalb des Schienenfahrzeugsund/oder mit einem Signalgeber zum Ausgeben von Signalen in den Raum außerhalb desSchienenfahrzeugs
DE202013010696U1 (de) Eine zur Kontrolle einer Gruppe von Objekten bestimmte Messvorrichtung
DE102017218932B4 (de) Verfahren zur Bewertung einer Trajektorie eines Fortbewegungsmittels
DE102020005763A1 (de) Verfahren zur Steuerung eines Fahrzeugs
DE102017121021A1 (de) Verfahren zum Betreiben einer Detektionsvorrichtung eines Fahrzeugs zur Erfassung von Objekten und Detektionsvorrichtung
DE102022111280A1 (de) Verfahren, Computerprogrammprodukt, Parkassistenzsystem und Parkanlage
DE102022131470A1 (de) Verfahren zur Kalibration wenigstens einer zur Erfassung wenigstens eine Überwachungsbereichs ausgestalteten Detektionsrichtung eines Fahrzeugs, Detektionsvorrichtungen, Fahrerassistenzsystem und Fahrzeug
EP4279955A1 (fr) Dispositif de détection, agencement, robot, structure stationnaire et procédé
DE102022129176A1 (de) Aufbau für ein fahrerloses Transportsystem
WO2023012096A1 (fr) Procédé et dispositif de génération de carte pour un véhicule permettant de produire une carte haute résolution d'une surface de sol dans un environnement de véhicule

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20170502

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20180321

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: BOMBARDIER TRANSPORTATION GMBH

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502015005894

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1040191

Country of ref document: AT

Kind code of ref document: T

Effective date: 20181015

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20180912

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181212

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181212

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181213

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2700830

Country of ref document: ES

Kind code of ref document: T3

Effective date: 20190219

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20181102

Year of fee payment: 8

REG Reference to a national code

Ref country code: CH

Ref legal event code: NV

Representative=s name: PATENTANWALT DIPL.-ING. (UNI.) WOLFGANG HEISEL, CH

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190112

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190112

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502015005894

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181110

26N No opposition filed

Effective date: 20190613

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181110

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20151110

Ref country code: MK

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180912

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180912

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20191130

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20191110

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191110

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191130

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230822

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20231120

Year of fee payment: 9

Ref country code: DE

Payment date: 20231121

Year of fee payment: 9

Ref country code: CH

Payment date: 20231201

Year of fee payment: 9

Ref country code: AT

Payment date: 20231121

Year of fee payment: 9

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: PL

Payment date: 20231103

Year of fee payment: 9

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: ES

Payment date: 20240129

Year of fee payment: 9