US20210354737A1 - Monitoring device and method - Google Patents

Monitoring device and method Download PDF

Info

Publication number
US20210354737A1
US20210354737A1 US17/388,353 US202117388353A US2021354737A1 US 20210354737 A1 US20210354737 A1 US 20210354737A1 US 202117388353 A US202117388353 A US 202117388353A US 2021354737 A1 US2021354737 A1 US 2021354737A1
Authority
US
United States
Prior art keywords
data
around
monitoring device
mobile object
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/388,353
Inventor
Yoshie IMAI
Shu Murayama
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Assigned to MITSUBISHI ELECTRIC CORPORATION reassignment MITSUBISHI ELECTRIC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MURAYAMA, SHU, IMAI, YOSHIE
Publication of US20210354737A1 publication Critical patent/US20210354737A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L23/00Control, warning or like safety means along the route or between vehicles or trains
    • B61L23/04Control, warning or like safety means along the route or between vehicles or trains for monitoring the mechanical state of the route
    • B61L23/041Obstacle detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L15/00Indicators provided on the vehicle or train for signalling purposes
    • B61L15/0072On-board train data handling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L25/00Recording or indicating positions or identities of vehicles or trains or setting of track apparatus
    • B61L25/02Indicating or recording positions or identities of vehicles or trains
    • B61L25/021Measuring and recording of train speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L25/00Recording or indicating positions or identities of vehicles or trains or setting of track apparatus
    • B61L25/02Indicating or recording positions or identities of vehicles or trains
    • B61L25/023Determination of driving direction of vehicle or train
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L25/00Recording or indicating positions or identities of vehicles or trains or setting of track apparatus
    • B61L25/02Indicating or recording positions or identities of vehicles or trains
    • B61L25/025Absolute localisation, e.g. providing geodetic coordinates

Definitions

  • the present disclosure relates to a device for monitoring anomalies in railroad systems.
  • the patent document 1 describes a technique for automatically detecting an obstacle to railroad systems, such as an overhead wire and a rail, by comparing the image data obtained by photographing the front from a traveling railroad vehicle with the background image data previously taken.
  • the technique described in Patent Document 1 aims to efficiently monitor anomalies in railroad systems.
  • the present invention has been made to solve the above-mentioned problem, and aims to enable a railroad vehicle to detect an object dropped off from itself.
  • the monitoring device includes: a front data acquisition unit to acquire, via a sensor installed in a front of a mobile object, data around its front area with respect to its traveling direction and status information of the mobile object; a rear data acquisition unit to acquire, via a sensor installed in a rear of the mobile object, data around its rear area with respect to the traveling direction and status information of the mobile object; a position identification unit to identify a position of the data around the front area and a position of the data around the rear area; and an object detection unit to compare the data around the front area with the data around the rear area of the same position as the data around the front area, detect an object, and warn of an anomaly.
  • the present invention makes it possible for a railroad vehicle to detect an object dropped off while the railroad vehicle is traveling.
  • FIG. 1 is a hardware configuration diagram showing a mobile object according to Embodiment 1.
  • FIG. 2 is a functional configuration diagram showing a monitoring device according to Embodiment 1.
  • FIG. 3 is a flowchart showing operation of the monitoring device according to Embodiment 1.
  • FIG. 4 is a hardware configuration diagram of a mobile object according to Modified Example 1 of Embodiment 1.
  • FIG. 5 is a hardware configuration diagram of a mobile object according to Modified Example 2 of Embodiment 1.
  • FIG. 6 is a functional configuration diagram showing a monitoring device according to Embodiment 2.
  • FIG. 7 is a flowchart showing operation of the monitoring device according to Embodiment 2.
  • FIG. 1 is a hardware configuration diagram of a mobile object using a monitoring device according to the present embodiment.
  • the numeral 100 denotes the mobile object
  • the numeral 10 denotes the monitoring device
  • the numeral 101 denote a vehicle control unit.
  • the monitoring device 10 is a computer provided in the mobile object 100 .
  • monitoring device 10 may be implemented in an integrated (or inseparable) form or in a removable (or separable) form, with the mobile object 100 or another component illustrated herein. Further, the monitoring device 10 is not limited to the one described in the present embodiment, although a railroad vehicle is used as an example of the mobile object 100 in the present embodiment.
  • the monitoring device 10 includes hardware such as a processor 11 , a storage device 12 , a communication interface 13 , and an on-board interface 14 .
  • the processor 11 is connected to other hardware devices via a system bus to control them.
  • the processor 11 is an integrated circuit (IC) that performs processing.
  • the processor 11 is a central processing unit (CPU), a digital signal processor (DSP), or a graphics processing unit (GPU).
  • CPU central processing unit
  • DSP digital signal processor
  • GPU graphics processing unit
  • the storage device 12 includes a memory 121 and a storage 122 .
  • the memory 121 is a random-access memory (RAM).
  • the storage 122 is a hard disk drive (HDD).
  • the storage 122 may be a portable storage medium such as a secure digital (SD) memory card, a compact flash (CF), a NAND flash, a flexible disk, an optical disk, a compact disk, a Blu-ray (registered trademark) disk, or a DVD.
  • the communication interface 13 is a device for communicating with a communication device around the mobile object 100 .
  • the communication interface 13 is an Ethernet (registered trademark) terminal or a universal serial bus (USE) terminal.
  • the on-board interface 14 is a device for connecting to the vehicle control unit 101 installed on the mobile object 100 .
  • the on-board interface 14 is a USB terminal, an IEEE1394 terminal, or an HDMI (registered trademark) terminal.
  • the vehicle control unit 101 includes sensing devices such as a camera, a light detection and ranging device (LiDAR), a radar, a sonar, and a positioning device and also includes devices such as a steering, a brake, and an accelerator to control the mobile object 100 .
  • sensing devices such as a camera, a light detection and ranging device (LiDAR), a radar, a sonar, and a positioning device and also includes devices such as a steering, a brake, and an accelerator to control the mobile object 100 .
  • FIG. 2 shows a functional configuration diagram of the monitoring device 10 .
  • the monitoring device 10 includes, as functional components, a front data acquisition unit 21 , a rear data acquisition unit 22 ; a position identification unit 23 , an object detection unit 24 , and a history storage unit 25 .
  • the numeral 31 denotes front data; the numeral 32 denotes rear data; the numeral 41 denotes locating data; and the numeral 42 denotes exclusion data.
  • the front data acquisition unit 21 collects information obtained by a first sensor installed in the front of the mobile object 100 via the vehicle control unit 101 .
  • the first sensor is exemplified as a camera (front camera) installed in the front of the mobile object 100 , and the information obtained from the front camera is used as data around its front area.
  • the data may be information obtained from a LiDAR, a radar, or a sonar.
  • the information to be collected by the front camera is the information obtained in the direction of travel of the mobile object 100 .
  • the first sensor only needs to be installed in the front of the mobile object 100 , and the information to be collected by the first sensor may be information in any direction.
  • the data around its front area that is obtained is recorded in the storage 122 .
  • the front data acquisition unit 21 also records, in the storage 122 , the status information of the mobile object 100 such as position information, speed, attitude angles (roll angle, pitch angle, yaw angle), lighting color of the mobile object at the time of the data acquisition and the like at the time when acquiring data around its front area.
  • the position information of the mobile object 100 may be, for example, the latitude and longitude of the mobile object 100 obtained from the output value of the positioning device which is connected to the mobile object via the vehicle control unit 101 .
  • the data recorded by the front data acquisition unit 21 in the storage 122 be front data 31 .
  • the rear data acquisition unit 22 collects information obtained by a second sensor installed in the rear of the mobile object 100 via the vehicle control unit 101 .
  • the second sensor is exemplified as a camera (rear camera) installed in the rear of the mobile object 100 , and the image obtained from the rear camera is used as data around its rear area.
  • the data may be information obtained from a LiDAR, a radar, and a sonar.
  • the information to be collected by the rear camera is the information obtained in the direction opposite to the direction of travel of the mobile object 100 .
  • the second sensor only needs to be installed in the rear of the mobile object 100 , and the information to be collected by the second sensor is information in any direction.
  • the data obtained from around the rear area is recorded in the storage 122 .
  • the rear data acquisition unit 22 also records, in the storage 122 , the status information of the mobile object 100 such as position information, speed, attitude angles (roll angle, pitch angle, yaw angle), lighting color of the mobile object at the time of the data acquisition and the like at the time when obtaining data from around the rear area.
  • the position information of the mobile object 100 may be, for example, the latitude and longitude of the mobile object 100 obtained from the output value of the positioning device which is connected to the mobile object via the vehicle control unit 101 .
  • the data recorded by the rear data acquisition unit 22 in the storage 122 be rear data 32 .
  • the data around its front area and the data around its rear area are recorded in the storage 122 .
  • they may be recorded in the memory 121 , another area prepared in the storage device 12 , or an external device (not shown) connected via the communication I/F 13 .
  • the position identification unit 23 is called from the front data acquisition unit 21 and the rear data acquisition unit 22 .
  • the position identification unit 23 identifies the position of the front data 31 when called from the front data acquisition unit 21 and identifies the position of the rear data 32 when called from the rear data acquisition unit 22 .
  • the identified position of the data around its front area and the identified position of the data around its rear area are each the position corresponding to the information (identified position information) obtained from the first and second sensors, but not the position of the mobile object 100 .
  • the identified position information identified by the position identification unit 23 is recorded together with the front data 31 and the rear data 32 , but they may be recorded in separate areas if it is known which data the identified position information is linked with. Further, the position identification unit 23 may identify the positions of the data around the front area and the data around the rear area by using locating data 41 .
  • the locating data 41 may be data of any object that can uniquely identify the data obtained by the first and second sensors, such as a building, a pillar, a signboard, and a characteristic landscape that exist along the railroad track.
  • the image data of the objects that can be uniquely identified as described above and their position information are associated with the locating data 41 in advance.
  • the position identification unit 23 will be able to identify the position without using the status information of the mobile object 100 , making it possible to shorten the processing time.
  • a camera is used as the sensor.
  • a radar or a sonar is used as a sensor, a combination of an object made of a material that generates characteristic reflected waves, and positional information thereof may be used.
  • the object detection unit 24 compares the identified position information recorded in the front data 31 with the identified position information recorded in the rear data 32 to detect a matching combination. If there is a matching combination in the identified position information, the object detection unit 24 compares the data around its front area with the data around its rear area. Then, if there is a difference, it is determined that the difference indicates a fallen object dropped off during the passage of the mobile object 100 and an anomaly warning is issued.
  • the object detection unit 24 can determine whether the object determined to be a fallen object is really the fallen object dropped off during the passage of the mobile object 100 by using the exclusion data 42 .
  • the items to be excluded include animals such as crows and cats, gravel and stones, something blown by the wind such as paper wastes like newspapers and magazines, and vinyl sheets, and the image data of these items may be held as the exclusion data 42 .
  • the object detection unit 24 compares the image determined to be a fallen object with the images of the exclusion data 42 . If they match, the object detection unit 24 determines that the object found is not the one that was dropped off during the passage of the mobile object 100 , so that a false alarm will not be issued to the mobile object 100 regarding the occurrence of a fallen object. At this time, a warning may be issued that it is not a fallen object from the mobile object 100 but something that has been blown from the outside during the passage of the mobile object 100 .
  • the locating data 41 and the exclusion data 42 may be recorded in the storage 122 or in the memory 121 . Also, these data may be recorded in another area prepared in the storage device 12 , or an external device (not shown) connected via the communication I/F 13 .
  • each functional component of the monitoring device 10 is implemented by software.
  • the storage 122 of the storage device 12 stores a program that implements the function of each functional component implemented by the software. This program is loaded into the memory 121 by the processor 11 and executed by the processor 11 .
  • the storage 122 implements the function of the history storage unit 25 .
  • the history storage unit 25 stores information about the fallen objects that the object detection unit 24 detected in the past. Examples of such information to be stored include position, time, and number of times of detection of each fallen object detected.
  • the front data acquisition unit 21 and the rear data acquisition unit 22 may for example, shorten the interval for collecting data in the vicinity of the location where fallen objects are frequent, and may, conversely; lengthen the interval for collecting data in the vicinity of the location where fallen objects are less frequent. This makes it possible to efficiently obtain the data around its front area and the data around its rear area.
  • processor 11 only one processor 11 is provided. Instead, however, multiple processors 11 may be provided. In that case, the multiple processors 11 cooperate to execute the program that implements each function of the monitoring device 10 .
  • FIG. 3 is a flowchart showing processing of the monitoring device according to the present embodiment.
  • the operation of the monitoring device 10 according to Embodiment 1 will be described with reference to FIG. 3 .
  • the processes of the front data acquisition unit 21 , the rear data acquisition unit 22 , the position identification unit 23 , and the object detection unit 24 are described in a way that they are executed sequentially as shown in the flowchart.
  • the three units, namely, the front data acquisition unit 21 , the rear data acquisition unit 22 , and the object detection unit 24 in other words, the units other than the position identification unit 23 which is called from the front data acquisition unit 21 and the rear data acquisition unit 22 , can be executed in parallel.
  • Step S 11 Processing of front data acquisition
  • the front data acquisition unit 21 acquires the data around its front area by the first sensor installed in the front of the mobile object 100 and the status information and writes the data in the front data 31 .
  • the front data acquisition unit 21 calls the position identification unit 23 .
  • Step S 12 Calculation and identification of front data position
  • the position identification unit 23 identifies the position of the data around its front area on the basis of the status information written in the front data 31 and writes the identified position in the front data 31 .
  • Step S 13 Processing of rear data acquisition
  • the rear data acquisition unit 22 acquires the data around its rear area by the second sensor installed in the rear of the mobile object 100 and the status information and writes the data in the rear data 32 .
  • the rear data acquisition unit 22 calls the position identification unit 23 .
  • Step S 14 Calculation and identification of rear data position
  • the position identification unit 23 identifies the position of the data around its rear area on the basis of the status information written in the rear data 32 and writes the identified position in the rear data 32 .
  • Step S 15 Detection of object
  • the object, detection unit 24 compares the identified position information of the front data 31 and the identified position information of the rear data 32 , both stored in the storage 122 , and detects a matching combination. If there is a matching combination in the identified position information, the object detection unit 24 compares the data around its front area with the data around its rear area. Then, if there is a difference in the combination, it is determined that the difference indicates the existence of a fallen object dropped off during the passage of the mobile object 100 and an anomaly warning is issued.
  • the object detection unit 24 converts the pixel signals of either the front data 31 or the rear data 32 by using the status information included in the front data 31 and the rear data 32 as well as the characteristics of the acquired data.
  • the status information will be explained in detail. Specifically, the speed information, the attitude angles (roll angle, pitch angle, yaw angle) of the mobile object 100 , and the characteristics of sensing devices provided in the vehicle control unit 101 for the acquisition of data are used.
  • the lens and the size of image sensor each are a characteristic, and the focal length and the angle of view are determined from these characteristics.
  • the distances corresponding to the pixels in the camera image can be roughly known from the shooting position, the focal length, and the angle of view, and as a result, the positions corresponding to the pixel signals can be known.
  • the object detection unit 24 obtains the positions of the pixel signals for each of the front data 31 and the rear data 32 , and finds a pair of the front data 31 and the rear data 32 having the same position of the pixel signals to perform the comparison.
  • the pixel signals are compared, it is possible to use the markers and the characteristics of the landmarks and the like recorded in the data. In doing so, size scaling and angle adjustment may be performed in order to match the size of the pixel area and the obstacle.
  • the pixel signals of the same position may be found in multiple images such as a near view and a distant view, etc.
  • the identification accuracy is improved by giving priority to the image with higher resolution without using the image with lower resolution.
  • the pixel signals of the same position may be found in multiple images with different subject depths.
  • the identification accuracy is improved by giving priority to the image of higher resolution with no blurring without using the image with lower resolution due to blurring.
  • the shadow of the mobile object may be cast on the image.
  • the cast shadow may be corrected so as for the images to match with each other, or the orientation of the sensor may be changed to take a shadow-free image for priority use.
  • the shadow can be detected from the image signal, but it can also be predicted from the position of the sun calculated from the photographing time and place and the size of the moving object or it is possible that the image will not be used when it is determined in advance not to be suitable for the identification processing.
  • the object detection unit 24 searches for the object to be detected on the basis of the pixel signals at the identified data position. In the description of the present embodiment, this object is assumed to be a part that has fallen from the mobile object 100 .
  • the object detection unit 24 can create the stereoscopic image from the front data 31 by performing viewpoint conversion.
  • the stereoscopic image may be a bird's-eye view image or a three-dimensionally reconstructed image.
  • the object detection unit 24 performs the similar processing on the rear data 32 as well.
  • the front data 31 and the rear data 32 are compared to determine whether the difference therebetween indicates an object to be detected.
  • Moved stones on the railroad track for example, are excluded from the objects to be detected so that they are not determined as fallen objects.
  • foreign objects such as stones consequently move on a railroad track, hindering the safe railroad operation, it is also possible to include the objects as the targets to be detected.
  • a camera recording visible light is used as a sensor to acquire images.
  • the mobile object 100 is a railroad vehicle, it is required that, during operation, the color of the front light and the color of the rear light be different from each other. Therefore, if the camera recording visible light is used as the sensor, determination as a fallen object may be made due to the difference in color.
  • the object detection unit 24 cancels the color difference between the front light and the rear light by using the setting values such as the colors of the lights in the status information.
  • a straightforward way to correct the color difference of the lights is to use the Von Dries color conversion formula.
  • the monitoring device 10 by using the front data acquired by the front data acquisition unit 21 as well as the status information acquired when the front data is acquired by the front data acquisition unit 21 , and the rear data acquired by the rear data acquisition unit 22 as well as the status information acquired when the rear data is acquired by the rear data acquisition unit 22 , and further by using, in the position identification unit 23 , the characteristics of the status information acquired when the front data is acquired as well as the status information acquired when the rear data is acquired, the front data and the rear data having the same identified position information are compared to detect a fallen object.
  • this makes it possible to detect a fallen object immediately after it occurs during the travel of the mobile object 100 .
  • this configuration which only uses the sensors attached to the front and the rear of the railroad vehicle, contributes to reducing the number of monitoring cameras to be installed along the railroad track.
  • the front data acquisition unit 21 acquires the data around its front area in the direction of travel of the mobile object 100 , but the data around its rear area may be acquired as long as the first sensor is installed in the front of the mobile object 100 .
  • the rear data acquisition unit 22 acquires the data around its rear area in the direction of travel of the mobile object 100 , but it may acquire the data around its front area as long as the second sensor is installed in the rear of the mobile object 100
  • the front data acquisition unit 21 and the rear data acquisition unit 22 may be provided in the front and in the rear of each of the cars constituting the mobile object 100 , respectively. In that case, the front data acquisition unit 21 of the frontmost car and the rear data acquisition unit 22 of the rearmost car are used while traveling. With this configuration, even if the mobile object 100 is separated into a plurality of moving bodies, it is possible to use the front data acquisition unit 21 of the frontmost car and the rear data acquisition unit 22 of the rearmost car in each separated mobile object. To the contrary, when a plurality of the moving bodies 100 are connected into one mobile object, the front data acquisition unit 21 of the frontmost mobile object 100 and the rear data acquisition unit 22 of the rearmost mobile object 100 can be used.
  • Embodiment 1 describes the case where each functional component is implemented by software. However, each of these functional components may be implemented by hardware.
  • FIG. 4 shows a hardware configuration of the monitoring device 10 according to Modified Example 1. If each functional component is implemented by hardware, the monitoring device 10 includes an electronic circuit 15 in place of the processor 11 and the storage device 12 .
  • the electronic circuit 15 is a dedicated circuit that implements the functions of each functional component and the storage device 12 .
  • FIG. 4 shows, as FIG. 1 does, a configuration in which the communication interface 13 , the on-board interface 14 , and the electronic circuit 15 are connected via a bus.
  • the electronic circuit 15 may be configured as a single circuit that also implements the functions of the communication interface 13 and the on-board interface 14 .
  • Examples of the electronic circuit 15 include a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a gate array (GA), an application specific integrated circuit (ASIC), and a field-programmable gate array (FPGA).
  • a single circuit a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a gate array (GA), an application specific integrated circuit (ASIC), and a field-programmable gate array (FPGA).
  • each functional component of the monitoring device 10 described in Embodiment 1 may be integrated into one electronic circuit 15 , or they may be allocated to and implemented by a plurality of the electronic circuits 15 .
  • Modified Example 2 describes the case in which some of each of the functional components are implemented by hardware and each of the remaining functional components is implemented by software.
  • FIG. 5 shows a configuration of the monitoring device 10 according to Modified Example 2.
  • the processor 11 the storage device 12 , and the electronic circuit 15 are called processing circuits.
  • each functional component is implemented by a processing circuit.
  • Embodiment 1 is for detecting a fallen object from a railroad vehicle.
  • a configuration for detecting an obstacle at a station which is provided with platform doors will be described.
  • the present embodiment differs from Embodiment 1 in that the mobile object 100 identifies the position of the data around its front area and the position of the data around its rear area on the basis of the positions of the platform doors at a station.
  • the different points will be explained, and the same points will be omitted.
  • FIG. 6 shows a functional configuration diagram of the monitoring device 10 according to the present embodiment.
  • the numeral 26 denotes a control unit with which the monitoring device 10 identifies the position of the data around its front area and the position of the data around its rear area on the basis of the positions of the platform doors at the station.
  • the control unit 26 stores an image of the platform door in advance and determines whether the image of the platform door is included in the data around its front area acquired by the front data acquisition unit 21 . If the platform door image is included, the control unit 26 calls the front data acquisition unit 21 each time the frontmost of the mobile object 100 reaches each platform door at the station and calls the rear data acquisition unit 22 each time the rearmost of the mobile object 100 reaches each platform door at the station.
  • the control unit 26 outputs an anomaly warning when the object detection unit 24 detects an obstacle.
  • the same reference numerals as those in FIG. 2 denote the same or corresponding components or units, each of which, except for the control unit 26 , performs the same operation as those described in FIG. 2 shown in Embodiment 1.
  • FIG. 7 is a flowchart showing operation of the monitoring device 10 according to the present embodiment.
  • the operation of the monitoring device 10 according to the present embodiment will be described with reference to FIG. 7 .
  • the control unit 26 calls the front data acquisition unit 21 , the rear data acquisition unit 22 , and the object detection unit 24 , and that when an obstacle is detected as the result, an anomaly warning is issued.
  • the three units namely the front data acquisition unit 21 , the rear data acquisition unit 22 , and the object detection unit 24 , in other words, the units other than the position identification unit 23 which is called from the front data acquisition unit 21 and the rear data acquisition unit 22 , can be executed in parallel.
  • the object detection unit 24 can output an anomaly warning to be outputted when the obstacle is detected.
  • Step S 21 Processing of Front Data Acquisition
  • the control unit 26 When the railroad vehicle enters the station, the control unit 26 repeatedly calls the front data acquisition unit 21 until the railroad vehicle stops at the station.
  • the front data acquisition unit 21 acquires the data around its front area using a first sensor installed in the front of the mobile object 100 .
  • the front data acquisition unit 21 writes the acquired data around its front area in the memory 121 .
  • the control unit 26 calls the front data acquisition unit 21 each time the railroad vehicle approaches each of the three platform doors.
  • the front data acquisition unit 21 records the data around its front area and the status information as the front data 31 .
  • the front data taken at the three locations is written in the memory 121 .
  • Step S 22 Position Identification Processing of Data from Around Front Area
  • the position identification unit 23 identifies the position of the data around its front area and writes the position in the front data 31 .
  • Step S 23 Waiting for Completion of Passengers Getting on and Off
  • the control unit 26 waits while the doors of the railroad vehicle and the platform doors open, the passengers get on and off, and the platform doors and the doors of the railroad vehicle close.
  • Step S 24 Processing of Rear Data Acquisition
  • the control unit 26 calls the rear data acquisition unit 22 until the railroad vehicle leaves the station.
  • the rear data acquisition unit 22 acquires the data around its rear area using a second sensor installed in the rear of the mobile object 100 .
  • the rear area data acquisition unit 22 collects the information obtained by the second sensor via the vehicle control unit 101 .
  • the rear data acquisition unit 22 writes the acquired rear data 32 in the memory 121 .
  • the rear data acquisition unit 22 acquires the data around its rear area and the status information. In this way, the data around the three rear areas each corresponding to the data around the three front areas described above are written in the memory 121 .
  • Step S 25 Position Identification Processing of Data from Around Rear Area
  • the position identification unit 23 identifies the position of the data around its rear area and writes the position in the rear data 32 .
  • Step S 26 Processing of Obstacle Detection
  • the control unit 26 calls the object detection unit 24 .
  • the object detection unit 24 compares the data around its front area with the data around its rear area whose identified positions match, and determines a difference, if any, to be a fallen object which occurred during the passage of the mobile object 100 , and issues an anomaly warning.
  • the position identification unit 23 can identify the position of an obstacle from the numbers or the codes written on the platform doors recorded in the front data and the rear data stored.
  • Step S 27 Anomaly Warning
  • the control unit 26 issues an anomaly warning when the object detection unit 24 detects an obstacle (YES in Step S 26 ). For example, by transmitting this anomaly warning to the station (management center), the station staffs can make actions in a prompt manner. Also, it is possible to take measures such as stopping the following train if it is urgent or allowing the next train to enter and the doors to open at the platform to handle the problem occurred if it is not urgent.
  • the monitoring device 10 can detect an anomaly which occurs when the mobile object 100 enters the platform of a station at the time when the mobile object 100 leaves the platform of the station.
  • the monitoring device 10 With the first and second sensors installed at the front and the rear of the mobile object 100 , the monitoring device 10 according to the present embodiment obtains the data around its front area viewed from the mobile object 100 and the data around its rear area viewed from the mobile object 100 . Then, by comparing them, it is further made possible to immediately detect a fallen object existing on the outside of the platform doors and on the nearside of the railroad track, which is normally difficult to detect only with monitoring cameras installed on the platform.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Train Traffic Observation, Control, And Security (AREA)
  • Selective Calling Equipment (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

The present disclosure provides a monitoring device including: a front data acquisition unit to acquire, via a sensor installed in a front of a mobile object, data around its front area with respect to its traveling direction and status information of the mobile object; a rear data acquisition unit to acquire, via a sensor installed in a rear of the mobile object, data around its rear area with respect to the traveling direction and status information of the mobile object; a position identification unit to identify a position of the data around the front area and a position of the data around the rear area; and an object detection unit to compare the data around the front area with the data around the rear area of the same position as the data around the front area, detect an object, and warn of an anomaly.

Description

    TECHNICAL FIELD
  • The present disclosure relates to a device for monitoring anomalies in railroad systems.
  • BACKGROUND ART
  • The patent document 1 describes a technique for automatically detecting an obstacle to railroad systems, such as an overhead wire and a rail, by comparing the image data obtained by photographing the front from a traveling railroad vehicle with the background image data previously taken. The technique described in Patent Document 1 aims to efficiently monitor anomalies in railroad systems.
  • CITATION LIST
  • Patent Document
    • Patent Document 1: Japanese Patent Application Laid-Open No. 2016-52849
    SUMMARY OF INVENTION Problems to be Solved by Invention
  • One of the factors that cause anomalies in railroad systems is an object dropped off from railroad vehicles. The problem of the technique described in Patent Document 1 is that a railroad vehicle can detect objects dropped off from its preceding railroad vehicles but cannot detect objects dropped off from itself while traveling.
  • The present invention has been made to solve the above-mentioned problem, and aims to enable a railroad vehicle to detect an object dropped off from itself.
  • Means for Solving the Problems
  • The monitoring device according to the present disclosure includes: a front data acquisition unit to acquire, via a sensor installed in a front of a mobile object, data around its front area with respect to its traveling direction and status information of the mobile object; a rear data acquisition unit to acquire, via a sensor installed in a rear of the mobile object, data around its rear area with respect to the traveling direction and status information of the mobile object; a position identification unit to identify a position of the data around the front area and a position of the data around the rear area; and an object detection unit to compare the data around the front area with the data around the rear area of the same position as the data around the front area, detect an object, and warn of an anomaly.
  • Effect of Invention
  • The present invention makes it possible for a railroad vehicle to detect an object dropped off while the railroad vehicle is traveling.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a hardware configuration diagram showing a mobile object according to Embodiment 1.
  • FIG. 2 is a functional configuration diagram showing a monitoring device according to Embodiment 1.
  • FIG. 3 is a flowchart showing operation of the monitoring device according to Embodiment 1.
  • FIG. 4 is a hardware configuration diagram of a mobile object according to Modified Example 1 of Embodiment 1.
  • FIG. 5 is a hardware configuration diagram of a mobile object according to Modified Example 2 of Embodiment 1.
  • FIG. 6 is a functional configuration diagram showing a monitoring device according to Embodiment 2.
  • FIG. 7 is a flowchart showing operation of the monitoring device according to Embodiment 2.
  • MODES FOR CARRYING OUT INVENTION Embodiment 1
  • FIG. 1 is a hardware configuration diagram of a mobile object using a monitoring device according to the present embodiment.
  • In FIG. 1, the numeral 100 denotes the mobile object, the numeral 10 denotes the monitoring device, and the numeral 101 denote a vehicle control unit. The monitoring device 10 is a computer provided in the mobile object 100.
  • Note that the monitoring device 10 may be implemented in an integrated (or inseparable) form or in a removable (or separable) form, with the mobile object 100 or another component illustrated herein. Further, the monitoring device 10 is not limited to the one described in the present embodiment, although a railroad vehicle is used as an example of the mobile object 100 in the present embodiment.
  • The monitoring device 10 includes hardware such as a processor 11, a storage device 12, a communication interface 13, and an on-board interface 14. The processor 11 is connected to other hardware devices via a system bus to control them.
  • The processor 11 is an integrated circuit (IC) that performs processing. For specific examples, the processor 11 is a central processing unit (CPU), a digital signal processor (DSP), or a graphics processing unit (GPU).
  • The storage device 12 includes a memory 121 and a storage 122. For a specific example, the memory 121 is a random-access memory (RAM). For a specific example, the storage 122 is a hard disk drive (HDD). Further, the storage 122 may be a portable storage medium such as a secure digital (SD) memory card, a compact flash (CF), a NAND flash, a flexible disk, an optical disk, a compact disk, a Blu-ray (registered trademark) disk, or a DVD.
  • The communication interface 13 is a device for communicating with a communication device around the mobile object 100. For a specific example, the communication interface 13 is an Ethernet (registered trademark) terminal or a universal serial bus (USE) terminal.
  • The on-board interface 14 is a device for connecting to the vehicle control unit 101 installed on the mobile object 100. For a specific example, the on-board interface 14 is a USB terminal, an IEEE1394 terminal, or an HDMI (registered trademark) terminal.
  • The vehicle control unit 101 includes sensing devices such as a camera, a light detection and ranging device (LiDAR), a radar, a sonar, and a positioning device and also includes devices such as a steering, a brake, and an accelerator to control the mobile object 100.
  • FIG. 2 shows a functional configuration diagram of the monitoring device 10. The monitoring device 10 includes, as functional components, a front data acquisition unit 21, a rear data acquisition unit 22; a position identification unit 23, an object detection unit 24, and a history storage unit 25. The numeral 31 denotes front data; the numeral 32 denotes rear data; the numeral 41 denotes locating data; and the numeral 42 denotes exclusion data.
  • The front data acquisition unit 21 collects information obtained by a first sensor installed in the front of the mobile object 100 via the vehicle control unit 101. In the present embodiment, the first sensor is exemplified as a camera (front camera) installed in the front of the mobile object 100, and the information obtained from the front camera is used as data around its front area. However, the data may be information obtained from a LiDAR, a radar, or a sonar. Note that, in the present embodiment, the information to be collected by the front camera is the information obtained in the direction of travel of the mobile object 100. However, the first sensor only needs to be installed in the front of the mobile object 100, and the information to be collected by the first sensor may be information in any direction.
  • The data around its front area that is obtained is recorded in the storage 122. In addition, the front data acquisition unit 21 also records, in the storage 122, the status information of the mobile object 100 such as position information, speed, attitude angles (roll angle, pitch angle, yaw angle), lighting color of the mobile object at the time of the data acquisition and the like at the time when acquiring data around its front area.
  • The position information of the mobile object 100 may be, for example, the latitude and longitude of the mobile object 100 obtained from the output value of the positioning device which is connected to the mobile object via the vehicle control unit 101. Here, let the data recorded by the front data acquisition unit 21 in the storage 122 be front data 31.
  • The rear data acquisition unit 22 collects information obtained by a second sensor installed in the rear of the mobile object 100 via the vehicle control unit 101. In the present embodiment, the second sensor is exemplified as a camera (rear camera) installed in the rear of the mobile object 100, and the image obtained from the rear camera is used as data around its rear area. However, the data may be information obtained from a LiDAR, a radar, and a sonar. In the present embodiment, the information to be collected by the rear camera is the information obtained in the direction opposite to the direction of travel of the mobile object 100. However, the second sensor only needs to be installed in the rear of the mobile object 100, and the information to be collected by the second sensor is information in any direction.
  • The data obtained from around the rear area is recorded in the storage 122. In addition, the rear data acquisition unit 22 also records, in the storage 122, the status information of the mobile object 100 such as position information, speed, attitude angles (roll angle, pitch angle, yaw angle), lighting color of the mobile object at the time of the data acquisition and the like at the time when obtaining data from around the rear area. The position information of the mobile object 100 may be, for example, the latitude and longitude of the mobile object 100 obtained from the output value of the positioning device which is connected to the mobile object via the vehicle control unit 101. Here, let the data recorded by the rear data acquisition unit 22 in the storage 122 be rear data 32.
  • In the present embodiment, it is described that the data around its front area and the data around its rear area are recorded in the storage 122. However, they may be recorded in the memory 121, another area prepared in the storage device 12, or an external device (not shown) connected via the communication I/F 13.
  • The position identification unit 23 is called from the front data acquisition unit 21 and the rear data acquisition unit 22. The position identification unit 23 identifies the position of the front data 31 when called from the front data acquisition unit 21 and identifies the position of the rear data 32 when called from the rear data acquisition unit 22. Note that the identified position of the data around its front area and the identified position of the data around its rear area are each the position corresponding to the information (identified position information) obtained from the first and second sensors, but not the position of the mobile object 100.
  • In the present embodiment, the identified position information identified by the position identification unit 23 is recorded together with the front data 31 and the rear data 32, but they may be recorded in separate areas if it is known which data the identified position information is linked with. Further, the position identification unit 23 may identify the positions of the data around the front area and the data around the rear area by using locating data 41. The locating data 41 may be data of any object that can uniquely identify the data obtained by the first and second sensors, such as a building, a pillar, a signboard, and a characteristic landscape that exist along the railroad track.
  • Then, the image data of the objects that can be uniquely identified as described above and their position information are associated with the locating data 41 in advance. By doing so, if the data around its front area and the data around its rear area contain information that matches the locating data 41, the position identification unit 23 will be able to identify the position without using the status information of the mobile object 100, making it possible to shorten the processing time. In the present embodiment, a camera is used as the sensor. However, when a radar or a sonar is used as a sensor, a combination of an object made of a material that generates characteristic reflected waves, and positional information thereof may be used.
  • The object detection unit 24 compares the identified position information recorded in the front data 31 with the identified position information recorded in the rear data 32 to detect a matching combination. If there is a matching combination in the identified position information, the object detection unit 24 compares the data around its front area with the data around its rear area. Then, if there is a difference, it is determined that the difference indicates a fallen object dropped off during the passage of the mobile object 100 and an anomaly warning is issued.
  • Further, the object detection unit 24 can determine whether the object determined to be a fallen object is really the fallen object dropped off during the passage of the mobile object 100 by using the exclusion data 42. Examples of the items to be excluded include animals such as crows and cats, gravel and stones, something blown by the wind such as paper wastes like newspapers and magazines, and vinyl sheets, and the image data of these items may be held as the exclusion data 42.
  • The object detection unit 24 compares the image determined to be a fallen object with the images of the exclusion data 42. If they match, the object detection unit 24 determines that the object found is not the one that was dropped off during the passage of the mobile object 100, so that a false alarm will not be issued to the mobile object 100 regarding the occurrence of a fallen object. At this time, a warning may be issued that it is not a fallen object from the mobile object 100 but something that has been blown from the outside during the passage of the mobile object 100. Note that the locating data 41 and the exclusion data 42 may be recorded in the storage 122 or in the memory 121. Also, these data may be recorded in another area prepared in the storage device 12, or an external device (not shown) connected via the communication I/F 13.
  • The function of each functional component of the monitoring device 10 is implemented by software. The storage 122 of the storage device 12 stores a program that implements the function of each functional component implemented by the software. This program is loaded into the memory 121 by the processor 11 and executed by the processor 11.
  • In addition, the storage 122 implements the function of the history storage unit 25. The history storage unit 25 stores information about the fallen objects that the object detection unit 24 detected in the past. Examples of such information to be stored include position, time, and number of times of detection of each fallen object detected. By using such information held by the history storage unit 25, the front data acquisition unit 21 and the rear data acquisition unit 22 may for example, shorten the interval for collecting data in the vicinity of the location where fallen objects are frequent, and may, conversely; lengthen the interval for collecting data in the vicinity of the location where fallen objects are less frequent. This makes it possible to efficiently obtain the data around its front area and the data around its rear area.
  • Note that, in the present embodiment, as shown in FIG. 1, only one processor 11 is provided. Instead, however, multiple processors 11 may be provided. In that case, the multiple processors 11 cooperate to execute the program that implements each function of the monitoring device 10.
  • FIG. 3 is a flowchart showing processing of the monitoring device according to the present embodiment. The operation of the monitoring device 10 according to Embodiment 1 will be described with reference to FIG. 3. In the present embodiment, for ease of explanation, the processes of the front data acquisition unit 21, the rear data acquisition unit 22, the position identification unit 23, and the object detection unit 24 are described in a way that they are executed sequentially as shown in the flowchart. However, instead, the three units, namely, the front data acquisition unit 21, the rear data acquisition unit 22, and the object detection unit 24, in other words, the units other than the position identification unit 23 which is called from the front data acquisition unit 21 and the rear data acquisition unit 22, can be executed in parallel.
  • (Step S11: Processing of front data acquisition)
  • The front data acquisition unit 21 acquires the data around its front area by the first sensor installed in the front of the mobile object 100 and the status information and writes the data in the front data 31. The front data acquisition unit 21 calls the position identification unit 23.
  • (Step S12: Calculation and identification of front data position)
  • The position identification unit 23 identifies the position of the data around its front area on the basis of the status information written in the front data 31 and writes the identified position in the front data 31.
  • (Step S13: Processing of rear data acquisition)
  • The rear data acquisition unit 22 acquires the data around its rear area by the second sensor installed in the rear of the mobile object 100 and the status information and writes the data in the rear data 32. The rear data acquisition unit 22 calls the position identification unit 23.
  • (Step S14: Calculation and identification of rear data position)
  • The position identification unit 23 identifies the position of the data around its rear area on the basis of the status information written in the rear data 32 and writes the identified position in the rear data 32.
  • (Step S15: Detection of object)
  • The object, detection unit 24 compares the identified position information of the front data 31 and the identified position information of the rear data 32, both stored in the storage 122, and detects a matching combination. If there is a matching combination in the identified position information, the object detection unit 24 compares the data around its front area with the data around its rear area. Then, if there is a difference in the combination, it is determined that the difference indicates the existence of a fallen object dropped off during the passage of the mobile object 100 and an anomaly warning is issued.
  • However, since it should be considered that the acquisition directions are different by about 180 degrees between the front data 31 and the rear data 32 that are identical in the identified position information, it is not possible to simply compare their data. Therefore, the object detection unit 24 converts the pixel signals of either the front data 31 or the rear data 32 by using the status information included in the front data 31 and the rear data 32 as well as the characteristics of the acquired data.
  • The status information will be explained in detail. Specifically, the speed information, the attitude angles (roll angle, pitch angle, yaw angle) of the mobile object 100, and the characteristics of sensing devices provided in the vehicle control unit 101 for the acquisition of data are used. For example, in a case of a camera, the lens and the size of image sensor each are a characteristic, and the focal length and the angle of view are determined from these characteristics. In the image taken by the camera, the distances corresponding to the pixels in the camera image can be roughly known from the shooting position, the focal length, and the angle of view, and as a result, the positions corresponding to the pixel signals can be known.
  • In this way, the object detection unit 24 obtains the positions of the pixel signals for each of the front data 31 and the rear data 32, and finds a pair of the front data 31 and the rear data 32 having the same position of the pixel signals to perform the comparison. When the pixel signals are compared, it is possible to use the markers and the characteristics of the landmarks and the like recorded in the data. In doing so, size scaling and angle adjustment may be performed in order to match the size of the pixel area and the obstacle.
  • Note that, in the acquired data, depending on the imaging interval, the pixel signals of the same position may be found in multiple images such as a near view and a distant view, etc. When there are multiple choices for the images to be used as described above, the identification accuracy is improved by giving priority to the image with higher resolution without using the image with lower resolution. Similarly, in the acquired data, depending on the imaging interval, the pixel signals of the same position may be found in multiple images with different subject depths. When there are multiple choices for the images to be used as described above, the identification accuracy is improved by giving priority to the image of higher resolution with no blurring without using the image with lower resolution due to blurring.
  • Also, depending on the direction of the sun, the shadow of the mobile object may be cast on the image. In such a case, the cast shadow may be corrected so as for the images to match with each other, or the orientation of the sensor may be changed to take a shadow-free image for priority use. The shadow can be detected from the image signal, but it can also be predicted from the position of the sun calculated from the photographing time and place and the size of the moving object or it is possible that the image will not be used when it is determined in advance not to be suitable for the identification processing.
  • The object detection unit 24 searches for the object to be detected on the basis of the pixel signals at the identified data position. In the description of the present embodiment, this object is assumed to be a part that has fallen from the mobile object 100. The object detection unit 24 can create the stereoscopic image from the front data 31 by performing viewpoint conversion. The stereoscopic image may be a bird's-eye view image or a three-dimensionally reconstructed image. The object detection unit 24 performs the similar processing on the rear data 32 as well.
  • Then, the front data 31 and the rear data 32 are compared to determine whether the difference therebetween indicates an object to be detected. Moved stones on the railroad track, for example, are excluded from the objects to be detected so that they are not determined as fallen objects. However, in consideration of the possibility that foreign objects such as stones consequently move on a railroad track, hindering the safe railroad operation, it is also possible to include the objects as the targets to be detected.
  • While it is bright in the daytime, there is no problem in particular, but when lighting is required, care must be taken when a camera recording visible light is used as a sensor to acquire images. For example, if the mobile object 100 is a railroad vehicle, it is required that, during operation, the color of the front light and the color of the rear light be different from each other. Therefore, if the camera recording visible light is used as the sensor, determination as a fallen object may be made due to the difference in color.
  • In such a case, the object detection unit 24 cancels the color difference between the front light and the rear light by using the setting values such as the colors of the lights in the status information. A straightforward way to correct the color difference of the lights is to use the Von Dries color conversion formula. Thus, even if a color difference occurs between the camera image acquired in front of the mobile object 100 and the camera image acquired in rear, it is possible to prevent erroneous detection of an obstacle owing to the color difference.
  • As described above, in the monitoring device 10 according to the present embodiment, by using the front data acquired by the front data acquisition unit 21 as well as the status information acquired when the front data is acquired by the front data acquisition unit 21, and the rear data acquired by the rear data acquisition unit 22 as well as the status information acquired when the rear data is acquired by the rear data acquisition unit 22, and further by using, in the position identification unit 23, the characteristics of the status information acquired when the front data is acquired as well as the status information acquired when the rear data is acquired, the front data and the rear data having the same identified position information are compared to detect a fallen object. Thus, this makes it possible to detect a fallen object immediately after it occurs during the travel of the mobile object 100.
  • As a result, if a fallen object occurs which may interfere with the railroad operation, an immediate action can be taken for it to improve the safe operation of the railroad. In addition, this configuration, which only uses the sensors attached to the front and the rear of the railroad vehicle, contributes to reducing the number of monitoring cameras to be installed along the railroad track. In the present embodiment, a case is described in which the front data acquisition unit 21 acquires the data around its front area in the direction of travel of the mobile object 100, but the data around its rear area may be acquired as long as the first sensor is installed in the front of the mobile object 100. Similarly, in the present embodiment, a case is described in which the rear data acquisition unit 22 acquires the data around its rear area in the direction of travel of the mobile object 100, but it may acquire the data around its front area as long as the second sensor is installed in the rear of the mobile object 100
  • When the mobile object 100 includes a plurality of railroad cars, the front data acquisition unit 21 and the rear data acquisition unit 22 may be provided in the front and in the rear of each of the cars constituting the mobile object 100, respectively. In that case, the front data acquisition unit 21 of the frontmost car and the rear data acquisition unit 22 of the rearmost car are used while traveling. With this configuration, even if the mobile object 100 is separated into a plurality of moving bodies, it is possible to use the front data acquisition unit 21 of the frontmost car and the rear data acquisition unit 22 of the rearmost car in each separated mobile object. To the contrary, when a plurality of the moving bodies 100 are connected into one mobile object, the front data acquisition unit 21 of the frontmost mobile object 100 and the rear data acquisition unit 22 of the rearmost mobile object 100 can be used.
  • Modified Example 1
  • Embodiment 1 describes the case where each functional component is implemented by software. However, each of these functional components may be implemented by hardware.
  • FIG. 4 shows a hardware configuration of the monitoring device 10 according to Modified Example 1. If each functional component is implemented by hardware, the monitoring device 10 includes an electronic circuit 15 in place of the processor 11 and the storage device 12. The electronic circuit 15 is a dedicated circuit that implements the functions of each functional component and the storage device 12.
  • FIG. 4 shows, as FIG. 1 does, a configuration in which the communication interface 13, the on-board interface 14, and the electronic circuit 15 are connected via a bus. However, the electronic circuit 15 may be configured as a single circuit that also implements the functions of the communication interface 13 and the on-board interface 14.
  • Examples of the electronic circuit 15 include a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a gate array (GA), an application specific integrated circuit (ASIC), and a field-programmable gate array (FPGA).
  • Further, each functional component of the monitoring device 10 described in Embodiment 1 may be integrated into one electronic circuit 15, or they may be allocated to and implemented by a plurality of the electronic circuits 15.
  • Modified Example 2
  • Modified Example 2 describes the case in which some of each of the functional components are implemented by hardware and each of the remaining functional components is implemented by software. FIG. 5 shows a configuration of the monitoring device 10 according to Modified Example 2.
  • In FIG. 5, the processor 11, the storage device 12, and the electronic circuit 15 are called processing circuits.
  • In other words, the function of each functional component is implemented by a processing circuit.
  • Embodiment 2
  • The configuration of Embodiment 1 is for detecting a fallen object from a railroad vehicle. Next, in the present embodiment, a configuration for detecting an obstacle at a station which is provided with platform doors will be described. The present embodiment differs from Embodiment 1 in that the mobile object 100 identifies the position of the data around its front area and the position of the data around its rear area on the basis of the positions of the platform doors at a station. In the present embodiment, the different points will be explained, and the same points will be omitted.
  • FIG. 6 shows a functional configuration diagram of the monitoring device 10 according to the present embodiment. In FIG. 6, the numeral 26 denotes a control unit with which the monitoring device 10 identifies the position of the data around its front area and the position of the data around its rear area on the basis of the positions of the platform doors at the station. For example, the control unit 26 stores an image of the platform door in advance and determines whether the image of the platform door is included in the data around its front area acquired by the front data acquisition unit 21. If the platform door image is included, the control unit 26 calls the front data acquisition unit 21 each time the frontmost of the mobile object 100 reaches each platform door at the station and calls the rear data acquisition unit 22 each time the rearmost of the mobile object 100 reaches each platform door at the station.
  • The control unit 26 outputs an anomaly warning when the object detection unit 24 detects an obstacle. In FIG. 6, the same reference numerals as those in FIG. 2 denote the same or corresponding components or units, each of which, except for the control unit 26, performs the same operation as those described in FIG. 2 shown in Embodiment 1.
  • FIG. 7 is a flowchart showing operation of the monitoring device 10 according to the present embodiment. The operation of the monitoring device 10 according to the present embodiment will be described with reference to FIG. 7. In the present embodiment, for ease of explanation, it is described that the control unit 26 calls the front data acquisition unit 21, the rear data acquisition unit 22, and the object detection unit 24, and that when an obstacle is detected as the result, an anomaly warning is issued. Instead, as in Embodiment 1, the three units, namely the front data acquisition unit 21, the rear data acquisition unit 22, and the object detection unit 24, in other words, the units other than the position identification unit 23 which is called from the front data acquisition unit 21 and the rear data acquisition unit 22, can be executed in parallel. At that time, the object detection unit 24 can output an anomaly warning to be outputted when the obstacle is detected.
  • (Step S21: Processing of Front Data Acquisition)
  • When the railroad vehicle enters the station, the control unit 26 repeatedly calls the front data acquisition unit 21 until the railroad vehicle stops at the station. The front data acquisition unit 21 acquires the data around its front area using a first sensor installed in the front of the mobile object 100. The front data acquisition unit 21 writes the acquired data around its front area in the memory 121.
  • For example, if there are three platform doors corresponding to the doors of the railroad vehicle, the control unit 26 calls the front data acquisition unit 21 each time the railroad vehicle approaches each of the three platform doors. When the mobile object 100 approaches a predetermined position with respect to each of the platform doors or each of the vehicle doors (for example, position from which the entire door can be overlooked), the front data acquisition unit 21 records the data around its front area and the status information as the front data 31. In this example, the front data taken at the three locations is written in the memory 121.
  • (Step S22: Position Identification Processing of Data from Around Front Area)
  • As in Embodiment 1, when called by the front data acquisition unit 21, the position identification unit 23 identifies the position of the data around its front area and writes the position in the front data 31.
  • (Step S23: Waiting for Completion of Passengers Getting on and Off)
  • The control unit 26 waits while the doors of the railroad vehicle and the platform doors open, the passengers get on and off, and the platform doors and the doors of the railroad vehicle close.
  • (Step S24: Processing of Rear Data Acquisition)
  • The control unit 26 calls the rear data acquisition unit 22 until the railroad vehicle leaves the station. The rear data acquisition unit 22 acquires the data around its rear area using a second sensor installed in the rear of the mobile object 100. Specifically, as in Step S21, the rear area data acquisition unit 22 collects the information obtained by the second sensor via the vehicle control unit 101. The rear data acquisition unit 22 writes the acquired rear data 32 in the memory 121. To explain in accordance with the example given in Step S21, when the rear of the mobile object 100 approaches a predetermined position with respect to each of the platform doors or each of the vehicle doors (for example, the position from which the entire door can be overlooked), the rear data acquisition unit 22 acquires the data around its rear area and the status information. In this way, the data around the three rear areas each corresponding to the data around the three front areas described above are written in the memory 121.
  • (Step S25: Position Identification Processing of Data from Around Rear Area)
  • As in Embodiment 1, when called by the rear data acquisition unit 22, the position identification unit 23 identifies the position of the data around its rear area and writes the position in the rear data 32.
  • (Step S26: Processing of Obstacle Detection)
  • Next, the control unit 26 calls the object detection unit 24. As in Embodiment 1, the object detection unit 24 compares the data around its front area with the data around its rear area whose identified positions match, and determines a difference, if any, to be a fallen object which occurred during the passage of the mobile object 100, and issues an anomaly warning.
  • In the present embodiment, an example is described in which the position of each platform door or each vehicle door is recognized in advance, and then the front data and the rear data are taken at each position. However, in a case where the front data and the rear data are continuously taken at regular time intervals, it is also possible to detect and count doors from images by considering the front-rear relationship between images taken in time-series. Alternatively, the position identification unit 23 can identify the position of an obstacle from the numbers or the codes written on the platform doors recorded in the front data and the rear data stored.
  • (Step S27: Anomaly Warning)
  • The control unit 26 issues an anomaly warning when the object detection unit 24 detects an obstacle (YES in Step S26). For example, by transmitting this anomaly warning to the station (management center), the station staffs can make actions in a prompt manner. Also, it is possible to take measures such as stopping the following train if it is urgent or allowing the next train to enter and the doors to open at the platform to handle the problem occurred if it is not urgent.
  • As so far described, the monitoring device 10 according to the present embodiment can detect an anomaly which occurs when the mobile object 100 enters the platform of a station at the time when the mobile object 100 leaves the platform of the station. With the first and second sensors installed at the front and the rear of the mobile object 100, the monitoring device 10 according to the present embodiment obtains the data around its front area viewed from the mobile object 100 and the data around its rear area viewed from the mobile object 100. Then, by comparing them, it is further made possible to immediately detect a fallen object existing on the outside of the platform doors and on the nearside of the railroad track, which is normally difficult to detect only with monitoring cameras installed on the platform.
  • With the above effects, when an obstacle that is detected may interfere with the railroad service, it is possible to take immediate actions, which contributes to improvement of the safe and on-time operation of the railroad service. Further, since the sensors attached to the railroad vehicle is used, it is not necessary to equip the platform doors with their respective image monitoring devices, and the cost, can be greatly reduced.
  • DESCRIPTION OF REFERENCE NUMERALS AND SIGNS
  • 10 monitoring device, 11 processor, 12 storage device, 13 communication IT 14 in-vehicle 15 electronic circuit, 21 front data acquisition unit, 22 rear data acquisition unit, 28 position identification unit, 24 object detection unit, 25 history storage unit, 26 control unit, 31 front data, 32 rear data, 41 locating data, 42 exclusion data, 100 mobile object, 101 vehicle control unit, 121 memory 122 storage

Claims (17)

1. A monitoring device comprising:
front data acquisition circuitry to acquire, via a sensor installed in a front of a mobile object, data around its front area with respect to its traveling direction and status information of the mobile object;
rear data acquisition circuitry to acquire, via a sensor installed in a rear of the mobile object, data around its rear area with respect to the traveling direction and status information of the mobile object;
position identification circuitry to identify a position of the data around the front area and a position of the data around the rear area; and
object detection circuitry to compare the data around the front area with the data around the rear area of the same position as the data around the front area, detect an object, and warn of an anomaly.
2. The monitoring device according to claim 1, wherein the front data acquisition circuitry and the rear data acquisition circuitry acquire at least one of speed of the mobile object and a yaw angle of the mobile object as the status information.
3. The monitoring device according to claim 1, wherein
the mobile object is provided with a light in each of its front and rear,
the front data acquisition circuitry and the rear data acquisition circuitry detect and record colors of their respective lights as the status information, and
the object detection circuitry corrects colors of the data around the front area and the data around the rear area by using the colors of the lights.
4. The monitoring device according to claim 2, wherein
the mobile object is provided with a light in each of its front and rear,
the front data acquisition circuitry and the rear data acquisition circuitry detect and record colors of their respective lights as the status information, and
the object detection circuitry corrects colors of the data around the front area and the data around the rear area by using the colors of the lights.
5. The monitoring device according to claim 1, wherein the position identification circuitry includes locating data in which uniquely identifiable data and position information thereof are recorded, compares the data around the front area acquired by the front data acquisition circuitry and the data around the rear area acquired by the rear data acquisition circuitry with the locating data.
6. The monitoring device according to claim 2, wherein the position identification circuitry includes locating data in which uniquely identifiable data and position information thereof are recorded, compares the data around the front area acquired by the front data acquisition circuitry and the data around the rear area acquired by the rear data acquisition circuitry with the locating data.
7. The monitoring device according to claim 3, wherein the position identification circuitry includes locating data in which uniquely identifiable data and position information thereof are recorded, compares the data around the front area acquired by the front data acquisition circuitry and the data around the rear area acquired by the rear data acquisition circuitry with the locating data.
8. The monitoring device according to claim 4, wherein the position identification circuitry includes locating data in which uniquely identifiable data and position information thereof are recorded, compares the data around the front area acquired by the front data acquisition circuitry and the data around the rear area acquired by the rear data acquisition circuitry with the locating data.
9. The monitoring device according to claim 1, wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
10. The monitoring device according to claim 2, wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
11. The monitoring device according to claim 3, wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
12. The monitoring device according to claim 4, wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
13. The monitoring device according to claim 5, wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
14. The monitoring device according to claim 6, wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
15. The monitoring device according to claim 7, wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
16. The monitoring device according to claim 8, wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
17. A monitoring method comprising:
acquiring data around a front area with respect to a traveling direction of a mobile object and status information of the mobile object;
acquiring data around a rear area with respect to the traveling direction of the mobile object and status information of the mobile object;
identifying a position of the data around the front area and a position of the data around the rear area; and
comparing the data around the front area with the data around the rear area of the same position as the data around the front area, detecting an object, and warning of an anomaly.
US17/388,353 2019-02-06 2021-07-29 Monitoring device and method Pending US20210354737A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/004203 WO2020161818A1 (en) 2019-02-06 2019-02-06 Monitoring device and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/004203 Continuation WO2020161818A1 (en) 2019-02-06 2019-02-06 Monitoring device and method

Publications (1)

Publication Number Publication Date
US20210354737A1 true US20210354737A1 (en) 2021-11-18

Family

ID=71947675

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/388,353 Pending US20210354737A1 (en) 2019-02-06 2021-07-29 Monitoring device and method

Country Status (5)

Country Link
US (1) US20210354737A1 (en)
EP (1) EP3907121B1 (en)
JP (1) JP6914461B2 (en)
ES (1) ES2945839T3 (en)
WO (1) WO2020161818A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59156089A (en) * 1983-10-11 1984-09-05 Hitachi Ltd Obstacle detecting method for vehicle
JP3448088B2 (en) * 1993-12-24 2003-09-16 東日本旅客鉄道株式会社 Obstacle detection system
JP5161673B2 (en) * 2008-06-30 2013-03-13 株式会社神戸製鋼所 Falling object detection device and falling object detection method
US8712610B2 (en) * 2008-09-18 2014-04-29 General Electric Company System and method for determining a characterisitic of an object adjacent to a route
JP6209141B2 (en) 2014-09-04 2017-10-04 公益財団法人鉄道総合技術研究所 Obstacle detection apparatus and method

Also Published As

Publication number Publication date
EP3907121B1 (en) 2023-04-26
ES2945839T3 (en) 2023-07-07
JP6914461B2 (en) 2021-08-04
EP3907121A4 (en) 2022-02-23
WO2020161818A1 (en) 2020-08-13
JPWO2020161818A1 (en) 2021-05-20
EP3907121A1 (en) 2021-11-10

Similar Documents

Publication Publication Date Title
US10599931B2 (en) Automated driving system that merges heterogenous sensor data
US9824277B2 (en) Pedestrian right of way monitoring and reporting system and method
CN111382768A (en) Multi-sensor data fusion method and device
CN113492851B (en) Vehicle control device, vehicle control method, and computer program for vehicle control
CN111932901B (en) Road vehicle tracking detection apparatus, method and storage medium
CN106485233A (en) Drivable region detection method, device and electronic equipment
CN102576495B (en) Collision monitor for a motor vehicle
EP3140777B1 (en) Method for performing diagnosis of a camera system of a motor vehicle, camera system and motor vehicle
US20190050652A1 (en) Obstacle analyzer, vehicle control system, and methods thereof
JP4858761B2 (en) Collision risk determination system and warning system
US11726176B2 (en) Annotation of radar-profiles of objects
EP3439920A1 (en) Determining mounting positions and/or orientations of multiple cameras of a camera system of a vehicle
AU2020366769A1 (en) Sensor performance evaluation system and method, and automatic driving system
EP3035315A1 (en) Information retrieval arrangement
US20210354737A1 (en) Monitoring device and method
EP3287940A1 (en) Intersection detection system for a vehicle
JP6220322B2 (en) HOME DETECTING METHOD, HOME DETECTING DEVICE, AND TRAIN OPENING SUPPORT DEVICE
JP2019204174A (en) Illegal passage detection device, illegal passage detection system and illegal passage detection method
CN114333414A (en) Parking yield detection device, parking yield detection system, and recording medium
JP5957182B2 (en) Road surface pattern recognition method and vehicle information recording apparatus
AU2021102368A4 (en) A Sensor Device for Vehicles
US20240010242A1 (en) Signal processing device and signal processing method
WO2020071133A1 (en) Sign recognition device
CN114268788A (en) Monitoring of functional compliance of vehicle-mounted vehicle image acquisition device

Legal Events

Date Code Title Description
AS Assignment

Owner name: MITSUBISHI ELECTRIC CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:IMAI, YOSHIE;MURAYAMA, SHU;SIGNING DATES FROM 20210705 TO 20210708;REEL/FRAME:057037/0586

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION