US20210354737A1 - Monitoring device and method - Google Patents
Monitoring device and method Download PDFInfo
- Publication number
- US20210354737A1 US20210354737A1 US17/388,353 US202117388353A US2021354737A1 US 20210354737 A1 US20210354737 A1 US 20210354737A1 US 202117388353 A US202117388353 A US 202117388353A US 2021354737 A1 US2021354737 A1 US 2021354737A1
- Authority
- US
- United States
- Prior art keywords
- data
- around
- monitoring device
- mobile object
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012806 monitoring device Methods 0.000 title claims abstract description 48
- 238000000034 method Methods 0.000 title claims description 6
- 238000001514 detection method Methods 0.000 claims abstract description 43
- 230000007717 exclusion Effects 0.000 claims description 22
- 239000003086 colorant Substances 0.000 claims description 7
- 238000012544 monitoring process Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 description 16
- 238000004891 communication Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 238000013459 approach Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000009474 immediate action Effects 0.000 description 2
- 241001137251 Corvidae Species 0.000 description 1
- 241000282326 Felis catus Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000010893 paper waste Substances 0.000 description 1
- 125000000391 vinyl group Chemical group [H]C([*])=C([H])[H] 0.000 description 1
- 229920002554 vinyl polymer Polymers 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B61—RAILWAYS
- B61L—GUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
- B61L23/00—Control, warning or like safety means along the route or between vehicles or trains
- B61L23/04—Control, warning or like safety means along the route or between vehicles or trains for monitoring the mechanical state of the route
- B61L23/041—Obstacle detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B61—RAILWAYS
- B61L—GUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
- B61L15/00—Indicators provided on the vehicle or train for signalling purposes
- B61L15/0072—On-board train data handling
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B61—RAILWAYS
- B61L—GUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
- B61L25/00—Recording or indicating positions or identities of vehicles or trains or setting of track apparatus
- B61L25/02—Indicating or recording positions or identities of vehicles or trains
- B61L25/021—Measuring and recording of train speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B61—RAILWAYS
- B61L—GUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
- B61L25/00—Recording or indicating positions or identities of vehicles or trains or setting of track apparatus
- B61L25/02—Indicating or recording positions or identities of vehicles or trains
- B61L25/023—Determination of driving direction of vehicle or train
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B61—RAILWAYS
- B61L—GUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
- B61L25/00—Recording or indicating positions or identities of vehicles or trains or setting of track apparatus
- B61L25/02—Indicating or recording positions or identities of vehicles or trains
- B61L25/025—Absolute localisation, e.g. providing geodetic coordinates
Definitions
- the present disclosure relates to a device for monitoring anomalies in railroad systems.
- the patent document 1 describes a technique for automatically detecting an obstacle to railroad systems, such as an overhead wire and a rail, by comparing the image data obtained by photographing the front from a traveling railroad vehicle with the background image data previously taken.
- the technique described in Patent Document 1 aims to efficiently monitor anomalies in railroad systems.
- the present invention has been made to solve the above-mentioned problem, and aims to enable a railroad vehicle to detect an object dropped off from itself.
- the monitoring device includes: a front data acquisition unit to acquire, via a sensor installed in a front of a mobile object, data around its front area with respect to its traveling direction and status information of the mobile object; a rear data acquisition unit to acquire, via a sensor installed in a rear of the mobile object, data around its rear area with respect to the traveling direction and status information of the mobile object; a position identification unit to identify a position of the data around the front area and a position of the data around the rear area; and an object detection unit to compare the data around the front area with the data around the rear area of the same position as the data around the front area, detect an object, and warn of an anomaly.
- the present invention makes it possible for a railroad vehicle to detect an object dropped off while the railroad vehicle is traveling.
- FIG. 1 is a hardware configuration diagram showing a mobile object according to Embodiment 1.
- FIG. 2 is a functional configuration diagram showing a monitoring device according to Embodiment 1.
- FIG. 3 is a flowchart showing operation of the monitoring device according to Embodiment 1.
- FIG. 4 is a hardware configuration diagram of a mobile object according to Modified Example 1 of Embodiment 1.
- FIG. 5 is a hardware configuration diagram of a mobile object according to Modified Example 2 of Embodiment 1.
- FIG. 6 is a functional configuration diagram showing a monitoring device according to Embodiment 2.
- FIG. 7 is a flowchart showing operation of the monitoring device according to Embodiment 2.
- FIG. 1 is a hardware configuration diagram of a mobile object using a monitoring device according to the present embodiment.
- the numeral 100 denotes the mobile object
- the numeral 10 denotes the monitoring device
- the numeral 101 denote a vehicle control unit.
- the monitoring device 10 is a computer provided in the mobile object 100 .
- monitoring device 10 may be implemented in an integrated (or inseparable) form or in a removable (or separable) form, with the mobile object 100 or another component illustrated herein. Further, the monitoring device 10 is not limited to the one described in the present embodiment, although a railroad vehicle is used as an example of the mobile object 100 in the present embodiment.
- the monitoring device 10 includes hardware such as a processor 11 , a storage device 12 , a communication interface 13 , and an on-board interface 14 .
- the processor 11 is connected to other hardware devices via a system bus to control them.
- the processor 11 is an integrated circuit (IC) that performs processing.
- the processor 11 is a central processing unit (CPU), a digital signal processor (DSP), or a graphics processing unit (GPU).
- CPU central processing unit
- DSP digital signal processor
- GPU graphics processing unit
- the storage device 12 includes a memory 121 and a storage 122 .
- the memory 121 is a random-access memory (RAM).
- the storage 122 is a hard disk drive (HDD).
- the storage 122 may be a portable storage medium such as a secure digital (SD) memory card, a compact flash (CF), a NAND flash, a flexible disk, an optical disk, a compact disk, a Blu-ray (registered trademark) disk, or a DVD.
- the communication interface 13 is a device for communicating with a communication device around the mobile object 100 .
- the communication interface 13 is an Ethernet (registered trademark) terminal or a universal serial bus (USE) terminal.
- the on-board interface 14 is a device for connecting to the vehicle control unit 101 installed on the mobile object 100 .
- the on-board interface 14 is a USB terminal, an IEEE1394 terminal, or an HDMI (registered trademark) terminal.
- the vehicle control unit 101 includes sensing devices such as a camera, a light detection and ranging device (LiDAR), a radar, a sonar, and a positioning device and also includes devices such as a steering, a brake, and an accelerator to control the mobile object 100 .
- sensing devices such as a camera, a light detection and ranging device (LiDAR), a radar, a sonar, and a positioning device and also includes devices such as a steering, a brake, and an accelerator to control the mobile object 100 .
- FIG. 2 shows a functional configuration diagram of the monitoring device 10 .
- the monitoring device 10 includes, as functional components, a front data acquisition unit 21 , a rear data acquisition unit 22 ; a position identification unit 23 , an object detection unit 24 , and a history storage unit 25 .
- the numeral 31 denotes front data; the numeral 32 denotes rear data; the numeral 41 denotes locating data; and the numeral 42 denotes exclusion data.
- the front data acquisition unit 21 collects information obtained by a first sensor installed in the front of the mobile object 100 via the vehicle control unit 101 .
- the first sensor is exemplified as a camera (front camera) installed in the front of the mobile object 100 , and the information obtained from the front camera is used as data around its front area.
- the data may be information obtained from a LiDAR, a radar, or a sonar.
- the information to be collected by the front camera is the information obtained in the direction of travel of the mobile object 100 .
- the first sensor only needs to be installed in the front of the mobile object 100 , and the information to be collected by the first sensor may be information in any direction.
- the data around its front area that is obtained is recorded in the storage 122 .
- the front data acquisition unit 21 also records, in the storage 122 , the status information of the mobile object 100 such as position information, speed, attitude angles (roll angle, pitch angle, yaw angle), lighting color of the mobile object at the time of the data acquisition and the like at the time when acquiring data around its front area.
- the position information of the mobile object 100 may be, for example, the latitude and longitude of the mobile object 100 obtained from the output value of the positioning device which is connected to the mobile object via the vehicle control unit 101 .
- the data recorded by the front data acquisition unit 21 in the storage 122 be front data 31 .
- the rear data acquisition unit 22 collects information obtained by a second sensor installed in the rear of the mobile object 100 via the vehicle control unit 101 .
- the second sensor is exemplified as a camera (rear camera) installed in the rear of the mobile object 100 , and the image obtained from the rear camera is used as data around its rear area.
- the data may be information obtained from a LiDAR, a radar, and a sonar.
- the information to be collected by the rear camera is the information obtained in the direction opposite to the direction of travel of the mobile object 100 .
- the second sensor only needs to be installed in the rear of the mobile object 100 , and the information to be collected by the second sensor is information in any direction.
- the data obtained from around the rear area is recorded in the storage 122 .
- the rear data acquisition unit 22 also records, in the storage 122 , the status information of the mobile object 100 such as position information, speed, attitude angles (roll angle, pitch angle, yaw angle), lighting color of the mobile object at the time of the data acquisition and the like at the time when obtaining data from around the rear area.
- the position information of the mobile object 100 may be, for example, the latitude and longitude of the mobile object 100 obtained from the output value of the positioning device which is connected to the mobile object via the vehicle control unit 101 .
- the data recorded by the rear data acquisition unit 22 in the storage 122 be rear data 32 .
- the data around its front area and the data around its rear area are recorded in the storage 122 .
- they may be recorded in the memory 121 , another area prepared in the storage device 12 , or an external device (not shown) connected via the communication I/F 13 .
- the position identification unit 23 is called from the front data acquisition unit 21 and the rear data acquisition unit 22 .
- the position identification unit 23 identifies the position of the front data 31 when called from the front data acquisition unit 21 and identifies the position of the rear data 32 when called from the rear data acquisition unit 22 .
- the identified position of the data around its front area and the identified position of the data around its rear area are each the position corresponding to the information (identified position information) obtained from the first and second sensors, but not the position of the mobile object 100 .
- the identified position information identified by the position identification unit 23 is recorded together with the front data 31 and the rear data 32 , but they may be recorded in separate areas if it is known which data the identified position information is linked with. Further, the position identification unit 23 may identify the positions of the data around the front area and the data around the rear area by using locating data 41 .
- the locating data 41 may be data of any object that can uniquely identify the data obtained by the first and second sensors, such as a building, a pillar, a signboard, and a characteristic landscape that exist along the railroad track.
- the image data of the objects that can be uniquely identified as described above and their position information are associated with the locating data 41 in advance.
- the position identification unit 23 will be able to identify the position without using the status information of the mobile object 100 , making it possible to shorten the processing time.
- a camera is used as the sensor.
- a radar or a sonar is used as a sensor, a combination of an object made of a material that generates characteristic reflected waves, and positional information thereof may be used.
- the object detection unit 24 compares the identified position information recorded in the front data 31 with the identified position information recorded in the rear data 32 to detect a matching combination. If there is a matching combination in the identified position information, the object detection unit 24 compares the data around its front area with the data around its rear area. Then, if there is a difference, it is determined that the difference indicates a fallen object dropped off during the passage of the mobile object 100 and an anomaly warning is issued.
- the object detection unit 24 can determine whether the object determined to be a fallen object is really the fallen object dropped off during the passage of the mobile object 100 by using the exclusion data 42 .
- the items to be excluded include animals such as crows and cats, gravel and stones, something blown by the wind such as paper wastes like newspapers and magazines, and vinyl sheets, and the image data of these items may be held as the exclusion data 42 .
- the object detection unit 24 compares the image determined to be a fallen object with the images of the exclusion data 42 . If they match, the object detection unit 24 determines that the object found is not the one that was dropped off during the passage of the mobile object 100 , so that a false alarm will not be issued to the mobile object 100 regarding the occurrence of a fallen object. At this time, a warning may be issued that it is not a fallen object from the mobile object 100 but something that has been blown from the outside during the passage of the mobile object 100 .
- the locating data 41 and the exclusion data 42 may be recorded in the storage 122 or in the memory 121 . Also, these data may be recorded in another area prepared in the storage device 12 , or an external device (not shown) connected via the communication I/F 13 .
- each functional component of the monitoring device 10 is implemented by software.
- the storage 122 of the storage device 12 stores a program that implements the function of each functional component implemented by the software. This program is loaded into the memory 121 by the processor 11 and executed by the processor 11 .
- the storage 122 implements the function of the history storage unit 25 .
- the history storage unit 25 stores information about the fallen objects that the object detection unit 24 detected in the past. Examples of such information to be stored include position, time, and number of times of detection of each fallen object detected.
- the front data acquisition unit 21 and the rear data acquisition unit 22 may for example, shorten the interval for collecting data in the vicinity of the location where fallen objects are frequent, and may, conversely; lengthen the interval for collecting data in the vicinity of the location where fallen objects are less frequent. This makes it possible to efficiently obtain the data around its front area and the data around its rear area.
- processor 11 only one processor 11 is provided. Instead, however, multiple processors 11 may be provided. In that case, the multiple processors 11 cooperate to execute the program that implements each function of the monitoring device 10 .
- FIG. 3 is a flowchart showing processing of the monitoring device according to the present embodiment.
- the operation of the monitoring device 10 according to Embodiment 1 will be described with reference to FIG. 3 .
- the processes of the front data acquisition unit 21 , the rear data acquisition unit 22 , the position identification unit 23 , and the object detection unit 24 are described in a way that they are executed sequentially as shown in the flowchart.
- the three units, namely, the front data acquisition unit 21 , the rear data acquisition unit 22 , and the object detection unit 24 in other words, the units other than the position identification unit 23 which is called from the front data acquisition unit 21 and the rear data acquisition unit 22 , can be executed in parallel.
- Step S 11 Processing of front data acquisition
- the front data acquisition unit 21 acquires the data around its front area by the first sensor installed in the front of the mobile object 100 and the status information and writes the data in the front data 31 .
- the front data acquisition unit 21 calls the position identification unit 23 .
- Step S 12 Calculation and identification of front data position
- the position identification unit 23 identifies the position of the data around its front area on the basis of the status information written in the front data 31 and writes the identified position in the front data 31 .
- Step S 13 Processing of rear data acquisition
- the rear data acquisition unit 22 acquires the data around its rear area by the second sensor installed in the rear of the mobile object 100 and the status information and writes the data in the rear data 32 .
- the rear data acquisition unit 22 calls the position identification unit 23 .
- Step S 14 Calculation and identification of rear data position
- the position identification unit 23 identifies the position of the data around its rear area on the basis of the status information written in the rear data 32 and writes the identified position in the rear data 32 .
- Step S 15 Detection of object
- the object, detection unit 24 compares the identified position information of the front data 31 and the identified position information of the rear data 32 , both stored in the storage 122 , and detects a matching combination. If there is a matching combination in the identified position information, the object detection unit 24 compares the data around its front area with the data around its rear area. Then, if there is a difference in the combination, it is determined that the difference indicates the existence of a fallen object dropped off during the passage of the mobile object 100 and an anomaly warning is issued.
- the object detection unit 24 converts the pixel signals of either the front data 31 or the rear data 32 by using the status information included in the front data 31 and the rear data 32 as well as the characteristics of the acquired data.
- the status information will be explained in detail. Specifically, the speed information, the attitude angles (roll angle, pitch angle, yaw angle) of the mobile object 100 , and the characteristics of sensing devices provided in the vehicle control unit 101 for the acquisition of data are used.
- the lens and the size of image sensor each are a characteristic, and the focal length and the angle of view are determined from these characteristics.
- the distances corresponding to the pixels in the camera image can be roughly known from the shooting position, the focal length, and the angle of view, and as a result, the positions corresponding to the pixel signals can be known.
- the object detection unit 24 obtains the positions of the pixel signals for each of the front data 31 and the rear data 32 , and finds a pair of the front data 31 and the rear data 32 having the same position of the pixel signals to perform the comparison.
- the pixel signals are compared, it is possible to use the markers and the characteristics of the landmarks and the like recorded in the data. In doing so, size scaling and angle adjustment may be performed in order to match the size of the pixel area and the obstacle.
- the pixel signals of the same position may be found in multiple images such as a near view and a distant view, etc.
- the identification accuracy is improved by giving priority to the image with higher resolution without using the image with lower resolution.
- the pixel signals of the same position may be found in multiple images with different subject depths.
- the identification accuracy is improved by giving priority to the image of higher resolution with no blurring without using the image with lower resolution due to blurring.
- the shadow of the mobile object may be cast on the image.
- the cast shadow may be corrected so as for the images to match with each other, or the orientation of the sensor may be changed to take a shadow-free image for priority use.
- the shadow can be detected from the image signal, but it can also be predicted from the position of the sun calculated from the photographing time and place and the size of the moving object or it is possible that the image will not be used when it is determined in advance not to be suitable for the identification processing.
- the object detection unit 24 searches for the object to be detected on the basis of the pixel signals at the identified data position. In the description of the present embodiment, this object is assumed to be a part that has fallen from the mobile object 100 .
- the object detection unit 24 can create the stereoscopic image from the front data 31 by performing viewpoint conversion.
- the stereoscopic image may be a bird's-eye view image or a three-dimensionally reconstructed image.
- the object detection unit 24 performs the similar processing on the rear data 32 as well.
- the front data 31 and the rear data 32 are compared to determine whether the difference therebetween indicates an object to be detected.
- Moved stones on the railroad track for example, are excluded from the objects to be detected so that they are not determined as fallen objects.
- foreign objects such as stones consequently move on a railroad track, hindering the safe railroad operation, it is also possible to include the objects as the targets to be detected.
- a camera recording visible light is used as a sensor to acquire images.
- the mobile object 100 is a railroad vehicle, it is required that, during operation, the color of the front light and the color of the rear light be different from each other. Therefore, if the camera recording visible light is used as the sensor, determination as a fallen object may be made due to the difference in color.
- the object detection unit 24 cancels the color difference between the front light and the rear light by using the setting values such as the colors of the lights in the status information.
- a straightforward way to correct the color difference of the lights is to use the Von Dries color conversion formula.
- the monitoring device 10 by using the front data acquired by the front data acquisition unit 21 as well as the status information acquired when the front data is acquired by the front data acquisition unit 21 , and the rear data acquired by the rear data acquisition unit 22 as well as the status information acquired when the rear data is acquired by the rear data acquisition unit 22 , and further by using, in the position identification unit 23 , the characteristics of the status information acquired when the front data is acquired as well as the status information acquired when the rear data is acquired, the front data and the rear data having the same identified position information are compared to detect a fallen object.
- this makes it possible to detect a fallen object immediately after it occurs during the travel of the mobile object 100 .
- this configuration which only uses the sensors attached to the front and the rear of the railroad vehicle, contributes to reducing the number of monitoring cameras to be installed along the railroad track.
- the front data acquisition unit 21 acquires the data around its front area in the direction of travel of the mobile object 100 , but the data around its rear area may be acquired as long as the first sensor is installed in the front of the mobile object 100 .
- the rear data acquisition unit 22 acquires the data around its rear area in the direction of travel of the mobile object 100 , but it may acquire the data around its front area as long as the second sensor is installed in the rear of the mobile object 100
- the front data acquisition unit 21 and the rear data acquisition unit 22 may be provided in the front and in the rear of each of the cars constituting the mobile object 100 , respectively. In that case, the front data acquisition unit 21 of the frontmost car and the rear data acquisition unit 22 of the rearmost car are used while traveling. With this configuration, even if the mobile object 100 is separated into a plurality of moving bodies, it is possible to use the front data acquisition unit 21 of the frontmost car and the rear data acquisition unit 22 of the rearmost car in each separated mobile object. To the contrary, when a plurality of the moving bodies 100 are connected into one mobile object, the front data acquisition unit 21 of the frontmost mobile object 100 and the rear data acquisition unit 22 of the rearmost mobile object 100 can be used.
- Embodiment 1 describes the case where each functional component is implemented by software. However, each of these functional components may be implemented by hardware.
- FIG. 4 shows a hardware configuration of the monitoring device 10 according to Modified Example 1. If each functional component is implemented by hardware, the monitoring device 10 includes an electronic circuit 15 in place of the processor 11 and the storage device 12 .
- the electronic circuit 15 is a dedicated circuit that implements the functions of each functional component and the storage device 12 .
- FIG. 4 shows, as FIG. 1 does, a configuration in which the communication interface 13 , the on-board interface 14 , and the electronic circuit 15 are connected via a bus.
- the electronic circuit 15 may be configured as a single circuit that also implements the functions of the communication interface 13 and the on-board interface 14 .
- Examples of the electronic circuit 15 include a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a gate array (GA), an application specific integrated circuit (ASIC), and a field-programmable gate array (FPGA).
- a single circuit a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a gate array (GA), an application specific integrated circuit (ASIC), and a field-programmable gate array (FPGA).
- each functional component of the monitoring device 10 described in Embodiment 1 may be integrated into one electronic circuit 15 , or they may be allocated to and implemented by a plurality of the electronic circuits 15 .
- Modified Example 2 describes the case in which some of each of the functional components are implemented by hardware and each of the remaining functional components is implemented by software.
- FIG. 5 shows a configuration of the monitoring device 10 according to Modified Example 2.
- the processor 11 the storage device 12 , and the electronic circuit 15 are called processing circuits.
- each functional component is implemented by a processing circuit.
- Embodiment 1 is for detecting a fallen object from a railroad vehicle.
- a configuration for detecting an obstacle at a station which is provided with platform doors will be described.
- the present embodiment differs from Embodiment 1 in that the mobile object 100 identifies the position of the data around its front area and the position of the data around its rear area on the basis of the positions of the platform doors at a station.
- the different points will be explained, and the same points will be omitted.
- FIG. 6 shows a functional configuration diagram of the monitoring device 10 according to the present embodiment.
- the numeral 26 denotes a control unit with which the monitoring device 10 identifies the position of the data around its front area and the position of the data around its rear area on the basis of the positions of the platform doors at the station.
- the control unit 26 stores an image of the platform door in advance and determines whether the image of the platform door is included in the data around its front area acquired by the front data acquisition unit 21 . If the platform door image is included, the control unit 26 calls the front data acquisition unit 21 each time the frontmost of the mobile object 100 reaches each platform door at the station and calls the rear data acquisition unit 22 each time the rearmost of the mobile object 100 reaches each platform door at the station.
- the control unit 26 outputs an anomaly warning when the object detection unit 24 detects an obstacle.
- the same reference numerals as those in FIG. 2 denote the same or corresponding components or units, each of which, except for the control unit 26 , performs the same operation as those described in FIG. 2 shown in Embodiment 1.
- FIG. 7 is a flowchart showing operation of the monitoring device 10 according to the present embodiment.
- the operation of the monitoring device 10 according to the present embodiment will be described with reference to FIG. 7 .
- the control unit 26 calls the front data acquisition unit 21 , the rear data acquisition unit 22 , and the object detection unit 24 , and that when an obstacle is detected as the result, an anomaly warning is issued.
- the three units namely the front data acquisition unit 21 , the rear data acquisition unit 22 , and the object detection unit 24 , in other words, the units other than the position identification unit 23 which is called from the front data acquisition unit 21 and the rear data acquisition unit 22 , can be executed in parallel.
- the object detection unit 24 can output an anomaly warning to be outputted when the obstacle is detected.
- Step S 21 Processing of Front Data Acquisition
- the control unit 26 When the railroad vehicle enters the station, the control unit 26 repeatedly calls the front data acquisition unit 21 until the railroad vehicle stops at the station.
- the front data acquisition unit 21 acquires the data around its front area using a first sensor installed in the front of the mobile object 100 .
- the front data acquisition unit 21 writes the acquired data around its front area in the memory 121 .
- the control unit 26 calls the front data acquisition unit 21 each time the railroad vehicle approaches each of the three platform doors.
- the front data acquisition unit 21 records the data around its front area and the status information as the front data 31 .
- the front data taken at the three locations is written in the memory 121 .
- Step S 22 Position Identification Processing of Data from Around Front Area
- the position identification unit 23 identifies the position of the data around its front area and writes the position in the front data 31 .
- Step S 23 Waiting for Completion of Passengers Getting on and Off
- the control unit 26 waits while the doors of the railroad vehicle and the platform doors open, the passengers get on and off, and the platform doors and the doors of the railroad vehicle close.
- Step S 24 Processing of Rear Data Acquisition
- the control unit 26 calls the rear data acquisition unit 22 until the railroad vehicle leaves the station.
- the rear data acquisition unit 22 acquires the data around its rear area using a second sensor installed in the rear of the mobile object 100 .
- the rear area data acquisition unit 22 collects the information obtained by the second sensor via the vehicle control unit 101 .
- the rear data acquisition unit 22 writes the acquired rear data 32 in the memory 121 .
- the rear data acquisition unit 22 acquires the data around its rear area and the status information. In this way, the data around the three rear areas each corresponding to the data around the three front areas described above are written in the memory 121 .
- Step S 25 Position Identification Processing of Data from Around Rear Area
- the position identification unit 23 identifies the position of the data around its rear area and writes the position in the rear data 32 .
- Step S 26 Processing of Obstacle Detection
- the control unit 26 calls the object detection unit 24 .
- the object detection unit 24 compares the data around its front area with the data around its rear area whose identified positions match, and determines a difference, if any, to be a fallen object which occurred during the passage of the mobile object 100 , and issues an anomaly warning.
- the position identification unit 23 can identify the position of an obstacle from the numbers or the codes written on the platform doors recorded in the front data and the rear data stored.
- Step S 27 Anomaly Warning
- the control unit 26 issues an anomaly warning when the object detection unit 24 detects an obstacle (YES in Step S 26 ). For example, by transmitting this anomaly warning to the station (management center), the station staffs can make actions in a prompt manner. Also, it is possible to take measures such as stopping the following train if it is urgent or allowing the next train to enter and the doors to open at the platform to handle the problem occurred if it is not urgent.
- the monitoring device 10 can detect an anomaly which occurs when the mobile object 100 enters the platform of a station at the time when the mobile object 100 leaves the platform of the station.
- the monitoring device 10 With the first and second sensors installed at the front and the rear of the mobile object 100 , the monitoring device 10 according to the present embodiment obtains the data around its front area viewed from the mobile object 100 and the data around its rear area viewed from the mobile object 100 . Then, by comparing them, it is further made possible to immediately detect a fallen object existing on the outside of the platform doors and on the nearside of the railroad track, which is normally difficult to detect only with monitoring cameras installed on the platform.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Train Traffic Observation, Control, And Security (AREA)
- Traffic Control Systems (AREA)
- Selective Calling Equipment (AREA)
- Emergency Alarm Devices (AREA)
Abstract
The present disclosure provides a monitoring device including: a front data acquisition unit to acquire, via a sensor installed in a front of a mobile object, data around its front area with respect to its traveling direction and status information of the mobile object; a rear data acquisition unit to acquire, via a sensor installed in a rear of the mobile object, data around its rear area with respect to the traveling direction and status information of the mobile object; a position identification unit to identify a position of the data around the front area and a position of the data around the rear area; and an object detection unit to compare the data around the front area with the data around the rear area of the same position as the data around the front area, detect an object, and warn of an anomaly.
Description
- The present disclosure relates to a device for monitoring anomalies in railroad systems.
- The patent document 1 describes a technique for automatically detecting an obstacle to railroad systems, such as an overhead wire and a rail, by comparing the image data obtained by photographing the front from a traveling railroad vehicle with the background image data previously taken. The technique described in Patent Document 1 aims to efficiently monitor anomalies in railroad systems.
- Patent Document
- Patent Document 1: Japanese Patent Application Laid-Open No. 2016-52849
- One of the factors that cause anomalies in railroad systems is an object dropped off from railroad vehicles. The problem of the technique described in Patent Document 1 is that a railroad vehicle can detect objects dropped off from its preceding railroad vehicles but cannot detect objects dropped off from itself while traveling.
- The present invention has been made to solve the above-mentioned problem, and aims to enable a railroad vehicle to detect an object dropped off from itself.
- The monitoring device according to the present disclosure includes: a front data acquisition unit to acquire, via a sensor installed in a front of a mobile object, data around its front area with respect to its traveling direction and status information of the mobile object; a rear data acquisition unit to acquire, via a sensor installed in a rear of the mobile object, data around its rear area with respect to the traveling direction and status information of the mobile object; a position identification unit to identify a position of the data around the front area and a position of the data around the rear area; and an object detection unit to compare the data around the front area with the data around the rear area of the same position as the data around the front area, detect an object, and warn of an anomaly.
- The present invention makes it possible for a railroad vehicle to detect an object dropped off while the railroad vehicle is traveling.
-
FIG. 1 is a hardware configuration diagram showing a mobile object according to Embodiment 1. -
FIG. 2 is a functional configuration diagram showing a monitoring device according to Embodiment 1. -
FIG. 3 is a flowchart showing operation of the monitoring device according to Embodiment 1. -
FIG. 4 is a hardware configuration diagram of a mobile object according to Modified Example 1 of Embodiment 1. -
FIG. 5 is a hardware configuration diagram of a mobile object according to Modified Example 2 of Embodiment 1. -
FIG. 6 is a functional configuration diagram showing a monitoring device according to Embodiment 2. -
FIG. 7 is a flowchart showing operation of the monitoring device according to Embodiment 2. -
FIG. 1 is a hardware configuration diagram of a mobile object using a monitoring device according to the present embodiment. - In
FIG. 1 , thenumeral 100 denotes the mobile object, thenumeral 10 denotes the monitoring device, and thenumeral 101 denote a vehicle control unit. Themonitoring device 10 is a computer provided in themobile object 100. - Note that the
monitoring device 10 may be implemented in an integrated (or inseparable) form or in a removable (or separable) form, with themobile object 100 or another component illustrated herein. Further, themonitoring device 10 is not limited to the one described in the present embodiment, although a railroad vehicle is used as an example of themobile object 100 in the present embodiment. - The
monitoring device 10 includes hardware such as aprocessor 11, astorage device 12, acommunication interface 13, and an on-board interface 14. Theprocessor 11 is connected to other hardware devices via a system bus to control them. - The
processor 11 is an integrated circuit (IC) that performs processing. For specific examples, theprocessor 11 is a central processing unit (CPU), a digital signal processor (DSP), or a graphics processing unit (GPU). - The
storage device 12 includes amemory 121 and astorage 122. For a specific example, thememory 121 is a random-access memory (RAM). For a specific example, thestorage 122 is a hard disk drive (HDD). Further, thestorage 122 may be a portable storage medium such as a secure digital (SD) memory card, a compact flash (CF), a NAND flash, a flexible disk, an optical disk, a compact disk, a Blu-ray (registered trademark) disk, or a DVD. - The
communication interface 13 is a device for communicating with a communication device around themobile object 100. For a specific example, thecommunication interface 13 is an Ethernet (registered trademark) terminal or a universal serial bus (USE) terminal. - The on-
board interface 14 is a device for connecting to thevehicle control unit 101 installed on themobile object 100. For a specific example, the on-board interface 14 is a USB terminal, an IEEE1394 terminal, or an HDMI (registered trademark) terminal. - The
vehicle control unit 101 includes sensing devices such as a camera, a light detection and ranging device (LiDAR), a radar, a sonar, and a positioning device and also includes devices such as a steering, a brake, and an accelerator to control themobile object 100. -
FIG. 2 shows a functional configuration diagram of themonitoring device 10. Themonitoring device 10 includes, as functional components, a frontdata acquisition unit 21, a reardata acquisition unit 22; aposition identification unit 23, anobject detection unit 24, and ahistory storage unit 25. Thenumeral 31 denotes front data; thenumeral 32 denotes rear data; thenumeral 41 denotes locating data; and thenumeral 42 denotes exclusion data. - The front
data acquisition unit 21 collects information obtained by a first sensor installed in the front of themobile object 100 via thevehicle control unit 101. In the present embodiment, the first sensor is exemplified as a camera (front camera) installed in the front of themobile object 100, and the information obtained from the front camera is used as data around its front area. However, the data may be information obtained from a LiDAR, a radar, or a sonar. Note that, in the present embodiment, the information to be collected by the front camera is the information obtained in the direction of travel of themobile object 100. However, the first sensor only needs to be installed in the front of themobile object 100, and the information to be collected by the first sensor may be information in any direction. - The data around its front area that is obtained is recorded in the
storage 122. In addition, the frontdata acquisition unit 21 also records, in thestorage 122, the status information of themobile object 100 such as position information, speed, attitude angles (roll angle, pitch angle, yaw angle), lighting color of the mobile object at the time of the data acquisition and the like at the time when acquiring data around its front area. - The position information of the
mobile object 100 may be, for example, the latitude and longitude of themobile object 100 obtained from the output value of the positioning device which is connected to the mobile object via thevehicle control unit 101. Here, let the data recorded by the frontdata acquisition unit 21 in thestorage 122 befront data 31. - The rear
data acquisition unit 22 collects information obtained by a second sensor installed in the rear of themobile object 100 via thevehicle control unit 101. In the present embodiment, the second sensor is exemplified as a camera (rear camera) installed in the rear of themobile object 100, and the image obtained from the rear camera is used as data around its rear area. However, the data may be information obtained from a LiDAR, a radar, and a sonar. In the present embodiment, the information to be collected by the rear camera is the information obtained in the direction opposite to the direction of travel of themobile object 100. However, the second sensor only needs to be installed in the rear of themobile object 100, and the information to be collected by the second sensor is information in any direction. - The data obtained from around the rear area is recorded in the
storage 122. In addition, the reardata acquisition unit 22 also records, in thestorage 122, the status information of themobile object 100 such as position information, speed, attitude angles (roll angle, pitch angle, yaw angle), lighting color of the mobile object at the time of the data acquisition and the like at the time when obtaining data from around the rear area. The position information of themobile object 100 may be, for example, the latitude and longitude of themobile object 100 obtained from the output value of the positioning device which is connected to the mobile object via thevehicle control unit 101. Here, let the data recorded by the reardata acquisition unit 22 in thestorage 122 berear data 32. - In the present embodiment, it is described that the data around its front area and the data around its rear area are recorded in the
storage 122. However, they may be recorded in thememory 121, another area prepared in thestorage device 12, or an external device (not shown) connected via the communication I/F 13. - The
position identification unit 23 is called from the frontdata acquisition unit 21 and the reardata acquisition unit 22. Theposition identification unit 23 identifies the position of thefront data 31 when called from the frontdata acquisition unit 21 and identifies the position of therear data 32 when called from the reardata acquisition unit 22. Note that the identified position of the data around its front area and the identified position of the data around its rear area are each the position corresponding to the information (identified position information) obtained from the first and second sensors, but not the position of themobile object 100. - In the present embodiment, the identified position information identified by the
position identification unit 23 is recorded together with thefront data 31 and therear data 32, but they may be recorded in separate areas if it is known which data the identified position information is linked with. Further, theposition identification unit 23 may identify the positions of the data around the front area and the data around the rear area by using locatingdata 41. The locatingdata 41 may be data of any object that can uniquely identify the data obtained by the first and second sensors, such as a building, a pillar, a signboard, and a characteristic landscape that exist along the railroad track. - Then, the image data of the objects that can be uniquely identified as described above and their position information are associated with the locating
data 41 in advance. By doing so, if the data around its front area and the data around its rear area contain information that matches the locatingdata 41, theposition identification unit 23 will be able to identify the position without using the status information of themobile object 100, making it possible to shorten the processing time. In the present embodiment, a camera is used as the sensor. However, when a radar or a sonar is used as a sensor, a combination of an object made of a material that generates characteristic reflected waves, and positional information thereof may be used. - The
object detection unit 24 compares the identified position information recorded in thefront data 31 with the identified position information recorded in therear data 32 to detect a matching combination. If there is a matching combination in the identified position information, theobject detection unit 24 compares the data around its front area with the data around its rear area. Then, if there is a difference, it is determined that the difference indicates a fallen object dropped off during the passage of themobile object 100 and an anomaly warning is issued. - Further, the
object detection unit 24 can determine whether the object determined to be a fallen object is really the fallen object dropped off during the passage of themobile object 100 by using theexclusion data 42. Examples of the items to be excluded include animals such as crows and cats, gravel and stones, something blown by the wind such as paper wastes like newspapers and magazines, and vinyl sheets, and the image data of these items may be held as theexclusion data 42. - The
object detection unit 24 compares the image determined to be a fallen object with the images of theexclusion data 42. If they match, theobject detection unit 24 determines that the object found is not the one that was dropped off during the passage of themobile object 100, so that a false alarm will not be issued to themobile object 100 regarding the occurrence of a fallen object. At this time, a warning may be issued that it is not a fallen object from themobile object 100 but something that has been blown from the outside during the passage of themobile object 100. Note that the locatingdata 41 and theexclusion data 42 may be recorded in thestorage 122 or in thememory 121. Also, these data may be recorded in another area prepared in thestorage device 12, or an external device (not shown) connected via the communication I/F 13. - The function of each functional component of the
monitoring device 10 is implemented by software. Thestorage 122 of thestorage device 12 stores a program that implements the function of each functional component implemented by the software. This program is loaded into thememory 121 by theprocessor 11 and executed by theprocessor 11. - In addition, the
storage 122 implements the function of thehistory storage unit 25. Thehistory storage unit 25 stores information about the fallen objects that theobject detection unit 24 detected in the past. Examples of such information to be stored include position, time, and number of times of detection of each fallen object detected. By using such information held by thehistory storage unit 25, the frontdata acquisition unit 21 and the reardata acquisition unit 22 may for example, shorten the interval for collecting data in the vicinity of the location where fallen objects are frequent, and may, conversely; lengthen the interval for collecting data in the vicinity of the location where fallen objects are less frequent. This makes it possible to efficiently obtain the data around its front area and the data around its rear area. - Note that, in the present embodiment, as shown in
FIG. 1 , only oneprocessor 11 is provided. Instead, however,multiple processors 11 may be provided. In that case, themultiple processors 11 cooperate to execute the program that implements each function of themonitoring device 10. -
FIG. 3 is a flowchart showing processing of the monitoring device according to the present embodiment. The operation of themonitoring device 10 according to Embodiment 1 will be described with reference toFIG. 3 . In the present embodiment, for ease of explanation, the processes of the frontdata acquisition unit 21, the reardata acquisition unit 22, theposition identification unit 23, and theobject detection unit 24 are described in a way that they are executed sequentially as shown in the flowchart. However, instead, the three units, namely, the frontdata acquisition unit 21, the reardata acquisition unit 22, and theobject detection unit 24, in other words, the units other than theposition identification unit 23 which is called from the frontdata acquisition unit 21 and the reardata acquisition unit 22, can be executed in parallel. - (Step S11: Processing of front data acquisition)
- The front
data acquisition unit 21 acquires the data around its front area by the first sensor installed in the front of themobile object 100 and the status information and writes the data in thefront data 31. The frontdata acquisition unit 21 calls theposition identification unit 23. - (Step S12: Calculation and identification of front data position)
- The
position identification unit 23 identifies the position of the data around its front area on the basis of the status information written in thefront data 31 and writes the identified position in thefront data 31. - (Step S13: Processing of rear data acquisition)
- The rear
data acquisition unit 22 acquires the data around its rear area by the second sensor installed in the rear of themobile object 100 and the status information and writes the data in therear data 32. The reardata acquisition unit 22 calls theposition identification unit 23. - (Step S14: Calculation and identification of rear data position)
- The
position identification unit 23 identifies the position of the data around its rear area on the basis of the status information written in therear data 32 and writes the identified position in therear data 32. - (Step S15: Detection of object)
- The object,
detection unit 24 compares the identified position information of thefront data 31 and the identified position information of therear data 32, both stored in thestorage 122, and detects a matching combination. If there is a matching combination in the identified position information, theobject detection unit 24 compares the data around its front area with the data around its rear area. Then, if there is a difference in the combination, it is determined that the difference indicates the existence of a fallen object dropped off during the passage of themobile object 100 and an anomaly warning is issued. - However, since it should be considered that the acquisition directions are different by about 180 degrees between the
front data 31 and therear data 32 that are identical in the identified position information, it is not possible to simply compare their data. Therefore, theobject detection unit 24 converts the pixel signals of either thefront data 31 or therear data 32 by using the status information included in thefront data 31 and therear data 32 as well as the characteristics of the acquired data. - The status information will be explained in detail. Specifically, the speed information, the attitude angles (roll angle, pitch angle, yaw angle) of the
mobile object 100, and the characteristics of sensing devices provided in thevehicle control unit 101 for the acquisition of data are used. For example, in a case of a camera, the lens and the size of image sensor each are a characteristic, and the focal length and the angle of view are determined from these characteristics. In the image taken by the camera, the distances corresponding to the pixels in the camera image can be roughly known from the shooting position, the focal length, and the angle of view, and as a result, the positions corresponding to the pixel signals can be known. - In this way, the
object detection unit 24 obtains the positions of the pixel signals for each of thefront data 31 and therear data 32, and finds a pair of thefront data 31 and therear data 32 having the same position of the pixel signals to perform the comparison. When the pixel signals are compared, it is possible to use the markers and the characteristics of the landmarks and the like recorded in the data. In doing so, size scaling and angle adjustment may be performed in order to match the size of the pixel area and the obstacle. - Note that, in the acquired data, depending on the imaging interval, the pixel signals of the same position may be found in multiple images such as a near view and a distant view, etc. When there are multiple choices for the images to be used as described above, the identification accuracy is improved by giving priority to the image with higher resolution without using the image with lower resolution. Similarly, in the acquired data, depending on the imaging interval, the pixel signals of the same position may be found in multiple images with different subject depths. When there are multiple choices for the images to be used as described above, the identification accuracy is improved by giving priority to the image of higher resolution with no blurring without using the image with lower resolution due to blurring.
- Also, depending on the direction of the sun, the shadow of the mobile object may be cast on the image. In such a case, the cast shadow may be corrected so as for the images to match with each other, or the orientation of the sensor may be changed to take a shadow-free image for priority use. The shadow can be detected from the image signal, but it can also be predicted from the position of the sun calculated from the photographing time and place and the size of the moving object or it is possible that the image will not be used when it is determined in advance not to be suitable for the identification processing.
- The
object detection unit 24 searches for the object to be detected on the basis of the pixel signals at the identified data position. In the description of the present embodiment, this object is assumed to be a part that has fallen from themobile object 100. Theobject detection unit 24 can create the stereoscopic image from thefront data 31 by performing viewpoint conversion. The stereoscopic image may be a bird's-eye view image or a three-dimensionally reconstructed image. Theobject detection unit 24 performs the similar processing on therear data 32 as well. - Then, the
front data 31 and therear data 32 are compared to determine whether the difference therebetween indicates an object to be detected. Moved stones on the railroad track, for example, are excluded from the objects to be detected so that they are not determined as fallen objects. However, in consideration of the possibility that foreign objects such as stones consequently move on a railroad track, hindering the safe railroad operation, it is also possible to include the objects as the targets to be detected. - While it is bright in the daytime, there is no problem in particular, but when lighting is required, care must be taken when a camera recording visible light is used as a sensor to acquire images. For example, if the
mobile object 100 is a railroad vehicle, it is required that, during operation, the color of the front light and the color of the rear light be different from each other. Therefore, if the camera recording visible light is used as the sensor, determination as a fallen object may be made due to the difference in color. - In such a case, the
object detection unit 24 cancels the color difference between the front light and the rear light by using the setting values such as the colors of the lights in the status information. A straightforward way to correct the color difference of the lights is to use the Von Dries color conversion formula. Thus, even if a color difference occurs between the camera image acquired in front of themobile object 100 and the camera image acquired in rear, it is possible to prevent erroneous detection of an obstacle owing to the color difference. - As described above, in the
monitoring device 10 according to the present embodiment, by using the front data acquired by the frontdata acquisition unit 21 as well as the status information acquired when the front data is acquired by the frontdata acquisition unit 21, and the rear data acquired by the reardata acquisition unit 22 as well as the status information acquired when the rear data is acquired by the reardata acquisition unit 22, and further by using, in theposition identification unit 23, the characteristics of the status information acquired when the front data is acquired as well as the status information acquired when the rear data is acquired, the front data and the rear data having the same identified position information are compared to detect a fallen object. Thus, this makes it possible to detect a fallen object immediately after it occurs during the travel of themobile object 100. - As a result, if a fallen object occurs which may interfere with the railroad operation, an immediate action can be taken for it to improve the safe operation of the railroad. In addition, this configuration, which only uses the sensors attached to the front and the rear of the railroad vehicle, contributes to reducing the number of monitoring cameras to be installed along the railroad track. In the present embodiment, a case is described in which the front
data acquisition unit 21 acquires the data around its front area in the direction of travel of themobile object 100, but the data around its rear area may be acquired as long as the first sensor is installed in the front of themobile object 100. Similarly, in the present embodiment, a case is described in which the reardata acquisition unit 22 acquires the data around its rear area in the direction of travel of themobile object 100, but it may acquire the data around its front area as long as the second sensor is installed in the rear of themobile object 100 - When the
mobile object 100 includes a plurality of railroad cars, the frontdata acquisition unit 21 and the reardata acquisition unit 22 may be provided in the front and in the rear of each of the cars constituting themobile object 100, respectively. In that case, the frontdata acquisition unit 21 of the frontmost car and the reardata acquisition unit 22 of the rearmost car are used while traveling. With this configuration, even if themobile object 100 is separated into a plurality of moving bodies, it is possible to use the frontdata acquisition unit 21 of the frontmost car and the reardata acquisition unit 22 of the rearmost car in each separated mobile object. To the contrary, when a plurality of the movingbodies 100 are connected into one mobile object, the frontdata acquisition unit 21 of the frontmostmobile object 100 and the reardata acquisition unit 22 of the rearmostmobile object 100 can be used. - Embodiment 1 describes the case where each functional component is implemented by software. However, each of these functional components may be implemented by hardware.
-
FIG. 4 shows a hardware configuration of themonitoring device 10 according to Modified Example 1. If each functional component is implemented by hardware, themonitoring device 10 includes anelectronic circuit 15 in place of theprocessor 11 and thestorage device 12. Theelectronic circuit 15 is a dedicated circuit that implements the functions of each functional component and thestorage device 12. -
FIG. 4 shows, asFIG. 1 does, a configuration in which thecommunication interface 13, the on-board interface 14, and theelectronic circuit 15 are connected via a bus. However, theelectronic circuit 15 may be configured as a single circuit that also implements the functions of thecommunication interface 13 and the on-board interface 14. - Examples of the
electronic circuit 15 include a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a gate array (GA), an application specific integrated circuit (ASIC), and a field-programmable gate array (FPGA). - Further, each functional component of the
monitoring device 10 described in Embodiment 1 may be integrated into oneelectronic circuit 15, or they may be allocated to and implemented by a plurality of theelectronic circuits 15. - Modified Example 2 describes the case in which some of each of the functional components are implemented by hardware and each of the remaining functional components is implemented by software.
FIG. 5 shows a configuration of themonitoring device 10 according to Modified Example 2. - In
FIG. 5 , theprocessor 11, thestorage device 12, and theelectronic circuit 15 are called processing circuits. - In other words, the function of each functional component is implemented by a processing circuit.
- The configuration of Embodiment 1 is for detecting a fallen object from a railroad vehicle. Next, in the present embodiment, a configuration for detecting an obstacle at a station which is provided with platform doors will be described. The present embodiment differs from Embodiment 1 in that the
mobile object 100 identifies the position of the data around its front area and the position of the data around its rear area on the basis of the positions of the platform doors at a station. In the present embodiment, the different points will be explained, and the same points will be omitted. -
FIG. 6 shows a functional configuration diagram of themonitoring device 10 according to the present embodiment. InFIG. 6 , the numeral 26 denotes a control unit with which themonitoring device 10 identifies the position of the data around its front area and the position of the data around its rear area on the basis of the positions of the platform doors at the station. For example, thecontrol unit 26 stores an image of the platform door in advance and determines whether the image of the platform door is included in the data around its front area acquired by the frontdata acquisition unit 21. If the platform door image is included, thecontrol unit 26 calls the frontdata acquisition unit 21 each time the frontmost of themobile object 100 reaches each platform door at the station and calls the reardata acquisition unit 22 each time the rearmost of themobile object 100 reaches each platform door at the station. - The
control unit 26 outputs an anomaly warning when theobject detection unit 24 detects an obstacle. InFIG. 6 , the same reference numerals as those inFIG. 2 denote the same or corresponding components or units, each of which, except for thecontrol unit 26, performs the same operation as those described inFIG. 2 shown in Embodiment 1. -
FIG. 7 is a flowchart showing operation of themonitoring device 10 according to the present embodiment. The operation of themonitoring device 10 according to the present embodiment will be described with reference toFIG. 7 . In the present embodiment, for ease of explanation, it is described that thecontrol unit 26 calls the frontdata acquisition unit 21, the reardata acquisition unit 22, and theobject detection unit 24, and that when an obstacle is detected as the result, an anomaly warning is issued. Instead, as in Embodiment 1, the three units, namely the frontdata acquisition unit 21, the reardata acquisition unit 22, and theobject detection unit 24, in other words, the units other than theposition identification unit 23 which is called from the frontdata acquisition unit 21 and the reardata acquisition unit 22, can be executed in parallel. At that time, theobject detection unit 24 can output an anomaly warning to be outputted when the obstacle is detected. - (Step S21: Processing of Front Data Acquisition)
- When the railroad vehicle enters the station, the
control unit 26 repeatedly calls the frontdata acquisition unit 21 until the railroad vehicle stops at the station. The frontdata acquisition unit 21 acquires the data around its front area using a first sensor installed in the front of themobile object 100. The frontdata acquisition unit 21 writes the acquired data around its front area in thememory 121. - For example, if there are three platform doors corresponding to the doors of the railroad vehicle, the
control unit 26 calls the frontdata acquisition unit 21 each time the railroad vehicle approaches each of the three platform doors. When themobile object 100 approaches a predetermined position with respect to each of the platform doors or each of the vehicle doors (for example, position from which the entire door can be overlooked), the frontdata acquisition unit 21 records the data around its front area and the status information as thefront data 31. In this example, the front data taken at the three locations is written in thememory 121. - (Step S22: Position Identification Processing of Data from Around Front Area)
- As in Embodiment 1, when called by the front
data acquisition unit 21, theposition identification unit 23 identifies the position of the data around its front area and writes the position in thefront data 31. - (Step S23: Waiting for Completion of Passengers Getting on and Off)
- The
control unit 26 waits while the doors of the railroad vehicle and the platform doors open, the passengers get on and off, and the platform doors and the doors of the railroad vehicle close. - (Step S24: Processing of Rear Data Acquisition)
- The
control unit 26 calls the reardata acquisition unit 22 until the railroad vehicle leaves the station. The reardata acquisition unit 22 acquires the data around its rear area using a second sensor installed in the rear of themobile object 100. Specifically, as in Step S21, the rear areadata acquisition unit 22 collects the information obtained by the second sensor via thevehicle control unit 101. The reardata acquisition unit 22 writes the acquiredrear data 32 in thememory 121. To explain in accordance with the example given in Step S21, when the rear of themobile object 100 approaches a predetermined position with respect to each of the platform doors or each of the vehicle doors (for example, the position from which the entire door can be overlooked), the reardata acquisition unit 22 acquires the data around its rear area and the status information. In this way, the data around the three rear areas each corresponding to the data around the three front areas described above are written in thememory 121. - (Step S25: Position Identification Processing of Data from Around Rear Area)
- As in Embodiment 1, when called by the rear
data acquisition unit 22, theposition identification unit 23 identifies the position of the data around its rear area and writes the position in therear data 32. - (Step S26: Processing of Obstacle Detection)
- Next, the
control unit 26 calls theobject detection unit 24. As in Embodiment 1, theobject detection unit 24 compares the data around its front area with the data around its rear area whose identified positions match, and determines a difference, if any, to be a fallen object which occurred during the passage of themobile object 100, and issues an anomaly warning. - In the present embodiment, an example is described in which the position of each platform door or each vehicle door is recognized in advance, and then the front data and the rear data are taken at each position. However, in a case where the front data and the rear data are continuously taken at regular time intervals, it is also possible to detect and count doors from images by considering the front-rear relationship between images taken in time-series. Alternatively, the
position identification unit 23 can identify the position of an obstacle from the numbers or the codes written on the platform doors recorded in the front data and the rear data stored. - (Step S27: Anomaly Warning)
- The
control unit 26 issues an anomaly warning when theobject detection unit 24 detects an obstacle (YES in Step S26). For example, by transmitting this anomaly warning to the station (management center), the station staffs can make actions in a prompt manner. Also, it is possible to take measures such as stopping the following train if it is urgent or allowing the next train to enter and the doors to open at the platform to handle the problem occurred if it is not urgent. - As so far described, the
monitoring device 10 according to the present embodiment can detect an anomaly which occurs when themobile object 100 enters the platform of a station at the time when themobile object 100 leaves the platform of the station. With the first and second sensors installed at the front and the rear of themobile object 100, themonitoring device 10 according to the present embodiment obtains the data around its front area viewed from themobile object 100 and the data around its rear area viewed from themobile object 100. Then, by comparing them, it is further made possible to immediately detect a fallen object existing on the outside of the platform doors and on the nearside of the railroad track, which is normally difficult to detect only with monitoring cameras installed on the platform. - With the above effects, when an obstacle that is detected may interfere with the railroad service, it is possible to take immediate actions, which contributes to improvement of the safe and on-time operation of the railroad service. Further, since the sensors attached to the railroad vehicle is used, it is not necessary to equip the platform doors with their respective image monitoring devices, and the cost, can be greatly reduced.
- 10 monitoring device, 11 processor, 12 storage device, 13
communication IT 14 in-vehicle 15 electronic circuit, 21 front data acquisition unit, 22 rear data acquisition unit, 28 position identification unit, 24 object detection unit, 25 history storage unit, 26 control unit, 31 front data, 32 rear data, 41 locating data, 42 exclusion data, 100 mobile object, 101 vehicle control unit, 121memory 122 storage
Claims (17)
1. A monitoring device comprising:
front data acquisition circuitry to acquire, via a sensor installed in a front of a mobile object, data around its front area with respect to its traveling direction and status information of the mobile object;
rear data acquisition circuitry to acquire, via a sensor installed in a rear of the mobile object, data around its rear area with respect to the traveling direction and status information of the mobile object;
position identification circuitry to identify a position of the data around the front area and a position of the data around the rear area; and
object detection circuitry to compare the data around the front area with the data around the rear area of the same position as the data around the front area, detect an object, and warn of an anomaly.
2. The monitoring device according to claim 1 , wherein the front data acquisition circuitry and the rear data acquisition circuitry acquire at least one of speed of the mobile object and a yaw angle of the mobile object as the status information.
3. The monitoring device according to claim 1 , wherein
the mobile object is provided with a light in each of its front and rear,
the front data acquisition circuitry and the rear data acquisition circuitry detect and record colors of their respective lights as the status information, and
the object detection circuitry corrects colors of the data around the front area and the data around the rear area by using the colors of the lights.
4. The monitoring device according to claim 2 , wherein
the mobile object is provided with a light in each of its front and rear,
the front data acquisition circuitry and the rear data acquisition circuitry detect and record colors of their respective lights as the status information, and
the object detection circuitry corrects colors of the data around the front area and the data around the rear area by using the colors of the lights.
5. The monitoring device according to claim 1 , wherein the position identification circuitry includes locating data in which uniquely identifiable data and position information thereof are recorded, compares the data around the front area acquired by the front data acquisition circuitry and the data around the rear area acquired by the rear data acquisition circuitry with the locating data.
6. The monitoring device according to claim 2 , wherein the position identification circuitry includes locating data in which uniquely identifiable data and position information thereof are recorded, compares the data around the front area acquired by the front data acquisition circuitry and the data around the rear area acquired by the rear data acquisition circuitry with the locating data.
7. The monitoring device according to claim 3 , wherein the position identification circuitry includes locating data in which uniquely identifiable data and position information thereof are recorded, compares the data around the front area acquired by the front data acquisition circuitry and the data around the rear area acquired by the rear data acquisition circuitry with the locating data.
8. The monitoring device according to claim 4 , wherein the position identification circuitry includes locating data in which uniquely identifiable data and position information thereof are recorded, compares the data around the front area acquired by the front data acquisition circuitry and the data around the rear area acquired by the rear data acquisition circuitry with the locating data.
9. The monitoring device according to claim 1 , wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
10. The monitoring device according to claim 2 , wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
11. The monitoring device according to claim 3 , wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
12. The monitoring device according to claim 4 , wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
13. The monitoring device according to claim 5 , wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
14. The monitoring device according to claim 6 , wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
15. The monitoring device according to claim 7 , wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
16. The monitoring device according to claim 8 , wherein the object detection circuitry includes exclusion data, and does not warn of an anomaly when the object matches the exclusion data.
17. A monitoring method comprising:
acquiring data around a front area with respect to a traveling direction of a mobile object and status information of the mobile object;
acquiring data around a rear area with respect to the traveling direction of the mobile object and status information of the mobile object;
identifying a position of the data around the front area and a position of the data around the rear area; and
comparing the data around the front area with the data around the rear area of the same position as the data around the front area, detecting an object, and warning of an anomaly.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2019/004203 WO2020161818A1 (en) | 2019-02-06 | 2019-02-06 | Monitoring device and method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/004203 Continuation WO2020161818A1 (en) | 2019-02-06 | 2019-02-06 | Monitoring device and method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210354737A1 true US20210354737A1 (en) | 2021-11-18 |
Family
ID=71947675
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/388,353 Pending US20210354737A1 (en) | 2019-02-06 | 2021-07-29 | Monitoring device and method |
Country Status (5)
Country | Link |
---|---|
US (1) | US20210354737A1 (en) |
EP (1) | EP3907121B1 (en) |
JP (1) | JP6914461B2 (en) |
ES (1) | ES2945839T3 (en) |
WO (1) | WO2020161818A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060098843A1 (en) * | 2004-06-11 | 2006-05-11 | Stratech Systems Limited | Method and system for rail track scanning and foreign object detection |
JP5161673B2 (en) * | 2008-06-30 | 2013-03-13 | 株式会社神戸製鋼所 | Falling object detection device and falling object detection method |
US10081376B2 (en) * | 2015-09-03 | 2018-09-25 | Sameer Singh | Rail track asset survey system |
US20190039633A1 (en) * | 2017-08-02 | 2019-02-07 | Panton, Inc. | Railroad track anomaly detection |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS59156089A (en) * | 1983-10-11 | 1984-09-05 | Hitachi Ltd | Obstacle detecting method for vehicle |
JP3448088B2 (en) * | 1993-12-24 | 2003-09-16 | 東日本旅客鉄道株式会社 | Obstacle detection system |
US8712610B2 (en) * | 2008-09-18 | 2014-04-29 | General Electric Company | System and method for determining a characterisitic of an object adjacent to a route |
JP6209141B2 (en) | 2014-09-04 | 2017-10-04 | 公益財団法人鉄道総合技術研究所 | Obstacle detection apparatus and method |
-
2019
- 2019-02-06 ES ES19914410T patent/ES2945839T3/en active Active
- 2019-02-06 EP EP19914410.6A patent/EP3907121B1/en active Active
- 2019-02-06 JP JP2020570257A patent/JP6914461B2/en active Active
- 2019-02-06 WO PCT/JP2019/004203 patent/WO2020161818A1/en unknown
-
2021
- 2021-07-29 US US17/388,353 patent/US20210354737A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060098843A1 (en) * | 2004-06-11 | 2006-05-11 | Stratech Systems Limited | Method and system for rail track scanning and foreign object detection |
JP5161673B2 (en) * | 2008-06-30 | 2013-03-13 | 株式会社神戸製鋼所 | Falling object detection device and falling object detection method |
US10081376B2 (en) * | 2015-09-03 | 2018-09-25 | Sameer Singh | Rail track asset survey system |
US20190039633A1 (en) * | 2017-08-02 | 2019-02-07 | Panton, Inc. | Railroad track anomaly detection |
Also Published As
Publication number | Publication date |
---|---|
EP3907121A1 (en) | 2021-11-10 |
EP3907121B1 (en) | 2023-04-26 |
EP3907121A4 (en) | 2022-02-23 |
WO2020161818A1 (en) | 2020-08-13 |
JPWO2020161818A1 (en) | 2021-05-20 |
JP6914461B2 (en) | 2021-08-04 |
ES2945839T3 (en) | 2023-07-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10599931B2 (en) | Automated driving system that merges heterogenous sensor data | |
US10713490B2 (en) | Traffic monitoring and reporting system and method | |
CN113492851B (en) | Vehicle control device, vehicle control method, and computer program for vehicle control | |
CN111382768A (en) | Multi-sensor data fusion method and device | |
CN111932901B (en) | Road vehicle tracking detection apparatus, method and storage medium | |
CN102576495B (en) | Collision monitor for a motor vehicle | |
US20190050652A1 (en) | Obstacle analyzer, vehicle control system, and methods thereof | |
EP3140777B1 (en) | Method for performing diagnosis of a camera system of a motor vehicle, camera system and motor vehicle | |
JP4858761B2 (en) | Collision risk determination system and warning system | |
AU2020366769B2 (en) | Sensor performance evaluation system and method, and automatic driving system | |
EP3439920A1 (en) | Determining mounting positions and/or orientations of multiple cameras of a camera system of a vehicle | |
US11726176B2 (en) | Annotation of radar-profiles of objects | |
EP3035315A1 (en) | Information retrieval arrangement | |
US20210354737A1 (en) | Monitoring device and method | |
EP3287940A1 (en) | Intersection detection system for a vehicle | |
US20220101025A1 (en) | Temporary stop detection device, temporary stop detection system, and recording medium | |
JP5957182B2 (en) | Road surface pattern recognition method and vehicle information recording apparatus | |
JP6220322B2 (en) | HOME DETECTING METHOD, HOME DETECTING DEVICE, AND TRAIN OPENING SUPPORT DEVICE | |
US20240010242A1 (en) | Signal processing device and signal processing method | |
GB2623497A (en) | Automated safety management in environment | |
WO2020071133A1 (en) | Sign recognition device | |
CN114268788A (en) | Monitoring of functional compliance of vehicle-mounted vehicle image acquisition device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MITSUBISHI ELECTRIC CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:IMAI, YOSHIE;MURAYAMA, SHU;SIGNING DATES FROM 20210705 TO 20210708;REEL/FRAME:057037/0586 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |