WO2021044707A1 - Surroundings observation system, surroundings observation program, and surroundings observation method - Google Patents

Surroundings observation system, surroundings observation program, and surroundings observation method Download PDF

Info

Publication number
WO2021044707A1
WO2021044707A1 PCT/JP2020/024144 JP2020024144W WO2021044707A1 WO 2021044707 A1 WO2021044707 A1 WO 2021044707A1 JP 2020024144 W JP2020024144 W JP 2020024144W WO 2021044707 A1 WO2021044707 A1 WO 2021044707A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
observation
unit
peripheral
boundary
Prior art date
Application number
PCT/JP2020/024144
Other languages
French (fr)
Japanese (ja)
Inventor
幸彦 小野
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Publication of WO2021044707A1 publication Critical patent/WO2021044707A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L3/00Electric devices on electrically-propelled vehicles for safety purposes; Monitoring operating variables, e.g. speed, deceleration or energy consumption
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L25/00Recording or indicating positions or identities of vehicles or trains or setting of track apparatus
    • B61L25/02Indicating or recording positions or identities of vehicles or trains
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a peripheral observation system, a peripheral observation program, and a peripheral observation method.
  • Patent Document 1 An obstacle is detected by using a short-distance photographing unit for photographing a short distance provided in a vehicle, a long-distance photographing unit for photographing a long distance, and a LIDAR for irradiating a short distance.
  • the technology to the effect of "doing” is disclosed.
  • Patent Document 2 states that "the vehicle position of a train is determined based on the correlation between a base video image of the front of a train taken in advance and a reference image of the front of a running train. Detect. A virtual building limit frame is preset for the base video image. The difference between the base video image and the reference video image taken at the same vehicle position at different dates and times is detected only within the virtual building limit frame. Detects only obstacles within the virtual building limit frame. "
  • an object of the present invention is to provide a technique for peripheral observation that enhances the detection accuracy of the vehicle position.
  • one of the representative peripheral observation systems of the present invention includes a peripheral environment observation unit and a position estimation unit.
  • the surrounding environment observation unit observes the position of a group of surrounding objects from the position of the vehicle traveling on the track and outputs the observation result.
  • the position estimation unit stores map data created by observing the position of the object group in advance, and the observation result is in a state where the position of the vehicle in the observation result is limited to the track of the map data. The correlation between the vehicle and the map data is calculated, and the vehicle position is estimated based on the correlation.
  • the present invention provides a technique for peripheral observation that enhances the accuracy of detecting the vehicle position.
  • FIG. 1 is a diagram showing a configuration of an orbital transportation system (including a peripheral observation system).
  • FIG. 2 is a diagram showing a configuration of a self-position estimation system.
  • FIG. 3 is a flowchart showing the overall operation of the peripheral observation system.
  • FIG. 4 is a diagram showing an example of a sensor for observing the surrounding environment.
  • FIG. 5 is a diagram showing a lateral boundary of the monitoring area when traveling on a turning track.
  • FIG. 6 is a flowchart showing the operation of the self-position estimation system.
  • FIG. 7 is a diagram showing an example of the surrounding environment of the traveling vehicle.
  • FIG. 8 is a diagram illustrating the detection of rail trajectories by the self-position estimation system.
  • FIG. 9 is a diagram showing an example of surrounding environment observation data.
  • FIG. 10 is a diagram for explaining the posture estimation of the vehicle by the self-position estimation system.
  • FIG. 11 is a flowchart showing the operation of the vehicle driving control unit.
  • FIG. 12 is a diagram showing an example of a monitoring area in a station section.
  • FIG. 13 is a diagram showing an example of a monitoring area in the maintenance work section.
  • FIG. 14 is a diagram showing scan matching (during scanning) in the case of an automobile.
  • FIG. 15 is a diagram showing scan matching (scan completion) in the case of an automobile.
  • FIG. 16 is a diagram showing scan matching (during scanning) of the embodiment.
  • FIG. 17 is a diagram showing scan matching (scan completion) of the embodiment.
  • FIG. 1 is a diagram showing a configuration of a railway transportation system 100 (including a peripheral observation system 100A) of the first embodiment.
  • the railway transportation system 100 includes a vehicle 102 and a peripheral observation system 100A.
  • Vehicle 102 is a vehicle that transports passengers and freight that travel along the track.
  • the vehicle 102 is composed of a vehicle driving control unit 105 and a vehicle drive unit 106.
  • Obstacle information 161 is given to the vehicle 102 from the peripheral observation system 100A.
  • the vehicle driving control unit 105 has an internal function of detecting the position and speed of the vehicle 102.
  • the vehicle driving control unit 105 generates a drive command 142 so that the position and speed follow the target traveling pattern.
  • the target traveling pattern is based on a pattern based on the acceleration / deceleration of the vehicle 102 and the speed limit of the traveling section, which are known in advance.
  • the vehicle driving control unit 105 calculates the allowable maximum speed of the vehicle 102 from the position of the vehicle 102 and the maximum deceleration of the vehicle 102, and reflects it in the basic target traveling pattern.
  • an ATO device automated train operation device
  • the vehicle drive unit 106 drives the vehicle 102 based on the given drive command 142.
  • Examples of the specific device of the vehicle drive unit 106 include an inverter, a motor, a friction brake, and the like.
  • the peripheral observation system 100A includes a peripheral environment observation unit 107, a self-position estimation system 101, and an obstacle detection system 103.
  • the surrounding environment observation unit 107 is a sensor installed in front of the vehicle 102 to acquire the position, shape, color, reflection intensity, etc. of an object around the vehicle 102.
  • the surrounding environment observation unit 107 is a sensor such as a camera, a LIDAR (Laser Imaging Detection and Ringing), a laser radar, or a millimeter wave radar.
  • LIDAR Laser Imaging Detection and Ringing
  • laser radar or a millimeter wave radar.
  • the self-position estimation system 101 acquires the surrounding environment observation data 151 from the surrounding environment observation unit 107, estimates the vehicle position and attitude of the vehicle 102, and outputs the position / attitude information 153.
  • the obstacle detection system 103 includes a detection range setting database 108, a monitoring area setting processing unit 109, a detection target information database 110, a side boundary monitoring unit 111, a front boundary monitoring unit 112, and an obstacle detection unit 113.
  • the detection range setting database 108 stores the detection range 154 of the monitoring area in association with the position / posture information 153 of the vehicle 102.
  • the detection range 154 includes information on undetected areas such as the vicinity of the station platform and the area where maintenance work is performed, in addition to the range data based on the building limit near the track.
  • the monitoring area setting processing unit 109 acquires the detection range 154 corresponding to the position / attitude information 153 by inquiring the position / attitude information 153 to the detection range setting database 108.
  • the monitoring area setting processing unit 109 sequentially determines the side boundary 155 of the monitoring area based on the latest detection range 154, and sets the side boundary monitoring unit 111. Further, the monitoring area setting processing unit 109 sequentially determines the front boundary 156 of the monitoring area based on the latest detection range 154, and sets the front boundary monitoring unit 112.
  • the detection target information database 110 records information such as the position and reflectance of the detection point (detection target other than obstacles) as the background observed in the monitoring area in association with the vehicle position and the monitoring area.
  • the side boundary monitoring unit 111 and the front boundary monitoring unit 112 refer to the detection target information database 110 for the vehicle position and the monitoring area, and acquire information 157 and 158 regarding the background detection points, respectively.
  • the side boundary monitoring unit 111 and the front boundary monitoring unit 112 detect obstacles in the area set on the side boundary and the front boundary of the monitoring area by using sensors such as a camera, LIDAR, laser radar, or millimeter wave radar. Has a function to do.
  • the side boundary monitoring unit 111 and the front boundary monitoring unit 112 may share the sensor of the surrounding environment observation unit 107.
  • the monitoring result 159 of the side boundary monitoring unit 111 is transmitted to the obstacle detection unit 113. Further, the monitoring result 160 of the front boundary monitoring unit 112 is transmitted to the obstacle detection unit 113.
  • the obstacle information 161 by the obstacle detection unit 113 is transmitted to the vehicle 102 as the output of the peripheral observation system 100A.
  • FIG. 2 is a diagram showing the configuration of the self-position estimation system 101.
  • the self-position estimation system 101 includes an observation data selection processing unit 114, a vehicle attitude estimation processing unit 115, a peripheral environment data coordinate conversion processing unit 116, a peripheral environment map database 117, a 3D rail track database 118, and a scan matching self.
  • the position estimation unit 119 is provided.
  • Such a peripheral observation system 100A is composed of one or more computer systems equipped with, for example, a CPU (Central Processing Unit) and a memory as hardware. When this hardware executes the peripheral observation program, various functions of the peripheral observation system 100A are realized. For some or all of this hardware, dedicated equipment, general-purpose machine learning machines, DSP (Digital Signal Processor), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), PLD (programmable logic device) You may substitute with. Further, by arranging a part or all of the hardware in a cloud in a centralized or distributed manner on a server on an external network, a plurality of peripheral observation systems 100A may be jointly used via the network.
  • a CPU Central Processing Unit
  • FIG. 3 is a flowchart illustrating the overall operation of the peripheral observation system 100A. In steps S201 to 205 shown in the figure, whether or not to give a stop instruction to the vehicle 102 is determined for each measurement cycle of the obstacle detection system 103.
  • Step S201 The monitoring area setting processing unit 109 in the obstacle detection system 103 acquires the position / attitude information 153 calculated by the self-position estimation system 101.
  • Step S202 The monitoring area setting processing unit 109 queries the detection range setting database 108 for the position / attitude information 153 to obtain the side boundary 155 and the front boundary 156 as the boundaries of the monitoring area for monitoring obstacles. For example, the building limits on the left and right of the track are set as the side boundary 155 of the obstacle monitoring area, and the stoptable distance of the vehicle is set as the front boundary 156 of the obstacle monitoring area.
  • the side boundaries 155a and 155b and the front boundary 156 shown in FIG. 4 have boundary lines having a predetermined width.
  • a widened boundary detection area is set. This predetermined width takes into consideration the size and maximum moving speed of the obstacle expected to exceed the boundary and the sensing cycle of the side boundary monitoring unit 111 and the front boundary monitoring unit 112 (obstacle detection sensor). It is set to ensure a width that can be detected at least once when an object enters the boundary.
  • the predetermined width is set narrowly to several cm to several tens of cm (more specifically, 10 cm) assuming the size of passengers waiting at the platform and the maximum moving speed.
  • a predetermined width is set wide (for example, 1 m) assuming the size and maximum moving speed when a car or the like crosses. In this way, the predetermined width of the boundary detection region is sequentially changed according to the vehicle position of the vehicle 102.
  • Step S203 The side boundary monitoring unit 111 and the front boundary monitoring unit 112 monitor whether or not an obstacle has entered the orbit beyond the side boundary 155 and the front boundary 156, and monitor the monitoring results 159 and 160 as obstacles. It is given to the detection unit 113.
  • the boundary detection regions of the left and right lateral boundaries 155a and 155b can be rectangular regions having a width of several tens of centimeters and a depth of more than 100 m.
  • the detectors 201 and 202 of the lateral boundary monitoring unit 111 two multilayer LIDARs or the like installed at high positions on the front left and right of the vehicle 102 so as to face downward are used.
  • the detectors 201 and 201 one or more stereo cameras, millimeter-wave radars, and laser rangefinders may be used.
  • the boundary detection region of the lateral boundary 155 may be scanned by attaching the sensor to the automatic pan head.
  • Millimeter wave radar, multilayer LIDAR, laser rangefinder, etc. are used as the detector 203 of the front boundary monitoring unit 112 as the detector 203 of the front boundary monitoring unit 112 as the detector 203 of the front boundary monitoring unit 112 as the detector 203 of the front boundary monitoring unit 112
  • Millimeter wave radar, multilayer LIDAR, laser rangefinder, etc. are used as the detector 203 of the front boundary monitoring unit 112
  • the detection results of these detectors 201 to 203 are compared with the background detection points (detection targets other than obstacles) recorded in the detection target information database 110. Based on this comparison, when any of the following conditions 1 to 3 is satisfied, it is determined that an obstacle exists in the boundary detection region.
  • Countermeasure 2 A marker having a reflectance of a certain value or higher is set in advance as a detection point in the boundary detection area.
  • the position of the detection target and its reflectance are recorded in the detection target information database 110 in advance, and the detection point is regarded as an obstacle only when the position of the detection point is included in the boundary detection area at the current vehicle position. Used to determine intrusion.
  • FIG. 5 shows a case where the multilayer LIDAR is used as the surrounding environment observation unit 107 and the detectors 201 to 203 as shown by a plurality of straight lines.
  • the road surface detection points by each detection layer irradiated from the multilayer LIDAR are shown by dotted lines.
  • the detection layers that pass over the lateral boundaries 155a and 155b depend on the distance from the vehicle, and the detection points of those multiple layers are monitored to determine the intrusion of obstacles.
  • the probability of detecting an obstacle may be increased and the detection rate of the obstacle may be increased. Further, the false detection rate may be lowered by using the logical product (AND) of the detected detection results.
  • the obstacle detection unit 113 shifts the operation to step S204. In other cases, the obstacle detection unit 113 shifts the operation to step S205.
  • Step S204 When it is determined in step S203 that an obstacle exists, the obstacle detection unit 113 creates obstacle information 161 in order to stop the vehicle 102.
  • Step S205 The obstacle detection unit 113 transmits the obstacle information 161 to the vehicle 102.
  • FIG. 6 is a flowchart illustrating the operation of the self-position estimation system 101.
  • Steps S401 to 408 shown in the figure are operations for estimating the vehicle position of the vehicle 102, and are repeatedly executed every measurement cycle of the obstacle detection system 103.
  • Step S401 The observation data selection processing unit 114 acquires the peripheral environment observation data 151 observed by the peripheral environment observation unit 107. For example, when the surrounding environment of the vehicle 102 shown in FIG. 7 is observed by the multilayer LIDAR, rail observation data 162a to 162af and the like are acquired as three-dimensional point group data as shown in FIG.
  • Step S402 The observation data selection processing unit 114 sorts the acquired surrounding environment observation data into the rail observation data 162a to 162 on the orbit shown in FIG. 9 and the peripheral observation data 163 other than the orbit.
  • Such rail observation data 162a to 162 are selected by utilizing the shape and reflectance of the rail observation data and the fact that the rail detection data forms one raceway surface (plane or curved surface).
  • Step S403 The vehicle attitude estimation processing unit 115 determines the temporary position of the vehicle 102 on the rail track in the map data of the surrounding environment map database 117 and the 3D rail track database 118. This temporary position is determined based on the communication between the rail ground element and the vehicle 102, GPS information, the cumulative speed of the vehicle 102, and the like.
  • Step S404 The vehicle attitude estimation processing unit 115 acquires rail position information (three-dimensional point cloud data defining the rail surface) at the temporary position by inquiring the temporary position on the rail track to the 3D rail track database 118. ..
  • the vehicle attitude estimation processing unit 115 geometrically calculates the surface R formed by the rail surface obtained from the rail observation data 162 shown in FIG. 10 and the rail position information acquired from the 3D rail track database 118 to obtain the attitude of the vehicle 102. Estimate the information.
  • Step S405 Since the peripheral environment observation data of the peripheral environment observation unit 107 is observed by the sensor mounted on the vehicle 102, it becomes an observed value in the vehicle coordinate system ⁇ T fixed to the vehicle 102.
  • map data and the 3D rail track data in the external coordinate system ⁇ O are recorded in the surrounding environment map database 117 and the 3D rail track database 118.
  • the surrounding environment data coordinate conversion processing unit 116 obtains the surrounding environment observation data in the vehicle coordinate system fixed to the vehicle 102 based on the temporary position of the vehicle 102 on the rail track and the attitude information of the vehicle 102 estimated in step S404.
  • the coordinates are converted from ⁇ T to the external coordinate system ⁇ O, and output as the surrounding environment data after the coordinate conversion.
  • Step S406 The scan matching self-position estimation unit 119 scan-matches the surrounding environment data after coordinate conversion with the map data recorded in the surrounding environment map database 117. This scan matching scans with the sensor position (vehicle position) in the surrounding environment data constrained on the rail track of the map data.
  • the scan range in this case is a width that can maintain the validity of the posture estimation in step S404 and the coordinate transformation in step S405. In this scan range, the scan position where the residual between the surrounding environment data and the map data (such as the sum of the absolute values of the differences between the two data) is the minimum residual is searched.
  • Step S407 The scan matching self-position estimation unit 119 determines whether or not the minimum residual obtained in scan matching is smaller than the permissible value. This permissible value is appropriately set based on the accuracy required for the vehicle position.
  • the scan matching self-position estimation unit 119 shifts the operation to step S408.
  • the scan matching self-position estimation unit 119 shifts the operation to step S409.
  • Step S408 The scan matching self-position estimation unit 119 displaces the temporary position along the rail track by the distance. This step distance is a distance that exceeds the range scanned in step S406. Further, the displacement direction of the temporary position (positive or negative of the step distance) is determined in the direction in which the matching residual becomes smaller based on the change tendency of the residual during the scan matching in step S406.
  • the scan matching self-position estimation unit 119 After the temporary position is displaced in this way, the scan matching self-position estimation unit 119 returns the operation to step S404.
  • Step S409 By repeating steps S404 to S408, the minimum residual of scan matching is reduced to be equal to or less than the allowable value.
  • the scan matching self-position estimation unit 119 estimates the scan position on the rail track that has obtained the minimum residual of the permissible value or less as the vehicle position of the vehicle 102.
  • FIG. 11 is a flowchart illustrating the operation of the vehicle driving control unit 105. The operation shown in the figure is repeatedly executed at regular intervals.
  • Step S500 The vehicle operation control unit 105 acquires information on the location of the vehicle 102 based on the operation schedule of the vehicle 102 and the like.
  • Step S5011 The vehicle driving control unit 105 determines whether or not the vehicle 102 is stopped at the station. The determination is made based on the location and speed of the vehicle 102. For example, if the vehicle 102 is near the station platform and the speed is zero, it is determined that the vehicle is stopped at the station. When it is determined that the vehicle is stopped at the station, the vehicle operation control unit 105 shifts the operation to step S502. In other cases, the vehicle driving control unit 105 shifts the operation to step S511.
  • Step S502 The vehicle operation control unit 105 acquires information on the scheduled departure time of the vehicle 102 based on the operation schedule of the vehicle 102.
  • Step S503 The vehicle operation control unit 105 determines whether or not the current time has reached the scheduled departure time. If the current time has not reached the scheduled departure time, the vehicle operation control unit 105 exits this processing flow. When the current time reaches the scheduled departure time, the vehicle operation control unit 105 proceeds to step S504.
  • Step S504 The vehicle driving control unit 105 determines whether or not the vehicle 102 has completed the departure preparation.
  • An example of preparation for departure is confirmation of the closed state of the vehicle door. If it is not completed, exit this processing flow.
  • the vehicle operation control unit 105 proceeds to step S505.
  • Step S505 The vehicle driving control unit 105 acquires obstacle information 161 from the obstacle detection system 103.
  • the obstacle monitoring area 500 is set to exclude the range of the station platform based on the highly accurate vehicle position by the self-position estimation system 101.
  • a highly accurate vehicle position a highly accurate setting that includes the inside of the white line of the station platform and the vicinity of the platform door (the area where the vehicle 102 is not safe because it passes nearby) in the obstacle monitoring area 500. Is fully possible.
  • Step S506 The vehicle driving control unit 105 determines whether or not an obstacle exists from the obstacle information 161. If there are no obstacles, the vehicle driving control unit 105 proceeds to step S507. When an obstacle is present, the vehicle driving control unit 105 exits the main processing flow while generating or maintaining a stop command.
  • Step S507 The vehicle driving control unit 105 generates a drive command 142 and transmits it to the vehicle drive unit 106. Specifically, here, a powering command is transmitted to depart the station.
  • Step S508 The vehicle operation control unit 105 calculates the estimated time of arrival of the next station based on the timing when the vehicle 102 departs and the estimated time of travel between the stations to be traveled, and the operation management center of the vehicle 102 ( Send to the operation management system).
  • step S501 processing when it is determined in step S501 that the vehicle 102 is not stopped at the station (steps S511 to S515) will be described.
  • Step S311 The vehicle driving control unit 105 acquires the obstacle information 161 in the monitoring area from the obstacle detection system 103.
  • the obstacle monitoring area 600 is widely set because it is traveling between stations. However, if there is a section under maintenance work between stations, the range where maintenance personnel can work safely should be excluded from the obstacle monitoring area 600 based on the highly accurate vehicle position by the self-position estimation system 101. Is set to.
  • Step S512 The vehicle driving control unit 105 determines whether or not braking of the vehicle 102 is necessary based on the obstacle information 161 in the monitoring area. If there are no obstacles and braking is not required, the vehicle driving control unit 105 proceeds to step S513. If there is an obstacle and braking is required, the vehicle driving control unit 105 proceeds to step S514.
  • Step S513 The vehicle driving control unit 105 generates a drive command 142 and transmits it to the vehicle drive unit 106. Specifically, here, a drive command such as proportional control is transmitted so that the speed of the vehicle 102 becomes a predetermined target speed. After transmission, the vehicle driving control unit 105 advances the operation to step S515.
  • Step S514 The vehicle driving control unit 105 transmits a braking command of the vehicle 102 to the vehicle driving unit 106 in response to the detection of an obstacle. Specifically, a braking command is generated to decelerate and stop the vehicle 102 at the maximum deceleration. After transmitting the braking command, the vehicle driving control unit 105 advances the operation to step S515.
  • Step S515 The vehicle operation control unit 105 estimates the time when the vehicle 102 arrives at the next station from the position, speed, and operation status at that time, and transmits the time to the operation management center (operation management system) of the vehicle 102.
  • the matching (correlation) between the observation result of the surrounding environment observation unit 107 and the map data in the surrounding environment map database is scanned in a range along the orbit.
  • the surrounding environment data 168 and the map data 166 are correlated while repeating displacements in a plurality of directions.
  • the position with the highest correlation value (FIG. 15) is obtained as the vehicle position.
  • the degree of freedom of scanning is high, it is easy to erroneously determine the position where the matching residual is the minimum as the vehicle position, and there is a problem that it is difficult to improve the detection accuracy of the vehicle position.
  • the matching (correlation) between the surrounding environment data 168 and the map data 166 is the highest while scanning one-dimensionally while being constrained to the track L as shown in FIG.
  • the scan position (see FIG. 17) is estimated as the vehicle position. Therefore, since the scanning range is limited to the track L, it is possible to improve the detection accuracy of the vehicle position without erroneously determining the position deviating from the track L as the vehicle position.
  • the processing load of scan matching is lightened, and the time required for processing can be shortened.
  • the posture of the vehicle 102 with respect to the track is obtained based on the observation result of the track by the surrounding environment observation unit 107. Even for the vehicle 102 traveling on the track, the posture of the vehicle 102 with respect to the track fluctuates from moment to moment due to the influence of acceleration / deceleration, curve traveling, and the like. In the first embodiment, it is possible to acquire information on the posture of the vehicle 102 that fluctuates in this way.
  • the observation result of the surrounding environment observation unit 107 is coordinate-transformed based on the posture of the vehicle 102.
  • the tilt fluctuation of the observation result of the surrounding environment observation unit 107 due to the posture of the vehicle 102 is correctly calibrated. Therefore, the matching accuracy between the observation result after the coordinate conversion and the map data becomes high, and it becomes possible to further improve the detection accuracy of the vehicle position.
  • the monitoring area of the obstacle detection system 103 can be changed at an appropriate timing according to the vehicle position. Therefore, it is possible to reliably detect only obstacles in the monitoring area to be monitored. In addition, an object located outside the monitoring area that should not be monitored will not be erroneously detected as an obstacle.
  • a boundary detection area having a predetermined width is provided at the boundary of the monitoring area (left and right lateral boundary, front boundary, etc.) for the purpose of quickly and surely detecting an obstacle.
  • the predetermined width of the boundary detection area since the detection accuracy of the vehicle position is high, it is possible to change the predetermined width of the boundary detection area at an appropriate timing according to the vehicle position.
  • the station section by narrowing the predetermined width of the boundary detection area, it becomes possible to intensively detect passengers invading on the railroad track or in the white line of the station platform.
  • the present invention is not limited to the above-described embodiment, and includes various modifications.
  • the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the one including all the described configurations.
  • the operation of intensively detecting an obstacle in the boundary detection area of the monitoring area is described, but the present invention is not limited to this.
  • obstacles may be detected in the entire area or subarea within the monitoring area.
  • 3D rail track database 119 ... Scan matching self-position estimation unit, 142 ... Drive command, 151 ... Surrounding environment observation data, 153 ... position / attitude information, 154 ... detection range, 155 ... side boundary, 156 ... front boundary, 159 ... monitoring result, 160 ... monitoring result, 161 ... obstacle information

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Power Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Sustainable Development (AREA)
  • Sustainable Energy (AREA)
  • Electromagnetism (AREA)
  • Transportation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)
  • Electric Propulsion And Braking For Vehicles (AREA)
  • Train Traffic Observation, Control, And Security (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The purpose of the present invention is to provide a technique for increasing the precision of detecting a vehicle position. For this purpose, one typical surroundings observation system of the present invention comprises a surrounding environment observation unit and a position estimation unit. The surrounding environment observation unit observes a physical object group in the surroundings from a vehicle traveling on a railroad. The position estimation unit stores map data created by observing the physical object group in advance. Furthermore, the position estimation unit acquires, at a position limited to the railroad, a correlation of the physical object group between the result of the observation made by the surrounding environment observation unit and the map data, and estimates the vehicle position on the basis of the correlation.

Description

周辺観測システム、周辺観測プログラムおよび周辺観測方法Peripheral observation system, peripheral observation program and peripheral observation method
 本発明は、周辺観測システム、周辺観測プログラムおよび周辺観測方法に関する。 The present invention relates to a peripheral observation system, a peripheral observation program, and a peripheral observation method.
 鉄道の技術分野では、安全性を高めるために、車両前方の障害物を発見する技術が知られている。例えば、特許文献1には、「車両に設けた近距離を撮影する近距離撮影部と、遠距離を撮影する遠距離撮影部、更に近距離内を照射するLIDARとを用いて障害物を検知する」旨の技術が開示される。 In the technical field of railways, technology for finding obstacles in front of vehicles is known in order to improve safety. For example, in Patent Document 1, "an obstacle is detected by using a short-distance photographing unit for photographing a short distance provided in a vehicle, a long-distance photographing unit for photographing a long distance, and a LIDAR for irradiating a short distance. The technology to the effect of "doing" is disclosed.
 この種の障害物を検知する技術では、駅ホーム上で安全に待機する乗客や、軌道近くで安全に保守作業を行う保守員を障害物と区別することが技術的に困難であった。
 そのため、障害物ではないものを誤検出するたびに、車両を緊急停止させるなど、鉄道運行に支障が生じる場合があった。
With this type of obstacle detection technology, it was technically difficult to distinguish passengers who are safely waiting on the station platform and maintenance personnel who perform maintenance work safely near the track from obstacles.
Therefore, every time a non-obstacle is erroneously detected, the vehicle may be stopped in an emergency, which may hinder railway operation.
 この種の誤検知を防ぐ技術として、例えば、特許文献2には、「列車前方を事前に撮影したベースビデオ画像と、走行中の列車前方のリファレンス画像との相関に基づいて列車の車両位置を検出する。ベースビデオ画像には仮想建築限界枠が予め設定される。同じ車両位置で別日時に撮影されたベースビデオ画像とリファレンスビデオ画像との差異を仮想建築限界枠内に限って検出することにより、仮想建築限界枠内の障害物のみを検知する。」旨の技術が開示される。 As a technique for preventing this kind of false detection, for example, Patent Document 2 states that "the vehicle position of a train is determined based on the correlation between a base video image of the front of a train taken in advance and a reference image of the front of a running train. Detect. A virtual building limit frame is preset for the base video image. The difference between the base video image and the reference video image taken at the same vehicle position at different dates and times is detected only within the virtual building limit frame. Detects only obstacles within the virtual building limit frame. "
特開2016-088183号公報Japanese Unexamined Patent Publication No. 2016-08183 特開2017-001638号公報JP-A-2017-001638
 特許文献2のような列車の車両位置を検知するシステムでは、車両位置の検出精度をさらに高めることが望まれる。 In a system that detects the vehicle position of a train as in Patent Document 2, it is desired to further improve the detection accuracy of the vehicle position.
 そこで、本発明は、車両位置の検出精度を高める周辺観測の技術を提供することを目的とする。 Therefore, an object of the present invention is to provide a technique for peripheral observation that enhances the detection accuracy of the vehicle position.
 上記課題を解決するために、代表的な本発明の周辺観測システムの一つは、周辺環境観測部、および位置推定部を備える。前記周辺環境観測部は、軌道を走行する車両の位置から周辺の物体群の位置を観測し、観測結果として出力する。前記位置推定部は、事前に前記物体群の位置を観測して作成された地図データを記憶し、前記観測結果における前記車両の位置を前記地図データの軌道上に限定した状態で、前記観測結果と前記地図データとの相関を算出し、前記相関に基づいて車両位置を推定する。 In order to solve the above problems, one of the representative peripheral observation systems of the present invention includes a peripheral environment observation unit and a position estimation unit. The surrounding environment observation unit observes the position of a group of surrounding objects from the position of the vehicle traveling on the track and outputs the observation result. The position estimation unit stores map data created by observing the position of the object group in advance, and the observation result is in a state where the position of the vehicle in the observation result is limited to the track of the map data. The correlation between the vehicle and the map data is calculated, and the vehicle position is estimated based on the correlation.
 本発明により、車両位置の検出精度を高める周辺観測の技術が提供される。 The present invention provides a technique for peripheral observation that enhances the accuracy of detecting the vehicle position.
 上記した以外の課題、構成および効果は、以下の実施形態の説明により明らかにされる。 Issues, configurations and effects other than those described above will be clarified by the explanation of the following embodiments.
図1は、軌道輸送システム(周辺観測システムを含む)の構成を示す図である。FIG. 1 is a diagram showing a configuration of an orbital transportation system (including a peripheral observation system). 図2は、自己位置推定システムの構成を示す図である。FIG. 2 is a diagram showing a configuration of a self-position estimation system. 図3は、周辺観測システムの全体動作を示すフローチャートである。FIG. 3 is a flowchart showing the overall operation of the peripheral observation system. 図4は、周辺環境観測用のセンサーの一例を示す図である。FIG. 4 is a diagram showing an example of a sensor for observing the surrounding environment. 図5は、旋回軌道走行時の監視エリアの側方境界を示す図である。FIG. 5 is a diagram showing a lateral boundary of the monitoring area when traveling on a turning track. 図6は、自己位置推定システムの動作を示すフローチャートである。FIG. 6 is a flowchart showing the operation of the self-position estimation system. 図7は、走行する車両の周辺環境の一例を示す図である。FIG. 7 is a diagram showing an example of the surrounding environment of the traveling vehicle. 図8は、自己位置推定システムによるレール軌道の検知を説明する図である。FIG. 8 is a diagram illustrating the detection of rail trajectories by the self-position estimation system. 図9は、周辺環境観測データの一例を示す図である。FIG. 9 is a diagram showing an example of surrounding environment observation data. 図10は、自己位置推定システムによる車両の姿勢推定を説明する図である。FIG. 10 is a diagram for explaining the posture estimation of the vehicle by the self-position estimation system. 図11は、車両運転制御部の動作を示すフローチャートである。FIG. 11 is a flowchart showing the operation of the vehicle driving control unit. 図12は、駅区間における監視エリアの一例を示す図である。FIG. 12 is a diagram showing an example of a monitoring area in a station section. 図13は、保守作業区間における監視エリアの一例を示す図である。FIG. 13 is a diagram showing an example of a monitoring area in the maintenance work section. 図14は、自動車の場合のスキャンマッチング(スキャン中)を示す図である。FIG. 14 is a diagram showing scan matching (during scanning) in the case of an automobile. 図15は、自動車の場合のスキャンマッチング(スキャン完了)を示す図である。FIG. 15 is a diagram showing scan matching (scan completion) in the case of an automobile. 図16は、実施例のスキャンマッチング(スキャン中)を示す図である。FIG. 16 is a diagram showing scan matching (during scanning) of the embodiment. 図17は、実施例のスキャンマッチング(スキャン完了)を示す図である。FIG. 17 is a diagram showing scan matching (scan completion) of the embodiment.
 以下、実施の形態について図面を参照して説明する。 Hereinafter, embodiments will be described with reference to the drawings.
 《実施例1の構成》
 図1は、実施例1の鉄道輸送システム100(周辺観測システム100Aを含む)の構成を示す図である。
<< Configuration of Example 1 >>
FIG. 1 is a diagram showing a configuration of a railway transportation system 100 (including a peripheral observation system 100A) of the first embodiment.
 鉄道輸送システム100は、車両102、および周辺観測システム100Aを備える。 The railway transportation system 100 includes a vehicle 102 and a peripheral observation system 100A.
 車両102は、軌道に沿って走行する旅客や貨物を輸送する車両である。車両102は、車両運転制御部105と車両駆動部106とから構成される。この車両102には、周辺観測システム100Aから障害物情報161が与えられる。 Vehicle 102 is a vehicle that transports passengers and freight that travel along the track. The vehicle 102 is composed of a vehicle driving control unit 105 and a vehicle drive unit 106. Obstacle information 161 is given to the vehicle 102 from the peripheral observation system 100A.
 車両運転制御部105は、車両102の位置と速度を検知する機能を内部に有する。車両運転制御部105は、この位置および速度が、目標走行パターンに沿うように、駆動指令142を生成する。目標走行パターンは、予め分かっている車両102の加減速度と走行区間の制限速度に基づくパターンを基本とする。そのうえで、車両運転制御部105は、車両102の位置と、車両102の最大減速度とから、車両102の許容最高速度を算出し、基本の目標走行パターンに反映させる。このような車両運転制御部105としては、ATO装置(自動列車運転装置)が例として挙げられる。 The vehicle driving control unit 105 has an internal function of detecting the position and speed of the vehicle 102. The vehicle driving control unit 105 generates a drive command 142 so that the position and speed follow the target traveling pattern. The target traveling pattern is based on a pattern based on the acceleration / deceleration of the vehicle 102 and the speed limit of the traveling section, which are known in advance. Then, the vehicle driving control unit 105 calculates the allowable maximum speed of the vehicle 102 from the position of the vehicle 102 and the maximum deceleration of the vehicle 102, and reflects it in the basic target traveling pattern. As such a vehicle operation control unit 105, an ATO device (automatic train operation device) can be mentioned as an example.
 車両駆動部106は、与えられる駆動指令142に基づき、車両102を駆動する。車両駆動部106の具体的装置の例としては、インバータ、モータ、摩擦ブレーキなどが挙げられる。 The vehicle drive unit 106 drives the vehicle 102 based on the given drive command 142. Examples of the specific device of the vehicle drive unit 106 include an inverter, a motor, a friction brake, and the like.
 一方、周辺観測システム100Aは、周辺環境観測部107、自己位置推定システム101、および障害物検知システム103を備える。 On the other hand, the peripheral observation system 100A includes a peripheral environment observation unit 107, a self-position estimation system 101, and an obstacle detection system 103.
 周辺環境観測部107は、車両102の前方に設置され、車両102の周辺にある物体の位置や形状、色や反射強度などを取得するセンサーである。例えば、周辺環境観測部107は、カメラ、LIDAR(Laser Imaging Detection and Ranging)、レーザレーダ、あるいはミリ波レーダーなどのセンサーである。 The surrounding environment observation unit 107 is a sensor installed in front of the vehicle 102 to acquire the position, shape, color, reflection intensity, etc. of an object around the vehicle 102. For example, the surrounding environment observation unit 107 is a sensor such as a camera, a LIDAR (Laser Imaging Detection and Ringing), a laser radar, or a millimeter wave radar.
 自己位置推定システム101は、周辺環境観測部107から周辺環境観測データ151を取得し、車両102の車両位置および姿勢を推定して位置・姿勢情報153を出力する。 The self-position estimation system 101 acquires the surrounding environment observation data 151 from the surrounding environment observation unit 107, estimates the vehicle position and attitude of the vehicle 102, and outputs the position / attitude information 153.
 障害物検知システム103は、検知範囲設定データベース108、監視エリア設定処理部109、検知対象情報データベース110、側方境界監視部111、前方境界監視部112、および障害物検知部113を備える。 The obstacle detection system 103 includes a detection range setting database 108, a monitoring area setting processing unit 109, a detection target information database 110, a side boundary monitoring unit 111, a front boundary monitoring unit 112, and an obstacle detection unit 113.
 検知範囲設定データベース108には、車両102の位置・姿勢情報153に対応付けて、監視エリアの検知範囲154が記憶される。なお、検知範囲154には、軌道付近の建築限界をベースとした範囲データに加えて、駅ホーム付近や保守作業を行うエリアなどの検知しないエリアの情報も含まれる。 The detection range setting database 108 stores the detection range 154 of the monitoring area in association with the position / posture information 153 of the vehicle 102. The detection range 154 includes information on undetected areas such as the vicinity of the station platform and the area where maintenance work is performed, in addition to the range data based on the building limit near the track.
 監視エリア設定処理部109は、位置・姿勢情報153を検知範囲設定データベース108に照会することにより、位置・姿勢情報153に対応する検知範囲154を取得する。監視エリア設定処理部109は、最新の検知範囲154に基づいて、監視エリアの側方境界155を逐次に決定し、側方境界監視部111に設定する。また、監視エリア設定処理部109は、最新の検知範囲154に基づいて、監視エリアの前方境界156を逐次に決定し、前方境界監視部112に設定する。 The monitoring area setting processing unit 109 acquires the detection range 154 corresponding to the position / attitude information 153 by inquiring the position / attitude information 153 to the detection range setting database 108. The monitoring area setting processing unit 109 sequentially determines the side boundary 155 of the monitoring area based on the latest detection range 154, and sets the side boundary monitoring unit 111. Further, the monitoring area setting processing unit 109 sequentially determines the front boundary 156 of the monitoring area based on the latest detection range 154, and sets the front boundary monitoring unit 112.
 検知対象情報データベース110は、車両位置や監視エリアに対応付けて、監視エリアで観測される背景としての検出点(障害物以外の検出対象)について、位置と反射率などの情報が記録される。側方境界監視部111および前方境界監視部112は、検知対象情報データベース110に車両位置や監視エリアを照会して、背景の検出点に関する情報157,158をそれぞれ取得する。 The detection target information database 110 records information such as the position and reflectance of the detection point (detection target other than obstacles) as the background observed in the monitoring area in association with the vehicle position and the monitoring area. The side boundary monitoring unit 111 and the front boundary monitoring unit 112 refer to the detection target information database 110 for the vehicle position and the monitoring area, and acquire information 157 and 158 regarding the background detection points, respectively.
 側方境界監視部111および前方境界監視部112は、カメラ、LIDAR、レーザレーダ、あるいはミリ波レーダーなどのセンサーを用いて、監視エリアの側方境界および前方境界に設定した領域の障害物を検知する機能を持つ。ここで、側方境界監視部111および前方境界監視部112は、周辺環境観測部107のセンサーを共用してもよい。 The side boundary monitoring unit 111 and the front boundary monitoring unit 112 detect obstacles in the area set on the side boundary and the front boundary of the monitoring area by using sensors such as a camera, LIDAR, laser radar, or millimeter wave radar. Has a function to do. Here, the side boundary monitoring unit 111 and the front boundary monitoring unit 112 may share the sensor of the surrounding environment observation unit 107.
 側方境界監視部111の監視結果159は、障害物検知部113に伝達される。また、前方境界監視部112の監視結果160は、障害物検知部113に伝達される。 The monitoring result 159 of the side boundary monitoring unit 111 is transmitted to the obstacle detection unit 113. Further, the monitoring result 160 of the front boundary monitoring unit 112 is transmitted to the obstacle detection unit 113.
 障害物検知部113による障害物情報161は、周辺観測システム100Aの出力として、車両102に伝達される。 The obstacle information 161 by the obstacle detection unit 113 is transmitted to the vehicle 102 as the output of the peripheral observation system 100A.
 図2は、自己位置推定システム101の構成を示す図である。 FIG. 2 is a diagram showing the configuration of the self-position estimation system 101.
 同図において、自己位置推定システム101は、観測データ選別処理部114、車両姿勢推定処理部115、周辺環境データ座標変換処理部116、周辺環境地図データベース117、3Dレール軌道データベース118、およびスキャンマッチング自己位置推定部119を備える。 In the figure, the self-position estimation system 101 includes an observation data selection processing unit 114, a vehicle attitude estimation processing unit 115, a peripheral environment data coordinate conversion processing unit 116, a peripheral environment map database 117, a 3D rail track database 118, and a scan matching self. The position estimation unit 119 is provided.
 このような周辺観測システム100Aは、例えばハードウェアとしてCPU(Central Processing Unit)やメモリなどを備えた1つ以上のコンピュータシステムにより構成される。このハードウェアが周辺観測プログラムを実行することにより、周辺観測システム100Aの各種機能が実現する。このハードウェアの一部または全部については、専用の装置、汎用の機械学習マシン、DSP(Digital Signal Processor)、FPGA(Field-Programmable Gate Array)、GPU(Graphics Processing Unit)、PLD(programmable logic device)などで代替してもよい。また、ハードウェアの一部または全部を外部のネットワーク上のサーバに集中または分散してクラウド配置することにより、複数の周辺観測システム100Aがネットワークを介して共同使用するようにしてもよい。 Such a peripheral observation system 100A is composed of one or more computer systems equipped with, for example, a CPU (Central Processing Unit) and a memory as hardware. When this hardware executes the peripheral observation program, various functions of the peripheral observation system 100A are realized. For some or all of this hardware, dedicated equipment, general-purpose machine learning machines, DSP (Digital Signal Processor), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), PLD (programmable logic device) You may substitute with. Further, by arranging a part or all of the hardware in a cloud in a centralized or distributed manner on a server on an external network, a plurality of peripheral observation systems 100A may be jointly used via the network.
 《周辺観測システム100Aの全体動作》
 図3は、周辺観測システム100Aの全体動作を説明するフローチャートである。
 同図に示すステップS201~205では、障害物検知システム103の計測周期ごとに、車両102に対する停止指示の可否が決定される。
<< Overall operation of peripheral observation system 100A >>
FIG. 3 is a flowchart illustrating the overall operation of the peripheral observation system 100A.
In steps S201 to 205 shown in the figure, whether or not to give a stop instruction to the vehicle 102 is determined for each measurement cycle of the obstacle detection system 103.
 以下、図3に示すステップ番号に沿って説明する。 Hereinafter, description will be given according to the step numbers shown in FIG.
ステップS201: 障害物検知システム103内の監視エリア設定処理部109は、自己位置推定システム101で算出される位置・姿勢情報153を取得する。 Step S201: The monitoring area setting processing unit 109 in the obstacle detection system 103 acquires the position / attitude information 153 calculated by the self-position estimation system 101.
ステップS202: 監視エリア設定処理部109は、位置・姿勢情報153を検知範囲設定データベース108に照会することにより、障害物を監視する監視エリアの境界として側方境界155および前方境界156を求める。例えば、軌道の左右の建築限界などを障害物監視エリアの側方境界155として設定し、車両の停止可能距離を障害物監視エリアの前方境界156として設定する。 Step S202: The monitoring area setting processing unit 109 queries the detection range setting database 108 for the position / attitude information 153 to obtain the side boundary 155 and the front boundary 156 as the boundaries of the monitoring area for monitoring obstacles. For example, the building limits on the left and right of the track are set as the side boundary 155 of the obstacle monitoring area, and the stoptable distance of the vehicle is set as the front boundary 156 of the obstacle monitoring area.
 ここで、側方境界155および前方境界156を超えて侵入する障害物を境界上で逸早く検出するため、図4に示す側方境界155a,155bおよび前方境界156には、境界線を所定幅に拡幅した境界検出領域が設定される。この所定幅は、その境界を超えると予想される障害物の大きさと最大移動速度と、側方境界監視部111や前方境界監視部112(障害物検知センサー)のセンシング周期とを考慮し、障害物が境界内に入る際に少なくとも1回以上検出できる幅を確保するように設定される。 Here, in order to quickly detect an obstacle that invades beyond the lateral boundary 155 and the front boundary 156 on the boundary, the side boundaries 155a and 155b and the front boundary 156 shown in FIG. 4 have boundary lines having a predetermined width. A widened boundary detection area is set. This predetermined width takes into consideration the size and maximum moving speed of the obstacle expected to exceed the boundary and the sensing cycle of the side boundary monitoring unit 111 and the front boundary monitoring unit 112 (obstacle detection sensor). It is set to ensure a width that can be detected at least once when an object enters the boundary.
 例えば、駅区間では、ホームで待つ乗客の大きさと最大移動速度を想定して所定幅は数cm~数十cm(さらに具体的には10cm)に狭く設定される。
 また例えば、駅間の走行区間(特に踏み切り付近)では自動車等が横切る際の大きさと最大移動速度を想定して所定幅は広く(例えば1mに)設定される。このように、車両102の車両位置に応じて、境界検出領域の所定幅は逐次に変更される。
For example, in the station section, the predetermined width is set narrowly to several cm to several tens of cm (more specifically, 10 cm) assuming the size of passengers waiting at the platform and the maximum moving speed.
Further, for example, in a traveling section between stations (particularly near a railroad crossing), a predetermined width is set wide (for example, 1 m) assuming the size and maximum moving speed when a car or the like crosses. In this way, the predetermined width of the boundary detection region is sequentially changed according to the vehicle position of the vehicle 102.
ステップS203: 側方境界監視部111および前方境界監視部112は、側方境界155および前方境界156を超えて障害物が軌道に侵入したか否かを監視し、監視結果159,160を障害物検知部113に与える。 Step S203: The side boundary monitoring unit 111 and the front boundary monitoring unit 112 monitor whether or not an obstacle has entered the orbit beyond the side boundary 155 and the front boundary 156, and monitor the monitoring results 159 and 160 as obstacles. It is given to the detection unit 113.
 図4に示すように、左右の側方境界155a,155bの境界検出領域は、幅数十cm、奥行き百m超の長方形領域になり得る。その長方形領域を観測範囲に収めるため、側方境界監視部111の検出器201,202としては、車両102の前方左右の高い位置に前方下向きに設置した2台のマルチレイヤLIDARなどを使用する。なお、検出器201,201としては、ステレオカメラ、ミリ波レーダー、レーザー距離計を1つまたは複数使用してもよい。また、センサーを自動雲台に取り付けることにより、側方境界155の境界検出領域を走査してもよい。 As shown in FIG. 4, the boundary detection regions of the left and right lateral boundaries 155a and 155b can be rectangular regions having a width of several tens of centimeters and a depth of more than 100 m. In order to keep the rectangular area within the observation range, as the detectors 201 and 202 of the lateral boundary monitoring unit 111, two multilayer LIDARs or the like installed at high positions on the front left and right of the vehicle 102 so as to face downward are used. As the detectors 201 and 201, one or more stereo cameras, millimeter-wave radars, and laser rangefinders may be used. Further, the boundary detection region of the lateral boundary 155 may be scanned by attaching the sensor to the automatic pan head.
 また例えば、前方境界監視部112の検出器203としては、前方境界156の境界検出領域が遠方になることを考慮して、画角の狭い単眼カメラや、暗視野用の赤外線カメラや、ステレオカメラ、ミリ波レーダー、マルチレイヤLIDAR、レーザー距離計などを使用する。 Further, for example, as the detector 203 of the front boundary monitoring unit 112, a monocular camera having a narrow angle of view, an infrared camera for a dark field, or a stereo camera in consideration of the fact that the boundary detection area of the front boundary 156 is far away. , Millimeter wave radar, multilayer LIDAR, laser rangefinder, etc. are used.
 これら検出器201~203の検出結果は、検知対象情報データベース110に記録される背景の検出点(障害物以外の検出対象)と比較される。この比較により、以下の条件1~3のいずれかを満たすとき、境界検出領域に障害物が存在すると判断する。 The detection results of these detectors 201 to 203 are compared with the background detection points (detection targets other than obstacles) recorded in the detection target information database 110. Based on this comparison, when any of the following conditions 1 to 3 is satisfied, it is determined that an obstacle exists in the boundary detection region.
 (条件1)境界検出領域の検出点が検出されない。 (Condition 1) The detection point in the boundary detection area is not detected.
 (条件2)境界検出領域の検出点の位置が異なる。 (Condition 2) The position of the detection point in the boundary detection area is different.
 (条件3)境界検出領域の検出点の反射強度が異なる。 (Condition 3) The reflection intensity of the detection point in the boundary detection area is different.
 ここで、車両の速度が速くなるにつれて輸送車両が停止するまでに必要な距離が延び、前方および側方の境界検出領域が遠方に拡大する。このとき、遠方の検出点(路面や設置物など)のレーザーの反射率が極端に小さくなる場合には、条件1によって障害物が侵入したと誤って判断される。このような誤判断を避けるため、車両の走行速度を抑えなくてはならない。 Here, as the speed of the vehicle increases, the distance required for the transport vehicle to stop increases, and the front and side boundary detection areas expand far away. At this time, if the reflectance of the laser at a distant detection point (road surface, installed object, etc.) becomes extremely small, it is erroneously determined that an obstacle has invaded under Condition 1. In order to avoid such a misjudgment, the traveling speed of the vehicle must be suppressed.
 そこで、走行速度を下げずに誤判断を避けるため、次の対処1~2を加える。 Therefore, in order to avoid misjudgment without reducing the running speed, the following measures 1 and 2 are added.
 (対処1)境界検出領域にある、ある値以上の反射率をもつ既存物(レールや標識等)の位置のみを検出点とする。 (Countermeasure 1) Only the position of an existing object (rail, sign, etc.) with a reflectance of a certain value or higher in the boundary detection area is set as the detection point.
 (対処2)境界検出領域に、ある値以上の反射率をもつマーカーを検出点として予め設置しておく。 (Countermeasure 2) A marker having a reflectance of a certain value or higher is set in advance as a detection point in the boundary detection area.
 いずれの場合も、事前に検出対象の位置とその反射率を検知対象情報データベース110に記録し、現在の車両位置における境界検出領域に検出点の位置が含まれる場合のみ、その検出点を障害物侵入の判断に用いる。 In either case, the position of the detection target and its reflectance are recorded in the detection target information database 110 in advance, and the detection point is regarded as an obstacle only when the position of the detection point is included in the boundary detection area at the current vehicle position. Used to determine intrusion.
 図5は、複数の直線で示したように、周辺環境観測部107や検出器201~203としてマルチレイヤLIDARを使用する場合を示す。複数レイヤにまたがる境界検出領域上の検出点を使用することで、境界が曲線となる場合にも障害物の侵入を判断することが可能になる。 FIG. 5 shows a case where the multilayer LIDAR is used as the surrounding environment observation unit 107 and the detectors 201 to 203 as shown by a plurality of straight lines. By using the detection points on the boundary detection area that spans multiple layers, it is possible to determine the intrusion of obstacles even when the boundary is a curve.
 図5には、マルチレイヤLIDARから照射される各検出レイヤによる路面検出点を点線で示す。側方境界155a、155b上を通る検出レイヤは車両からの距離によって異なり、障害物の侵入を判断するには、それらの複数レイヤの検出点を監視する。 In FIG. 5, the road surface detection points by each detection layer irradiated from the multilayer LIDAR are shown by dotted lines. The detection layers that pass over the lateral boundaries 155a and 155b depend on the distance from the vehicle, and the detection points of those multiple layers are monitored to determine the intrusion of obstacles.
 このとき、LIDARの検出点が境界検出領域上にある場合でも、その検出点とLIDARを結ぶ直線(レーザーの光路)が境界検出領域外を通る場合には、障害物の侵入判断には使用しない。これは、境界検出領域外の物体による誤検知を防ぐためである。 At this time, even if the detection point of LIDAR is on the boundary detection area, if the straight line (laser optical path) connecting the detection point and LIDAR passes outside the boundary detection area, it is not used for determining the intrusion of obstacles. .. This is to prevent false detection by an object outside the boundary detection area.
 なお、検出器201~203として、種類が異なる複数のセンサーを使用することにより、障害物を検知する確率を高めて、障害物の検知率を上げてもよい。また、検知できた検出結果の論理積(AND)を用いることにより、誤検知率を下げてもよい。 By using a plurality of sensors of different types as the detectors 201 to 203, the probability of detecting an obstacle may be increased and the detection rate of the obstacle may be increased. Further, the false detection rate may be lowered by using the logical product (AND) of the detected detection results.
 側方境界155および前方境界156を超えて障害物が軌道に侵入した場合、障害物検知部113は、ステップS204に動作を移行する。それ以外の場合、障害物検知部113は、ステップS205に動作を移行する。 When an obstacle enters the orbit beyond the side boundary 155 and the front boundary 156, the obstacle detection unit 113 shifts the operation to step S204. In other cases, the obstacle detection unit 113 shifts the operation to step S205.
ステップS204: ステップS203において障害物が存在すると判定された場合、障害物検知部113は、車両102を停止させるために、障害物情報161を作成する。 Step S204: When it is determined in step S203 that an obstacle exists, the obstacle detection unit 113 creates obstacle information 161 in order to stop the vehicle 102.
ステップS205: 障害物検知部113は、障害物情報161を車両102に送信する。 Step S205: The obstacle detection unit 113 transmits the obstacle information 161 to the vehicle 102.
 以上が、周辺観測システム100Aの全体動作の説明である。
 次に、自己位置推定システム101の動作について詳しく説明する。
The above is a description of the overall operation of the peripheral observation system 100A.
Next, the operation of the self-position estimation system 101 will be described in detail.
 《自己位置推定システム101の動作》
 図6は、自己位置推定システム101の動作を説明するフローチャートである。
<< Operation of self-position estimation system 101 >>
FIG. 6 is a flowchart illustrating the operation of the self-position estimation system 101.
 同図に示すステップS401~408は、車両102の車両位置を推定する動作であり、障害物検知システム103の計測周期ごとに繰り返し実行される。 Steps S401 to 408 shown in the figure are operations for estimating the vehicle position of the vehicle 102, and are repeatedly executed every measurement cycle of the obstacle detection system 103.
ステップS401: 観測データ選別処理部114は、周辺環境観測部107で観測した周辺環境観測データ151を取得する。例えば、図7に示す車両102の周辺環境をマルチレイヤLIDARで観測すると、図8のような三次元の点群データとしてレール観測データ162a~fなどが取得される。 Step S401: The observation data selection processing unit 114 acquires the peripheral environment observation data 151 observed by the peripheral environment observation unit 107. For example, when the surrounding environment of the vehicle 102 shown in FIG. 7 is observed by the multilayer LIDAR, rail observation data 162a to 162af and the like are acquired as three-dimensional point group data as shown in FIG.
ステップS402: 観測データ選別処理部114は、取得した周辺環境観測データを、図9に示す軌道上のレール観測データ162a~fと、軌道以外の周辺観測データ163とに選別する。 Step S402: The observation data selection processing unit 114 sorts the acquired surrounding environment observation data into the rail observation data 162a to 162 on the orbit shown in FIG. 9 and the peripheral observation data 163 other than the orbit.
 このようなレール観測データ162a~fは、その形状や反射率のほか、レール検出データが1つの軌道面(平面または曲面)をなすことを利用して選別される。 Such rail observation data 162a to 162 are selected by utilizing the shape and reflectance of the rail observation data and the fact that the rail detection data forms one raceway surface (plane or curved surface).
ステップS403: 車両姿勢推定処理部115は、周辺環境地図データベース117および3Dレール軌道データベース118の地図データにおいて、車両102の仮位置をレール軌道上に定める。この仮位置は、レール地上子と車両102との通信や、GPS情報や、車両102の速度累積値などに基づいて決定される。 Step S403: The vehicle attitude estimation processing unit 115 determines the temporary position of the vehicle 102 on the rail track in the map data of the surrounding environment map database 117 and the 3D rail track database 118. This temporary position is determined based on the communication between the rail ground element and the vehicle 102, GPS information, the cumulative speed of the vehicle 102, and the like.
ステップS404: 車両姿勢推定処理部115は、レール軌道上の仮位置を3Dレール軌道データベース118に照会することにより、仮位置におけるレール位置情報(レール表面を規定する3次元点群データ)を取得する。 Step S404: The vehicle attitude estimation processing unit 115 acquires rail position information (three-dimensional point cloud data defining the rail surface) at the temporary position by inquiring the temporary position on the rail track to the 3D rail track database 118. ..
 車両姿勢推定処理部115は、図10に示すレール観測データ162から求めたレール表面がなす面Rと、3Dレール軌道データベース118から取得したレール位置情報とを幾何演算することにより、車両102の姿勢情報を推定する。 The vehicle attitude estimation processing unit 115 geometrically calculates the surface R formed by the rail surface obtained from the rail observation data 162 shown in FIG. 10 and the rail position information acquired from the 3D rail track database 118 to obtain the attitude of the vehicle 102. Estimate the information.
ステップS405: 周辺環境観測部107の周辺環境観測データは、車両102に搭載されたセンサーにより観測されるため、車両102に固定された車両座標系Σにおける観測値となる。 Step S405: Since the peripheral environment observation data of the peripheral environment observation unit 107 is observed by the sensor mounted on the vehicle 102, it becomes an observed value in the vehicle coordinate system Σ T fixed to the vehicle 102.
 一方、周辺環境地図データベース117および3Dレール軌道データベース118には、外部座標系Σにおける地図データおよび3Dレール軌道データが記録される。 On the other hand, the map data and the 3D rail track data in the external coordinate system Σ O are recorded in the surrounding environment map database 117 and the 3D rail track database 118.
 周辺環境データ座標変換処理部116は、レール軌道上の車両102の仮位置と、ステップS404で推定した車両102の姿勢情報に基づいて、周辺環境観測データを、車両102に固定された車両座標系Σから外部座標系Σに座標変換し、座標変換後の周辺環境データとして出力する。 The surrounding environment data coordinate conversion processing unit 116 obtains the surrounding environment observation data in the vehicle coordinate system fixed to the vehicle 102 based on the temporary position of the vehicle 102 on the rail track and the attitude information of the vehicle 102 estimated in step S404. The coordinates are converted from Σ T to the external coordinate system Σ O, and output as the surrounding environment data after the coordinate conversion.
ステップS406: スキャンマッチング自己位置推定部119は、座標変換後の周辺環境データと、周辺環境地図データベース117に記録された地図データとのスキャンマッチングを行う。このスキャンマッチングは、周辺環境データにおけるセンサー位置(車両位置)を地図データのレール軌道上に拘束した状態でスキャンを行う。この場合のスキャン範囲は、ステップS404の姿勢推定や、ステップS405の座標変換の妥当性が維持可能な幅とする。
 このスキャン範囲において、周辺環境データと地図データとの残差(両データの間の差分絶対値を総和した値など)が最小残差となるスキャン位置を探索する。
Step S406: The scan matching self-position estimation unit 119 scan-matches the surrounding environment data after coordinate conversion with the map data recorded in the surrounding environment map database 117. This scan matching scans with the sensor position (vehicle position) in the surrounding environment data constrained on the rail track of the map data. The scan range in this case is a width that can maintain the validity of the posture estimation in step S404 and the coordinate transformation in step S405.
In this scan range, the scan position where the residual between the surrounding environment data and the map data (such as the sum of the absolute values of the differences between the two data) is the minimum residual is searched.
ステップS407: スキャンマッチング自己位置推定部119は、スキャンマッチングにおいて求めた最小残差が許容値より小さいか否かを判定する。この許容値は、車両位置に必要な精度に基づいて適宜に設定される。 Step S407: The scan matching self-position estimation unit 119 determines whether or not the minimum residual obtained in scan matching is smaller than the permissible value. This permissible value is appropriately set based on the accuracy required for the vehicle position.
 ここで、最小残差が許容値を超える場合、スキャンマッチング自己位置推定部119は動作をステップS408に移行する。 Here, when the minimum residual exceeds the permissible value, the scan matching self-position estimation unit 119 shifts the operation to step S408.
 また、最小残差が許容値以下に収まる場合、スキャンマッチング自己位置推定部119は動作をステップS409に移行する。 If the minimum residual is within the permissible value, the scan matching self-position estimation unit 119 shifts the operation to step S409.
ステップS408: スキャンマッチング自己位置推定部119は、レール軌道に沿って仮位置を刻み距離だけ変位させる。この刻み距離は、ステップS406でスキャンした範囲を超える距離とする。また、仮位置の変位方向(刻み距離の正負)については、ステップS406のスキャンマッチング中の残差の変化傾向に基づいて、マッチングの残差が小さくなる方向に決定する。 Step S408: The scan matching self-position estimation unit 119 displaces the temporary position along the rail track by the distance. This step distance is a distance that exceeds the range scanned in step S406. Further, the displacement direction of the temporary position (positive or negative of the step distance) is determined in the direction in which the matching residual becomes smaller based on the change tendency of the residual during the scan matching in step S406.
 このように仮位置を変位した後に、スキャンマッチング自己位置推定部119は動作をステップS404に戻す。 After the temporary position is displaced in this way, the scan matching self-position estimation unit 119 returns the operation to step S404.
ステップS409: ステップS404~S408を繰り返すことにより、スキャンマッチングの最小残差は減少し、許容値以下となる。スキャンマッチング自己位置推定部119は、許容値以下の最小残差を得たレール軌道上のスキャン位置を、車両102の車両位置と推定する。 Step S409: By repeating steps S404 to S408, the minimum residual of scan matching is reduced to be equal to or less than the allowable value. The scan matching self-position estimation unit 119 estimates the scan position on the rail track that has obtained the minimum residual of the permissible value or less as the vehicle position of the vehicle 102.
 《車両運転制御部105の動作》
 次に、車両運転制御部105の動作を説明する。
<< Operation of vehicle driving control unit 105 >>
Next, the operation of the vehicle driving control unit 105 will be described.
 図11は、車両運転制御部105の動作を説明するフローチャートである。
 同図に示す動作は一定周期ごとに繰り返し実行される。
ステップS500: 車両運転制御部105は、車両102の運行スケジュールなどに基づいて、車両102の在線位置を情報取得する。
FIG. 11 is a flowchart illustrating the operation of the vehicle driving control unit 105.
The operation shown in the figure is repeatedly executed at regular intervals.
Step S500: The vehicle operation control unit 105 acquires information on the location of the vehicle 102 based on the operation schedule of the vehicle 102 and the like.
ステップS501: 車両運転制御部105は、車両102が駅に停車中か否かを判定する。当該判定は、車両102の在線位置と速度などからなされる。例えば、車両102が駅ホームの近くにあって、速度がゼロであれば駅に停車中と判定する。駅に停車中と判定された場合、車両運転制御部105は、ステップS502に動作を移行する。それ以外の場合、車両運転制御部105は、ステップS511に動作を移行する。 Step S5011: The vehicle driving control unit 105 determines whether or not the vehicle 102 is stopped at the station. The determination is made based on the location and speed of the vehicle 102. For example, if the vehicle 102 is near the station platform and the speed is zero, it is determined that the vehicle is stopped at the station. When it is determined that the vehicle is stopped at the station, the vehicle operation control unit 105 shifts the operation to step S502. In other cases, the vehicle driving control unit 105 shifts the operation to step S511.
ステップS502: 車両運転制御部105は、車両102の運行スケジュールに基づいて、車両102の発車予定時刻を情報取得する。 Step S502: The vehicle operation control unit 105 acquires information on the scheduled departure time of the vehicle 102 based on the operation schedule of the vehicle 102.
ステップS503: 車両運転制御部105は、現在時刻が発車予定時刻に達したか否かを判定する。現在時刻が発車予定時刻に達していない場合、車両運転制御部105は、本処理フローを抜ける。現在時刻が発車予定時刻に達した場合、車両運転制御部105はステップS504に動作を進める。 Step S503: The vehicle operation control unit 105 determines whether or not the current time has reached the scheduled departure time. If the current time has not reached the scheduled departure time, the vehicle operation control unit 105 exits this processing flow. When the current time reaches the scheduled departure time, the vehicle operation control unit 105 proceeds to step S504.
ステップS504: 車両運転制御部105は、車両102が発車準備を完了しているか否かを判定する。発車準備の例として、車両ドア閉状態の確認が挙げられる。未完了の場合は、本処理フローを抜ける。発車準備が完了している場合、車両運転制御部105は、ステップS505に動作を進める。 Step S504: The vehicle driving control unit 105 determines whether or not the vehicle 102 has completed the departure preparation. An example of preparation for departure is confirmation of the closed state of the vehicle door. If it is not completed, exit this processing flow. When the departure preparation is completed, the vehicle operation control unit 105 proceeds to step S505.
ステップS505: 車両運転制御部105は、障害物検知システム103から障害物情報161を取得する。この場合、図12に示すように、自己位置推定システム101による高精度な車両位置に基づいて、障害物の監視エリア500は駅ホームの範囲を除くように設定される。なお、高精度な車両位置を使用することにより、駅ホームの白線内側やホーム扉の近く(車両102が近くを通過するために安全でない領域)を障害物の監視エリア500に含める高精度な設定も十分に可能になる。さらに、駅区間の高精度な車両位置に基づけば、隣接レール上の車両(駅の別ホームの車両)を障害物として検出しないように監視エリア500を狭めることも可能になる。 Step S505: The vehicle driving control unit 105 acquires obstacle information 161 from the obstacle detection system 103. In this case, as shown in FIG. 12, the obstacle monitoring area 500 is set to exclude the range of the station platform based on the highly accurate vehicle position by the self-position estimation system 101. By using a highly accurate vehicle position, a highly accurate setting that includes the inside of the white line of the station platform and the vicinity of the platform door (the area where the vehicle 102 is not safe because it passes nearby) in the obstacle monitoring area 500. Is fully possible. Further, based on the highly accurate vehicle position in the station section, it is possible to narrow the monitoring area 500 so that the vehicle on the adjacent rail (vehicle on another platform of the station) is not detected as an obstacle.
ステップS506: 車両運転制御部105は、障害物情報161から障害物が存在するか否かを判断する。障害物が存在しない場合、車両運転制御部105はステップS507に動作を進める。障害物が存在する場合、車両運転制御部105は停止指令を生成ないし維持しつつ、本処理フローを抜ける。 Step S506: The vehicle driving control unit 105 determines whether or not an obstacle exists from the obstacle information 161. If there are no obstacles, the vehicle driving control unit 105 proceeds to step S507. When an obstacle is present, the vehicle driving control unit 105 exits the main processing flow while generating or maintaining a stop command.
ステップS507: 車両運転制御部105は、駆動指令142を生成して、車両駆動部106に送信する。具体的にはここでは駅を発車するために力行指令が送信される。 Step S507: The vehicle driving control unit 105 generates a drive command 142 and transmits it to the vehicle drive unit 106. Specifically, here, a powering command is transmitted to depart the station.
ステップS508: 車両運転制御部105は、車両102が発車したタイミングと、これから走行する駅間の予定走行時分とに基づいて、次駅の到着予定時刻を計算し、車両102の運行管理センター(運行管理システム)に送信する。 Step S508: The vehicle operation control unit 105 calculates the estimated time of arrival of the next station based on the timing when the vehicle 102 departs and the estimated time of travel between the stations to be traveled, and the operation management center of the vehicle 102 ( Send to the operation management system).
 次に、ステップS501で車両102が駅停車中でないと判断された場合の処理(ステップS511~ステップS515)を説明する。 Next, processing when it is determined in step S501 that the vehicle 102 is not stopped at the station (steps S511 to S515) will be described.
ステップS311: 車両運転制御部105は、監視エリア内の障害物情報161を障害物検知システム103から取得する。図13に示すように、障害物の監視エリア600は、駅間を走行中のため広く設定される。ただし、駅間に保守作業中の区間が存在する場合、自己位置推定システム101による高精度な車両位置に基づいて、保守員が安全に作業可能な範囲については障害物の監視エリア600から除くように設定される。 Step S311: The vehicle driving control unit 105 acquires the obstacle information 161 in the monitoring area from the obstacle detection system 103. As shown in FIG. 13, the obstacle monitoring area 600 is widely set because it is traveling between stations. However, if there is a section under maintenance work between stations, the range where maintenance personnel can work safely should be excluded from the obstacle monitoring area 600 based on the highly accurate vehicle position by the self-position estimation system 101. Is set to.
ステップS512: 車両運転制御部105は、監視エリア内の障害物情報161に基づいて、車両102の制動が必要か否かを判定する。障害物が存在せずに制動が不要であれば、車両運転制御部105はステップS513に動作を進める。障害物が存在して制動が必要であれば、車両運転制御部105はステップS514に動作を進める。 Step S512: The vehicle driving control unit 105 determines whether or not braking of the vehicle 102 is necessary based on the obstacle information 161 in the monitoring area. If there are no obstacles and braking is not required, the vehicle driving control unit 105 proceeds to step S513. If there is an obstacle and braking is required, the vehicle driving control unit 105 proceeds to step S514.
ステップS513: 車両運転制御部105は、駆動指令142を生成して、車両駆動部106に送信する。具体的には、ここでは、車両102の速度が所定の目標速度となるように比例制御などの駆動指令が送信される。送信後、車両運転制御部105はステップS515に動作を進める。 Step S513: The vehicle driving control unit 105 generates a drive command 142 and transmits it to the vehicle drive unit 106. Specifically, here, a drive command such as proportional control is transmitted so that the speed of the vehicle 102 becomes a predetermined target speed. After transmission, the vehicle driving control unit 105 advances the operation to step S515.
ステップS514: 車両運転制御部105は、障害物の検知に応じて、車両102の制動指令を車両駆動部106に送信する。具体的には、車両102を最大減速度で減速させ停止させるよう制動指令が生成される。制動指令の送信後、車両運転制御部105はステップS515に動作を進める。 Step S514: The vehicle driving control unit 105 transmits a braking command of the vehicle 102 to the vehicle driving unit 106 in response to the detection of an obstacle. Specifically, a braking command is generated to decelerate and stop the vehicle 102 at the maximum deceleration. After transmitting the braking command, the vehicle driving control unit 105 advances the operation to step S515.
ステップS515: 車両運転制御部105は、その時点での位置と速度と運行状況から、車両102が次駅に到着する時刻を推定し、車両102の運行管理センター(運行管理システム)に送信する。 Step S515: The vehicle operation control unit 105 estimates the time when the vehicle 102 arrives at the next station from the position, speed, and operation status at that time, and transmits the time to the operation management center (operation management system) of the vehicle 102.
 以上が、車両運転制御部105の動作説明である。 The above is the operation explanation of the vehicle driving control unit 105.
 《実施例1の効果など》
 実施例1では、周辺環境観測部107の観測結果と、周辺環境地図データベース内の地図データとのマッチング(相関)を、軌道に沿った範囲でスキャンする。
<< Effects of Example 1 etc. >>
In the first embodiment, the matching (correlation) between the observation result of the surrounding environment observation unit 107 and the map data in the surrounding environment map database is scanned in a range along the orbit.
 通常、自動車のように特定の軌道に沿って走行しない車両の場合、図14および図15に示すように、複数方向に変位を繰り返しながら周辺環境データ168と地図データ166との相関をとり、その相関値が最も高い位置(図15)を車両位置として求める。この場合、スキャンの自由度が高いために、たまたまマッチング残差が極小となる位置を車両位置として誤判定しやすく、車両位置の検出精度を高めることが難しいという問題があった。 Normally, in the case of a vehicle that does not travel along a specific track such as an automobile, as shown in FIGS. 14 and 15, the surrounding environment data 168 and the map data 166 are correlated while repeating displacements in a plurality of directions. The position with the highest correlation value (FIG. 15) is obtained as the vehicle position. In this case, since the degree of freedom of scanning is high, it is easy to erroneously determine the position where the matching residual is the minimum as the vehicle position, and there is a problem that it is difficult to improve the detection accuracy of the vehicle position.
 さらに、複数方向のスキャンが必要になるため、スキャンマッチングに必要な処理負荷が重く、処理に時間がかかるという問題もあった。 Furthermore, since scanning in multiple directions is required, there is also a problem that the processing load required for scan matching is heavy and the processing takes time.
 しかしながら、実施例1における車両位置の推定処理では、図16に示すように軌道Lに拘束した状態で一次元にスキャンしながら、周辺環境データ168と地図データ166とのマッチング(相関)が最も高くなるスキャン位置(図17参照)を車両位置として推定する。したがって、スキャン範囲が軌道Lに限定されるため、軌道Lから外れた位置を車両位置として誤って判定することがなく、車両位置の検出精度を高めることが可能になる。 However, in the vehicle position estimation process in the first embodiment, the matching (correlation) between the surrounding environment data 168 and the map data 166 is the highest while scanning one-dimensionally while being constrained to the track L as shown in FIG. The scan position (see FIG. 17) is estimated as the vehicle position. Therefore, since the scanning range is limited to the track L, it is possible to improve the detection accuracy of the vehicle position without erroneously determining the position deviating from the track L as the vehicle position.
 また、実施例1では、スキャン範囲が軌道Lにより限定されるため、スキャンマッチングの処理負荷が軽くなり、処理に係る時間を短縮することが可能になる。 Further, in the first embodiment, since the scan range is limited by the trajectory L, the processing load of scan matching is lightened, and the time required for processing can be shortened.
 さらに、実施例1では、周辺環境観測部107による軌道の観測結果に基づいて、軌道に対する車両102の姿勢を求める。軌道を走行する車両102であっても、加減速やカーブ走行などの影響により、軌道に対する車両102の姿勢は時々刻々と変動する。実施例1では、このように変動する車両102の姿勢を情報取得することが可能になる。 Further, in the first embodiment, the posture of the vehicle 102 with respect to the track is obtained based on the observation result of the track by the surrounding environment observation unit 107. Even for the vehicle 102 traveling on the track, the posture of the vehicle 102 with respect to the track fluctuates from moment to moment due to the influence of acceleration / deceleration, curve traveling, and the like. In the first embodiment, it is possible to acquire information on the posture of the vehicle 102 that fluctuates in this way.
 また、実施例1では、車両102の姿勢に基づいて周辺環境観測部107の観測結果を座標変換する。この座標変換により、車両102の姿勢に起因する周辺環境観測部107の観測結果の傾き変動が正しく校正される。そのため、座標変換後の観測結果と、地図データとの間でのマッチング精度が高くなり、車両位置の検出精度をさらに高めることが可能になる。 Further, in the first embodiment, the observation result of the surrounding environment observation unit 107 is coordinate-transformed based on the posture of the vehicle 102. By this coordinate transformation, the tilt fluctuation of the observation result of the surrounding environment observation unit 107 due to the posture of the vehicle 102 is correctly calibrated. Therefore, the matching accuracy between the observation result after the coordinate conversion and the map data becomes high, and it becomes possible to further improve the detection accuracy of the vehicle position.
 さらに、実施例1では、車両位置の検出精度が高くなるため、障害物検知システム103の監視エリアを車両位置に応じて適切なタイミングで変更することが可能になる。したがって、監視すべき監視エリア内の障害物のみを確実に検知することが可能になる。また、監視すべきでない監視エリアの外に位置する対象物を障害物と誤検知することがなくなる。 Further, in the first embodiment, since the detection accuracy of the vehicle position is high, the monitoring area of the obstacle detection system 103 can be changed at an appropriate timing according to the vehicle position. Therefore, it is possible to reliably detect only obstacles in the monitoring area to be monitored. In addition, an object located outside the monitoring area that should not be monitored will not be erroneously detected as an obstacle.
 例えば、駅区間では、監視エリアを狭めることにより、駅ホーム(または駅ホームの白線外など)の安全な位置に待機する乗客を障害物と誤検知することがなくなる。 For example, in the station section, by narrowing the monitoring area, passengers waiting at a safe position on the station platform (or outside the white line of the station platform, etc.) will not be mistakenly detected as obstacles.
 また例えば、保守作業区間では、監視エリアを狭めることにより、安全に作業する保守員を障害物と誤検知することがなくなる。 Also, for example, in the maintenance work section, by narrowing the monitoring area, maintenance personnel who work safely will not be mistakenly detected as obstacles.
 また、実施例1では、監視エリアの境界(左右の側方境界、前方境界など)に、障害物を逸早く確実に検出するなどの目的から,所定幅の境界検出領域を設ける。実施例1では、車両位置の検出精度が高くなるため、この境界検出領域の所定幅を車両位置に応じて適切なタイミングで変更することが可能になる。 Further, in the first embodiment, a boundary detection area having a predetermined width is provided at the boundary of the monitoring area (left and right lateral boundary, front boundary, etc.) for the purpose of quickly and surely detecting an obstacle. In the first embodiment, since the detection accuracy of the vehicle position is high, it is possible to change the predetermined width of the boundary detection area at an appropriate timing according to the vehicle position.
 例えば、駅区間では、境界検出領域の所定幅を狭めることにより、線路上や駅ホームの白線内に侵入する乗客を集中的に検出することが可能になる。 For example, in the station section, by narrowing the predetermined width of the boundary detection area, it becomes possible to intensively detect passengers invading on the railroad track or in the white line of the station platform.
 また例えば、駅間の走行区間では、境界検出領域の所定幅を拡大することにより、線路上に侵入する比較的大きな移動体(自動車など)の速い動きを比較的広い視野で確実に検出することが可能になる。 Further, for example, in the traveling section between stations, by expanding the predetermined width of the boundary detection area, it is possible to reliably detect the fast movement of a relatively large moving object (automobile, etc.) invading the railroad track from a relatively wide field of view. Becomes possible.
 なお、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。 The present invention is not limited to the above-described embodiment, and includes various modifications. For example, the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the one including all the described configurations.
 また、実施例の構成の一部について、他の構成の追加・削除・置換をすることも可能である。 It is also possible to add / delete / replace a part of the configuration of the embodiment with another configuration.
 例えば、実施例では、監視エリアの境界検出領域において障害物を集中的に検出する動作について説明しているが、本発明はこれに限定されない。例えば、監視エリア内の全域または部分域において障害物を検出してもよい。 For example, in the embodiment, the operation of intensively detecting an obstacle in the boundary detection area of the monitoring area is described, but the present invention is not limited to this. For example, obstacles may be detected in the entire area or subarea within the monitoring area.
100…鉄道輸送システム、100A…周辺観測システム、101…自己位置推定システム、102…車両、103…障害物検知システム、105…車両運転制御部、106…車両駆動部、107…周辺環境観測部、108…検知範囲設定データベース、109…監視エリア設定処理部、110…検知対象情報データベース、111…側方境界監視部、112…前方境界監視部、113…障害物検知部、114…観測データ選別処理部、115…車両姿勢推定処理部、116…周辺環境データ座標変換処理部、117…周辺環境地図データベース、118…3Dレール軌道データベース、119…スキャンマッチング自己位置推定部、142…駆動指令、151…周辺環境観測データ、153…位置・姿勢情報、154…検知範囲、155…側方境界、156…前方境界、159…監視結果、160…監視結果、161…障害物情報 100 ... Rail transport system, 100A ... Peripheral observation system, 101 ... Self-position estimation system, 102 ... Vehicle, 103 ... Obstacle detection system, 105 ... Vehicle operation control unit, 106 ... Vehicle drive unit, 107 ... Surrounding environment observation unit, 108 ... Detection range setting database, 109 ... Monitoring area setting processing unit, 110 ... Detection target information database, 111 ... Side boundary monitoring unit, 112 ... Front boundary monitoring unit, 113 ... Obstacle detection unit, 114 ... Observation data selection processing Unit, 115 ... Vehicle attitude estimation processing unit, 116 ... Surrounding environment data coordinate conversion processing unit, 117 ... Surrounding environment map database, 118 ... 3D rail track database, 119 ... Scan matching self-position estimation unit, 142 ... Drive command, 151 ... Surrounding environment observation data, 153 ... position / attitude information, 154 ... detection range, 155 ... side boundary, 156 ... front boundary, 159 ... monitoring result, 160 ... monitoring result, 161 ... obstacle information

Claims (10)

  1.  軌道を走行する車両の位置から周辺の物体群の位置を観測し、観測結果として出力する周辺環境観測部と、
     事前に前記物体群の位置を観測して作成された地図データを記憶し、前記観測結果における前記車両の位置を、前記地図データの軌道上に限定した状態で、前記観測結果と前記地図データとの相関を算出し、前記相関に基づいて車両位置を推定する位置推定部と
     を備えた周辺観測システム。
    The surrounding environment observation unit that observes the position of surrounding objects from the position of the vehicle traveling on the track and outputs the observation result,
    The map data created by observing the position of the object group in advance is stored, and the observation result and the map data are combined with the position of the vehicle in the observation result limited to the track of the map data. A peripheral observation system including a position estimation unit that calculates the correlation between the above and estimates the vehicle position based on the correlation.
  2.  請求項1に記載の周辺観測システムであって、
     前記周辺環境観測部は、前記軌道を観測し、
     前記位置推定部は、前記軌道の前記観測結果に基づいて前記軌道に対する前記車両の姿勢を求める
     ことを特徴とする周辺観測システム。
    The peripheral observation system according to claim 1.
    The surrounding environment observation unit observes the orbit and
    The position estimation unit is a peripheral observation system characterized in that the posture of the vehicle with respect to the track is obtained based on the observation result of the track.
  3.  請求項2に記載の周辺観測システムであって、
     前記位置推定部は、前記車両の姿勢に基づいて前記周辺環境観測部の前記観測結果を座標変換し、座標変換後の前記観測結果と、前記地図データとの間で、前記物体群の前記相関を前記軌道に限定した位置でとり、前記相関に基づいて車両位置を推定する
     ことを特徴とする周辺観測システム。
    The peripheral observation system according to claim 2.
    The position estimation unit performs coordinate conversion of the observation result of the surrounding environment observation unit based on the posture of the vehicle, and the correlation between the observation result after the coordinate conversion and the map data of the object group. A peripheral observation system characterized in that the vehicle position is estimated based on the correlation.
  4.  請求項1~3のいずれか一項に記載の周辺観測システムであって、
     前記軌道に対して設定される監視エリアに存在する障害物を検知する障害物検知部と、
     前記位置推定部が推定する前記車両位置に基づいて、前記監視エリアを変更する境界設定部と
     を備えたことを特徴とする周辺観測システム。
    The peripheral observation system according to any one of claims 1 to 3.
    An obstacle detection unit that detects obstacles existing in the monitoring area set for the orbit, and
    A peripheral observation system including a boundary setting unit that changes the monitoring area based on the vehicle position estimated by the position estimation unit.
  5.  請求項4に記載の周辺観測システムであって、
     前記監視エリアは、前記監視エリアの境界において障害物を検知するための所定幅の境界検出領域を含み、
     前記境界設定部は、前記位置推定部が推定する前記車両位置に基づいて、前記境界検出領域の前記所定幅を変更する
     ことを特徴とする周辺観測システム。
    The peripheral observation system according to claim 4.
    The monitoring area includes a boundary detection area having a predetermined width for detecting an obstacle at the boundary of the monitoring area.
    The boundary setting unit is a peripheral observation system characterized in that the predetermined width of the boundary detection region is changed based on the vehicle position estimated by the position estimation unit.
  6.  請求項4~5のいずれか一項に記載の周辺観測システムであって、
     前記監視エリアは、前記軌道の左右に設定される側方境界と、前記軌道の進行方向前方に設定される前方境界により規定される
     ことを特徴とする周辺観測システム。
    The peripheral observation system according to any one of claims 4 to 5.
    The peripheral observation system is characterized in that the monitoring area is defined by a lateral boundary set to the left and right of the orbit and a front boundary set to the front in the traveling direction of the orbit.
  7.  請求項4~6のいずれか一項に記載の周辺観測システムであって、
     前記境界設定部は、前記位置推定部が推定する前記車両位置と、駅区間との位置関係に基づいて、前記駅区間における前記監視エリアを変更する
     ことを特徴とする周辺観測システム。
    The peripheral observation system according to any one of claims 4 to 6.
    The boundary setting unit is a peripheral observation system characterized in that the monitoring area in the station section is changed based on the positional relationship between the vehicle position estimated by the position estimation unit and the station section.
  8.  請求項4~7のいずれか一項に記載の周辺観測システムであって、
     前記境界設定部は、前記位置推定部が推定する前記車両位置と、前記軌道の保守作業区間との位置関係に基づいて、前記保守作業区間における前記監視エリアを変更する
     ことを特徴とする周辺観測システム。
    The peripheral observation system according to any one of claims 4 to 7.
    The boundary setting unit changes the monitoring area in the maintenance work section based on the positional relationship between the vehicle position estimated by the position estimation unit and the maintenance work section of the track. system.
  9.  請求項1~8のいずれか一項に記載の周辺観測システムとして、コンピュータシステムを機能させる
     ことを特徴とする周辺観測プログラム。
    A peripheral observation program characterized in that a computer system functions as the peripheral observation system according to any one of claims 1 to 8.
  10.  軌道を走行する車両から周辺の物体群を観測する周辺環境観測ステップと、
     事前に前記物体群を観測して作成された地図データを記憶し、前記周辺環境観測ステップによる観測結果と、前記地図データとの間で、前記物体群の相関を前記軌道に限定した位置でとり、前記相関に基づいて車両位置を推定する位置推定ステップと
     を備えた周辺観測方法。
    Surrounding environment observation steps for observing surrounding objects from vehicles traveling in orbit,
    The map data created by observing the object group in advance is stored, and the correlation between the observation result by the surrounding environment observation step and the map data is taken at a position limited to the orbit. , A peripheral observation method including a position estimation step for estimating a vehicle position based on the correlation.
PCT/JP2020/024144 2019-09-02 2020-06-19 Surroundings observation system, surroundings observation program, and surroundings observation method WO2021044707A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019159519A JP7227879B2 (en) 2019-09-02 2019-09-02 Surrounding Observation System, Surrounding Observation Program and Surrounding Observation Method
JP2019-159519 2019-09-02

Publications (1)

Publication Number Publication Date
WO2021044707A1 true WO2021044707A1 (en) 2021-03-11

Family

ID=74849151

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/024144 WO2021044707A1 (en) 2019-09-02 2020-06-19 Surroundings observation system, surroundings observation program, and surroundings observation method

Country Status (2)

Country Link
JP (1) JP7227879B2 (en)
WO (1) WO2021044707A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102579743B1 (en) * 2021-10-12 2023-09-19 한국해양과학기술원 System and method for emergency braking of the towing carriage using laser scanner

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002145072A (en) * 2000-11-10 2002-05-22 East Japan Railway Co Railroad crossing obstacle detecting device
JP2007240208A (en) * 2006-03-06 2007-09-20 Toyota Central Res & Dev Lab Inc Environment recognition device
JP2016162013A (en) * 2015-02-27 2016-09-05 株式会社日立製作所 Self position estimation device and mobile entity
JP2017001638A (en) * 2015-06-16 2017-01-05 西日本旅客鉄道株式会社 Train position detection system using image processing, and train position and environmental change detection system using image processing
JP2017083245A (en) * 2015-10-27 2017-05-18 株式会社明電舎 Clearance limit determination device
WO2019003436A1 (en) * 2017-06-30 2019-01-03 川崎重工業株式会社 Traveling-position identifying system, traveling-position identifying apparatus, and traveling-position identifying method for railroad car

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6354556B2 (en) 2014-12-10 2018-07-11 株式会社デンソー POSITION ESTIMATION DEVICE, POSITION ESTIMATION METHOD, POSITION ESTIMATION PROGRAM
JP6986936B2 (en) 2017-11-21 2021-12-22 株式会社日立製作所 Vehicle control system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002145072A (en) * 2000-11-10 2002-05-22 East Japan Railway Co Railroad crossing obstacle detecting device
JP2007240208A (en) * 2006-03-06 2007-09-20 Toyota Central Res & Dev Lab Inc Environment recognition device
JP2016162013A (en) * 2015-02-27 2016-09-05 株式会社日立製作所 Self position estimation device and mobile entity
JP2017001638A (en) * 2015-06-16 2017-01-05 西日本旅客鉄道株式会社 Train position detection system using image processing, and train position and environmental change detection system using image processing
JP2017083245A (en) * 2015-10-27 2017-05-18 株式会社明電舎 Clearance limit determination device
WO2019003436A1 (en) * 2017-06-30 2019-01-03 川崎重工業株式会社 Traveling-position identifying system, traveling-position identifying apparatus, and traveling-position identifying method for railroad car

Also Published As

Publication number Publication date
JP2021037811A (en) 2021-03-11
JP7227879B2 (en) 2023-02-22

Similar Documents

Publication Publication Date Title
CN109933064B (en) Multi-sensor safety path system for autonomous vehicles
US10222808B2 (en) Inspection system and method for performing inspections in a storage facility
US10006772B2 (en) Map production method, mobile robot, and map production system
KR102022773B1 (en) Apparatus for sensing location of autonomic vehicle and system for stopping right location using thereof
US10239692B2 (en) Article transport facility
JP6447863B2 (en) Moving body
CN109910955B (en) Rail transit tunnel barrier detection system and method based on transponder information transmission
CN109375629A (en) A kind of cruiser and its barrier-avoiding method that navigates
EP3763597B1 (en) Train position estimation device
US10388164B2 (en) Method and system for detecting an unoccupied region within a parking facility
WO2021044707A1 (en) Surroundings observation system, surroundings observation program, and surroundings observation method
JP7217094B2 (en) monitoring device
KR20230031344A (en) System and Method for Detecting Obstacles in Area Surrounding Vehicle
CN111487984B (en) Equipment control method and device and electronic equipment
JP7181754B2 (en) Obstacle detection system for track traveling vehicle and obstacle detection method
US20230202540A1 (en) Obstacle detection system, obstacle detection method, and self-location estimation system
WO2022268752A1 (en) Method for monitoring backward movement of an aircraft at an airport stand
AU2021371394B2 (en) Rail transportation system, method for controlling rail transportation system, and trackside facility shape measurement system
KR20220064111A (en) Ship block transportation equipment based on spatial information and method for detecting obstacle using the same
JP2015056123A (en) Environmental map generation control device of moving body, moving body, and environmental map generation method of moving body
US20230010630A1 (en) Anti-collision system for an aircraft and aircraft including the anti-collision system
KR20220064112A (en) Ship block transportation equipment using augmented reality image based on multi information and method for visualization using the same
CN115675492A (en) Satellite data processing method, control device, computer program, and storage medium for expanding or refining measurement data
KR20160112672A (en) Apparatus for processing Radar signal and the method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20861640

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20861640

Country of ref document: EP

Kind code of ref document: A1