WO2013058372A1 - Dispositif de sécurité pour porte de plateforme - Google Patents

Dispositif de sécurité pour porte de plateforme Download PDF

Info

Publication number
WO2013058372A1
WO2013058372A1 PCT/JP2012/077136 JP2012077136W WO2013058372A1 WO 2013058372 A1 WO2013058372 A1 WO 2013058372A1 JP 2012077136 W JP2012077136 W JP 2012077136W WO 2013058372 A1 WO2013058372 A1 WO 2013058372A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional sensor
platform door
detection
dimensional
safety device
Prior art date
Application number
PCT/JP2012/077136
Other languages
English (en)
Japanese (ja)
Inventor
論平 大町
Original Assignee
ナブテスコ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ナブテスコ株式会社 filed Critical ナブテスコ株式会社
Publication of WO2013058372A1 publication Critical patent/WO2013058372A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61BRAILWAY SYSTEMS; EQUIPMENT THEREFOR NOT OTHERWISE PROVIDED FOR
    • B61B1/00General arrangement of stations, platforms, or sidings; Railway networks; Rail vehicle marshalling systems
    • B61B1/02General arrangement of stations and platforms including protection devices for the passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L23/00Control, warning or like safety means along the route or between vehicles or trains
    • B61L23/04Control, warning or like safety means along the route or between vehicles or trains for monitoring the mechanical state of the route
    • B61L23/041Obstacle detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar

Definitions

  • the present invention relates to a platform door safety device for detecting an object existing in a predetermined area near the platform door.
  • the projector includes a projector that is a light emitting unit that emits light in the vicinity of the platform door, and an image sensor that is a light receiving element that receives light emitted from the projector and reflected by an object.
  • a distance image sensor that is a time-of-flight type three-dimensional sensor that generates a distance image that is an image representing the distance of an object with respect to each pixel based on an output of an image sensor, and a pixel of a distance image generated by the distance image sensor
  • a safety device that includes a calculation unit that is an in-region object detection means for detecting an object existing in a predetermined region near the platform door (see Patent Document 1).
  • the distance image sensor is a time-of-flight type three-dimensional sensor that acquires the distance of the object to itself based on the timing of the output peak of the imaging device
  • detection area When an object moves at high speed outside the area where the object is detected (hereinafter referred to as “detection area”), a phenomenon may be erroneously detected that the object exists in the detection area by the calculation unit.
  • detection area When an object moves at high speed outside the area where the object is detected (hereinafter referred to as “detection area”), a phenomenon may be erroneously detected that the object exists in the detection area by the calculation unit.
  • the inventor of the present application recognizes the problem. This will be specifically described below.
  • FIG. 12 is a side view of a platform 910 provided with a platform door safety device 930, which is an example of a conventional platform door safety device.
  • a platform door 920 is installed on the platform 910.
  • the platform door 920 includes door main bodies 921 and 922 and door pockets 923 and 924 for storing the door main bodies 921 and 922, respectively.
  • drive units (not shown) for driving the door main bodies 921 and 922 are housed.
  • the platform door safety device 930 is a time-of-flight (TOF) type three-dimensional sensor 940 that generates a distance image that is an image representing the distance of an object relative to itself for each pixel, and a distance image generated by the three-dimensional sensor 940. And a computer 950 for detecting an object existing in a detection area 930a indicated by diagonal lines.
  • TOF time-of-flight
  • each pixel of the distance image is a point having information on the distance of the object with respect to the three-dimensional sensor 940.
  • the distance image is a two-dimensional set of these pixels and is three-dimensional data indicating the position of the object with respect to the three-dimensional sensor 940.
  • the three-dimensional sensor 940 is fixed to the track side of the door pocket 923 of the platform door 920.
  • the three-dimensional sensor 940 includes a light emitting unit (not shown) that emits light in the vicinity of the platform door 920 and a light receiving element (not shown) that receives light emitted from the light emitting unit and reflected by an object.
  • the three-dimensional sensor 940 can measure the distance of an object existing in a predetermined angle range 940a.
  • the three-dimensional sensor 940 incorporates an IC (IntegratedInCircuit), and acquires the distance of the object relative to itself based on the peak timing of the output of the light receiving element.
  • IC IntegratedInCircuit
  • Positions P1, P2, and P3 are positions at distances L, 2L, and 3L from the three-dimensional sensor 940, respectively.
  • FIGS. 13A to 13D show the light reflected by the object 990 when the light emission pattern of the light emitting unit of the three-dimensional sensor 940 and the object 990 are present at the positions P1, P2, and P3, respectively. It is a graph which shows the light reception pattern of the light receiving element (henceforth "target light receiving element") to receive. Graphs showing the light receiving patterns of the target light receiving elements when the object 990 is present at the positions P1, P2, and P3 are shown in FIGS. 13A, 13B, 13C, and 13D, respectively. ).
  • the light emitting unit of the three-dimensional sensor 940 emits light at a time period of 4T.
  • the target light receiving element of the three-dimensional sensor 940 receives light with a time period of 4T, delayed by T from the light emission time of the light emitting unit when the object 990 exists at the position P1, and the object 990 exists at the position P2.
  • the light is received with a time period of 4T, 3T behind the light emission time of the light emitting unit.
  • the light emitted from the light emitting unit of the three-dimensional sensor 940 attenuates as the traveling distance increases. Therefore, the output of the target light receiving element when the object 990 is present at each of the positions P1, P2, and P3 decreases in the order of the positions P1, P2, and P3.
  • the three-dimensional sensor 940 generates a one-frame distance image by accumulating outputs for a predetermined plurality of periods for each time within the period for the output of the light receiving element.
  • FIGS. 14 (A) to 14 (C) show the target light reception when the outputs of a plurality of predetermined cycles are integrated every time within the cycle when the object 990 is present at each of the positions P1, P2, and P3. It is a graph which shows the output of an element.
  • FIGS. 14A, 14B, and 14C are graphs showing the output of the target light receiving element when the object 990 is present at the positions P1, P2, and P3, respectively.
  • the three-dimensional sensor 940 can calculate L, which is the distance of the object 990 relative to itself, as T ⁇ (1/2) ⁇ c.
  • c is the speed of light.
  • the three-dimensional sensor 940 calculates the distance of the object 990 relative to itself as 2T ⁇ (1/2) ⁇ c, that is, 2L.
  • the three-dimensional sensor 940 calculates the distance of the object 990 relative to itself as 3T ⁇ (1/2) ⁇ c, that is, 3L.
  • the three-dimensional sensor 940 calculates the distance of the object 990 as described above for each light receiving element, and generates a one-frame distance image representing the distance of the object 990 relative to itself for each pixel.
  • FIG. 15 is a side view of the platform 910 when the object 990 moves at high speed in the order of positions P4, P5, P6, and P7.
  • the object 990 passes a large number of positions other than the positions P5 and P6 between the position P4 and the position P7. However, for ease of understanding, only the positions P4, P5, P6, and P7 will be described below.
  • Positions P4, P5, P6, and P7 are positions at distances of 3.01L, 3L, 2.99L, and 2.98L from the three-dimensional sensor 940, respectively.
  • FIGS. 16A to 16D are graphs showing the light receiving patterns of the target light receiving elements when the object 990 is present at each of the positions P4, P5, P6, and P7.
  • Graphs showing the light receiving patterns of the target light receiving elements when the object 990 is present at the positions P4, P5, P6, and P7 are shown in FIGS. 16 (A), 16 (B), 16 (C), and 16 respectively. (D). Since the directions of the positions P4, P5, P6, and P7 with respect to the three-dimensional sensor 940 are different from each other, the target light receiving elements when the object 990 exists at each of the positions P4, P5, P6, and P7 are different light receiving elements. It is.
  • the target light receiving element of the three-dimensional sensor 940 is 4T delayed from the light emission time of the light emitting unit by 3.01T when the object 990 is present at the position P4.
  • Light is received at a time period, and when the object 990 is present at the position P5, light is received at a time period of 4T, 3T behind the light emission time of the light emitting section, and when the object 990 is present at the position P6, the light emitting section Light is received with a time period of 4T with a delay of 2.99T from the light emission time, and when the object 990 is present at the position P7, light is received with a time period of 4T with a delay of 2.98T from the light emission time of the light emitting unit.
  • the light emitted from the light emitting unit of the three-dimensional sensor 940 attenuates as the traveling distance increases. Therefore, the output of the target light receiving element when the object 990 is present at each of the positions P4, P5, P6, and P7 increases in the order of the positions P4, P5, P6, and P7.
  • the light receiving pattern of the target light receiving element when the object 990 is present at each of the positions P4, P5, P6, and P7 is indicated by a broken line.
  • the time existing at the positions P4, P5, and P6 is very short. Accordingly, the light receiving pattern of the target light receiving element when the object 990 moves at high speed in the order of the positions P4, P5, P6, and P7 is, for example, a range indicated by hatching in FIGS. 16 (A) to 16 (D).
  • the light receiving pattern of the target light receiving element when the object 990 moves at the position P4 is only in the time range from 2.01T to 2.21T.
  • the light receiving pattern of the target light receiving element when the object 990 moves in the position P5 is only in the time range from 6.4T to 6.6T.
  • the light receiving pattern of the target light receiving element when the object 990 moves at the position P6 is only the time range from 11.39T to 11.59T.
  • the light receiving pattern of the target light receiving element when the object 990 reaches the position P7 is only a range after any time after 12T.
  • 17 (A) to 17 (D) show a case where outputs for a plurality of predetermined cycles are integrated at intervals within a cycle when the object 990 moves at high speed in the order of positions P4, P5, P6, and P7. It is a graph which shows the output of an object light receiving element. Graphs showing the outputs of the target light receiving elements when the object 990 moves at the positions P4, P5, and P6 are FIGS. 17A, 17B, and 17C, respectively.
  • FIG. 17D is a graph showing the output of the target light receiving element when the object 990 is present at the position P7.
  • the peak timing of the output of the target light receiving element with respect to the position P4 is 2.21T. Accordingly, the three-dimensional sensor 940 calculates the distance of the object 990 relative to the position P4 as 2.21T ⁇ (1/2) ⁇ c, that is, 2.21L. Similarly, the three-dimensional sensor 940 calculates the distance of the object 990 relative to the position P5 as 2.6T ⁇ (1/2) ⁇ c, that is, 2.6L. The three-dimensional sensor 940 calculates the distance of the object 990 relative to the position P6 as 3.39T ⁇ (1/2) ⁇ c, that is, 3.39L. The three-dimensional sensor 940 calculates the distance of the object 990 relative to the position P7 as 2.98T ⁇ (1/2) ⁇ c, that is, 2.98L.
  • the three-dimensional sensor 940 calculates the distance of the object 990 as described above for each light receiving element, and generates a one-frame distance image representing the distance of the object 990 relative to itself for each pixel. That is, the three-dimensional sensor 940 causes the object 990 not only at the position P7 but also at the positions P4 ′, P5 ′, and P6 ′ as shown in FIG. 18 due to the influence of the object 990 moving at high speeds at the positions P4, P5, and P6. A one-frame distance image representing the existence is generated.
  • FIG. 18 is a diagram illustrating an example of a distance image generated by the three-dimensional sensor 940.
  • FIG. 19 is a side view of the platform 910 for explaining the positions P4 ′, P5 ′, and P6 ′ shown in FIG.
  • information on the distance of each pixel is drawn according to the difference in hatching.
  • information on the distance that each pixel has is represented by, for example, a color difference.
  • positions P4 ′, P5 ′, and P6 ′ are positions where an object 990 moving at high speeds in positions P4, P5, and P6 is erroneously measured by the three-dimensional sensor 940, respectively. Therefore, when the three-dimensional sensor 940 generates the distance image shown in FIG. 18, the computer 950 erroneously detects that the object 990 moving at a high speed outside the detection area 930a exists in the detection area 930a at the position P4 ′.
  • an object of the present invention is to provide a platform door safety device that can detect an object existing in a predetermined region near the platform door more accurately than in the past.
  • a light emitting portion configured to be disposed near the platform door and configured to emit light;
  • a light receiving element configured to receive light emitted from the light emitting unit and reflected by an object;
  • a time-of-flight type three-dimensional sensor wherein the distance of the object with respect to the three-dimensional sensor is acquired based on a peak timing of an output of the light receiving element, A set of three-dimensional data indicative of the position of the object relative to the three-dimensional sensor; and Detection means configured to detect the object present in a predetermined region near the platform door based on the point of the three-dimensional data generated by the three-dimensional sensor;
  • Selecting means configured to select a point for detection, which is a point used for detection of the object by the detecting means among the three-dimensional data generated by the three-dimensional sensor,
  • the selection means includes a change amount of the distance from the three-dimensional data generated by the three-dimensional sensor before the three-dimensional data among the points of the three-dimensional data generated by the three-dimensional sensor. Select only the points
  • the predetermined criterion may be a criterion that the amount of change is larger than a predetermined negative value.
  • the platform door safety device may include an operation execution unit configured to execute a predetermined operation for safety when the detection unit detects the object in the area.
  • the predetermined operation may be an operation of opening the platform door.
  • a light emitting portion configured to be disposed near the platform door and configured to emit light;
  • a light receiving element configured to receive light emitted from the light emitting unit and reflected by an object;
  • a time-of-flight type three-dimensional sensor wherein the distance of the object with respect to the three-dimensional sensor is acquired based on a peak timing of an output of the light receiving element, and two-dimensionally above the points each having the distance information
  • a set of three-dimensional data indicative of the position of the object relative to the three-dimensional sensor And a control method for a platform door safety device comprising a computer, the control method comprising: Detecting the object present in a predetermined region near the platform door based on the point of the three-dimensional data generated by the three-dimensional sensor in the computer; A selection step of selecting a detection point which is a point used for detection of the object by the detection step among the three-dimensional data generated by the three-dimensional sensor; Including In the selection step, among the points of the three-dimensional data generated by the three-
  • the safety device for platform doors of the present invention does not select a point where the distance of the object with respect to the three-dimensional sensor changes abruptly among the points of the three-dimensional data generated by the three-dimensional sensor as the detection point. Thereby, when an object moves at high speed outside a predetermined area near the platform door, the possibility of erroneous detection that the object exists in that area can be reduced. Therefore, the platform door safety device of the present invention can detect an object existing in a predetermined area near the platform door more accurately than in the past.
  • the platform door safety device of the present invention has a high possibility of being erroneously detected that an object is present in a predetermined region near the platform door among points where the distance of the object to the three-dimensional sensor has changed abruptly, That is, it is possible to exclude from the detection points only points that represent an object that has rapidly approached the three-dimensional sensor. As a result, not only the point that represents the object that has suddenly moved closer to the 3D sensor, but also the point that represents the object that has suddenly moved away from the 3D sensor, among the points where the distance of the object to the 3D sensor has suddenly changed. Compared to the configuration excluded from the points, the processing time for selecting the detection points can be shortened.
  • the platform door safety device of the present invention can accurately detect an object present in a predetermined area near the platform door. As a result, it is possible to prevent troubles in the operation of the vehicle due to erroneous detection that an object is present in a predetermined area near the platform door.
  • the point where the distance of the object with respect to the three-dimensional sensor changes abruptly among the points of the three-dimensional data generated by the three-dimensional sensor is not selected as the detection point.
  • the platform door safety device and the control method therefor according to the present invention, it is possible to detect an object existing in a predetermined area near the platform door more accurately than in the past.
  • FIG. 1 is a side view of a platform on which a platform door safety device according to an embodiment of the present invention is installed.
  • FIG. 2 is a plan view of a portion of the platform shown in FIG.
  • FIG. 3 is a front view of the three-dimensional sensor shown in FIG. 4 is a side view of the platform shown in FIG. 1 showing a detection area of the platform door safety device.
  • FIG. 5 is a plan view of a part of the platform shown in FIG. 1 showing the detection area of the platform door safety device.
  • FIG. 6 is a block diagram of a hardware configuration of the computer shown in FIG.
  • FIG. 7 is a block diagram of functions of the computer shown in FIG.
  • FIG. 8 is a flowchart of the operation of the detection pixel selecting means shown in FIG. FIG.
  • FIG. 9 is a flowchart of the operation of the detection pixel selecting unit shown in FIG. 7, and is a flowchart of an operation different from the operation shown in FIG.
  • FIG. 10 is a flowchart of the operation of the detection pixel selecting means of the platform door safety device according to the second embodiment.
  • FIG. 11 is an explanatory diagram of one process in the operation shown in FIG.
  • FIG. 12 is a side view of a platform on which a conventional platform door safety device is installed.
  • 13A is a graph showing a light emission pattern of the light emitting unit of the three-dimensional sensor shown in FIG. 12, and
  • FIG. 13B is a target light reception shown in FIG. 12 when an object is present at the position P1.
  • FIG. 13C is a graph showing the light receiving pattern of the target light receiving element shown in FIG.
  • FIG. 14A shows the output of the target light receiving element of the three-dimensional sensor shown in FIG. 12 when the outputs of a plurality of predetermined cycles are integrated every time within the cycle when an object is present at the position P1.
  • FIG. 14B shows the three-dimensional sensor shown in FIG. 12 when the outputs of a plurality of predetermined cycles are integrated for each time in the cycle when an object is present at the position P2.
  • 14C is a graph showing the output of the target light receiving element, and FIG.
  • FIG. 14C is a diagram when the outputs of a plurality of predetermined cycles are integrated for each time in the cycle when an object is present at the position P3.
  • 12 is a graph showing an output of a target light receiving element of the three-dimensional sensor shown in FIG.
  • FIG. 15 is a side view of the platform shown in FIG. 12 when an object moves at high speed in the order of positions P4, P5, P6, and P7.
  • FIG. 16A is a graph showing a light receiving pattern of the target light receiving element of the three-dimensional sensor shown in FIG. 12 when an object is present at the position P4.
  • FIG. 16B is a graph showing an object at the position P5.
  • FIG. 16C is a graph showing the light receiving pattern of the target light receiving element of the three-dimensional sensor shown in FIG.
  • FIG. 16C shows the three-dimensional shown in FIG. 12 when an object exists at the position P6.
  • FIG. 16D is a graph showing the light receiving pattern of the target light receiving element of the three-dimensional sensor shown in FIG. 12 when an object is present at position P7. is there.
  • FIG. 17A shows the output of a plurality of predetermined periods during the period in which the object exists at position P4 when the object moves at high speed in the order of positions P4, P5, P6, and P7.
  • FIG. 17B is a graph showing the output of the target light receiving element of the three-dimensional sensor shown in FIG. 12 when accumulated every time, and FIG. 17B shows a case where the object moves at high speed in the order of positions P4, P5, P6, and P7.
  • FIG. 17A shows the output of a plurality of predetermined periods during the period in which the object exists at position P4 when the object moves at high speed in the order of positions P4, P5, P6, and P7.
  • FIG. 17B is a graph showing the output of
  • FIG. 13 is a graph showing the output of the target light receiving element of the three-dimensional sensor shown in FIG. 12 when the outputs of a plurality of predetermined periods in the period in which the object is present at the position P5 are integrated every time within the period;
  • FIG. 17C shows the output of a plurality of predetermined periods during the period in which the object exists at the position P6 when the object moves at high speed in the order of positions P4, P5, P6, and P7.
  • the three-dimensional sensor shown in FIG. FIG. 17D is a graph showing the output of the target light receiving element, and FIG. 17D shows a predetermined period in a period in which the object exists at the position P7 when the object moves at high speed in the order of the positions P4, P5, P6, and P7.
  • FIG. 18 is a diagram illustrating an example of a distance image generated by the three-dimensional sensor illustrated in FIG.
  • FIG. 19 is a side view of the platform for explaining the positions P4 ′, P5 ′, and P6 ′ shown in FIG.
  • FIG. 1 is a side view of a platform 10 on which a platform door safety device 30 according to the present embodiment is installed.
  • FIG. 2 is a plan view of a part of the platform 10.
  • the platform 10 is provided with a platform door 20.
  • the platform door 20 includes door main bodies 21 and 22 and door pockets 23 and 24 for storing the door main bodies 21 and 22, respectively.
  • drive units (not shown) for driving the door main bodies 21 and 22 are housed.
  • the platform door safety device 30 includes a time-of-flight (TOF) type three-dimensional sensor 40 that generates a distance image that is an image representing the distance of an object to itself for each pixel, and a distance image generated by the three-dimensional sensor 40. And a computer 50 that detects an object based on the pixels.
  • TOF time-of-flight
  • each pixel of the distance image is a point having information on the distance of the object with respect to the three-dimensional sensor 40.
  • the distance image is a two-dimensional set of these pixels and is three-dimensional data indicating the position of the object with respect to the three-dimensional sensor 40.
  • the three-dimensional sensor 40 is fixed to the track side of the door pocket 23 of the platform door 20.
  • FIG. 3 is a front view of the three-dimensional sensor 40.
  • FIG. 4 is a side view of the platform 10 showing the detection area 30a of the platform door safety device 30.
  • FIG. 5 is a plan view of a part of the platform 10 showing the detection area 30a of the platform door safety device 30.
  • the three-dimensional sensor 40 includes a plurality of LEDs (Light Emitting Diode) 41 that is a light emitting unit that emits light in the vicinity of the platform door 20 and a plurality of LEDs 41 that are reflected by an object.
  • a plurality of light receiving elements 42 that receive the emitted light, and a band-pass filter 43 disposed in front of the light receiving element 42.
  • the three-dimensional sensor 40 can measure the distance of an object existing in a predetermined angle range 40a.
  • the three-dimensional sensor 40 incorporates an IC, and acquires the distance of the object relative to itself based on the peak timing of the output of the light receiving element 42 as in the conventional three-dimensional sensor 940 described above.
  • the plurality of LEDs 41 are arranged so as to surround the plurality of light receiving elements 42.
  • the LED 41 emits only light having a predetermined wavelength.
  • the plurality of light receiving elements 42 are arranged vertically and horizontally and receive light from the respective responsible ranges in the range 40a.
  • the plurality of LEDs 41 emit light simultaneously to the respective responsible ranges of the plurality of light receiving elements 42.
  • the band pass filter 43 passes only light having a wavelength in a predetermined range including the wavelength of light emitted from the LED 41.
  • FIG. 6 is a block diagram of the hardware configuration of the computer 50.
  • a computer 50 includes, for example, a CPU (Central Processing Unit) 51, a ROM (Read Only Memory) 52 that stores programs such as the safety device program 52a of the present invention and various data in advance. And a RAM (Random Access Memory) 53, which is a rewritable volatile storage device used as a work area of the CPU 51, and a USB for communicating with external devices such as the drive unit of the platform door 20 and the three-dimensional sensor 40. And a communication unit 54 such as (Universal ⁇ ⁇ Serial Bus).
  • the CPU 51 is an arithmetic processing unit that operates the computer 50 by executing a program stored in the ROM 52.
  • the RAM 53 temporarily stores programs and various data when the CPU 51 executes the programs.
  • the safety device program 52a may be installed in the computer 50 at the manufacturing stage of the computer 50, or from a storage medium such as a CD (Compact Disk) or DVD (Digital Versatile Disk), or from the network to the computer 50. It may be installed additionally.
  • a storage medium such as a CD (Compact Disk) or DVD (Digital Versatile Disk)
  • FIG. 7 is a block diagram of functions of the computer 50.
  • the CPU 51 executes the safety device program 52 a stored in the ROM 52, so that the vicinity of the platform door 20 is based on the pixels of the distance image generated by the three-dimensional sensor 40.
  • the in-region object detection means 50a detection means for detecting an object existing in the detection area 30a indicated by hatching in FIGS.
  • a detection pixel selection means 50b selection means for selecting a detection pixel as a detection point which is a pixel used for detection of an object by the detection means 50a, and an object within the detection region 30a by the detection means 50a.
  • Safe operation execution means 50c (operation execution) that executes a predetermined operation for safety when it is detected To function as a stage).
  • the computer 50 of the platform door safety device 30 emits light from the plurality of LEDs 41 of the three-dimensional sensor 40 to the range 40a as shown in FIGS. 4 and 5 when the platform door 20 is closed.
  • the three-dimensional sensor 40 receives light emitted from the plurality of LEDs 41 and reflected by the object by the plurality of light receiving elements 42.
  • the three-dimensional sensor 40 like the conventional three-dimensional sensor 940 described above, accumulates outputs for a predetermined plurality of periods for each output within the period and outputs light from the plurality of LEDs 41.
  • the three-dimensional position of the object is acquired based on the difference between the time when the light is received and the time when each of the plurality of light receiving elements 42 receives light, and a one-frame distance image is generated.
  • the three-dimensional sensor 40 transmits the distance images generated as described above to the computer 50 one after another.
  • the three-dimensional sensor 40 transmits a distance image of 20 frames per second to the computer 50. Therefore, the period corresponding to one period of the above period is a period obtained by further dividing 1/20 second.
  • the selection unit 50b of the computer 50 receives the distance image from the three-dimensional sensor 40
  • the selection unit 50b of the pixels of the distance image generated by the three-dimensional sensor 40 is generated by the three-dimensional sensor 40 one frame before the distance image. Only pixels satisfying a predetermined criterion for the distance variation from the distance image are selected as detection pixels. That is, the selection means 50b performs the process shown in FIG. 8 one after another on the distance images received one after another from the three-dimensional sensor 40. Note that the selection unit 50b selects all pixels as detection pixels because there is no distance image one frame before the first frame distance image.
  • FIG. 8 is a flowchart of the operation of the selection means 50b.
  • the selection means 50b calculates the amount of change in distance from the distance image one frame before for all the pixels in the distance image of the current frame (S101). Since the time between frames of the distance image is clear, the amount of change calculated in S101 can be grasped as the speed of the object on a straight line connecting the three-dimensional sensor 40 and the object.
  • the selection unit 50b targets the first pixel in the distance image of the current frame (S102).
  • the selection unit 50b determines whether or not a predetermined criterion, that is, a criterion that the amount of change calculated in S101 is greater than a predetermined negative value is satisfied for the pixel currently being processed (S103).
  • the value used for the determination in S103 is desirably set so that a person existing in the detection region 30a is appropriately detected by the detection means 50a.
  • the value used for the determination in S103 is the third order Pixels representing an object moving away from the original sensor 40, pixels representing an object whose position relative to the three-dimensional sensor 40 does not change, and pixels representing an object approaching the three-dimensional sensor 40 at a speed of 10 m / s or less are used as detection pixels. It is desirable to set so that the selection means 50b selects.
  • the selection unit 50b determines in S103 that the amount of change calculated in S101 is greater than a predetermined negative value, the selection unit 50b selects the pixel currently targeted as a detection pixel (S104).
  • the selection unit 50b determines in S103 that the amount of change calculated in S101 is equal to or less than a predetermined negative value, the selection unit 50b does not select the pixel currently targeted as a detection pixel (S105).
  • the selection unit 50b determines whether or not all pixels of the distance image of the current frame are targeted (S106).
  • the selection unit 50b determines in S106 that all the pixels in the distance image of the current frame are not the target, the next pixel of the current target pixel in the distance image of the current frame is set as a new target. (S107), the process returns to S103 again.
  • the selection unit 50b determines in S106 that all the pixels in the distance image of the current frame are targeted, the process illustrated in FIG. 8 ends.
  • the detection unit 50a of the computer 50 includes the detection area 30a based on the detection pixels selected by the selection unit 50b among the pixels of the distance image of the current frame. Detects objects present in That is, the detection unit 50a determines whether or not the detection pixel exists in the detection region 30a with respect to the distance images received one after another from the three-dimensional sensor 40, and if the detection pixel exists in the detection region 30a. When it is determined, it is determined that an object exists in the detection area 30a.
  • the operation execution means 50c of the computer 50 performs predetermined operations for safety, such as an operation of opening the platform door 20 or an operation of sounding an alarm when an object is detected in the detection area 30a by the detection means 50a. Execute.
  • the platform door safety device 30 does not select, as a detection pixel, a pixel in which the distance of the object with respect to the three-dimensional sensor 40 has changed abruptly among the distance images generated by the three-dimensional sensor 40 ( S105).
  • the platform door safety device 30 can more accurately detect an object existing in the detection region 30a in the vicinity of the platform door 20 than before.
  • the reference that the amount of change in distance from the distance image one frame before is larger than a predetermined negative value is the reference that is selected as the detection pixel.
  • the platform door safety device 30 rapidly applies not only the pixel representing the object that has rapidly approached the three-dimensional sensor 40 among the pixels in which the distance of the object to the three-dimensional sensor 40 has suddenly changed, but also the three-dimensional sensor 40.
  • the processing time for selecting the detection pixel can be shortened.
  • the platform door safety device 30 not only applies to the pixel representing the object that has suddenly approached the 3D sensor 40 among the pixels in which the distance of the object to the 3D sensor 40 has suddenly changed, but also to the 3D sensor 40.
  • a configuration may be adopted in which pixels that represent an object that is far away from the detection pixels are also excluded from the detection pixels.
  • the selection unit 50b performs the process illustrated in FIG. 9 instead of the process illustrated in FIG. 8, thereby rapidly approaching the three-dimensional sensor 40 among the pixels in which the distance of the object with respect to the three-dimensional sensor 40 has rapidly changed.
  • the pixels representing the object that has moved away rapidly from the three-dimensional sensor 40 can be excluded from the detection pixels.
  • FIG. 9 is a flowchart of the operation of the selection means 50b, and is a flowchart of an operation different from the operation shown in FIG.
  • the operation shown in FIG. 9 is an operation obtained by adding the process of S121 to the operation shown in FIG. That is, if the selection unit 50b determines in S103 that the amount of change calculated in S101 is larger than a predetermined negative value, the amount of change calculated in S101 is smaller than a predetermined positive value for the pixel currently being processed. It is determined whether or not the standard is satisfied (S121). When the selection unit 50b determines in S121 that the amount of change calculated in S101 is smaller than a predetermined positive value, the selection unit 50b selects the pixel currently targeted as a detection pixel (S104).
  • the selection unit 50b determines in S121 that the amount of change calculated in S101 is equal to or greater than a predetermined positive value, the selection unit 50b does not select the pixel that is currently targeted as a detection pixel (S105).
  • the platform door safety device 30 detects the vicinity of the platform door 20 because the operation execution means 50c executes a predetermined operation for safety when an object is detected in the detection area 30a by the detection means 50a. If it is erroneously detected that an object exists in the region 30a, the operation of the vehicle 90 is hindered. For example, when a person on the platform 10 shakes his / her hand to the passenger of the vehicle 90 outside the detection area 30a before the vehicle 90 departs after the platform door 20 is closed, the vehicle 90 starts at a high speed outside the detection area 30a.
  • the platform door safety device 30 has a great significance in accurately detecting an object existing in the detection region 30a in the vicinity of the platform door 20.
  • the platform door safety device 30 When the detection means 50a detects an object in the detection area 30a, the platform door safety device 30 is configured so that the platform execution door 50c performs an operation of opening the platform door 20 as a predetermined operation for safety. If there is a false detection that an object is present in the detection area 30a near 20, there is a great hindrance to the operation of the vehicle 90, so the significance of accurately detecting an object present in the detection area 30a near the platform door 20 is significant. Is particularly large.
  • the platform door safety device 30 does not have to perform a predetermined operation for safety when an object is detected in the detection area 30a by the detection means 50a.
  • the platform door safety device according to the second embodiment is the same as the platform door safety device 30 described above except for the operation of the selection means 50b.
  • the platform door safety device selecting unit 50b When receiving the distance image from the three-dimensional sensor 40, the platform door safety device selecting unit 50b according to the second embodiment receives a distance image from the three-dimensional sensor 40 and a predetermined distance or more from the pixel of the distance image generated by the three-dimensional sensor 40. Only pixels having a difference of less than a predetermined number of surrounding pixels are selected as detection pixels. That is, the selection means 50b performs the process shown in FIG. 10 one after another on the distance images received one after another from the three-dimensional sensor 40.
  • FIG. 10 is a flowchart of the operation of the platform door safety device selecting means 50b according to the second embodiment.
  • the selection means 50b targets the first pixel in the distance image of the current frame (S201).
  • the selection unit 50b acquires the number of pixels having a difference equal to or more than a predetermined distance from the current target pixel among the pixels around the current target pixel (S202). For example, when the pixel currently targeted is the pixel 220 shown in FIG. 11, the selection unit 50b calculates the difference in distance from the pixel 220 for each of the pixels 221 to 228 around the pixel 220, and calculates the calculated distance. By determining whether or not the difference is equal to or greater than a predetermined distance, the number of pixels having a difference equal to or greater than the predetermined distance from the pixel 220 among the pixels 221 to 228 around the pixel 220 is obtained. When the pixel targeted by the selection unit 50b is a pixel forming the contour of the distance image, the number of surrounding pixels is less than eight.
  • the distance used as the determination criterion in S202 is preferably set so that a person existing in the detection area 30a is appropriately detected by the detection means 50a.
  • the selection unit 50b determines whether or not the number acquired in S202 is less than a predetermined number (S203).
  • the number used as the determination criterion in S203 may be set to an appropriate number such as 3, for example.
  • the selection unit 50b determines in S203 that the number acquired in S202 is less than the predetermined number, the selection unit 50b selects the pixel currently targeted as a detection pixel (S204).
  • the selection unit 50b determines in S203 that the number acquired in S202 is equal to or greater than the predetermined number, the selection unit 50b does not select the pixel currently targeted as a detection pixel (S205).
  • the selection unit 50b determines whether or not all the pixels of the distance image of the current frame are targeted when the process of S204 or S205 is completed (S206).
  • the selection unit 50b determines in S206 that all the pixels in the distance image of the current frame are not targeted, the next pixel of the current target pixel is selected as a new target in the distance image of the current frame. (S207), the process returns to S202 again.
  • the selection unit 50b determines in S206 that all the pixels of the distance image of the current frame are targeted, the process illustrated in FIG.
  • the distance image generated by the three-dimensional sensor 40 is measured at the wrong position by the three-dimensional sensor 40.
  • the distance between pixels representing an object is extremely different from the surrounding pixels.
  • the second embodiment requires more processing time to select the detection pixels than the first embodiment, but the detection pixels are selected for each frame regardless of other frames. Can do.
  • a platform door safety device that can detect an object existing in a predetermined region near the platform door more accurately than in the past.
  • platform door 20 platform door 30 platform door safety device 30a detection area (predetermined area) 40 Three-dimensional sensor 41 LED (light emitting part) 42 Light receiving element 50 Computer 50a Object detection means in area (detection means) 50b Detection pixel selection means (selection means) 50c Safe operation execution means (operation execution means) 52a Safety device program

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Transportation (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Platform Screen Doors And Railroad Systems (AREA)
  • Train Traffic Observation, Control, And Security (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

La présente invention concerne un dispositif de sécurité pour porte de plateforme possédant un capteur tridimensionnel du type à temps de vol. Le dispositif est pourvu : d'un moyen de détection conçu pour détecter des objets présents dans une région prescrite à proximité d'une porte de plateforme sur la base de points de données tridimensionnelles détectées par le capteur tridimensionnel ; d'un moyen de sélection conçu pour sélectionner, à partir des ensembles de données tridimensionnelles, des points de détection qui sont des points à utiliser par le moyen de détection afin de détecter l'objet précité. Le moyen de sélection sélectionne des points de détection parmi les points de données tridimensionnelles en ne sélectionnant que les points où l'importance de changement de la distance de l'objet au capteur tridimensionnel entre l'ensemble de données tridimensionnelles en cours et l'ensemble de données tridimensionnelles précédent répond aux critères prescrits.
PCT/JP2012/077136 2011-10-21 2012-10-19 Dispositif de sécurité pour porte de plateforme WO2013058372A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011231500A JP5793055B2 (ja) 2011-10-21 2011-10-21 プラットホームドア用安全装置
JP2011-231500 2011-10-21

Publications (1)

Publication Number Publication Date
WO2013058372A1 true WO2013058372A1 (fr) 2013-04-25

Family

ID=48141015

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/077136 WO2013058372A1 (fr) 2011-10-21 2012-10-19 Dispositif de sécurité pour porte de plateforme

Country Status (2)

Country Link
JP (1) JP5793055B2 (fr)
WO (1) WO2013058372A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6386263B2 (ja) * 2014-06-26 2018-09-05 東日本旅客鉄道株式会社 画像形成装置、画像形成システム及び画像形成プログラム
JP6544947B2 (ja) * 2015-03-03 2019-07-17 日本信号株式会社 ロープ式ホーム安全柵
JP6529477B2 (ja) * 2016-10-28 2019-06-12 三菱電機プラントエンジニアリング株式会社 センシングエリア調整治具

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010098454A1 (fr) * 2009-02-27 2010-09-02 パナソニック電工株式会社 Appareil de mesure de la distance
JP2010266270A (ja) * 2009-05-13 2010-11-25 Toyota Motor Corp 周辺物体検出装置、周辺物体検出方法
JP2011022088A (ja) * 2009-07-17 2011-02-03 Panasonic Electric Works Co Ltd 空間情報検出装置
JP2011093514A (ja) * 2009-09-29 2011-05-12 Nabtesco Corp プラットホームドアの安全装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010098454A1 (fr) * 2009-02-27 2010-09-02 パナソニック電工株式会社 Appareil de mesure de la distance
JP2010266270A (ja) * 2009-05-13 2010-11-25 Toyota Motor Corp 周辺物体検出装置、周辺物体検出方法
JP2011022088A (ja) * 2009-07-17 2011-02-03 Panasonic Electric Works Co Ltd 空間情報検出装置
JP2011093514A (ja) * 2009-09-29 2011-05-12 Nabtesco Corp プラットホームドアの安全装置

Also Published As

Publication number Publication date
JP5793055B2 (ja) 2015-10-14
JP2013086740A (ja) 2013-05-13

Similar Documents

Publication Publication Date Title
US11681029B2 (en) Detecting a laser pulse edge for real time detection
US10830881B2 (en) Active signal detection using adaptive identification of a noise floor
CN111919138B (zh) 检测激光脉冲边沿以进行实时检测
JP6272566B2 (ja) 経路予測装置
CN105702088B (zh) 警报装置
CN105100780A (zh) 使用选择的像素阵列分析的光学安全监视
JP2013225295A5 (fr)
JP2013174568A (ja) 移動体追跡装置、移動体追跡方法、及びプログラム
JP2016148514A (ja) 移動物体追跡方法および移動物体追跡装置
WO2013058372A1 (fr) Dispositif de sécurité pour porte de plateforme
KR102198809B1 (ko) 객체 추적 시스템 및 방법
KR20210056694A (ko) 충돌을 회피하는 방법 및 이를 구현한 로봇 및 서버
JPWO2018003093A1 (ja) 人数推定装置、人数推定プログラム及び通過数推定装置
CN114537474A (zh) 一种列车行驶安全的防护方法及装置
JP2018036877A (ja) 回避経路推定装置、回避経路推定方法及び回避経路推定プログラム
JP2015201010A (ja) 予測外動き判定装置
JP6197724B2 (ja) 距離画像生成装置
CN111886637B (zh) 信息处理装置、信息处理方法和计算机能读取的记录介质
JP2017111529A (ja) 障害物判定装置
JPWO2020054110A1 (ja) 物体検出システム、および物体検出方法
US20220334252A1 (en) Information processing method and information processing device
JP3592531B2 (ja) 車両用距離測定装置
US11675077B2 (en) Systems and methods for analyzing waveforms using pulse shape information
KR102134717B1 (ko) 물류이송 시스템
JP2017162281A (ja) 移動体検知システムおよびそのデータ整合性判定方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12840928

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12840928

Country of ref document: EP

Kind code of ref document: A1