WO2012077511A1 - Appareil de sécurité pour porte de quai - Google Patents

Appareil de sécurité pour porte de quai Download PDF

Info

Publication number
WO2012077511A1
WO2012077511A1 PCT/JP2011/077183 JP2011077183W WO2012077511A1 WO 2012077511 A1 WO2012077511 A1 WO 2012077511A1 JP 2011077183 W JP2011077183 W JP 2011077183W WO 2012077511 A1 WO2012077511 A1 WO 2012077511A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
distance
detection
platform door
image sensor
Prior art date
Application number
PCT/JP2011/077183
Other languages
English (en)
Japanese (ja)
Inventor
正憲 安武
Original Assignee
ナブテスコ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ナブテスコ株式会社 filed Critical ナブテスコ株式会社
Publication of WO2012077511A1 publication Critical patent/WO2012077511A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61BRAILWAY SYSTEMS; EQUIPMENT THEREFOR NOT OTHERWISE PROVIDED FOR
    • B61B1/00General arrangement of stations, platforms, or sidings; Railway networks; Rail vehicle marshalling systems
    • B61B1/02General arrangement of stations and platforms including protection devices for the passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L23/00Control, warning or like safety means along the route or between vehicles or trains
    • B61L23/04Control, warning or like safety means along the route or between vehicles or trains for monitoring the mechanical state of the route
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L23/00Control, warning or like safety means along the route or between vehicles or trains
    • B61L23/04Control, warning or like safety means along the route or between vehicles or trains for monitoring the mechanical state of the route
    • B61L23/041Obstacle detection
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05FDEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05F15/00Power-operated mechanisms for wings
    • E05F15/70Power-operated mechanisms for wings with automatic actuation
    • E05F15/73Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/10Detecting, e.g. by using light barriers
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05FDEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05F15/00Power-operated mechanisms for wings
    • E05F15/70Power-operated mechanisms for wings with automatic actuation
    • E05F15/73Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects
    • E05F2015/767Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects using cameras
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05YINDEXING SCHEME ASSOCIATED WITH SUBCLASSES E05D AND E05F, RELATING TO CONSTRUCTION ELEMENTS, ELECTRIC CONTROL, POWER SUPPLY, POWER SIGNAL OR TRANSMISSION, USER INTERFACES, MOUNTING OR COUPLING, DETAILS, ACCESSORIES, AUXILIARY OPERATIONS NOT OTHERWISE PROVIDED FOR, APPLICATION THEREOF
    • E05Y2900/00Application of doors, windows, wings or fittings thereof
    • E05Y2900/40Application of doors, windows, wings or fittings thereof for gates
    • E05Y2900/404Application of doors, windows, wings or fittings thereof for gates for railway platform gates

Definitions

  • the present invention relates to a safety device for a platform door that determines whether there is a passenger or the like left behind between the platform door and a vehicle, and more particularly to a device using a two-dimensional distance image sensor.
  • Patent Document 1 Conventionally, as a platform door safety device, for example, there is one disclosed in Patent Document 1.
  • one or a plurality of cameras captures between the platform door and the vehicle.
  • An image obtained by photographing a state in which no person or object is present between the platform door and the vehicle is prepared in advance as a reference image.
  • the captured image of the camera and the reference image are compared, and based on the comparison result, it is detected whether or not a person or an object exists between the platform door and the vehicle.
  • Patent Document 1 when only one camera is installed, there is a possibility that a blind spot that is not photographed by the camera may be formed between the platform door and the vehicle. In that case, there is a possibility that passenger safety cannot be secured.
  • Patent Document 1 discloses the installation of a plurality of cameras, but Patent Document 1 does not mention how to process the captured images of the plurality of cameras. For example, it is conceivable to prepare a reference image for each camera. However, if the area captured by the camera is large, there is a possibility of erroneous detection.
  • the present invention provides a platform door safety device capable of preventing false detection while widening the detection area after eliminating the blind spots.
  • a time-of-flight type three-dimensional camera is provided so as to face the detection area at the opening of the platform door.
  • the detection unit When it is determined that a person or object exists based on the output of each of the three-dimensional cameras, the detection unit outputs detection information of the person or object. When each detection unit outputs the detection information, the determination unit determines that the person or object exists in the detection area.
  • a time-of-flight type 3D camera is provided so as to have a detection area at the opening of the platform door. A blind spot is never formed. Further, when both three-dimensional cameras detect a person or an object, it is determined that a person or an object exists in the detection area, so that no erroneous detection occurs.
  • the determination unit is based on the distance from each of the three-dimensional cameras to the person or the object, and the distance is larger than the detection result of the three-dimensional camera having a long distance.
  • the presence or absence of a person or an object can be determined from the detection result of the nearby three-dimensional camera.
  • the output of a three-dimensional camera that is close to the person has higher accuracy in detecting a person or object, so that the presence or absence of a person or object can be determined more reliably.
  • the determination unit can determine the presence or absence of a person or an object based on a distance from each of the three-dimensional cameras to the person or the object only outside a predetermined area including an intermediate position of the detection area.
  • the distance from both the three-dimensional cameras to the person or the object is almost the same, but outside the predetermined area, either one of the tertiary
  • the original camera is closer to a person or an object than the other three-dimensional camera, and the detection accuracy of one three-dimensional camera is higher than the detection accuracy of the other distance image sensor. Since the presence or absence of a person or an object is determined based on the output of the distance image sensor with high accuracy, the presence or absence of a person or an object can be detected more reliably.
  • the determination unit determines that the person or object exists regardless of the detection result of the three-dimensional camera.
  • detection information based on the output of the 3D camera is highly accurate. According to the detection information, it is determined that a person or object exists, so that the person or object can be detected more reliably.
  • FIG. 1 is a plan view and a front view of a platform door safety device according to an embodiment of the present invention.
  • FIG. 2 is a flowchart of the platform door safety device of FIG.
  • a platform door safety device is attached to a platform door as shown in FIGS. 1 (a) and 1 (b).
  • fixed walls 4 and 4 are arranged on the platform 2 shown in FIG. 1B at intervals as shown in FIG.
  • Movable fences 8 and 8 are provided in the fixed walls 4 and 4 so as to open and close the opening 6 between the fixed walls 4 and 4.
  • the support bases 10 and 10 are provided on the vehicle side of the fixed walls 4 and 4 so as to be positioned on the vehicle side of the opening 6.
  • a three-dimensional camera for example, a TOF (Time-of-Flight) type distance image sensor 12a, 12b is provided at the same height position of the support bases 10, 10 so as to face each other toward the opening 6 side. Yes.
  • the distance image sensors 12a and 12b output two-dimensional video information and distance information of a subject photographed by the distance image sensors 12a and 12b.
  • reference numeral 14a represents the imaging region of the distance image sensor 12a.
  • Reference numerals 16a and 16a indicate the blind spot area of the distance image sensor 12a.
  • a reference numeral 14b indicates an imaging region of the distance image sensor 12b.
  • Reference numerals 16b and 16b indicate blind spots of the distance image sensor 12b. Since the imaging regions 14a and 14b of the distance image sensors 12a and 12b partially overlap each other, if there is a person or an object in the overlapped area, the distance image sensors 12a and 12b display video information on the same person or object. And distance information are output simultaneously.
  • the distance image sensor 12a includes the blind spot areas 16b and 16b of the distance image sensor 12b in the shooting area 14a.
  • the distance image sensor 12b captures the blind spot areas 16a and 16b of the distance image sensor 12a. It is included in the region 14b. Accordingly, there is no region that is not photographed by any of the distance image sensors 12a and 12b. These two imaging areas 14a and 14b serve as detection areas.
  • the outputs of the distance image sensors 12a and 12 are input to the detection units 18a and 18b provided correspondingly.
  • the detection unit 18a outputs a detection result that determines whether or not a person or an object exists based on the output of the distance image sensor 12a. If a person or an object exists, the detection unit 18a outputs the detection result from the distance image sensor 12a. Alternatively, distance information indicating the distance to the object is also output.
  • the detection unit 18b outputs a detection result that determines whether or not a person or an object exists based on the output of the distance image sensor 12b. If a person or an object exists, the detection unit 18b outputs the detection result from the distance image sensor 12b. Alternatively, distance information indicating the distance to the object is also output.
  • detection results and the distance information of the person or the object are supplied from the detection units 18a and 18b to the determination unit 20.
  • the determination unit 20 determines that there is a person or an object based on the detection results and distance information from both the detection units 18a and 18b
  • a detection signal is output from the output unit 22 to the host system 24 for the safety device, for example, comprehensive control. Supplied to the board.
  • the upper system 24 opens the movable fences 8 and 8.
  • the determination unit 20 determines whether a person or the like is detected in the distance image sensor 12a (step S4). This determination is made, for example, by examining the detection result of the detection unit 18a.
  • step S4 determines whether the distance between the distance image sensor 12a and a person (this is included in the output of the detection unit 18a) is equal to or less than a predetermined first distance, or The determination unit 20 determines whether a person or the like is present in the blind spot area 16b of the distance image sensor 12b (step S6).
  • the first predetermined distance is a predetermined distance from the distance image sensors 12a and 12b to the opening 6 side, for example, approximately the center of each of the movable fences 8 and 8 in the closed state.
  • the distance (that is, a distance approximately half of the distance from the distance image sensors 12a and 12b to the center of the door opening 8) is determined. If the distance from the distance image sensor 12a to the person or the object is shorter than the first predetermined distance, the person or the like is located at a position that is quite close to the distance image sensor 12a and is far away from the distance image sensor 12b. Exists.
  • step S6 determines that a person or the like exists only by the information that the distance image sensor 12a detects the person or the like. Therefore, when the answer to the determination in step S6 is yes, the determination unit 20 outputs a detection signal from the output unit 22 (step S8), and then executes step S2 again.
  • the determination unit 20 executes Step S8.
  • the positions and sizes of the blind spot areas 16b and 16b are set in the determination unit 20 in advance.
  • step S4 determines whether the distance image sensor 12b detects a person or the like (step S10).
  • step S12 it is determined whether the distance between the person or the like and the distance image sensor 12b is equal to or less than the first predetermined distance, or whether the person or the like exists in the blind spot areas 16a and 16a of the distance image sensor 12a. Is determined (step S12). If the answer to this determination is yes, (1) whether or not the distance image sensor 12a detects a person or the like, but whether or not a person or the like exists within a first predetermined distance from the distance image sensor 12b, Or (2) a person or the like exists in the blind spot areas 16a and 16a of the distance image sensor 12a. Accordingly, the determination unit 20 executes step S8 in order to improve safety in the same manner as in the case where it is determined as YES in step S6. The positions and sizes of the blind spot areas 16a and 16a are also set in the determination unit 20 in advance.
  • step S12 If the answer to the determination in step S12 is no, a person or the like is detected by the distance image sensor 12b.
  • the person or the like is detected from the position of the first predetermined distance from the distance image sensor 12b shown in FIG. Is also present at a position close to the distance image sensor 12a side. If the answer to the determination at step S10 is no, no person or the like is detected by the distance image sensor 12b.
  • the determination unit 20 determines whether the distance between the person or the like and the distance image sensor 12a is equal to or less than the second predetermined distance (step S14).
  • the second predetermined distance is a distance longer than the first predetermined distance from the distance image sensors 12a and 12b, for example, to a position slightly closer to the distance image sensors 12a and 12b side than the center portion of the door opening 6, respectively. It is said that the distance.
  • determining whether the distance between the person or the like and the distance image sensor 12a is equal to or less than the second predetermined distance means that the position of the second predetermined distance is the end on the center side of the opening 6, It is determined whether a person or the like is located in the region 24 where the end on the distance image sensor 12a side is the position of the first predetermined distance.
  • step S14 determines whether the determination unit 20 is set in advance so as to prioritize the detection result of the person or the like by the distance image sensor 12a (step S16), and the answer to the determination is yes
  • the determination unit 20 executes step S8. That is, when it is determined by the distance image sensor 12a close to the region 24 and the detection unit 18a that a person or the like exists in the region 24, the determination result is given priority. If the answer to the determination at step S16 is no, the determination unit 20 executes step S2 on the assumption that no person has been detected.
  • step S14 If the answer to the determination in step S14 is no, and if the answer to the determination in step S12 is yes, the person or the like: (1) the position of the second predetermined region is the end on the center side of the opening 6, and the distance The end on the image sensor 12b side exists in the region 26 at the position of the first predetermined distance, or (2) exists in the region 28 having the second predetermined distance from each of the distance image sensors 12a and 12b. It will be. If the answer to the determination at step S10 is no, no person or the like is detected by the distance image sensor 12b.
  • the determination unit 20 determines whether the distance between the person detected by the distance image sensor 12b and the distance image sensor 12b is equal to or smaller than the second predetermined distance (step S18). That is, the determination unit 20 determines whether a person or the like exists in the area 26. When the answer to this determination is yes, the determination unit 20 determines whether or not the determination unit 20 is set in advance so as to prioritize the detection result of the person or the like by the distance image sensor 12b (step S20). In this case, the determination unit 20 executes Step S8. That is, when it is determined by the distance image sensor 12b and the detection unit 18b that are close to the area 26 that a person or the like exists in the area 26, the determination result is given priority. If the answer to the determination in step S20 is no, the determination unit 20 executes step S2 on the assumption that no person has been detected.
  • step S18 determines whether or not a person or the like is detected by both the distance image sensors 12a and 12b (step S22), and when the answer to the determination is yes, the determination unit 20 determines that a person or the like exists.
  • step S8 is executed. If the answer to the determination in step S22 is no, a person or the like has not been detected by any of the distance image sensors 12a and 12b, and thus the determination unit 20 executes step S2.
  • the case where only one person or the like is simultaneously detected by the distance image sensors 12a and 12b has been described, but a plurality of persons or the like may be simultaneously detected by the distance image sensors 12a and 12b.
  • the outputs of the detection units 18a and 18b are input to the determination unit 20 in step S2, which of the people represented by the output of the distance image sensor 12a is the person represented by the output of the distance image sensor 12b. Is associated with each other based on the distance information, and the above-described processing is performed for each of the associated persons.
  • steps S16 and S20 are executed, but these can also be removed. Moreover, in said embodiment, after performing step S4, S6, step S10, S12 was performed, but conversely after step S10, S12 was performed, step S4, S6 was performed, and after that step S14 may be executed. Further, when the answer to the determination in step S14 is yes, step S16 is executed. When the answer to the determination in step S14 is no, step S18 is executed. However, step S18 is executed, and the determination is made. If the answer is yes, step S20 is executed. If the answer to the determination in step S18 is no, step S14 is executed. If the answer to the determination is yes, step S16 is executed and the answer to the determination is no. In this case, the configuration may be such that step S22 is executed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Geophysics (AREA)
  • Train Traffic Observation, Control, And Security (AREA)
  • Platform Screen Doors And Railroad Systems (AREA)

Abstract

L'invention concerne un appareil caractérisé en ce que des caméras tridimensionnelles (12a, 12b) à temps de vol sont installées en vis-à vis de façon à ménager des zones (24, 26) de détection dans une ouverture (6) de portes de quai. Des unités (18a, 18b) de détection émettent des informations de détection d'une personne ou d'un objet lorsqu'elles déterminent la présence d'une personne ou d'un objet en se basant sur les sorties en provenance des caméras tridimensionnelles (12a, 12b) à temps de vol. Une unité (20) de détermination détermine la présence d'une personne ou d'un objet dans les zones (24, 26) de détection lorsque les unités (18a, 18b) de détection émettent les informations de détection.
PCT/JP2011/077183 2010-12-06 2011-11-25 Appareil de sécurité pour porte de quai WO2012077511A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010271382 2010-12-06
JP2010-271382 2010-12-06

Publications (1)

Publication Number Publication Date
WO2012077511A1 true WO2012077511A1 (fr) 2012-06-14

Family

ID=46207005

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/077183 WO2012077511A1 (fr) 2010-12-06 2011-11-25 Appareil de sécurité pour porte de quai

Country Status (1)

Country Link
WO (1) WO2012077511A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105649467A (zh) * 2015-12-09 2016-06-08 重庆川仪自动化股份有限公司 一种实现安全门双门同步的方法及系统
JP2018199434A (ja) * 2017-05-29 2018-12-20 三菱電機株式会社 可動ホーム柵及び可動ホーム柵列

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1191567A (ja) * 1997-09-18 1999-04-06 Koito Ind Ltd プラットホームの安全監視装置
JP2002037056A (ja) * 2000-07-21 2002-02-06 Nabco Ltd プラットホームドア装置の安全装置
JP2004042777A (ja) * 2002-07-11 2004-02-12 Matsushita Electric Ind Co Ltd 障害物検知装置
JP2004182077A (ja) * 2002-12-03 2004-07-02 Nabco Ltd プラットホームドア装置の安全システム
JP2005239013A (ja) * 2004-02-26 2005-09-08 Mitsubishi Heavy Ind Ltd プラットホーム柵
JP2006224842A (ja) * 2005-02-18 2006-08-31 Nabtesco Corp プラットホームドア装置の安全装置
JP2006231948A (ja) * 2005-02-22 2006-09-07 Hitachi Ltd 可動式ホーム柵システム
JP2008511068A (ja) * 2004-08-27 2008-04-10 シンガポール テクノロジーズ ダイナミックス ピーティーイー リミテッド マルチセンサ侵入検出システム
JP2011016421A (ja) * 2009-07-08 2011-01-27 Higashi Nippon Transportec Kk 支障物検知装置及びこれを備えたプラットホームドアシステム並びに支障物検知方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1191567A (ja) * 1997-09-18 1999-04-06 Koito Ind Ltd プラットホームの安全監視装置
JP2002037056A (ja) * 2000-07-21 2002-02-06 Nabco Ltd プラットホームドア装置の安全装置
JP2004042777A (ja) * 2002-07-11 2004-02-12 Matsushita Electric Ind Co Ltd 障害物検知装置
JP2004182077A (ja) * 2002-12-03 2004-07-02 Nabco Ltd プラットホームドア装置の安全システム
JP2005239013A (ja) * 2004-02-26 2005-09-08 Mitsubishi Heavy Ind Ltd プラットホーム柵
JP2008511068A (ja) * 2004-08-27 2008-04-10 シンガポール テクノロジーズ ダイナミックス ピーティーイー リミテッド マルチセンサ侵入検出システム
JP2006224842A (ja) * 2005-02-18 2006-08-31 Nabtesco Corp プラットホームドア装置の安全装置
JP2006231948A (ja) * 2005-02-22 2006-09-07 Hitachi Ltd 可動式ホーム柵システム
JP2011016421A (ja) * 2009-07-08 2011-01-27 Higashi Nippon Transportec Kk 支障物検知装置及びこれを備えたプラットホームドアシステム並びに支障物検知方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105649467A (zh) * 2015-12-09 2016-06-08 重庆川仪自动化股份有限公司 一种实现安全门双门同步的方法及系统
JP2018199434A (ja) * 2017-05-29 2018-12-20 三菱電機株式会社 可動ホーム柵及び可動ホーム柵列
JP6999292B2 (ja) 2017-05-29 2022-01-18 三菱電機株式会社 可動ホーム柵及び可動ホーム柵列

Similar Documents

Publication Publication Date Title
US9760784B2 (en) Device, method and program for measuring number of passengers
US9477881B2 (en) Passenger counting system, passenger counting method and passenger counting program
JP4066168B2 (ja) 侵入物監視装置
US9785842B2 (en) Safety alarm system and method for vehicle
US20090128632A1 (en) Camera and image processor
WO2016117401A1 (fr) Dispositif de caméra embarquée
WO2011114624A1 (fr) Dispositif de contrôle des alentours d'un véhicule
US9137498B1 (en) Detection of mobile computing device use in motor vehicle
JP6051608B2 (ja) 車両周辺対象物検出装置
JP6722051B2 (ja) 物体検出装置、及び物体検出方法
EP2978206A1 (fr) Système de surveillance basé sur des images
KR20140076415A (ko) 차량의 사각지대 정보 제공 장치 및 방법
WO2020157901A1 (fr) Dispositif d'aide à la conduite
JP6702578B1 (ja) エレベータの利用者検知システム
JP2011093514A (ja) プラットホームドアの安全装置
WO2012077511A1 (fr) Appareil de sécurité pour porte de quai
JP2014084064A (ja) プラットホームドア監視装置及びプラットホームドア監視方法
JP6702579B1 (ja) エレベータの利用者検知システム
JP2012022646A (ja) 視線方向検出装置、視線方向検出方法及び安全運転評価システム
JP6117089B2 (ja) 人物検出装置
KR20210023859A (ko) 화상 처리 장치, 이동 장치 및 방법, 그리고 프로그램
JP2020201881A (ja) ドア開放警告装置、ドア開放警告方法、及びドア開放警告プログラム
JP2011175405A (ja) 表示装置および表示装置における操作可能範囲表示方法
JP6456761B2 (ja) 道路環境認識装置、車両制御装置及び車両制御方法
CN112456287B (zh) 电梯的使用者探测系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11847691

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11847691

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP