CN114926790A - Subway station scene joint control system - Google Patents

Subway station scene joint control system Download PDF

Info

Publication number
CN114926790A
CN114926790A CN202210451337.9A CN202210451337A CN114926790A CN 114926790 A CN114926790 A CN 114926790A CN 202210451337 A CN202210451337 A CN 202210451337A CN 114926790 A CN114926790 A CN 114926790A
Authority
CN
China
Prior art keywords
camera
output
input end
subway
big data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210451337.9A
Other languages
Chinese (zh)
Inventor
丁建隆
蔡昌俊
祝唯
金辉
王晓夏
张�杰
李漾
罗伟庭
杨志强
郭婷婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Metro Group Co Ltd
Original Assignee
Guangzhou Metro Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Metro Group Co Ltd filed Critical Guangzhou Metro Group Co Ltd
Priority to CN202210451337.9A priority Critical patent/CN114926790A/en
Priority to PCT/CN2022/105982 priority patent/WO2023206825A1/en
Publication of CN114926790A publication Critical patent/CN114926790A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a subway station scene joint control system which comprises a subway information terminal, wherein the output and input end of the subway information terminal is connected with the output and input end of a neural network algorithm module, and the output and input end of the neural network algorithm module is respectively connected with the output and input ends of a first 3D camera, a second 3D camera and a third 3D camera. According to the invention, the first 3D camera, the second 3D camera and the third 3D camera are arranged, so that the first 3D camera can be arranged in a plurality of carriages of a subway, the second 3D camera and the third 3D camera are arranged at the positions of stations and other carriages, the crowded area of people in the carriages can be shot and calculated by using the first 3D camera, and the crowded area of people in the stations can be shot and calculated by using the second 3D camera and the third 3D camera.

Description

Subway station scene joint control system
Technical Field
The invention relates to the technical field of subway stations, in particular to a scene joint control system of a subway station.
Background
Most of system integration products based on urban rail transit mainstream aim at realizing specific functions and do not have intelligent attributes from perception, reaction, learning to evolution. For example, the integrated monitoring system aims at realizing the real-time centralized monitoring function of the electromechanical equipment and the linkage function among the systems, is a customized system and is difficult to change after the system development is finished. At present, comprehensive monitoring mainly depends on manual debugging and control of a point table, data of other subsystems are obtained only in a communication message mode and state display is carried out, and the comprehensive monitoring is deficient in intelligent functions such as big data analysis, hidden danger mining and fault prediction.
Most of intelligent products based on urban rail transit mainstream are single system self-growing and do not support multi-professional business collaboration and capability evolution. For example, the diversified payment is to realize ticket card mobile payment only in an automatic ticket selling and checking system; the intelligent video analysis system only analyzes and gives an alarm for abnormal conditions such as warning area warning line intrusion, retrograde motion, crowd density abnormality and the like appearing in the CCTV video area; the intelligent train detecting system is used for automatically analyzing the running states of train running key components such as a steering frame, a pantograph and the like. At the present stage, intelligent products are subject to research in each specialty, and a system architecture for linking multi-service coordination and capability progression is not formed.
At present, the existing subway station scene joint control system has some defects, for example; the existing subway station scene joint control system cannot perform personnel distribution coordination at the taking position according to the number of people in each carriage of the subway, cannot effectively monitor the station-entering speed and the station-exiting speed of the subway, and adopts an artificial intelligence algorithm, so that the efficiency is not high, the linkage big data cannot be accurate, and various analysis and calculation are performed.
Disclosure of Invention
The invention aims to provide a subway station scene joint control system, which solves the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme: the utility model provides a control system is ally oneself with to subway station scene, includes subway information terminal, subway information terminal's output and input end is connected with the output and input end of neural network algorithm module, the output and input end of neural network algorithm module is connected with the output and input end of first 3D camera, second 3D camera and third 3D camera respectively, the output and the input of first tachometer sensor, second tachometer sensor and third tachometer sensor of first 3D camera, second 3D camera and third 3D camera are connected, the output of first tachometer sensor, second tachometer sensor and third tachometer sensor is connected with the input of shield door LED screen.
As a preferred embodiment of the present invention, the output ends of the first 3D camera, the second 3D camera and the third 3D camera are respectively connected to the input ends of the spectral feature library and the big data feature library.
As a preferred embodiment of the present invention, the output and input ends of the spectral feature library and the big data feature library are connected with the output and input end of the neural network algorithm module.
As a preferred embodiment of the present invention, the output ends of the first speed sensor, the second speed sensor and the third speed sensor are connected to the input end of the big data analysis module.
As a preferred embodiment of the present invention, an output end of the big data analysis module is connected to an input end of the big data comparison module.
As a preferred embodiment of the present invention, an output end of the big data comparing module is connected to an input end of the neural network algorithm module.
As a preferred embodiment of the present invention, an output end of the subway information terminal is connected to an input end of a large screen of a station control room.
Compared with the prior art, the invention has the following beneficial effects:
1. the invention can arrange the first 3D camera inside a plurality of carriages of the subway, arrange the second 3D camera and the third 3D camera at the position of the station and the like, use the first 3D camera to shoot and calculate the crowding area of people in the carriages, use the second 3D camera and the third 3D camera to shoot and calculate the crowding area of people in the station, through a subway information terminal and a neural network algorithm module, can carry out calculation and analysis, display carriage numbers on LED screens of screen doors on a plurality of platforms, when people queue and wait to enter the carriages, the LED screens of the screen doors can display the crowding degree in different carriages and recommend the carriages entering other numbers, can effectively carry out the joint control of the carriages and the scenes when waiting for the carriages, and according to the passenger flow condition of the station, the switch of the gate is controlled in a linked manner to change the in-out direction.
2. According to the invention, by arranging the spectrum characteristic library and the big data characteristic library, in the process of calculating by the neural network algorithm module, the spectrum characteristic library and the big data characteristic library can be utilized to increase the data of artificial intelligence operation, and the quantity and the detection mode of the data are increased by performing big data comparison and spectrum comparison with the image shot by the camera, so that the diversity of the calculated data is improved.
3. According to the invention, by arranging the large screen of the station control room, 3D imaging can be carried out on the carriage and the waiting position through the first 3D camera, the second 3D camera and the third 3D camera, and 3D image presentation is carried out in the control room by utilizing the large screen of the station control room, so that 3D monitoring can be conveniently carried out in real time on the carriage and the station.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
fig. 1 is a flow chart of a subway station scene joint control system according to the present invention.
In the figure: the subway information terminal comprises a subway information terminal 2, a neural network algorithm module 3, a first 3D camera 4, a second 3D camera 5, a third 3D camera 6, a first speed measuring sensor 7, a second speed measuring sensor 8, a third speed measuring sensor 9, a shielded door LED screen 10, a big data analysis module 11, a big data comparison module 12, a spectrum feature library 13, a big data feature library 14 and a big screen of a station control room.
Detailed Description
Referring to fig. 1, the present invention provides a technical solution: the utility model provides a control system is ally oneself with to subway station scene, includes subway information terminal 1, subway information terminal 1's output and input end is connected with neural network algorithm module 2, neural network algorithm module 2's output and input end is connected with the output and input end of first 3D camera 3, second 3D camera 4 and third 3D camera 5 respectively, the output and the input of first 6, second 7 and third 8 speed sensor of speed sensor are connected to the output of first 3D camera 3, second 3D camera 4 and third 3D camera 5, the output of first 6, second 7 and third 8 speed sensor of speed sensor is connected with the input of shielding door LED screen 9.
It should be noted that, by arranging the first 3D camera 3, the second 3D camera 4 and the third 3D camera 5, the first 3D camera 3 can be arranged inside a plurality of carriages of a subway, the second 3D camera 4 and the third 3D camera 5 can be arranged at a station waiting position, the first 3D camera 3 can be used for shooting and calculating the crowding area of people in the carriages, the second 3D camera 4 and the third 3D camera 5 can be used for shooting and calculating the crowding area of people in the carriages, calculation and analysis can be carried out through a subway information terminal and a neural network algorithm module 2, carriage numbers can be displayed on a plurality of platform screen door LED screens 9, when people wait to enter the carriages, the screen door LED screens can display the crowding degree in different carriages and recommend the carriages with other numbers, the carriage and waiting scene joint control can be effectively carried out, and the switch of the gate machine is controlled in a united way according to the passenger flow condition of the station to change the access direction.
In this embodiment, referring to fig. 1, output ends of the first 3D camera 3, the second 3D camera 4, and the third 3D camera 5 are respectively connected to input ends of the spectral feature library 12 and the big data feature library 13.
It should be noted that, by providing the spectral feature library 12 and the big data feature library 13, in the process of calculating by the neural network algorithm module 2, the spectral feature library 12 and the big data feature library 13 can be used to add data of artificial intelligence operation, and through performing big data comparison and spectrum comparison with an image shot by a camera, the number and detection mode of data are increased, and the diversity of calculated data is improved.
In this embodiment, referring to fig. 1, the output and input ends of the spectral feature library 12 and the big data feature library 13 are connected to the output and input end of the neural network algorithm module 2.
In this embodiment, referring to fig. 1, the output ends of the first speed sensor 6, the second speed sensor 7 and the third speed sensor 8 are connected to the input end of the big data analysis module 10.
In this embodiment, referring to fig. 1, an output end of the big data analysis module 10 is connected to an input end of the big data comparison module 11.
In this embodiment, referring to fig. 1, an output end of the big data comparing module 11 is connected to an input end of the neural network algorithm module 2.
In this embodiment, referring to fig. 1, an output end of the subway information terminal 1 is connected to an input end of a large screen 14 of a station control room.
It should be noted that, by setting the large screen 14 of the station control room, 3D imaging can be performed on the carriage and the waiting position by the first 3D camera 3, the second 3D camera 4 and the third 3D camera 5, and 3D image presentation is performed in the control room by using the large screen 14 of the station control room, so that real-time 3D monitoring can be performed in the carriage and the station.
When the subway station scene joint control system is used, a first 3D camera 3 is arranged in a plurality of carriages of a subway, a second 3D camera 4 and a third 3D camera 5 are arranged at the station waiting position, the first 3D camera 3 can be used for shooting and calculating the crowdedness area of people in the carriages, the second 3D camera 4 and the third 3D camera 5 can be used for shooting and calculating the crowdedness area of people in the carriages, calculation and analysis can be carried out through a subway information terminal and a neural network algorithm module 2, carriage numbers are displayed on screen doors LED screens 9 on a plurality of platforms, when people queue up to enter the carriages, the screen doors LED screens can display the crowdedness degree in different carriages and recommend carriages entering other numbers, the joint control of the carriages and waiting scenes can be effectively carried out, and according to the station traffic conditions, allies oneself with accuse floodgate switch, change the direction of cominging in and going out, at neural network algorithm module 2 at the in-process that calculates, can utilize spectral feature storehouse 12 and 13 data that increase artificial intelligence operation of big data feature storehouse, through can carry out big data comparison and spectrum comparison with the image that the camera was shot, the quantity and the detection mode of data have been increased, the variety of calculating data has been promoted, through first 3D camera 3, second 3D camera 4 and third 3D camera 5 can carry out 3D formation of image to carriage and waiting the car position, and utilize station control room large-size screen 14 to carry out the 3D image presentation in the control room, be convenient for carry out the 3D control in real time to carriage and station.

Claims (7)

1. The utility model provides a subway station scene allies oneself with accuse system, includes subway information terminal (1), its characterized in that: the output and input end of subway information terminal (1) is connected with the output and input end of neural network algorithm module (2), the output and input end of neural network algorithm module (2) is connected with the output and input end of first 3D camera (3), second 3D camera (4) and third 3D camera (5) respectively, the output and the input of first speed sensor (6), second speed sensor (7) and third speed sensor (8) of first 3D camera (3), second 3D camera (4) and third 3D camera (5) are connected, the output and the input of shielding door LED screen (9) of first speed sensor (6), second speed sensor (7) and third speed sensor (8) are connected.
2. The subway station scene joint control system as claimed in claim 1, wherein: the output ends of the first 3D camera (3), the second 3D camera (4) and the third 3D camera (5) are respectively connected with the input ends of the spectral feature library (12) and the big data feature library (13).
3. The subway station scene joint control system of claim 2, wherein: the output and input ends of the spectral characteristic library (12) and the big data characteristic library (13) are connected with the output and input end of the neural network algorithm module (2).
4. The subway station scene joint control system of claim 1, wherein: the output ends of the first speed measuring sensor (6), the second speed measuring sensor (7) and the third speed measuring sensor (8) are connected with the input end of the big data analysis module (10).
5. The subway station scene joint control system of claim 4, wherein: the output end of the big data analysis module (10) is connected with the input end of the big data comparison module (11).
6. The subway station scene joint control system as claimed in claim 5, wherein: the output end of the big data comparison module (11) is connected with the input end of the neural network algorithm module (2).
7. The subway station scene joint control system of claim 1, wherein: the output end of the subway information terminal (1) is connected with the input end of a large screen (14) of a station control room.
CN202210451337.9A 2022-04-26 2022-04-26 Subway station scene joint control system Pending CN114926790A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210451337.9A CN114926790A (en) 2022-04-26 2022-04-26 Subway station scene joint control system
PCT/CN2022/105982 WO2023206825A1 (en) 2022-04-26 2022-07-15 Subway station scene joint control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210451337.9A CN114926790A (en) 2022-04-26 2022-04-26 Subway station scene joint control system

Publications (1)

Publication Number Publication Date
CN114926790A true CN114926790A (en) 2022-08-19

Family

ID=82806929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210451337.9A Pending CN114926790A (en) 2022-04-26 2022-04-26 Subway station scene joint control system

Country Status (2)

Country Link
CN (1) CN114926790A (en)
WO (1) WO2023206825A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007001381A (en) * 2005-06-22 2007-01-11 Oki Electric Ind Co Ltd System for collecting and providing train congestion rate information
JP2016168876A (en) * 2015-03-11 2016-09-23 株式会社東芝 Congestion predictor and congestion prediction method
CN204993663U (en) * 2015-08-28 2016-01-20 中电科二十二所(青岛)天博信息科技公司 Subway carriage passenger flow density monitored control system
CN112347814A (en) * 2019-08-07 2021-02-09 中兴通讯股份有限公司 Passenger flow estimation and display method, system and computer readable storage medium
CN113276913B (en) * 2021-05-25 2022-11-01 五邑大学 Method and system for dynamically balancing passenger flow of subway carriage
CN113505644B (en) * 2021-06-08 2022-11-18 同济大学 Carriage passenger flow detection alarm system and method thereof
CN114285970A (en) * 2021-12-22 2022-04-05 南京国电南自轨道交通工程有限公司 Intelligent station operation auxiliary linkage method based on video analysis technology

Also Published As

Publication number Publication date
WO2023206825A1 (en) 2023-11-02

Similar Documents

Publication Publication Date Title
CN112598182B (en) Intelligent scheduling method and system for rail transit
CN111553314B (en) Urban rail transit passenger guidance system and guidance method
CN108351967A (en) A kind of plurality of human faces detection method, device, server, system and storage medium
CN104092988A (en) Method, device and system for managing passenger flow in public place
CN112434566B (en) Passenger flow statistics method and device, electronic equipment and storage medium
CN107123274A (en) Double parking stall video detecting devices and method
CN112669497A (en) Pedestrian passageway perception system and method based on stereoscopic vision technology
CN110175533A (en) Overpass traffic condition method of real-time, device, terminal and storage medium
CN113762766A (en) Rail transit station transport pipe system, method and device
CN114202711A (en) Intelligent monitoring method, device and system for abnormal behaviors in train compartment
CN112988830A (en) People flow statistical method, device, system, storage medium and computer equipment
CN113788051A (en) Train on-station running state monitoring and analyzing system
CN106710253A (en) High-reliability intelligent intersection traffic control system and control method
CN114118470B (en) Intelligent control method and system for production operation of full-automatic driving vehicle base
CN110263622A (en) Train fire monitoring method, apparatus, terminal and storage medium
CN109887303A (en) Random change lane behavior early warning system and method
CN113691778A (en) Panoramic station patrol system for urban rail transit station
CN114926790A (en) Subway station scene joint control system
CN112911233A (en) Intelligent train system
CN111080753A (en) Station information display method, device, equipment and storage medium
CN103606280B (en) A kind of information identifying method, device and system
CN116033240A (en) Equipment inspection method and system based on station operation cockpit
CN105574499B (en) A kind of number detection statistics method and system based on SOC
CN107369237A (en) A kind of big data detection method based on multi parameter analysis
CN106355925A (en) Method and device for achieving evaluation of control effect of intersection signal machine by using internet

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication