CN112637453A - Ubiquitous Internet of things-based following type intelligent sensing interaction equipment and method for electric power emergency disposal - Google Patents

Ubiquitous Internet of things-based following type intelligent sensing interaction equipment and method for electric power emergency disposal Download PDF

Info

Publication number
CN112637453A
CN112637453A CN202011440865.1A CN202011440865A CN112637453A CN 112637453 A CN112637453 A CN 112637453A CN 202011440865 A CN202011440865 A CN 202011440865A CN 112637453 A CN112637453 A CN 112637453A
Authority
CN
China
Prior art keywords
data
module
intelligent
monitoring
personnel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011440865.1A
Other languages
Chinese (zh)
Other versions
CN112637453B (en
Inventor
孙世军
栾晓嵘
朱坤双
孙娟子
韩洪
梁雅洁
付奇
倪家春
李猛
韩智海
宫梓超
许圣佳
张天宝
乔立同
李学昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Shandong Electric Power Co Emergency Management Center
Original Assignee
State Grid Shandong Electric Power Co Emergency Management Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Shandong Electric Power Co Emergency Management Center filed Critical State Grid Shandong Electric Power Co Emergency Management Center
Priority to CN202011440865.1A priority Critical patent/CN112637453B/en
Publication of CN112637453A publication Critical patent/CN112637453A/en
Application granted granted Critical
Publication of CN112637453B publication Critical patent/CN112637453B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B19/00Alarms responsive to two or more different undesired or abnormal conditions, e.g. burglary and fire, abnormal temperature and abnormal rate of flow
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/08Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using communication transmission lines
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/10Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using wireless transmission systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/35Utilities, e.g. electricity, gas or water
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/50Safety; Security of things, users, data or systems
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J13/00Circuit arrangements for providing remote indication of network conditions, e.g. an instantaneous record of the open or closed condition of each circuitbreaker in the network; Circuit arrangements for providing remote control of switching means in a power distribution network, e.g. switching in and out of current consumers by using a pulse code signal carried by the network
    • H02J13/00001Circuit arrangements for providing remote indication of network conditions, e.g. an instantaneous record of the open or closed condition of each circuitbreaker in the network; Circuit arrangements for providing remote control of switching means in a power distribution network, e.g. switching in and out of current consumers by using a pulse code signal carried by the network characterised by the display of information or by user interaction, e.g. supervisory control and data acquisition systems [SCADA] or graphical user interfaces [GUI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Abstract

The invention provides a ubiquitous Internet of things-based following type intelligent sensing interaction device and method for electric power emergency disposal, and relates to the field of electric power emergency disposal. The intelligent sensing interaction device is communicated with the monitoring and dispatching platform, so that decision support is provided for field emergency disposal personnel. The intelligent sensing interaction equipment acquires on-site pictures, videos and environmental parameters, and somatosensory data and human body parameter data of personnel and sends the data to the monitoring and scheduling platform; the monitoring and dispatching platform analyzes the data, evolves the disaster, and sends an evolution result and a corresponding instruction to each intelligent perception interaction device so that field emergency disposal personnel can complete disaster tracing and decision making. Meanwhile, the data processing is carried out at the perception interaction equipment end to form a processing result, and a parallel controller, a memory and RAID combined access mode is arranged on the monitoring and scheduling platform to reduce the data transmission quantity, improve the overall processing efficiency of the system and improve the emergency capacity of the system.

Description

Ubiquitous Internet of things-based following type intelligent sensing interaction equipment and method for electric power emergency disposal
Technical Field
The invention relates to the field of electric power emergency disposal, in particular to a following type electric power emergency disposal intelligent perception interaction device and method based on ubiquitous internet of things.
Background
At present, the field environment for emergency disposal of high-risk power grid emergency events is complex and changeable, personal safety can be endangered if the emergency disposal is not noticed a little, and moreover, due to insufficient convenience of information interaction, a rear command center cannot timely master the field situation, cannot provide reliable and timely information guarantee support for the front, cannot make timely, scientific and accurate emergency disposal decisions, and field emergency disposal personnel cannot effectively deal with the emergency situation.
Therefore, the mobile terminal integrating communication, high-precision positioning, intelligent sensing and audio/video and the like can provide real-time information interaction for the front and the rear, provide weather and environment monitoring and early warning for the front emergency disposal personnel, ensure the field disposal personnel to prevent sudden danger and be beneficial to improving the disposal decision and risk avoiding capability of the personnel.
Disclosure of Invention
The purpose of the invention is realized by the following technical scheme.
In order to solve the problems, the invention provides a following type electric power emergency disposal intelligent perception interaction device and method based on ubiquitous internet of things.
The invention relates to a ubiquitous Internet of things-based following type intelligent sensing interaction device for electric power emergency disposal, which specifically comprises:
the intelligent sensing interaction equipment is portable equipment, and a power grid field emergency treatment personnel can carry the intelligent sensing interaction equipment to realize monitoring and surveying of field conditions and communicate with a monitoring and scheduling platform to ensure that field treatment personnel can prevent sudden danger; the intelligent perception interaction devices are in communication connection and can communicate with each other;
the intelligent perception interaction device is in communication connection with the monitoring scheduling platform in a wireless network mode, and the communication mode includes but is not limited to 4G, 5G, wifi and the like.
The intelligent perception interaction device specifically comprises: the intelligent sensing module, the sensor cluster module, the body sensing module, the high-precision positioning module, the intelligent lighting module, the camera module, the communication module and the intelligent control panel.
The camera module is responsible for acquiring picture or video data of a site environment and transmitting the picture or video data to the monitoring scheduling platform in real time through the communication module connected with the camera module; the camera module is connected with the intelligent lighting module and the sensor cluster module, when the intelligent sensing interaction equipment finds that the current ambient light intensity is lower than a certain threshold value through an ambient light sensor in the sensor cluster module, the intelligent lighting module is started while the camera module is started, and the illumination intensity and the illumination angle of the intelligent lighting module are dynamically adjusted according to the current ambient light intensity value;
furthermore, the camera module is also connected with the intelligent control panel, and the video content of the camera module can be synchronously displayed in the display screen of the control panel in real time;
furthermore, the camera module also comprises an automatic tracking module; the field emergency treatment personnel can select a target area of interest in the display screen of the control panel, and the automatic tracking module automatically adjusts the rotation angle of the camera so as to keep the selected target enlarged and displayed in the central area of the display screen of the control panel;
the lighting module is responsible for field lighting; the illumination module is connected with the sensor cluster module and the camera module, when the current ambient light intensity measured by an ambient light sensor in the sensor cluster module is lower than a certain threshold value, the illumination module is automatically started, and the illumination intensity of the illumination module is automatically adjusted according to the current ambient light intensity value; the illumination module dynamically adjusts the light intensity and the illumination direction according to the image definition and the key attention area of the camera module so as to improve the definition of the picture shot by the camera module.
The sensor cluster module is responsible for detecting field data, comprises an ultrasonic wind measuring sensor, a gravity sensor, an air pressure sensor, a temperature and humidity sensor, a distance sensor, an ambient light sensor, a smoke sensor and a combustible gas and toxic and harmful gas sensor, and realizes acquisition, calculation, monitoring and early warning of a plurality of meteorological environment elements such as temperature, humidity, air pressure, air speed and wind direction and data such as landform, smoke, combustible gas and toxic gas;
the high-precision positioning module is responsible for positioning field emergency disposal personnel, is connected with the communication module and sends the current position of the personnel to the monitoring and dispatching platform in real time through the communication module; meanwhile, the high-precision positioning module is connected with the control panel to realize the real-time display of the position of the personnel on the display screen of the control panel;
the motion sensing module is a sensing device and comprises a personnel motion acquisition device and an intelligent helmet device; the motion acquisition equipment acquires the skeletal motion data of the person in real time; the intelligent helmet equipment displays the text and video data sent by the monitoring and scheduling platform in real time; meanwhile, the motion sensing module is connected with the communication module so as to transmit data acquired by the motion sensing module to the monitoring scheduling platform in real time through the communication module and receive character and video data transmitted by the monitoring scheduling platform;
the intelligent sensing module is responsible for collecting human body parameters of field emergency treatment personnel, and the human body parameters comprise data such as heart rate, heartbeat, blood pressure, blood oxygen and the like; the intelligent sensing module is connected with the communication module so as to realize the real-time transmission of the human body parameter data to the monitoring and scheduling platform;
the intelligent panel is a touch display screen and is responsible for displaying data sent by the camera module, the high-precision positioning module, the motion sensing module, the intelligent sensing module and the monitoring scheduling platform;
the communication module is connected with the intelligent sensing module, the sensor cluster module, the motion sensing module, the high-precision positioning module, the intelligent illuminating module, the camera module, the intelligent control panel and the monitoring and scheduling platform and is responsible for sending and receiving data;
furthermore, the communication modules of the intelligent perception interaction devices are in communication connection in an ad hoc network mode, and communication of field emergency treatment personnel can be achieved.
Furthermore, in order to further improve the data processing speed of the system and the data transmission speed of the system, the intelligent sensing interaction devices complete data cleaning of results acquired by the sensors at a local end, complete interception of key attention areas of videos acquired by the camera, complete arrangement of human skeleton action data acquired by the motion sensing module, and send processed results to the monitoring and dispatching platform in a classified manner, so that the data transmission amount is reduced, and the data processing efficiency of the monitoring and dispatching platform is improved;
further, the monitoring and scheduling platform establishes a virtual interaction platform for each intelligent perception interaction device, establishes a three-dimensional virtual person by using the human skeleton action data of each site emergency disposal person with the site environment and the key attention area as the background, and keeps the action data of the three-dimensional virtual person consistent with the skeleton action data of the site emergency disposal person, so that the environment, the action and the motion track of each site emergency disposal person can be played in real time in the monitoring and scheduling platform and each intelligent perception interaction device;
the monitoring and scheduling platform comprises a data storage system, a field disaster analysis and preview system, a data communication system and a dual-control processor system;
the data communication system is responsible for receiving data sent by each intelligent perception interactive device and sending the data to each intelligent perception interactive device;
the data storage system comprises a memory and a RAID disk array; wherein the memory is a fixed address space size, and the RAID can expand the capacity;
further, after the monitoring and scheduling platform receives the data, writing the data into a memory; the memory carries out data reading and writing in a circulating writing mode, and when new data is written and before the previous written data needs to be covered, the previous written data can be backed up to the RAID disk array;
specifically, in order to improve the efficiency of data storage and processing, the display of the field situation and the disaster analysis and preview are performed at the fastest speed, and the specific method for writing the data into the memory is as follows:
setting a head pointer and a tail pointer, wherein the head pointer points to a latest data writing position, and the tail pointer points to a data reading position; when the monitoring and scheduling platform receives new data, the head pointer is used as a first position for writing the new data, and after the data writing is finished, the head pointer points to the next position of the latest data position; when the monitoring platform analyzes data, reading the data from the tail pointer, and stopping reading when the addresses of the tail pointer and the head pointer are the same;
furthermore, when the address of the head pointer points to the last address position of the memory and new data comes, the head pointer points to the first storage position of the memory, so that the memory can be recycled; and when the address position of the head pointer is coincident with the address position of the read pointer, judging whether the head pointer is currently in a read state or a write state, if the head pointer is currently in the write state, backing up the data at the position pointed by the current read pointer to the RAID disk array in sequence, and writing the data until the data are completely written.
The field disaster analysis and preview system comprises a data analysis module and a disaster preview module; the data analysis module is responsible for sorting and analyzing the collected data and displaying the data in a monitoring and scheduling platform display screen; and the disaster forecasting module is responsible for forecasting and simulating the disasters according to the collected real-time data, generating a forecasting result, displaying the forecasting result in a display screen of the monitoring and dispatching platform, and sending the forecasting result to each intelligent perception interaction device. The dual-control processor system comprises a first controller and a second controller, wherein the first controller is responsible for monitoring the receiving of data of the dispatching platform, the writing of the data into the memory and the analysis and display; the second controller is responsible for reading the stored data in real time and calculating the disaster preview so as to realize the parallel processing of data reading and writing, data analysis and disaster preview, further improve the system operation efficiency and improve the system response speed.
Furthermore, the disaster prediction module is connected with the RAID disk array, and the RAID stores historical disaster evolution process data and action relations among the data; the disaster situation forecasting module is used for performing machine learning and modeling according to historical disaster situation evolution process data stored by the RAID and action relations among the data to form a plurality of types of disaster situation evolution models; when the monitoring and scheduling platform receives current real-time field data (including video data, sensor data and the like), the disaster situation preview module performs disaster situation preview according to the received data and a corresponding disaster situation evolution model to form a disaster situation evolution process virtual video; when a disaster occurs, the disaster prediction module completes the reverse thrust of disaster development by using the disaster evolution model and the current real-time field data and generates a reverse thrust result; the backstepping result includes, but is not limited to, a disaster occurrence point, an evolution process video of a disaster from occurrence to a current time, and the like.
Further, after the disaster situation previewing module generates a virtual video and a reverse-pushing result in the disaster situation evolution process, the video and the reverse-pushing result are sent to each intelligent perception interaction device through a data communication system;
furthermore, the intelligent helmet equipment in each intelligent perception interaction equipment displays a disaster evolution process video and a backstepping result in real time so as to provide a decision for field emergency disposal personnel.
The invention has the following technical effects:
1. through the interaction between the intelligent sensing interaction equipment and the monitoring and dispatching platform as well as the intelligent sensing interaction equipment, the decision of on-site emergency disposal personnel is supported;
2. the intelligent camera and the intelligent lighting equipment are combined, the light intensity and the light angle of the intelligent lighting equipment dynamically change along with the definition of a picture shot by the intelligent camera and the key attention area, and the intelligent camera and the intelligent lighting equipment can be automatically controlled by a monitoring and scheduling platform and manually controlled by field emergency treatment personnel; the definition and the accuracy of a scene shot picture are greatly improved;
3. the intelligent sensing interaction equipment carries out local data processing on the acquired data and sends the data processing result to the monitoring and dispatching platform and other intelligent sensing interaction equipment, so that the data transmission quantity is greatly reduced, the data processing quantity of the monitoring and dispatching platform is reduced, and the emergency processing speed is improved;
4. the monitoring and scheduling platform adopts a dual-controller mode to carry out division and parallel control of data access, analysis and preview, so that the data processing efficiency is further improved;
5. the monitoring and scheduling platform stores data in a mode of combining the memory and the RAID disk array module, and stores the data in a fixed memory, double pointers and cyclic storage mode, so that the efficiency of data reading and writing is improved, and the problems of memory breakdown or low data reading memory hit rate caused by large data volume are effectively avoided.
6. The monitoring and scheduling platform carries out modeling according to historical disaster data, can utilize current field data to carry out disaster evolution and disaster reverse pushing, can feed back the data to each intelligent perception interaction device in real time, and provides powerful support for disaster tracing, decision, disposal and escape of field emergency disposal personnel.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 is a flowchart of a following type electric power emergency disposal intelligent perception interaction method based on ubiquitous internet of things.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The invention relates to a ubiquitous Internet of things-based following type intelligent sensing interaction device for electric power emergency disposal, which specifically comprises:
the intelligent perception interaction device is a portable device, and power grid field emergency treatment personnel can carry the intelligent perception interaction device to realize monitoring and surveying of field conditions and communicate with a monitoring and scheduling platform to ensure that field treatment personnel can prevent sudden danger.
The intelligent perception interaction device is in communication connection with the monitoring scheduling platform in a wireless network mode, and the communication mode includes but is not limited to 4G, 5G, wifi and the like.
The intelligent perception interaction device specifically comprises: the intelligent sensing module, the sensor cluster module, the body sensing module, the high-precision positioning module, the intelligent lighting module, the camera module, the communication module and the intelligent control panel.
The camera module is responsible for acquiring picture or video data of a site environment and transmitting the picture or video data to the monitoring scheduling platform in real time through the communication module connected with the camera module; the camera module is connected with the intelligent lighting module and the sensor cluster module, when the intelligent sensing interaction equipment finds that the current ambient light intensity is lower than a certain threshold value through an ambient light sensor in the sensor cluster module, the intelligent lighting module is started while the camera module is started, and the illumination intensity and the illumination angle of the intelligent lighting module are dynamically adjusted according to the current ambient light intensity value;
furthermore, the camera module is also connected with the intelligent control panel, and the video content of the camera module can be synchronously displayed in the display screen of the control panel in real time;
furthermore, the camera module also comprises an automatic tracking module; the field emergency treatment personnel can select a target area of interest in the display screen of the control panel, and the automatic tracking module automatically adjusts the rotation angle of the camera so as to keep the selected target enlarged and displayed in the central area of the display screen of the control panel;
the lighting module is responsible for field lighting; the illumination module is connected with the sensor cluster module and the camera module, when the current ambient light intensity measured by an ambient light sensor in the sensor cluster module is lower than a certain threshold value, the illumination module is automatically started, and the illumination intensity of the illumination module is automatically adjusted according to the current ambient light intensity value; the illumination module dynamically adjusts the light intensity and the illumination direction according to the image definition and the key attention area of the camera module so as to improve the definition of the picture shot by the camera module.
The sensor cluster module is responsible for detecting field data, comprises an ultrasonic wind measuring sensor, a gravity sensor, an air pressure sensor, a temperature and humidity sensor, a distance sensor, an ambient light sensor, a smoke sensor and a combustible gas and toxic and harmful gas sensor, and realizes acquisition, calculation, monitoring and early warning of a plurality of meteorological environment elements such as temperature, humidity, air pressure, air speed and wind direction and data such as landform, smoke, combustible gas and toxic gas;
the high-precision positioning module is responsible for positioning field emergency disposal personnel, is connected with the communication module and sends the current position of the personnel to the monitoring and dispatching platform in real time through the communication module; meanwhile, the high-precision positioning module is connected with the control panel to realize the real-time display of the position of the personnel on the display screen of the control panel;
the motion sensing module is a sensing device and comprises a personnel motion acquisition device and an intelligent helmet device; the motion acquisition equipment acquires the skeletal motion data of the person in real time; the intelligent helmet equipment displays the text and video data sent by the monitoring and scheduling platform in real time; meanwhile, the motion sensing module is connected with the communication module so as to transmit data acquired by the motion sensing module to the monitoring scheduling platform in real time through the communication module and receive character and video data transmitted by the monitoring scheduling platform;
the intelligent sensing module is responsible for collecting human body parameters of field emergency treatment personnel, and the human body parameters comprise data such as heart rate, heartbeat, blood pressure, blood oxygen and the like; the intelligent sensing module is connected with the communication module so as to realize the real-time transmission of the human body parameter data to the monitoring and scheduling platform;
the intelligent panel is a touch display screen and is responsible for displaying data sent by the camera module, the high-precision positioning module, the motion sensing module, the intelligent sensing module and the monitoring scheduling platform;
the communication module is connected with the intelligent sensing module, the sensor cluster module, the motion sensing module, the high-precision positioning module, the intelligent illuminating module, the camera module, the intelligent control panel and the monitoring and scheduling platform and is responsible for sending and receiving data;
furthermore, the communication modules of the intelligent perception interaction devices are in communication connection in an ad hoc network mode, and communication of field emergency treatment personnel can be achieved.
Furthermore, in order to further improve the data processing speed of the system and the data transmission speed of the system, the intelligent sensing interaction devices complete data cleaning of results acquired by the sensors at a local end, complete interception of key attention areas of videos acquired by the camera, complete arrangement of human skeleton action data acquired by the motion sensing module, and send processed results to the monitoring and dispatching platform in a classified manner, so that the data transmission amount is reduced, and the data processing efficiency of the monitoring and dispatching platform is improved;
further, the monitoring and scheduling platform establishes a virtual interaction platform for each intelligent perception interaction device, establishes a three-dimensional virtual person by using the human skeleton action data of each site emergency disposal person with the site environment and the key attention area as the background, and keeps the action data of the three-dimensional virtual person consistent with the skeleton action data of the site emergency disposal person, so that the environment, the action and the motion track of each site emergency disposal person can be played in real time in the monitoring and scheduling platform and each intelligent perception interaction device;
the monitoring and scheduling platform comprises a data storage system, a field disaster analysis and preview system, a data communication system and a dual-control processor system;
the data communication system is responsible for receiving data sent by each intelligent perception interactive device and sending the data to each functional perception interactive device;
the data storage system comprises a memory and a RAID disk array; wherein the memory is a fixed address space size, and the RAID can expand the capacity;
further, after the monitoring and scheduling platform receives the data, writing the data into a memory; the memory carries out data reading and writing in a circulating writing mode, and when new data is written and before the previous written data needs to be covered, the previous written data can be backed up to the RAID disk array;
specifically, in order to improve the efficiency of data storage and processing, the display of the field situation and the disaster analysis and preview are performed at the fastest speed, and the specific method for writing the data into the memory is as follows:
setting a head pointer and a tail pointer, wherein the head pointer points to a latest data writing position, and the tail pointer points to a data reading position; when the monitoring and scheduling platform receives new data, the head pointer is used as a first position for writing the new data, and after the data writing is finished, the head pointer points to the next position of the latest data position; when the monitoring platform analyzes data, reading the data from the tail pointer, and stopping reading when the addresses of the tail pointer and the head pointer are the same;
furthermore, when the address of the head pointer points to the last address position of the memory and new data comes, the head pointer points to the first storage position of the memory, so that the memory can be recycled; and when the address position of the head pointer is coincident with the address position of the read pointer, judging whether the head pointer is currently in a read state or a write state, if the head pointer is currently in the write state, backing up the data at the position pointed by the current read pointer to the RAID disk array in sequence, and writing the data until the data are completely written.
The field disaster analysis and preview system comprises a data analysis module and a disaster preview module; the data analysis module is responsible for sorting and analyzing the collected data and displaying the data in a monitoring and scheduling platform display screen; and the disaster forecasting module is responsible for forecasting and simulating the disasters according to the collected real-time data, generating a forecasting result, displaying the forecasting result in a display screen of the monitoring and dispatching platform, and sending the forecasting result to each intelligent perception interaction device. The dual-control processor system comprises a first controller and a second controller, wherein the first controller is responsible for monitoring the receiving of data of the dispatching platform, the writing of the data into the memory and the analysis and display; the second controller is responsible for reading the stored data in real time and calculating the disaster preview so as to realize the parallel processing of data reading and writing, data analysis and disaster preview, further improve the system operation efficiency and improve the system response speed.
Furthermore, the disaster prediction module is connected with the RAID disk array, and the RAID stores historical disaster evolution process data and action relations among the data; the disaster situation forecasting module is used for performing machine learning and modeling according to historical disaster situation evolution process data stored by the RAID and action relations among the data to form a plurality of types of disaster situation evolution models; when the monitoring and scheduling platform receives current real-time field data (including video data, sensor data and the like), the disaster situation preview module performs disaster situation preview according to the received data and a corresponding disaster situation evolution model to form a disaster situation evolution process virtual video; when a disaster occurs, the disaster prediction module completes the reverse thrust of disaster development by using the disaster evolution model and the current real-time field data and generates a reverse thrust result; the backstepping result includes, but is not limited to, a disaster occurrence point, an evolution process video of a disaster from occurrence to a current time, and the like.
Further, after the disaster situation previewing module generates a virtual video and a reverse-pushing result in the disaster situation evolution process, the video and the reverse-pushing result are sent to each intelligent perception interaction device through a data communication system;
furthermore, the intelligent helmet equipment in each intelligent perception interaction equipment displays a disaster evolution process video and a backstepping result in real time so as to provide a decision for field emergency disposal personnel.
The invention also provides a ubiquitous Internet of things-based following type intelligent sensing interaction method for electric power emergency disposal, which comprises the following specific steps:
101. each intelligent sensing interaction device collects on-site picture and video data, sensor data, personnel skeleton action data, human body parameter data and the like through a camera, a sensor cluster module, a somatosensory module and an intelligent sensing module;
102. each intelligent perception interaction device carries out data processing on each type of collected data to form a data processing result and sends the data processing result to the monitoring and dispatching platform;
103. the monitoring and scheduling platform receives various types of data collected by various intelligent perception interactive devices and stores the data into a memory;
104. a first controller in the monitoring and scheduling platform controls and completes writing of various types of data into the memory, and controls a data analysis module to complete analysis of real-time data; the second controller finishes reading of multiple disaster evolution models stored in the RAID and reading of the memory and real-time field data stored in the RAID, and controls the disaster prediction module to finish the disaster evolution so as to generate a disaster evolution process virtual video and a reverse result;
105. the monitoring and scheduling platform generates a virtual character model of the field emergency disposal personnel in the virtual interaction platform according to the personnel skeleton action data and the human body parameter data which are acquired by the somatosensory module and the intelligent sensing module; human body parameters of personnel are marked beside the virtual character model;
106. the monitoring and scheduling platform sends disaster evolution process virtual videos, reverse-pushing results and control instructions to the intelligent perception interaction devices;
the control instruction comprises a text, voice and video form instruction; the instruction comprises an instruction for controlling a camera, a sensor module and an intelligent lighting module of the intelligent sensing interaction equipment; if the monitoring and scheduling platform needs to pay close attention to the placement position of the on-site dangerous chemicals, a camera control instruction and a text reminding instruction for on-site emergency treatment personnel can be directly sent, the angle of the camera is adjusted in real time, the dangerous chemicals are taken as a key attention object and are amplified and displayed in the center of a shot picture, and meanwhile, the intelligent lighting intensity is dynamically adjusted according to the definition of the dangerous chemicals in the shot picture;
107. the intelligent helmets in the intelligent perception interaction devices display disaster evolution processes and the actions, environment and human body parameters of virtual characters of field emergency treatment personnel displayed by the virtual platform;
108. and each field emergency disposal person interacts in the virtual interaction platform through the intelligent perception interaction equipment.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (11)

1. A ubiquitous Internet of things-based walk-behind power emergency treatment intelligent perception interaction device is characterized by comprising: the intelligent sensing module, the sensor cluster module, the motion sensing module, the high-precision positioning module, the intelligent lighting module, the camera module, the communication module and the intelligent control panel;
the intelligent perception interaction equipment is portable equipment and is communicated with the monitoring and dispatching platform;
the monitoring and scheduling platform comprises a data storage system, a field disaster analysis and preview system, a data communication system and a dual-control processor system;
the power grid field emergency disposal personnel can carry the intelligent perception interaction equipment to realize monitoring and surveying of field conditions, and communicate with a monitoring and dispatching platform to ensure that the field disposal personnel can prevent sudden danger.
The camera module is responsible for acquiring the picture or video data of the site environment and transmitting the picture or video data to the monitoring scheduling platform in real time through the communication module connected with the camera module;
the camera module is connected with the intelligent lighting module and the sensor cluster module, when the intelligent sensing interaction equipment finds that the current ambient light intensity is lower than a certain threshold value through an ambient light sensor in the sensor cluster module, the intelligent lighting module is started while the camera module is started, and the illumination intensity and the illumination angle of the intelligent lighting module are dynamically adjusted according to the current ambient light intensity value;
the camera module is also connected with the intelligent control panel, and the video content of the camera module can be synchronously displayed in the display screen of the control panel in real time;
the high-precision positioning module is responsible for positioning field emergency disposal personnel, is connected with the communication module and sends the current position of the personnel to the monitoring and dispatching platform in real time through the communication module; meanwhile, the high-precision positioning module is connected with the control panel to realize the real-time display of the position of the personnel on the display screen of the control panel;
the motion sensing module is a sensing device and comprises a personnel motion acquisition device and an intelligent helmet device; the motion acquisition equipment acquires the skeletal motion data of the person in real time; the intelligent helmet equipment displays the text and video data sent by the monitoring and scheduling platform in real time; meanwhile, the motion sensing module is connected with the communication module so as to transmit data acquired by the motion sensing module to the monitoring scheduling platform in real time through the communication module and receive character and video data transmitted by the monitoring scheduling platform;
the intelligent sensing module is responsible for collecting human body parameters of field emergency treatment personnel, and the human body parameters comprise data such as heart rate, heartbeat, blood pressure, blood oxygen and the like; the intelligent sensing module is connected with the communication module to realize the real-time transmission of the human body parameter data to the monitoring and scheduling platform.
2. The smart aware interactive device of claim 1, wherein the camera module further comprises an auto-tracking module; the field emergency treatment personnel can select a target area of interest in the display screen of the control panel, and the automatic tracking module automatically adjusts the rotation angle of the camera so as to keep the selected target enlarged and displayed in the central area of the display screen of the control panel.
3. The smart aware interaction device of claim 1, wherein the lighting module is responsible for field lighting; the illumination module is connected with the sensor cluster module and the camera module, when the current ambient light intensity measured by an ambient light sensor in the sensor cluster module is lower than a certain threshold value, the illumination module is automatically started, and the illumination intensity of the illumination module is automatically adjusted according to the current ambient light intensity value;
the illumination module dynamically adjusts the light intensity and the illumination direction according to the image definition and the key attention area of the camera module so as to improve the definition of the picture shot by the camera module.
4. The intelligent sensing interaction device according to claim 1, wherein the sensor cluster module is responsible for detecting field data, and comprises an ultrasonic anemometer sensor, a gravity sensor, an air pressure sensor, a temperature and humidity sensor, a distance sensor, an ambient light sensor, a smoke sensor, a combustible gas sensor and a toxic and harmful gas sensor, so as to realize data acquisition, calculation, monitoring and early warning of a plurality of meteorological environment elements such as temperature, humidity, air pressure, wind speed and wind direction, and terrain, landform, smoke, combustible gas, toxic gas and the like.
5. The intelligent sensing interaction device of claim 1, wherein the intelligent panel is a touch-enabled display screen and is responsible for displaying data sent by the camera module, the high-precision positioning module, the motion sensing module, the intelligent sensing module, and the monitoring and scheduling platform.
6. The intelligent sensing interaction device of claim 1, wherein the communication module is connected to the intelligent sensing module, the sensor cluster module, the motion sensing module, the high-precision positioning module, the intelligent lighting module, the camera module, the intelligent control panel, and the monitoring and scheduling platform, and is responsible for sending and receiving data.
7. The intelligent sensing interaction device of claim 1, wherein the communication modules of the intelligent sensing interaction devices are in communication connection in an ad hoc network form, so that communication of field emergency treatment personnel can be realized;
the intelligent sensing interaction equipment completes data cleaning of results acquired by the sensors at a local end, completes interception of a key attention area of a video acquired by the camera, completes arrangement of human skeleton action data acquired by the motion sensing module, and sends processed results to the monitoring and dispatching platform in a classified manner;
the monitoring and dispatching platform establishes a virtual interaction platform for each intelligent perception interaction device, establishes three-dimensional virtual personnel by using the personnel skeleton action data of each site emergency disposal personnel with the site environment and the key attention area as backgrounds, and keeps the action data of the three-dimensional virtual personnel consistent with the skeleton action data of the site emergency disposal personnel, so that the environment, the action and the motion track of each site emergency disposal personnel can be played in real time in the monitoring and dispatching platform and each intelligent perception interaction device.
8. The intelligent perception interactive device according to claim 1, wherein the monitoring and scheduling platform includes a data storage system, a field disaster analysis and preview system, a data communication system, and a dual-control processor system;
the data communication system is responsible for receiving data sent by each intelligent perception interactive device and sending the data to each intelligent perception interactive device;
the data storage system comprises a memory and a RAID disk array; wherein the memory is a fixed address space size, and the RAID can expand the capacity;
after receiving the data, the monitoring and scheduling platform writes the data into a memory; the memory carries out data reading and writing in a circulating writing mode, and when new data is written and before the previous written data needs to be covered, the previous written data can be backed up to the RAID disk array;
the field disaster analysis and preview system comprises a data analysis module and a disaster preview module; the data analysis module is responsible for sorting and analyzing the collected data and displaying the data in a monitoring and scheduling platform display screen; the disaster situation previewing module is responsible for predicting and simulating disaster situations according to the collected real-time data, generating a previewing result, displaying the previewing result in a display screen of the monitoring and scheduling platform, and sending the previewing result to each intelligent perception interaction device; the dual-control processor system comprises a first controller and a second controller, wherein the first controller is responsible for monitoring the receiving of data of the dispatching platform, the writing of the data into the memory and the analysis and display; the second controller is responsible for real-time reading of stored data and calculation of disaster prediction so as to achieve parallel processing of data reading and writing, data analysis and disaster prediction.
9. The intelligent sensing interaction device of claim 8, wherein the specific method for writing data into the memory is as follows:
setting a head pointer and a tail pointer, wherein the head pointer points to a latest data writing position, and the tail pointer points to a data reading position; when the monitoring and scheduling platform receives new data, the head pointer is used as a first position for writing the new data, and after the data writing is finished, the head pointer points to the next position of the latest data position; when the monitoring platform analyzes data, reading the data from the tail pointer, and stopping reading when the addresses of the tail pointer and the head pointer are the same;
furthermore, when the address of the head pointer points to the last address position of the memory and new data comes, the head pointer points to the first storage position of the memory, so that the memory can be recycled; and when the address position of the head pointer is coincident with the address position of the read pointer, judging whether the head pointer is currently in a read state or a write state, if the head pointer is currently in the write state, backing up the data at the position pointed by the current read pointer to the RAID disk array in sequence, and writing the data until the data are completely written.
10. A ubiquitous Internet of things-based walk-behind electric power emergency disposal intelligent perception interaction method is characterized by comprising the following steps:
101. each intelligent sensing interaction device collects on-site picture and video data, sensor data, personnel skeleton action data, human body parameter data and the like through a camera, a sensor cluster module, a somatosensory module and an intelligent sensing module;
102. each intelligent perception interaction device carries out data processing on each type of collected data to form a data processing result and sends the data processing result to the monitoring and dispatching platform;
103. the monitoring and scheduling platform receives various types of data collected by various intelligent perception interactive devices and stores the data into a memory;
104. a first controller in the monitoring and scheduling platform controls and completes writing of various types of data into the memory, and controls a data analysis module to complete analysis of real-time data; the second controller finishes reading of multiple disaster evolution models stored in the RAID and reading of the memory and real-time field data stored in the RAID, and controls the disaster prediction module to finish the disaster evolution so as to generate a virtual video and a reverse result in the disaster evolution process.
11. The smart aware interaction method of claim 10, further comprising:
105. the monitoring and scheduling platform generates a virtual character model of the field emergency disposal personnel in the virtual interaction platform according to the personnel skeleton action data and the human body parameter data which are acquired by the somatosensory module and the intelligent sensing module; human body parameters of personnel are marked beside the virtual character model;
106. the monitoring and scheduling platform sends disaster evolution process virtual videos, reverse-pushing results and control instructions to the intelligent perception interaction devices;
the control instruction comprises a text, voice and video form instruction; the instruction comprises an instruction for controlling a camera, a sensor module and an intelligent lighting module of the intelligent sensing interaction equipment; if the monitoring and scheduling platform needs to pay close attention to the placement position of the on-site dangerous chemicals, a camera control instruction and a text reminding instruction for on-site emergency treatment personnel can be directly sent, the angle of the camera is adjusted in real time, the dangerous chemicals are taken as a key attention object and are amplified and displayed in the center of a shot picture, and meanwhile, the intelligent lighting intensity is dynamically adjusted according to the definition of the dangerous chemicals in the shot picture;
107. the intelligent helmets in the intelligent perception interaction devices display disaster evolution processes and the actions, environment and human body parameters of virtual characters of field emergency treatment personnel displayed by the virtual platform;
108. and each field emergency disposal person interacts in the virtual interaction platform through the intelligent perception interaction equipment.
CN202011440865.1A 2020-12-07 2020-12-07 Ubiquitous Internet of things-based following type intelligent sensing interaction equipment and method for electric power emergency disposal Active CN112637453B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011440865.1A CN112637453B (en) 2020-12-07 2020-12-07 Ubiquitous Internet of things-based following type intelligent sensing interaction equipment and method for electric power emergency disposal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011440865.1A CN112637453B (en) 2020-12-07 2020-12-07 Ubiquitous Internet of things-based following type intelligent sensing interaction equipment and method for electric power emergency disposal

Publications (2)

Publication Number Publication Date
CN112637453A true CN112637453A (en) 2021-04-09
CN112637453B CN112637453B (en) 2022-06-21

Family

ID=75309319

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011440865.1A Active CN112637453B (en) 2020-12-07 2020-12-07 Ubiquitous Internet of things-based following type intelligent sensing interaction equipment and method for electric power emergency disposal

Country Status (1)

Country Link
CN (1) CN112637453B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114071363A (en) * 2021-11-16 2022-02-18 全球能源互联网研究院有限公司 Real-time information interaction system for electric power emergency disposal

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003150890A (en) * 2001-11-16 2003-05-23 Mitsubishi Heavy Ind Ltd Plant dynamic characteristic model construction system and construction method and plant dynamic characteristic simulator
CN103211578A (en) * 2013-03-26 2013-07-24 中国人民解放军成都军区总医院 System for monitoring environmental parameters and human body vital signs
CN105701614A (en) * 2016-01-13 2016-06-22 天津中科智能识别产业技术研究院有限公司 Emergency commanding platform based on three-dimensional landform and building model
CN106780116A (en) * 2016-12-01 2017-05-31 全球能源互联网研究院 The construction method of power emergency drilling method, device and its scenario models, device
CN108399709A (en) * 2018-05-03 2018-08-14 温利军 Multifunctional remote monitoring and warning system and monitoring method
CN111223263A (en) * 2020-03-11 2020-06-02 四川路桥建设集团交通工程有限公司 Full-automatic comprehensive fire early warning response system
CN111857070A (en) * 2020-07-09 2020-10-30 国网浙江省电力有限公司嘉兴供电公司 Construction site monitoring system and monitoring method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003150890A (en) * 2001-11-16 2003-05-23 Mitsubishi Heavy Ind Ltd Plant dynamic characteristic model construction system and construction method and plant dynamic characteristic simulator
CN103211578A (en) * 2013-03-26 2013-07-24 中国人民解放军成都军区总医院 System for monitoring environmental parameters and human body vital signs
CN105701614A (en) * 2016-01-13 2016-06-22 天津中科智能识别产业技术研究院有限公司 Emergency commanding platform based on three-dimensional landform and building model
CN106780116A (en) * 2016-12-01 2017-05-31 全球能源互联网研究院 The construction method of power emergency drilling method, device and its scenario models, device
CN108399709A (en) * 2018-05-03 2018-08-14 温利军 Multifunctional remote monitoring and warning system and monitoring method
CN111223263A (en) * 2020-03-11 2020-06-02 四川路桥建设集团交通工程有限公司 Full-automatic comprehensive fire early warning response system
CN111857070A (en) * 2020-07-09 2020-10-30 国网浙江省电力有限公司嘉兴供电公司 Construction site monitoring system and monitoring method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨鸿昌: "《基于物联网技术的电力应急救援智能通信系统》", 《电力信息化》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114071363A (en) * 2021-11-16 2022-02-18 全球能源互联网研究院有限公司 Real-time information interaction system for electric power emergency disposal

Also Published As

Publication number Publication date
CN112637453B (en) 2022-06-21

Similar Documents

Publication Publication Date Title
CN102930598B (en) Three-dimensional model is used to locate and show the system and method for Tunnel testing equipment state
CN201845345U (en) Facial expression identifying data collecting system based on active vision
EP3681169A1 (en) Method and terminal for displaying dynamic image
CN202736348U (en) Car driver driving skill practice guiding and examination scoring device
US11417106B1 (en) Crowd evacuation system based on real time perception, simulation, and warning
CN112929602B (en) Data monitoring method and device based on image processing and related equipment
CN108564274B (en) Guest room booking method and device and mobile terminal
CN112637453B (en) Ubiquitous Internet of things-based following type intelligent sensing interaction equipment and method for electric power emergency disposal
CN204119396U (en) The large online data monitor and early warning system of a kind of power transmission network
CN109040968A (en) Road conditions based reminding method, mobile terminal and computer readable storage medium
CN110657841A (en) Intelligent dust monitoring equipment for railway construction site
CN111464825A (en) Live broadcast method based on geographic information and related device
CN114723904A (en) Method, system, computer device and storage medium for dynamic management of airport data
CN115146540A (en) Method, system, device and storage medium for simulating fire-fighting risks of stadium
CN103776970A (en) Regional carbon dioxide concentration detection device based on plurality of sensors and method of regional carbon dioxide concentration detection device
CN113922502A (en) Intelligent video operation and maintenance management system and management method
CN109327568A (en) A kind of method and mobile terminal switching camera
CN111563689B (en) Aircraft operation scoring method and system
CN110710102A (en) Method and apparatus for locating an energy harvesting device in an environment
CN114897507A (en) Building construction management system and management method based on BIM
CN215987316U (en) Visual monitoring management system
CN103561245A (en) Space information monitoring device and technology like human brain
CN113570808A (en) Wireless smoke detector based on ZYNQ7020
CN207939657U (en) Working region monitoring system
CN112416135A (en) Projection parameter determination method and device based on indoor positioning and projection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant