CN112954280B - Artificial live working system and method based on intelligent wearable equipment - Google Patents

Artificial live working system and method based on intelligent wearable equipment Download PDF

Info

Publication number
CN112954280B
CN112954280B CN202110430397.8A CN202110430397A CN112954280B CN 112954280 B CN112954280 B CN 112954280B CN 202110430397 A CN202110430397 A CN 202110430397A CN 112954280 B CN112954280 B CN 112954280B
Authority
CN
China
Prior art keywords
working
intelligent
operator
safety helmet
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110430397.8A
Other languages
Chinese (zh)
Other versions
CN112954280A (en
Inventor
李帅
李惠宇
王新建
周文涛
田鹏云
林德政
冬旭
任青亭
冯俐
肖雁起
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Ruijia Tianjin Intelligent Robot Co ltd
Original Assignee
State Grid Ruijia Tianjin Intelligent Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Ruijia Tianjin Intelligent Robot Co ltd filed Critical State Grid Ruijia Tianjin Intelligent Robot Co ltd
Priority to CN202110430397.8A priority Critical patent/CN112954280B/en
Publication of CN112954280A publication Critical patent/CN112954280A/en
Application granted granted Critical
Publication of CN112954280B publication Critical patent/CN112954280B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/0406Accessories for helmets
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/0406Accessories for helmets
    • A42B3/0433Detecting, signalling or lighting devices
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/0406Accessories for helmets
    • A42B3/0433Detecting, signalling or lighting devices
    • A42B3/0466Means for detecting that the user is wearing a helmet
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/30Mounting radio sets or communication systems
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Abstract

The application provides an artificial live working system and a method based on intelligent wearable equipment, wherein an intelligent working tool obtains a working video and sends the working video to an intelligent safety helmet, and after receiving the working video sent by an operator through the handheld intelligent working tool, the intelligent safety helmet determines the current working position and equipment to be worked from the working video, generates a working instruction and sends the working instruction to an intranet server; the intranet server determines a working tool identifier and a working tool to be executed from the working command, generates a corresponding working control command and sends the corresponding working control command to the intelligent working tool; after receiving the operation control instruction, the intelligent operation tool controls the corresponding operation unit to complete the operation task according to the operation control instruction. Thus, the distance between the operator and the equipment to be operated is increased, the time for temporarily adjusting the operation mode by the operator according to the on-site situation is reduced, the safety of the operator is ensured, and the operation efficiency of the operator is improved.

Description

Artificial live working system and method based on intelligent wearable equipment
Technical Field
The application relates to the technical field of intelligent control, in particular to an artificial live working system and method based on intelligent wearable equipment.
Background
Along with the development of intelligent device technical field research, intelligent device is used in the daily operation in each field more and more to supplementary daily operation, electric power operation is one of them field, and electric power operation personnel all need to dress intelligent protective equipment when pole-climbing to high altitude use the insulator spindle to carry out live working operation, in order to assist electric power operation personnel operation, guarantee electric power operation personnel operation safety simultaneously.
At present, in the operation process of electric power operators, the operators need to manually take the corresponding operation tools to reach the designated operation positions to manually operate, the operators need to manually hold the operation tools close to the operation sites to operate, and meanwhile, the operation modes need to be temporarily adjusted according to the conditions of the site at any time, so that the operation difficulty of the electric power operators is increased, and the operation efficiency of the operators is greatly affected.
Disclosure of Invention
In view of this, the purpose of this application is to provide a manual live working system and method based on intelligent wearable equipment, directly send out the instruction through the intelligent helmet of wearing, confirm the target operation utensil of execution operation, fix the intelligent operation utensil through the insulator spindle butt joint and reach appointed operation position, accomplish corresponding operation task according to the operation control instruction, the distance between operating personnel and the equipment of waiting for the operation has been enlarged, the time of the operating personnel's temporary adjustment operation mode according to the condition on the spot has been reduced simultaneously, the operating personnel safety has been promoted the operating efficiency of operating personnel when guaranteeing.
The embodiment of the application provides an artificial live working system based on intelligent wearable equipment, which comprises an intelligent safety helmet, an intelligent working appliance and an intranet server, wherein the intelligent safety helmet and the intelligent working appliance are respectively in communication connection with the intranet server;
the intelligent safety helmet is used for receiving a working video sent by the intelligent working tool held by an operator, and determining the working position of the current working and equipment to be worked based on the working video; transmitting a working instruction to the intranet server based on the equipment to be worked;
the intranet server is used for determining a working tool identifier and a working tool to be executed working action from the received working instructions, generating a working control instruction based on the working tool identifier and the working tool to be executed working action, and sending the working control instruction to the intelligent working tool;
the intelligent operation device is used for controlling the corresponding operation unit to reach a designated operation position to complete an operation task based on the operation device identifier indicated in the operation control instruction; the intelligent working tool is connected with the insulating rod, and the working personnel controls the working action of the target working tool through the insulating rod.
Further, the manual live working system further comprises intelligent glasses and intelligent safety belts, wherein the intelligent glasses are arranged on the intelligent safety helmet, and the intelligent safety helmet is further used for:
based on the received in-place information sent by the intelligent glasses, wearing information of the intelligent safety helmet and pressure information of a hook part sent by the intelligent safety belt, the wearing states of the intelligent glasses and the intelligent safety belt are determined, and when the wearing states of the intelligent glasses, the intelligent safety helmet and the intelligent safety belt are determined to be normal wearing states, the worker is prompted to normally operate.
Further, the intelligent safety helmet comprises a core processing unit, a voice control unit and a positioning unit; the voice control unit and the positioning unit are respectively in communication connection with the core processing unit;
the voice control unit is used for receiving the voice instruction of the operator, converting the voice instruction into voice control information and sending the voice control information to the core processing unit;
the positioning unit is used for collecting the position information of the operator and sending the position information to the core processing unit;
The core processing unit is used for generating the job control instruction based on the received voice control information and the received position information and sending the job control instruction to the intranet server.
Further, the intelligent safety helmet further comprises a broadcasting unit, wherein the broadcasting unit is used for:
when the wearing state of the intelligent safety belt, the wearing state of the intelligent safety helmet or the wearing state of the intelligent glasses is determined to have an abnormal wearing state, an abnormal alarm is sent to prompt the operator.
Further, the intelligent safety helmet is also used for:
generating marking request information based on a marking request instruction sent by the operator, and sending the received marking position information generated based on the marking request information to the intranet server.
Further, the intranet server is further configured to:
and marking the corresponding position of the operation video based on the marking position indicated by the marking position information, generating a marking video, and forwarding the marking video to the intelligent glasses of the intelligent safety helmet for display.
Further, the intelligent work implement is further configured to:
And determining a working unit for working based on the received working tool identifier, moving the intelligent working tool to a designated working position, and controlling the working unit in the intelligent working tool to complete a working task according to the working control instruction.
Further, the embodiment of the application also provides an artificial live working system based on the intelligent wearable device, and the artificial live working system also comprises a ground terminal, an external network server and a comprehensive management platform;
the ground terminal is used for acquiring the operation condition fed back by the intranet server in the affiliated operation area on a preset time node, displaying the fed back operation condition, and uploading the operation condition in the operation area to the extranet server in a preset time period;
the external network server is used for forwarding the operation conditions in the operation areas which are uploaded by the ground terminals to the comprehensive management platform;
the comprehensive management platform is used for displaying the received operation conditions in each operation area according to the operation areas so that management staff can monitor the operation according to the displayed operation conditions.
Further, the integrated management platform is further configured to:
and determining a job instruction requested in the mark request information based on the received mark request information, determining a job appliance identifier and a job action to be executed of the job appliance, generating mark position information based on the job appliance identifier and the job action to be executed of the job appliance, and sending the mark position information to the external network server so that the external network server forwards the mark position information to a corresponding internal network server.
The embodiment of the application also provides a job control method which is applied to the manual live working system, and the job control method comprises the following steps:
controlling the intelligent safety helmet to receive a working video sent by the intelligent working tool held by an operator, and determining the working position of the current working and equipment to be worked based on the working video; transmitting a working instruction to the intranet server based on the equipment to be worked;
the intranet server is controlled to determine a working tool identifier and a working tool to be executed working action from the received working instructions, a working control instruction is generated based on the working tool identifier and the working tool to be executed working action, and the working control instruction is sent to the intelligent working tool;
Controlling the intelligent working tool to control a corresponding target working tool to reach a designated working position to finish a working task based on the working tool identifier indicated in the working control instruction; the target working tool is connected with the insulating rod, and the working personnel controls the working action of the target working tool through the insulating rod.
The embodiment of the application also provides electronic equipment, which comprises: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory in communication via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the job control method as described above.
The present embodiment also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the job control method as described above.
According to the artificial live working system and the method based on the intelligent wearable equipment, the intelligent working tool obtains the working video and sends the working video to the intelligent safety helmet, and after the intelligent safety helmet receives the working video sent by the working personnel through the handheld intelligent working tool, the working position of the current working and equipment to be worked are determined from the working video, and a working instruction is generated and sent to the intranet server; the intranet server determines the operation position and the operation appliance identifier from the operation instruction, generates a corresponding operation control instruction and sends the corresponding operation control instruction to the intelligent operation appliance; after the intelligent operation tool receives the operation control instruction, the corresponding operation unit is controlled to reach the designated operation position according to the instruction in the operation control instruction to complete the corresponding operation task, so that the distance between an operator and equipment to be operated is increased, the time for temporarily adjusting an operation mode by the operator according to the on-site condition is reduced, the safety of the operator is ensured, and the operation efficiency of the operator is improved.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an embodiment of a manual hot-line work system according to the present disclosure;
FIG. 2 is a second schematic diagram of a manual hot-line system according to an embodiment of the present disclosure;
FIG. 3 is one of the schematic structural diagrams of the intelligent safety helmet;
FIG. 4 is a second schematic diagram of the structure of the intelligent helmet;
FIG. 5 is a third schematic diagram of a manual hot-line system according to an embodiment of the present disclosure;
FIG. 6 is a flowchart of a job control method according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Icon: 100-a manual live working system; 110-intelligent safety helmet; 1101-core processing unit; 1102-a voice control unit; 1103-a positioning unit; 1104-a broadcasting unit; 1105-smart glasses; 120-intranet server; 130-intelligent work implement; 140-intelligent safety belt; 150-ground terminals; 160-an extranet server; 170-a comprehensive management platform; 700-an electronic device; 710-a processor; 720-memory; 730-bus.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. Based on the embodiments of the present application, every other embodiment that a person skilled in the art would obtain without making any inventive effort is within the scope of protection of the present application.
First, application scenarios applicable to the present application will be described. The intelligent control method and the intelligent control device can be applied to the technical field of intelligent control.
According to the research, at the present stage, in the operation process of electric power operators, the operators need to manually take the corresponding operation tools to reach the designated operation positions to manually operate, the operators need to manually operate the operation tools close to the operation places to operate, and meanwhile, the operation modes need to be temporarily adjusted according to the conditions of the site at any time, so that the operation difficulty of the electric power operators is increased, and the operation efficiency of the operators is greatly influenced.
Based on this, this application embodiment provides a artifical live working system to when guaranteeing the operating personnel safety, promoted operating personnel's operating efficiency.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an artificial live working system 100 according to an embodiment of the present application, and as shown in fig. 1, the artificial live working system 100 according to an embodiment of the present application includes an intelligent helmet 110, an intranet server 120, and an intelligent working tool 130; the intelligent safety helmet 110 and the intelligent working tool 130 are respectively connected to the intranet server 120 in a communication manner.
The intelligent operation tool 130 acquires an operation video and sends the operation video to the intelligent safety helmet 110, and after receiving the operation video sent by an operator through the handheld intelligent operation tool 130, the intelligent safety helmet 110 determines the operation position of the current operation and equipment to be operated from the operation video, generates an operation instruction and sends the operation instruction to the intranet server 120; the intranet server 120 determines the operation tool identifier and the operation action to be executed of the operation tool from the operation instructions, generates a corresponding operation control instruction and sends the corresponding operation control instruction to the intelligent operation tool 130; upon receiving the job control command, intelligent work implement 130 controls the corresponding work unit to reach the designated work position according to the instruction in the job control command to complete the corresponding work task.
Specifically, the intelligent safety helmet 110 is configured to receive a working video sent by the intelligent working tool held by a worker, and determine a working position of a current operation and equipment to be worked based on the working video; and sending a working instruction to the intranet server based on the equipment to be worked.
Here, since the operation scene corresponding to the embodiment of the present application is a live operation scene, the smart helmet 110 is made of a high voltage resistant insulating material, so as to ensure the operation safety of operators under the condition of live operation; the internal circuit board of the intelligent safety helmet 110 is wrapped by an insulating shielding device, so as to prevent electromagnetic interference and ensure normal communication between all devices.
Here, the operator may directly issue an instruction to the intelligent helmet 110 through voice to acquire a current operation video through the intelligent helmet 110.
Here, when the operator gives an instruction to acquire a work video to the smart helmet 110 by voice, a specific position of the acquired image may be specified, for example, a video image of the current work position of the smart work implement 130 or the like is acquired.
Here, a button may be provided at a specific position of the smart helmet 110, and an operator may notify the smart helmet 110 to acquire a video by pressing the button.
Here, the operator can also send a voice command to the intelligent safety helmet 110 directly through voice, and the voice module in the intelligent safety helmet 110 receives the voice command for the operator to acquire the video.
The voice command of the operator may be "acquire the work video of XX position", etc.
Here, the intelligent working tool 130 may be placed at a corresponding working position, and after determining that the intelligent working tool 130 is in place, the operator may directly issue a working video acquisition instruction through voice, so as to acquire the working video of the current working position.
The operation position of the current operation and the equipment to be operated are determined according to the operation video, and the target operation tool for operation can be determined according to the operation position and the equipment to be operated, so that the target operation tool is controlled to complete the corresponding operation task at the corresponding position.
Here, when generating the video viewing information, the intelligent safety helmet 110 needs to add the identification information of the video object to be viewed to the job instruction and send the identification information to the intranet server 120 together, so as to inform the intranet server 120 of the specific video object to be viewed by requesting the video, so that the intranet server 120 provides the video of the object to be viewed in a targeted manner.
The identification information of the user may be the current position of the operator acquired by the intelligent safety helmet 110, or the unique identification information corresponding to the intelligent safety helmet 110 in the intranet server 120, etc.
Here, the intelligent safety helmet 110 may also implement voice intercom between operators, and may also receive a job execution result fed back by the intelligent working tool 130, and broadcast the job result of the intelligent working tool 130.
Here, the intelligent safety helmet 110 can also determine the working height and the working position of the operator through the positioning function, and when the operator enters the dangerous live working range, the operator is reminded of the early warning of the approach of the operator to the dangerous working area through voice broadcasting, and meanwhile, alarm information is sent to the integrated management platform.
Further, the intranet server 120 is configured to determine a job tool identifier and a job action to be performed of the job tool from the received job command, generate a job control command based on the job location and the job tool identifier, and the job action to be performed of the job tool, and send the job control command to the intelligent job tool 130.
Here, the intranet server 120 is provided at the operation site of the live working, and can acquire the requirement information and the operation information of each worker through the smart helmets 110 worn by each worker at the operation site.
The communication between each smart helmet 110 and the intranet server 120 is a wireless communication performed through a WIFI connection sent by a bluetooth module disposed in each smart helmet 110.
After acquiring the job instruction, the intranet server 120 determines, according to the instruction of the job instruction, the current job tool identifier and the job action to be executed of the job tool, and determines, according to the information of the to-be-operated device carried in the job instruction, the job tool identifier required to perform the job, thereby generating the corresponding job control instruction.
The job control instruction includes a job tool identifier and a job action to be executed, and controls a job unit in the intelligent job tool 130 corresponding to the job tool identifier to complete a corresponding job task according to a corresponding job mode.
Here, after receiving the operation instruction sent by the smart helmet 110, the intranet server 120 stores the information sent by the smart helmet 110 in a fixed location in the form of a file.
Further, the intelligent working tool 130 is configured to control, based on the working tool identifier indicated in the working control instruction, a corresponding working unit to reach a designated working position to complete a working task; wherein, intelligent operation utensil 130 is connected with the insulator spindle, the operator passes through insulator spindle control intelligent operation utensil and reaches corresponding operation position.
Here, the circuit board inside the intelligent working tool 130 is also wrapped by an insulating shielding device, so as to prevent electromagnetic interference and ensure normal communication between the devices.
Here, the intelligent working tool 130 is connected to and combined with the insulating rod, and the operator controls the target working tool to perform corresponding work at a remote place by holding the insulating rod by hand, so that the safety of the operator is ensured.
Here, the working tool may be an intelligent wire stripping tool, an intelligent wire connecting tool, or the like. An imaging device (such as a camera) is installed in the direction in which the working tool approaches the working position, and a working video at the current working position can be acquired according to an instruction of the smart helmet 110.
In this embodiment, the transmission of the operation video is real-time transmission, that is, the intelligent operation device 130 transmits the operation video to the intranet server 120 in real time after acquiring the operation video; in other embodiments, the intelligent working tool 130 may collect and store the working video in the working process according to a preset time interval in the working process, and after receiving the video acquisition instruction of the intranet server 120, feed back the stored corresponding video image to the intranet server 120 according to the video acquisition working area and/or working time indicated in the video acquisition instruction.
Here, when it is determined that the intelligent working tool 130 reaches the designated position and the worker finishes the butt joint of the target working tool and the insulating rod, the intelligent working tool 130 is notified to perform the corresponding work by the voice broadcast function of the intelligent helmet 110.
For example, the current working content is a nut screwing process, and the target working tool is an intelligent nut installing tool, and at this time, the voice broadcast content of the intelligent safety helmet 110 may be "start to screw the nut" so as to inform the intelligent nut installing tool to start working.
Further, referring to fig. 2, fig. 2 is a second schematic structural diagram of an artificial live working system 100 according to an embodiment of the present application, and as shown in fig. 2, the artificial live working system 100 according to an embodiment of the present application further includes an intelligent glasses 1105 and an intelligent safety belt 140 embedded in the intelligent safety helmet 110.
Here, the internal circuit board of the smart glasses 1105 is also wrapped by an insulating shielding device, so as to prevent electromagnetic interference and ensure normal communication between the devices.
Here, after receiving the video image sent by the intranet server 120, the intelligent safety helmet 110 displays the video image on the intelligent glasses 1105, and displays the operation video image to the operator in real time through the intelligent glasses 1105, so that the operator can know the operation condition of the current operation area of the intelligent operation tool 130.
Here, the smart glasses 1105 are embedded in the smart helmet 110 and are retractable, and after the smart helmet 110 receives the video image sent by the intranet server 120, the smart glasses 1105 extend out of the smart helmet 110 to display the video image to the operator.
The smart belt 140 is configured to: pressure information at the hooks is collected and transmitted to the intelligent safety helmet 110.
Here, the internal circuit board of the intelligent safety belt 140 is also wrapped by an insulating shielding device, so as to prevent electromagnetic interference and ensure normal communication between the devices.
Here, the intelligent safety belt 140 collects pressure sensor state information of the hook, and determines a pressure value of the intelligent safety belt hook, so as to analyze whether the intelligent safety belt 140 of the operator is worn in place according to the pressure value.
Here, the smart band 140 is connected to the bluetooth module built in the smart helmet 110 through WIFI wireless communication through the built-in bluetooth module, and transmits data information.
Further, the smart helmet 110 is further configured to:
based on the received in-place information sent by the intelligent glasses 1105, the wearing information of the intelligent safety helmet 110 and the pressure information of the hook part sent by the intelligent safety belt 140, the wearing states of the intelligent glasses 1105, the intelligent safety helmet 110 and the intelligent safety belt 140 are determined, and when the wearing states of the intelligent glasses 1105, the intelligent safety helmet 110 and the intelligent safety belt 140 are determined to be normal wearing states, the operator is prompted to normally operate.
Here, the intelligent safety helmet 110 forwards the received in-place information sent by the intelligent glasses 1105, the wearing information of the intelligent safety helmet 110 and the pressure information at the intelligent safety belt 140 to the intranet server 120 or the extranet server, and forwards the information to the integrated management platform, and the two servers analyze whether the intelligent glasses 1105, the intelligent safety helmet 110 and the intelligent safety belt 140 of the operator are worn in place according to the pressure information, if the intelligent glasses 1105, the intelligent safety helmet 110 and the intelligent safety belt 140 of the operator are worn in place, the information that the wearing states of the intelligent glasses 1105, the intelligent safety helmet 110 and the intelligent safety belt 140 are normal is sent to the intelligent safety helmet 110, and the intelligent safety helmet 110 prompts the operator to normally operate.
Here, the smart helmet 110 may prompt the operator to normally operate through voice.
Further, referring to fig. 3, fig. 3 is a schematic structural diagram of the smart helmet 110, and as shown in fig. 3, the smart helmet 110 includes a core processing unit 1101, a voice control unit 1102, and a positioning unit 1103; the voice control unit 1102 and the positioning unit 1103 are respectively connected to the core processing unit 1101 in a communication manner.
The voice control unit 1102 is configured to receive a voice command of the operator, convert the voice command into voice control information, and send the voice control information to the core processing unit 1101;
the positioning unit 1103 is configured to collect location information of the operator and send the location information to the core processing unit 1101;
the core processing unit 1101 is configured to generate the video viewing information based on the received voice control information and the location information.
Here, the voice control unit 1102 may receive voice commands of operators wearing the smart helmet 110, and also receive voice commands of other operators, and perform operation interaction between operators; the voice command sent by the integrated management platform can be received, so that the operation can be performed according to the command of the auxiliary personnel at the integrated management platform.
Here, after determining the position information of the worker wearing the smart helmet 110, the positioning unit 1103 may directly send the position information to the intranet server 120, so that the intranet server 120 positions the worker wearing the smart helmet 110 and issues a subsequent instruction.
Here, the smart helmet 110 may be further configured with a camera unit to collect operation environment information of an operator wearing the smart helmet so as to facilitate subsequent operation assistance and guidance of the operator.
Further, referring to fig. 4, fig. 4 is a second schematic structural diagram of the smart helmet 110, as shown in fig. 4, the smart helmet 110 further includes a broadcasting unit 1104, and the broadcasting unit 1104 is configured to: when it is determined that there is an abnormal wearing state in the wearing state of the smart harness 140, the wearing state of the smart helmet 110, or the wearing state of the smart glasses 1105, an abnormal alarm is issued to prompt the worker.
Here, the broadcasting unit 1104 in the smart helmet 110 may send out a voice prompt to prompt the operator to wear the working tool correctly when it is determined that the smart belt 140, the smart glasses 1105, or the smart helmet 110 itself is not worn correctly.
The voice content of the prompt can include information such as a worker wearing the wrong working tool and a worker wearing the working tool correctly.
For example, the voice message may be a message such as "the worker a wears the seat belt in place, please check again", or the like.
Further, the smart helmet 110 is further configured to: based on the marking request instruction sent by the operator, marking request information is generated, and the received marking position information generated based on the marking request information is sent to the intranet server 120.
Here, the operator may request the operation guidance from the integrated management platform through the voice control unit 1102 in the intelligent safety helmet 110, request the integrated management platform to guide the next operation position, that is, send request mark information, and the integrated management platform generates mark position information according to the request and sends the mark position information to the intelligent safety helmet 110, and the intelligent safety helmet 110 sends the mark position information to the intranet server 120, so that the intranet server 120 marks the mark position.
Here, the marking position may be a work position corresponding to the next work, or may be a mark of a work error position of the worker, which may cause the worker to fail to continue further operations.
Further, the intranet server 120 is further configured to: and marking the corresponding position of the operation video based on the marking position indicated by the marking position information, generating a marking video, and forwarding the marking video to the intelligent glasses 1105 of the intelligent safety helmet 110 for display.
Here, after receiving the marking position information, the intranet server 120 marks at the job video acquired from the intelligent work implement 130 according to the marking position information to prompt the worker of the progress of the job.
Here, the marker position may be an Augmented Reality (AR) marker, and the display in the job video may be a square, a circle, or the like, or may be marked with a different color.
Further, the intelligent working tool 130 is further configured to: and determining a target working tool for working based on the received working tool identification, moving the target working tool to a designated working position, and completing a working task according to the working control instruction.
Here, a target working tool to be worked is determined according to the corresponding working tool identifier, a worker holds the corresponding target working tool by holding the insulating rod, moves the target working tool to a designated working position, and completes a working task according to a step of a working control instruction.
According to the manual live working system provided by the embodiment of the application, an intelligent working tool acquires a working video and sends the working video to an intelligent safety helmet, and after receiving the working video sent by a worker through the handheld intelligent working tool, the intelligent safety helmet determines the working position of the current operation and equipment to be worked from the working video, generates a working instruction and sends the working instruction to an intranet server; the intranet server determines a working tool identifier and a working tool to be executed from the working command, generates a corresponding working control command and sends the corresponding working control command to the intelligent working tool; after the intelligent operation tool receives the operation control instruction, the corresponding operation unit is controlled to reach the designated operation position according to the instruction in the operation control instruction to complete the corresponding operation task, so that the distance between an operator and equipment to be operated is increased, the time for temporarily adjusting an operation mode by the operator according to the on-site condition is reduced, the safety of the operator is ensured, and the operation efficiency of the operator is improved.
Referring to fig. 5, fig. 5 is a third schematic structural diagram of the artificial live working system provided in the embodiment of the present application, as shown in fig. 5, the artificial live working system 100 further includes a ground terminal 150, an extranet server 160, and a comprehensive management platform 170;
specifically, the ground terminal 150 is configured to obtain, at a preset time node, a feedback operation condition of the intranet server 120 in the operation area, display the feedback operation condition, and upload, in a preset time period, the operation condition in the operation area to the extranet server 160;
here, in the manual live working system 100, a plurality of working areas may be monitored at the same time, a corresponding ground terminal 150 is disposed on a working site of each working area, the ground terminal 150 in each working area may be in communication connection with an intelligent safety helmet 110 worn by a worker in the working area, collect the working conditions fed back by each intranet server 120, and periodically upload the collected working conditions in the working area to the extranet server 160.
The division for each working area may be performed according to the communication capability of each ground terminal 150 and the allocation situation of specific working personnel in the working area, so the number of working personnel included in each working area may be different.
Here, the fed back operation condition may include a current operation position, a current operation progress, a remaining power of the intelligent work implement 130, and the like.
Here, after receiving each working condition, the ground terminal 150 may display each working condition, so that a field manager may monitor the working condition in the working area in real time.
The ground terminal 150 may be a computer, a tablet computer, or the like.
Here, the ground terminal 150 in each operation area may upload the operation condition in the present operation area in a preset period of time, or may upload the operation condition in real time, that is, after receiving the operation condition sent by the intranet server 120, upload the operation condition in real time.
Further, the ground terminal 150 may collect the location information of each intranet server 120, and upload the location information of each intranet server to the extranet server 160 in a preset period of time.
Further, the extranet server 160 is configured to forward the job status received in the affiliated job area uploaded by each ground terminal 150 to the integrated management platform 170.
Here, the extranet server 160, upon receiving the job situation in the affiliated job area uploaded by each of the ground terminals 150, forwards the job situation uploaded by each of the ground terminals 150 to the integrated management platform 170 according to a different job area.
Further, after receiving the location information of each intranet server 120 uploaded by the ground terminal 150, the corresponding extranet server 160 forwards the location information of each intranet server 120 to the integrated management platform 170.
Similarly, the external network server 160 may upload the received operation conditions sent by the plurality of ground terminals 150 in a preset period of time, or may upload the received operation conditions in real time, that is, after receiving the operation conditions sent by the ground terminals 150, forward the operation conditions uploaded by the ground terminals 150 in real time.
Further, the integrated management platform 170 is configured to display the received job situation in each job area according to the job area, so that a manager monitors the job according to the displayed job situation.
Here, after receiving the job situation uploaded by the ground terminal 150 in each job area, the integrated management platform 170 visually displays the received job situation according to the job area, so that the manager monitors the job according to the displayed job situation.
Further, the integrated management platform 170 displays the real-time position of each operator in real time after receiving the position information of each intranet server 120 sent by the extranet server 160.
Here, the integrated management platform 170 may display the job status of all the workers within the monitored entire job range, thereby monitoring the entire job system.
Here, the intranet server 120 in each operation area is also in communication connection with the integrated management platform 170, and when an abnormal situation occurs in the operation process, the abnormal situation can be directly uploaded to the integrated management platform 170, and the integrated management platform 170 performs early warning of the operation abnormality.
The intranet server 120 and the integrated management platform 170 communicate through a 4G network.
The voice command may interact with the voice control unit 1102 in the smart helmet 110, and the broadcasting unit 1104 of the smart helmet 110 plays the early warning information to the operator wearing the smart helmet 110.
Further, the integrated management platform 170 is further configured to: and determining a job instruction requested in the mark request information based on the received mark request information, determining a job appliance identifier and a job action to be executed of the job appliance, generating mark position information based on the job appliance identifier and the job action to be executed of the job appliance, and sending the mark position information to the external network server so that the external network server forwards the mark position information to the corresponding internal network server 120.
Here, after receiving the marking request information forwarded by the extranet server 160, the integrated management platform 170 determines a next operation position of the corresponding operator according to the operation progress and the operation requirement of the operator, and sends the marking position information to the intelligent safety helmet 110 of the corresponding operator to guide the operation.
Here, the auxiliary personnel at the integrated management platform 170 may also perform voice interaction with the operators, and send the mark position information to the intelligent safety helmet 110 of the corresponding operators, and then, the operators are informed of the corresponding operation steps by matching with voice explanation.
Further, the integrated management platform 170 may monitor the operation condition of each operator through the received operation video, and when an operator with abnormal operation is found, the auxiliary personnel of the integrated management platform 170 may direct the operation of the corresponding operator through voice.
Further, information interaction may be performed between each worker included in the manual live working system 100, for example, the worker a may send information to other smart helmets 110 on the working site except for the smart helmets 110 worn by the worker a through the voice control unit 1102 of the worn smart helmets 110, and the other workers may receive information of the worker a through the broadcasting unit 1104 in the worn smart helmets 110.
According to the manual live working system, the wearing state of each manual live working device is monitored in real time through the comprehensive management platform, the received information is stored, meanwhile, relevant operators are guided to work according to each received request, and live working guidance which is provided for the operators and is rich in function and convenient to use is provided for the operators.
Referring to fig. 6, fig. 6 is a flowchart of a job control method according to an embodiment of the present application. As shown in fig. 6, the job control method provided in the embodiment of the present application includes:
s601, controlling the intelligent safety helmet to receive a working video sent by the intelligent working tool held by an operator, and determining the working position of the current working and equipment to be worked based on the working video; and sending a working instruction to the intranet server based on the equipment to be worked.
The operation position of the current operation and the equipment to be operated are determined according to the operation video, and the target operation tool for operation can be determined according to the operation position and the equipment to be operated, so that the target operation tool is controlled to complete the corresponding operation task at the corresponding position.
S602, controlling the intranet server to determine a working tool identifier and a working tool to be executed working action from the received working instructions, generating a working control instruction based on the working tool identifier and the working tool to be executed working action, and sending the working control instruction to the intelligent working tool.
After the intranet server acquires the operation instruction, determining an operation tool identifier and an operation action to be executed of the operation tool according to the instruction of the operation instruction, and determining the operation tool identifier needing to be operated according to the information of the operation equipment to be executed carried in the operation instruction, so as to generate a corresponding operation control instruction.
S603, controlling corresponding operation units to reach a designated operation position to complete an operation task based on the operation tool identifier indicated in the operation control instruction; the intelligent operation device is connected with the insulating rod, and the operator controls the intelligent operation device to reach the corresponding operation position through the insulating rod.
Here, intelligent operation utensil is by target operation utensil and insulator spindle connection combination, and the operation personnel is through handheld insulator spindle control target operation utensil in remote department carries out corresponding work to the safety of operation personnel has been guaranteed.
When the intelligent working tool reaches the designated position and the operator finishes the butt joint of the target working tool and the insulating rod, the intelligent working tool is informed to perform corresponding work through the voice broadcasting function of the intelligent safety helmet.
Further, before step S601, the job control method further includes: the intelligent safety helmet is controlled to determine the wearing states of the intelligent glasses, the intelligent safety helmet and the intelligent safety belt based on the received in-place information sent by the intelligent glasses, the wearing information of the intelligent safety helmet and the pressure information of the hook sent by the intelligent safety belt, and when the wearing states of the intelligent glasses, the intelligent safety helmet and the intelligent safety belt are determined to be normal wearing states, the worker is prompted to normally operate.
Further, step S601 includes: the voice control unit is controlled to receive voice instructions of the operators, convert the voice instructions into voice control information and send the voice control information to the core processing unit; the control positioning unit collects the position information of the operator and sends the position information to the core processing unit; and the control core processing unit generates the job control instruction based on the received voice control information and the received position information and sends the job control instruction to the intranet server.
Further, after step S603, the job control method further includes: when the wearing state of the intelligent safety belt, the wearing state of the intelligent safety helmet or the wearing state of the intelligent glasses is determined to have an abnormal wearing state, an abnormal alarm is sent to prompt the operator.
Further, after step S603, the job control method further includes: generating marking request information based on a marking request instruction sent by the operator, and sending the received marking position information generated based on the marking request information to the intranet server.
Further, step S603 includes: and determining a working unit for working based on the received working tool identifier, moving the intelligent working tool to a designated working position, and controlling the working unit in the intelligent working tool to complete a working task according to the working control instruction.
According to the operation control method, the intelligent operation device is controlled to collect the operation video of the operation position, the intelligent safety helmet is controlled to determine the operation device identification and the operation action to be executed of the operation device according to the received operation video, corresponding operation instructions are generated and forwarded to the intelligent operation device through the intranet server, meanwhile, the intelligent operation device is controlled to select corresponding operation units according to the operation instructions, the operation units are controlled to reach the designated operation position to complete corresponding operation tasks, and therefore the distance between an operator and equipment to be operated is increased, meanwhile, the time for temporarily adjusting operation modes by the operator according to the on-site situation is shortened, the safety of the operator is guaranteed, and meanwhile the operation efficiency of the operator is improved.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 7, the electronic device 700 includes a processor 710, a memory 720, and a bus 730.
The memory 720 stores machine-readable instructions executable by the processor 710, when the electronic device 700 is running, the processor 710 communicates with the memory 720 through the bus 730, and when the machine-readable instructions are executed by the processor 710, the steps of the job control method in the method embodiment shown in fig. 6 can be executed, and the specific implementation can be referred to the method embodiment and will not be described herein.
The embodiment of the present application further provides a computer readable storage medium, where a computer program is stored on the computer readable storage medium, and when the computer program is executed by a processor, the steps of the job control method in the method embodiment shown in fig. 6 may be executed, and a specific implementation manner may refer to the method embodiment and will not be repeated herein.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present application, and are not intended to limit the scope of the present application, but the present application is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, the present application is not limited thereto. Any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or make equivalent substitutions for some of the technical features within the technical scope of the disclosure of the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. The artificial live working system based on the intelligent wearable equipment is characterized by comprising an intelligent safety helmet, an intelligent working tool and an intranet server, wherein the intelligent safety helmet and the intelligent working tool are respectively in communication connection with the intranet server;
the intelligent safety helmet is used for receiving a working video sent by the intelligent working tool held by an operator, and the working video is obtained based on a voice instruction received by the intelligent safety helmet; determining the operation position of the current operation and equipment to be operated based on the operation video; transmitting a working instruction to the intranet server based on the equipment to be worked; the intelligent safety helmet is also provided with intelligent glasses which are used for displaying the marking video fed back by the intranet server according to the marking position information sent by the intelligent safety helmet so as to guide the operator to operate; the intelligent glasses are embedded in the intelligent safety helmet, and extend out to display the video after the intelligent safety helmet receives the marked video;
the intelligent safety helmet is also used for: determining the working height and the working position of the operator through a positioning function, and reminding the operator of approaching a dangerous working area through voice broadcasting when the operator enters a dangerous live working range;
The intelligent safety helmet is also used for:
generating marking request information based on a marking request instruction sent by the operator, and sending the received marking position information generated based on the marking request information to the intranet server;
the intranet server is used for determining a working tool identifier and a working tool to be executed working action from the received working instructions, generating a working control instruction based on the working tool identifier and the working tool to be executed working action, and sending the working control instruction to the intelligent working tool;
the intelligent operation device is used for controlling the corresponding operation unit to reach a designated operation position to complete an operation task based on the operation device identifier indicated in the operation control instruction; the intelligent operation device is connected with the insulating rod, and the operator controls the intelligent operation device to reach the corresponding operation position through the insulating rod.
2. The artificial live working system of claim 1, further comprising smart glasses and smart safety belts, the smart glasses being disposed on the smart safety helmet, the smart safety helmet further configured to:
Based on the received in-place information sent by the intelligent glasses, wearing information of the intelligent safety helmet and pressure information of a hook part sent by the intelligent safety belt, the wearing states of the intelligent glasses, the intelligent safety helmet and the intelligent safety belt are determined, and when the wearing states of the intelligent glasses, the intelligent safety helmet and the intelligent safety belt are determined to be normal wearing states, the operating personnel are prompted to normally operate.
3. The manual live working system of claim 1, wherein the intelligent safety helmet comprises a core processing unit, a voice control unit, and a positioning unit; the voice control unit and the positioning unit are respectively in communication connection with the core processing unit;
the voice control unit is used for receiving the voice instruction of the operator, converting the voice instruction into voice control information and sending the voice control information to the core processing unit;
the positioning unit is used for collecting the position information of the operator and sending the position information to the core processing unit;
the core processing unit is used for generating the job control instruction based on the received voice control information and the received position information and sending the job control instruction to the intranet server.
4. The manual live working system of claim 2, wherein the intelligent safety helmet further comprises a broadcasting unit for:
when the wearing state of the intelligent safety belt, the wearing state of the intelligent safety helmet or the wearing state of the intelligent glasses is determined to have an abnormal wearing state, an abnormal alarm is sent to prompt the operator.
5. The manual live working system of claim 1, wherein the intranet server is further configured to:
and marking the corresponding position of the operation video based on the marking position indicated by the marking position information, generating a marking video, and forwarding the marking video to the intelligent glasses of the intelligent safety helmet for display.
6. The manual live working system of claim 1, wherein the intelligent working implement is further configured to:
and determining a working unit for working based on the received working tool identifier, moving the intelligent working tool to a designated working position, and controlling the working unit in the intelligent working tool to complete a working task according to the working control instruction.
7. The live working system of claim 1, further comprising a ground terminal, an extranet server, and a comprehensive management platform;
The ground terminal is used for acquiring the operation condition fed back by the intranet server in the affiliated operation area on a preset time node, displaying the fed back operation condition, and uploading the operation condition in the operation area to the extranet server in a preset time period;
the external network server is used for forwarding the operation conditions in the operation areas which are uploaded by the ground terminals to the comprehensive management platform;
the comprehensive management platform is used for displaying the received operation conditions in each operation area according to the operation areas so that management staff can monitor the operation according to the displayed operation conditions.
8. The manual live working system of claim 7, wherein the integrated management platform is further configured to:
and determining a job instruction requested in the mark request information based on the received mark request information, determining a job appliance identifier and a job action to be executed of the job appliance, generating mark position information based on the job appliance identifier and the job action to be executed of the job appliance, and sending the mark position information to the external network server so that the external network server forwards the mark position information to a corresponding internal network server.
9. A job control method, applied to the manual live working system according to any one of claims 1 to 7, comprising:
the intelligent safety helmet is controlled to receive a working video sent by the intelligent working tool held by an operator, and the working video is obtained based on a voice instruction received by the intelligent safety helmet; determining the operation position of the current operation and equipment to be operated based on the operation video; transmitting a working instruction to the intranet server based on the equipment to be worked;
the intranet server is controlled to determine a working tool identifier and a working tool to be executed working action from the received working instructions, a working control instruction is generated based on the working tool identifier and the working tool to be executed working action, and the working control instruction is sent to the intelligent working tool;
controlling the intelligent working tool to control a corresponding working unit to reach a designated working position to finish a working task based on the working tool identifier indicated in the working control instruction; the intelligent operation device is connected with the insulating rod, and the operator controls the intelligent operation device to reach a corresponding operation position through the insulating rod;
Generating marking request information based on a marking request instruction sent by the operator, sending the received marking position information generated based on the marking request information to the intranet server, and guiding the operator to operate according to a fed-back marking video; the marked video is displayed through intelligent glasses embedded in the intelligent safety helmet;
the job control method further includes:
the intelligent safety helmet is controlled to determine the working height and the working position of the operator through a positioning function, and when the operator enters a dangerous live working range, the operator is reminded of approaching a dangerous working area through voice broadcasting;
generating marking request information based on a marking request instruction sent by the operator, and sending the received marking position information generated based on the marking request information to the intranet server.
CN202110430397.8A 2021-04-21 2021-04-21 Artificial live working system and method based on intelligent wearable equipment Active CN112954280B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110430397.8A CN112954280B (en) 2021-04-21 2021-04-21 Artificial live working system and method based on intelligent wearable equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110430397.8A CN112954280B (en) 2021-04-21 2021-04-21 Artificial live working system and method based on intelligent wearable equipment

Publications (2)

Publication Number Publication Date
CN112954280A CN112954280A (en) 2021-06-11
CN112954280B true CN112954280B (en) 2023-08-04

Family

ID=76233220

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110430397.8A Active CN112954280B (en) 2021-04-21 2021-04-21 Artificial live working system and method based on intelligent wearable equipment

Country Status (1)

Country Link
CN (1) CN112954280B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114209118A (en) * 2021-12-29 2022-03-22 国网瑞嘉(天津)智能机器人有限公司 High-altitude operation intelligent early warning method and device and intelligent safety helmet
CN115047814A (en) * 2022-06-13 2022-09-13 国网山东省电力公司胶州市供电公司 Pole climbing operation safety supervision system and method based on 5G
CN115198818B (en) * 2022-08-31 2023-12-26 上海三一重机股份有限公司 Work machine control method, device, equipment, medium and work machine

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010166665A (en) * 2009-01-14 2010-07-29 Chugoku Electric Power Co Inc:The Slip-off state determining system
CN201656302U (en) * 2010-02-05 2010-11-24 厦门立林高压电气有限公司 On-line maintenance device of high-voltage equipment
WO2015123810A1 (en) * 2014-02-18 2015-08-27 广东电网公司江门供电局 Electric power field operation system combined with smartphone
CN108170273A (en) * 2017-12-28 2018-06-15 南京华讯方舟通信设备有限公司 A kind of expert's remote guide system and implementation method based on hololens glasses
CN108839037A (en) * 2018-07-11 2018-11-20 天津滨电电力工程有限公司 A kind of distribution network live line connects drainage thread robot system
CN108858121A (en) * 2018-07-27 2018-11-23 国网江苏省电力有限公司徐州供电分公司 Hot line robot and its control method
CN109318204A (en) * 2018-10-24 2019-02-12 国网江苏省电力有限公司徐州供电分公司 A kind of livewire work tow-armed robot intelligence control system
CN109514520A (en) * 2018-11-28 2019-03-26 广东电网有限责任公司 A kind of high-voltage hot-line work principal and subordinate robot apparatus for work and method
CN109782703A (en) * 2018-12-13 2019-05-21 湖南铁路科技职业技术学院 A kind of railway standardization operation control system based on intelligent glasses
CN109938442A (en) * 2018-12-07 2019-06-28 云南电网有限责任公司保山供电局 A kind of Intelligent safety helmet and householder method for electric operating
CN110492382A (en) * 2019-07-25 2019-11-22 安徽送变电工程有限公司 A kind of livewire work screening clothing Multidimensional Awareness information integrated system
CN111232094A (en) * 2020-03-18 2020-06-05 三门峡珑启物联网科技有限公司 Power transformation operation and maintenance vehicle
CN210779805U (en) * 2019-08-02 2020-06-16 国网浙江省电力有限公司金华供电公司 Safety monitoring device for live working process
CN111412952A (en) * 2020-04-28 2020-07-14 中国东方电气集团有限公司 Industrial environment wearable equipment
CN111645077A (en) * 2020-06-19 2020-09-11 国电南瑞科技股份有限公司 Ground monitoring system and monitoring method for distribution network line live working robot
CN111905297A (en) * 2020-06-23 2020-11-10 武汉瑞莱保能源技术有限公司 Safety management system and method for high-altitude operation
CN112388630A (en) * 2020-10-12 2021-02-23 北京国电富通科技发展有限责任公司 Distribution network live working wire stripping robot based on binocular vision and working method thereof
CN112565701A (en) * 2020-12-10 2021-03-26 国家电网有限公司 System and method for intelligent safety management and control of transformer substation based on VR glasses
CN112633238A (en) * 2020-12-31 2021-04-09 上海蓬渤机电设备有限公司 Electric welding construction detection method based on deep learning image processing

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010166665A (en) * 2009-01-14 2010-07-29 Chugoku Electric Power Co Inc:The Slip-off state determining system
CN201656302U (en) * 2010-02-05 2010-11-24 厦门立林高压电气有限公司 On-line maintenance device of high-voltage equipment
WO2015123810A1 (en) * 2014-02-18 2015-08-27 广东电网公司江门供电局 Electric power field operation system combined with smartphone
CN108170273A (en) * 2017-12-28 2018-06-15 南京华讯方舟通信设备有限公司 A kind of expert's remote guide system and implementation method based on hololens glasses
CN108839037A (en) * 2018-07-11 2018-11-20 天津滨电电力工程有限公司 A kind of distribution network live line connects drainage thread robot system
CN108858121A (en) * 2018-07-27 2018-11-23 国网江苏省电力有限公司徐州供电分公司 Hot line robot and its control method
CN109318204A (en) * 2018-10-24 2019-02-12 国网江苏省电力有限公司徐州供电分公司 A kind of livewire work tow-armed robot intelligence control system
CN109514520A (en) * 2018-11-28 2019-03-26 广东电网有限责任公司 A kind of high-voltage hot-line work principal and subordinate robot apparatus for work and method
CN109938442A (en) * 2018-12-07 2019-06-28 云南电网有限责任公司保山供电局 A kind of Intelligent safety helmet and householder method for electric operating
CN109782703A (en) * 2018-12-13 2019-05-21 湖南铁路科技职业技术学院 A kind of railway standardization operation control system based on intelligent glasses
CN110492382A (en) * 2019-07-25 2019-11-22 安徽送变电工程有限公司 A kind of livewire work screening clothing Multidimensional Awareness information integrated system
CN210779805U (en) * 2019-08-02 2020-06-16 国网浙江省电力有限公司金华供电公司 Safety monitoring device for live working process
CN111232094A (en) * 2020-03-18 2020-06-05 三门峡珑启物联网科技有限公司 Power transformation operation and maintenance vehicle
CN111412952A (en) * 2020-04-28 2020-07-14 中国东方电气集团有限公司 Industrial environment wearable equipment
CN111645077A (en) * 2020-06-19 2020-09-11 国电南瑞科技股份有限公司 Ground monitoring system and monitoring method for distribution network line live working robot
CN111905297A (en) * 2020-06-23 2020-11-10 武汉瑞莱保能源技术有限公司 Safety management system and method for high-altitude operation
CN112388630A (en) * 2020-10-12 2021-02-23 北京国电富通科技发展有限责任公司 Distribution network live working wire stripping robot based on binocular vision and working method thereof
CN112565701A (en) * 2020-12-10 2021-03-26 国家电网有限公司 System and method for intelligent safety management and control of transformer substation based on VR glasses
CN112633238A (en) * 2020-12-31 2021-04-09 上海蓬渤机电设备有限公司 Electric welding construction detection method based on deep learning image processing

Also Published As

Publication number Publication date
CN112954280A (en) 2021-06-11

Similar Documents

Publication Publication Date Title
CN112954280B (en) Artificial live working system and method based on intelligent wearable equipment
US8032253B2 (en) Automatic machine system and wireless communication method thereof
CN110212451B (en) Electric power AR intelligence inspection device
US9946342B2 (en) Programmable display device, portable terminal device, data processing method, and program
KR101431424B1 (en) Plant system for supporting operation/maintenance using smart helmet capable of bi-directional communication and method thereof
KR101339928B1 (en) Integrated wireless ship management system based on local information of worker
KR102041784B1 (en) Safety mornitoring system combining ioe and drone
CN108700912B (en) Method and system for operating a device through augmented reality
DE102018220039B4 (en) CONTROL DEVICE, ELECTRONIC DEVICE AND CONTROL SYSTEM
US10203686B2 (en) Operation management system for directly displaying work instruction based on operation management information on machine tool
CN209787471U (en) Power plant operating personnel positioning system
CN110883775A (en) Man-machine interaction system and man-machine cooperation system of single-arm live working robot
JP2019079144A (en) Work support system, imaging device, wearable device, and work support method
US20210076757A1 (en) Garment and alert system
CN109450090B (en) Lack department operation safety protection system
KR101377706B1 (en) Local information based wireless ship management system
CN113708275A (en) Transformer substation inspection system and method
JP6935729B2 (en) Power outage information provision system
CN111641263A (en) Secondary equipment intelligent operation and maintenance system and method based on three-dimensional navigation
KR101218404B1 (en) Self-adjustment device of monitor camera using for industry
TWM596938U (en) Operation Management System
JP2006331196A (en) Operation confirmation system in distributed control system
CN211628275U (en) Job management system
EP3330850A1 (en) Apparatus for compiling script
CN113285998B (en) Driver management terminal and unmanned aerial vehicle supervisory systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant