CN112527100A - Remote assistance method and device based on intelligent wearable equipment - Google Patents
Remote assistance method and device based on intelligent wearable equipment Download PDFInfo
- Publication number
- CN112527100A CN112527100A CN202011238081.0A CN202011238081A CN112527100A CN 112527100 A CN112527100 A CN 112527100A CN 202011238081 A CN202011238081 A CN 202011238081A CN 112527100 A CN112527100 A CN 112527100A
- Authority
- CN
- China
- Prior art keywords
- information
- intelligent wearable
- remote assistance
- wearable device
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 57
- 230000003993 interaction Effects 0.000 claims abstract description 37
- 230000008447 perception Effects 0.000 claims abstract description 21
- 238000005516 engineering process Methods 0.000 claims description 16
- 230000009471 action Effects 0.000 claims description 5
- 239000003550 marker Substances 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 abstract description 7
- 238000003860 storage Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 14
- 238000010586 diagram Methods 0.000 description 11
- 230000003287 optical effect Effects 0.000 description 10
- 238000012545 processing Methods 0.000 description 8
- 238000009434 installation Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000003190 augmentative effect Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 230000004807 localization Effects 0.000 description 3
- 238000012423 maintenance Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000005111 flow chemistry technique Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/06—Energy or water supply
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Economics (AREA)
- Water Supply & Treatment (AREA)
- General Health & Medical Sciences (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Primary Health Care (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Public Health (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a remote assistance method and device based on intelligent wearable equipment. The method comprises the following steps: performing three-dimensional perception through intelligent wearable equipment to determine a current position; generating display information based on the current position and the real-time image; generating interaction information based on the operation of a user; and performing remote assistance based on the interaction information and the presentation information. The remote assistance method and the remote assistance device based on the intelligent wearable equipment can efficiently, quickly and accurately perform remote assistance on field operation personnel, improve the field operation efficiency, improve the field visual management intensity and promote the field work safety.
Description
Technical Field
The disclosure relates to the field of computer information processing, in particular to a remote assistance method and device based on intelligent wearable equipment.
Background
With the increasing expansion of the scale of the power grid and the production of novel equipment, the installation, operation and maintenance of the power grid equipment face the problems of various data, complex steps, uneven working level of personnel and the like, and the traditional field operation management mode cannot meet the requirement of power grid development. In the aspect of field operation of power grid equipment installation, novel equipment and large-scale equipment are continuously developed, the execution link of field installation operation is more complex, the flow, the precision and the charge degree of installation steps are greatly improved, and the field installation operation efficiency is low and the quality is difficult to guarantee. In the aspects of field operation and maintenance operation of the power grid equipment, operation and maintenance operators are difficult to quickly and comprehensively master equipment information and trend equipment states on an operation field.
Meanwhile, the standard operation guide card is not in seamless connection with the actual operation, and the working mode needs to be improved. Through the mobile interconnection technology, although real-time acquisition, synchronization and analysis of field information are realized through intelligent devices such as a mobile phone or a tablet computer in the field operation process at present, both hands are occupied while the devices are used, and the sight line is influenced.
Therefore, a new remote assistance method and apparatus based on the smart wearable device are needed.
Disclosure of Invention
In view of this, the present disclosure provides a remote assistance method and apparatus based on an intelligent wearable device, which can efficiently, quickly and accurately perform remote assistance on field operation personnel, improve field operation efficiency, improve field visual management strength, and promote field work safety.
According to an aspect of the disclosure, a remote assistance method based on an intelligent wearable device is provided, which includes: performing three-dimensional perception through intelligent wearable equipment to determine a current position; generating display information based on the current position and the real-time image; generating interaction information based on the operation of a user; and performing remote assistance based on the interaction information and the presentation information.
In an exemplary embodiment of the present disclosure, further comprising: acquiring remote assistance information; and displaying the real-time image, the display information and/or the remote assistance information through the intelligent wearable equipment.
In an exemplary embodiment of the present disclosure, displaying the real-time image, the presentation information, and/or the remote assistance information through the smart wearable device includes: generating a first layer image through the real-time image; generating a second layer image through the display information; and/or generating a third layer image through the remote assistance information; and superposing and fusing the first layer image, the second layer image and the third layer image and displaying the images by an holographic waveguide head display technology.
In an exemplary embodiment of the present disclosure, the three-dimensional sensing by the smart wearable device to determine the current location includes: three-dimensional sensing is carried out through a structured light depth sensor of the intelligent wearable equipment to determine the current position; and/or performing three-dimensional perception through a visible light sensor of the intelligent wearable device to determine the current position; and/or performing three-dimensional perception through an inertial sensor of the intelligent wearable device to determine the current position.
In an exemplary embodiment of the present disclosure, the three-dimensional sensing by the smart wearable device to determine the current location includes: establishing a three-dimensional scene model of the transformer substation; associating the three-dimensional scene model with a coordinate system of the intelligent wearable device; and displaying the current position of the intelligent wearable device in the three-dimensional scene.
In an exemplary embodiment of the present disclosure, associating the three-dimensional scene model with a coordinate system of the smart wearable device includes: and associating the three-dimensional scene model with a coordinate system of the intelligent wearable device through a virtual reality marker and time positioning and map building technology.
In an exemplary embodiment of the present disclosure, generating presentation information based on the current location and the real-time image includes: extracting guidance information from a standard job database based on the current location; and extracting the display information from the guide information through the real-time image.
In an exemplary embodiment of the present disclosure, generating interaction information based on an operation of a user includes: generating the interaction information based on the real-time voice, operation gesture and head action of the user.
In an exemplary embodiment of the present disclosure, remotely assisting based on the interaction information and the presentation information includes: sending the interaction information and the display information to a background server; and obtaining, by the background server, remote assistance information.
According to an aspect of the present disclosure, a remote assistance device based on an intelligent wearable device is provided, the device includes: the position module is used for carrying out three-dimensional perception through the intelligent wearable equipment so as to determine the current position; the display module is used for generating display information based on the current position and the real-time image; the interaction module is used for generating interaction information based on the operation of a user; and the assistance module is used for carrying out remote assistance based on the interaction information and the display information.
According to an aspect of the present disclosure, an electronic device is provided, the electronic device including: one or more processors; storage means for storing one or more programs; when executed by one or more processors, cause the one or more processors to implement a method as above.
According to an aspect of the disclosure, a computer-readable medium is proposed, on which a computer program is stored, which program, when being executed by a processor, carries out the method as above.
According to the remote assistance method and device based on the intelligent wearable equipment, the current position is determined by three-dimensional perception through the intelligent wearable equipment; generating display information based on the current position and the real-time image; generating interaction information based on the operation of a user; and based on the interactive information and the display information, remote assistance can be efficiently, quickly and accurately performed on site operators, the site operation efficiency is improved, the site visual management intensity is improved, and the site work safety is promoted.
Drawings
Fig. 1 is a system block diagram illustrating a remote assistance method and apparatus based on an intelligent wearable device according to an exemplary embodiment.
Fig. 2 is a flowchart illustrating a remote assistance method based on an intelligent wearable device according to an exemplary embodiment.
Fig. 3 is a flowchart illustrating a remote assistance method based on an intelligent wearable device according to an exemplary embodiment.
Fig. 4 is a schematic diagram illustrating a remote assistance method based on an intelligent wearable device according to an exemplary embodiment.
Fig. 5 is a flowchart illustrating a remote assistance method based on a smart wearable device according to another exemplary embodiment.
Fig. 6 is a schematic diagram illustrating a remote assistance method based on an intelligent wearable device according to an exemplary embodiment.
Fig. 7 is a block diagram illustrating a remote assistance apparatus based on an intelligent wearable device according to an exemplary embodiment.
FIG. 8 is a block diagram illustrating an electronic device in accordance with an example embodiment.
FIG. 9 is a block diagram illustrating a computer-readable medium in accordance with an example embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings.
Fig. 1 is a system block diagram illustrating a remote assistance method and apparatus based on an intelligent wearable device according to an exemplary embodiment.
As shown in fig. 1, the system architecture 100 may include smart wearable devices 101, 102, 103, a network 104, and a server 105. Network 104 is the medium used to provide communication links between smart wearable devices 101, 102, 103 and server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the smart wearable devices 101, 102, 103 to interact with the server 105 over the network 104 to receive or send messages, etc. Various communication client applications, such as shopping applications, web browser applications, search applications, instant messaging tools, mailbox clients, social platform software, etc., may be installed on the smart wearable devices 101, 102, 103.
The smart wearable devices 101, 102, 103 may be various forms of wearable smart devices having a display screen and supporting data transmission.
The server 105 may be a server that provides various services, such as a background management server that provides support for video information transmitted by the user using the smart wearable device 101, 102, 103. The background management server may analyze the received video information, and feed back a processing result (management and control policy) to the smart wearable devices 101, 102, and 103.
The server 105 may also, for example, obtain remote assistance information; and displaying the real-time image, the display information and/or the remote assistance information through the intelligent wearable equipment.
The server 105 may be a single entity server, or may be composed of a plurality of servers, for example, it should be noted that the remote assistance method based on the intelligent wearable device provided by the embodiment of the present disclosure may be executed by the server 105, and accordingly, the remote assistance apparatus based on the intelligent wearable device may be disposed in the server 105.
The remote assistance method and device based on the intelligent wearable device can achieve informatization of the standard operation guide card on the wearable device, and display the content of the standard operation guide card in a holographic virtual display mode. The sensor on the intelligent wearable device is used for completing the on-site evidence obtaining function, including the functions of sound recording, video recording and picture taking. The personnel positioning function is achieved through a sensor on the intelligent wearable device, and personnel positioning information can be synchronously displayed in a remote transformer substation three-dimensional scene through a network. The equipment data needs to be associated with the actual position of the equipment, and the inquiry and the display of field personnel through gesture recognition are convenient.
Fig. 2 is a flowchart illustrating a remote assistance method based on an intelligent wearable device according to an exemplary embodiment. The remote assistance method 20 based on the smart wearable device at least includes steps S202 to S208.
As shown in fig. 2, in S202, three-dimensional sensing is performed by the smart wearable device to determine the current location. The method comprises the following steps: three-dimensional sensing is carried out through a structured light depth sensor of the intelligent wearable equipment to determine the current position; and/or performing three-dimensional perception through a visible light sensor of the intelligent wearable device to determine the current position; and/or performing three-dimensional perception through an inertial sensor of the intelligent wearable device to determine the current position.
Through the three-dimensional perception module, the intelligent wearable device can know the relative position of the intelligent wearable device in the real world (positioning function) and the three-dimensional structure of the real world (map building). Through the three-dimensional perception function, the wearable device can know the position of the power grid device, and displays information on the corresponding power grid device through the preset or real-time acquired power grid device information and the augmented reality technology. The three-dimensional perception can be realized by computer vision or SLAM (Simultaneous Localization And Mapping) in robotics, And more particularly, the accurate position of the equipment in a three-dimensional space can be obtained through the fusion of various sensors (laser radar, optical camera, depth camera And inertial sensor), And meanwhile, the surrounding three-dimensional space can be reconstructed in real time.
In one embodiment, further comprising: establishing a three-dimensional scene model of the transformer substation; associating the three-dimensional scene model with a coordinate system of the intelligent wearable device; and displaying the current position of the intelligent wearable device in the three-dimensional scene. The details will be described in detail in the embodiment corresponding to fig. 5.
In S204, presentation information is generated based on the current position and the real-time image. Can include the following steps: extracting guidance information from a standard job database based on the current location; and extracting the display information from the guide information through the real-time image.
The wearable equipment can realize informatization of the standard operation guide card, and the content of the standard operation guide card is displayed in a holographic virtual display mode. The sensor on the intelligent wearable device is used for completing the on-site evidence obtaining function, including the functions of sound recording, video recording and picture taking. The personnel positioning function is achieved through a sensor on the intelligent wearable device, and personnel positioning information can be synchronously displayed in a remote transformer substation three-dimensional scene through a network. The equipment data needs to be associated with the actual position of the equipment, and the inquiry and the display of field personnel through gesture recognition are convenient.
In S206, the interaction information is generated based on the user' S operation. Can include the following steps: generating the interaction information based on the real-time voice, operation gesture and head action of the user. The interaction result can be displayed and output after the user interaction information is input into the computer, for example, based on touch and gesture recognition algorithms, so that the user can display the content interested by the user by changing the calibration content of the current picture, or interactively operate the displayed virtual object.
In S208, remote assistance is performed based on the interaction information and the presentation information. Can include the following steps: sending the interaction information and the display information to a background server; and obtaining, by the background server, remote assistance information. The position of the field worker can be displayed in the far-end three-dimensional scene, and the content observed by the first-person visual angle of the field worker can be displayed. The remote three-dimensional scene can synchronously display the content recorded by the field staff, and can call the history of the equipment.
The high-precision real-time positioning information of the wearable equipment can be transmitted to a substation server through a 5G network or a power wireless private network, and the high-precision real-time positioning information contains a schematic model of field workers. And the back-end support personnel can know the real-time accurate positioning of the field working personnel in the virtual digital transformer substation scene in real time by checking the three-dimensional model of the transformer substation. More specifically, the field high-definition screenshot can be used as a rear-end support person input interface through preset software, the support person can input text contents after selecting a position point of the operation target device, actual space coordinates of field operation are determined in a two-dimensional to three-dimensional space mapping mode, and the contents needing to be operated are displayed on wearable devices of field workers in real time.
According to the remote assistance method based on the intelligent wearable equipment, the field operation personnel can be remotely assisted efficiently, quickly and accurately, the field operation efficiency is improved, the field visual management intensity is improved, and the field work safety is promoted.
It should be clearly understood that this disclosure describes how to make and use particular examples, but the principles of this disclosure are not limited to any details of these examples. Rather, these principles can be applied to many other embodiments based on the teachings of the present disclosure.
Fig. 3 is a flowchart illustrating a remote assistance method and apparatus based on an intelligent wearable device according to another exemplary embodiment.
As shown in fig. 3, in S302, remote assistance information is acquired.
In S304, a first layer image is generated from the real-time image.
In S306, a second layer image is generated through the presentation information.
In S308, a third layer image is generated from the remote assistance information.
In S310, the first layer image, the second layer image, and the third layer image are superimposed and fused, and displayed by the holographic head display technology.
The basic principle of the holographic optical waveguide technology is total reflection and diffraction of light, and the holographic waveguide helmet display system mainly comprises a micro display, a holographic grating and a flat waveguide. The image generated by the micro display is changed into parallel light after passing through the micro collimating lens, the parallel light enters the optical waveguide and reaches the holographic grating, and the transmission direction of the parallel light is changed due to the diffraction effect of the holographic grating, so that the parallel light is subjected to the busy family total reflection condition and is transmitted forward along the waveguide direction without loss. When the parallel light propagates to the holographic grating, the total reflection condition is broken so that the parallel light exits from the holographic waveguide and enters a human eye for imaging. Due to the presence of the holographic waveguide, the optical image can propagate with vertical deflection. This not only reduces the propagation distance, but also keeps the center of the optical system within the head. Meanwhile, the use of a folding mirror is reduced, and the simplification and the small-sized design of an optical system are facilitated.
When an image propagates in the waveguide in the form of parallel light, the image can be kept free from distortion and distortion because the waveguide plates are parallel.
Fig. 4 is a schematic diagram illustrating a remote assistance method based on an intelligent wearable device according to an exemplary embodiment. As shown in fig. 4, the mixed reality is a bridge in which a variable electric field and a physical scene interact, a wearable device based on a laser depth camera, an image sensor and a high-precision inertial measurement unit can sense the whole space through a Simultaneous localization and mapping (SLAM) technology, determine the position and the head posture of a field worker in the space through a space calculation technology, and finally display the corresponding content of the digital twin substation together with an actual device in front of the field worker through the mixed reality technology.
In fig. 4, "step-out high voltage hazard" is the content of virtual safety measures placed in the digital twin substation, and the field worker can directly see the content of the virtual safety measures through the wearable device. Because the equipment in the digital twin transformer substation is bound with the national network equipment object ID, the wearable equipment can directly read the relevant information and defect conditions of the equipment of sight line in advance.
Fig. 5 is a flowchart illustrating a remote assistance method and apparatus based on an intelligent wearable device according to another exemplary embodiment. The flow shown in fig. 5 is a detailed description of S202 "performing three-dimensional perception by the smart wearable device to determine the current location" in the flow shown in fig. 2,
as shown in fig. 5, in S502, a three-dimensional scene model of the substation is established.
In S504, the three-dimensional scene model is associated with the coordinate system of the smart wearable device. And associating the three-dimensional scene model with a coordinate system of the intelligent wearable device through a virtual reality marker and time positioning and map building technology.
In S506, the current position of the smart wearable device is displayed in the three-dimensional scene.
The substation can be surveyed on site, the whole 110kV substation is modeled according to centimeter-level precision, and software capable of reproducing the whole substation on a VR or PC display is developed through a three-dimensional engine.
And using AR (augmented reality) marks and SLAM (simultaneous localization and mapping) technology to correspond the coordinate system of the whole 110kV transformer substation with the coordinate system generated by the intelligent wearable device through an algorithm. The association between the VR scene and the augmented reality actual scene coordinate system is completed in a mode of associating the augmented reality area learning model with the whole VR scene of the transformer substation in the three-dimensional engine.
The informationization of the standard operation guide card is realized on the wearable device, and the content of the standard operation guide card is displayed in a holographic virtual display mode. The sensor on the intelligent wearable device is used for completing the on-site evidence obtaining function, including the functions of sound recording, video recording and picture taking. The personnel positioning function is achieved through a sensor on the intelligent wearable device, and personnel positioning information can be synchronously displayed in a remote transformer substation three-dimensional scene through a network. The equipment data needs to be associated with the actual position of the equipment, and the inquiry and the display of field personnel through gesture recognition are convenient.
The three-dimensional registration technology is a basic technology for realizing the association of the holographic virtual image and the real world. The method mainly completes the tasks of detecting the position state of the camera relative to a real scene in real time and determining the position coordinates of the holographic virtual model in the real world. In the power grid application, the information such as the attribute, the environmental information and the historical record of the current working area can be acquired through an AR + LBS mode, and the related attribute information of the equipment such as the power equipment, the tools and the buildings can be quickly acquired by combining an AR code technology.
Fig. 6 is a schematic diagram illustrating a remote assistance method based on an intelligent wearable device according to an exemplary embodiment. As shown in fig. 6, the high-precision real-time positioning information of the wearable device may be transmitted to the substation server through a 5G network or a wireless private network, which includes a schematic model of a field worker. And the back-end support personnel can also know the real-time accurate positioning of the field workers in the virtual digital transformer substation scene in real time by checking the three-dimensional model of the transformer substation. Through the mixed reality camera on the wearable equipment, the back end support personnel can know the first person visual angle video that the field work personnel saw in real time, and the video content contains the combination picture of scene physical scene and virtual scene.
Those skilled in the art will appreciate that all or part of the steps implementing the above embodiments are implemented as computer programs executed by a CPU. When executed by the CPU, performs the functions defined by the above-described methods provided by the present disclosure. The program may be stored in a computer readable storage medium, which may be a read-only memory, a magnetic or optical disk, or the like.
Furthermore, it should be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 7 is a block diagram illustrating a remote assistance apparatus based on an intelligent wearable device according to an exemplary embodiment. The remote assisting apparatus 70 based on the smart wearable device includes: a location module 702, a presentation module 704, an interaction module 706, and an assistance module 708.
The position module 702 is configured to perform three-dimensional sensing through the smart wearable device to determine a current position; the method comprises the following steps: three-dimensional sensing is carried out through a structured light depth sensor of the intelligent wearable equipment to determine the current position; and/or performing three-dimensional perception through a visible light sensor of the intelligent wearable device to determine the current position; and/or performing three-dimensional perception through an inertial sensor of the intelligent wearable device to determine the current position.
The display module 704 is configured to generate display information based on the current position and the real-time image; the method comprises the following steps: extracting guidance information from a standard job database based on the current location; and extracting the display information from the guide information through the real-time image.
The interaction module 706 is used for generating interaction information based on the operation of the user; the method comprises the following steps: generating the interaction information based on the real-time voice, operation gesture and head action of the user.
The assistance module 708 is configured to perform remote assistance based on the interaction information and the presentation information. The method comprises the following steps: sending the interaction information and the display information to a background server; and obtaining, by the background server, remote assistance information.
According to the remote assistance method based on the intelligent wearable equipment, the field operation personnel can be remotely assisted efficiently, quickly and accurately, the field operation efficiency is improved, the field visual management intensity is improved, and the field work safety is promoted.
FIG. 8 is a block diagram illustrating an electronic device in accordance with an example embodiment.
An electronic device 200 according to this embodiment of the present disclosure is described below with reference to fig. 8. The electronic device 200 shown in fig. 8 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 8, the electronic device 200 is embodied in the form of a general purpose computing device. The components of the electronic device 200 include, but are not limited to: at least one processing unit 210, at least one memory unit 220, a bus 230 connecting different system components (including the memory unit 220 and the processing unit 210), a display unit 240, and the like.
Wherein the storage unit stores program code executable by the processing unit 210 to cause the processing unit 210 to perform the steps according to various exemplary embodiments of the present disclosure described in the above-mentioned electronic prescription flow processing method section of the present specification. For example, the processing unit 210 may perform the steps as shown in fig. 2.
The memory unit 220 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)2201 and/or a cache memory unit 2202, and may further include a read only memory unit (ROM) 2203.
The storage unit 220 may also include a program/utility 2204 having a set (at least one) of program modules 2205, such program modules 2205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The electronic device 200 may also communicate with one or more external devices 300 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 200, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 200 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 250. Also, the electronic device 200 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 260. The network adapter 260 may communicate with other modules of the electronic device 200 via the bus 230. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 200, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, as shown in fig. 9, the technical solution according to the embodiment of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, or a network device, etc.) to execute the above method according to the embodiment of the present disclosure.
The software product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
The computer readable medium carries one or more programs which, when executed by a device, cause the computer readable medium to perform the functions of: performing three-dimensional perception through intelligent wearable equipment to determine a current position; generating display information based on the current position and the real-time image; generating interaction information based on the operation of a user; and performing remote assistance based on the interaction information and the presentation information.
Those skilled in the art will appreciate that the modules described above may be distributed in the apparatus according to the description of the embodiments, or may be modified accordingly in one or more apparatuses unique from the embodiments. The modules of the above embodiments may be combined into one module, or further split into multiple sub-modules.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Exemplary embodiments of the present disclosure are specifically illustrated and described above. It is to be understood that the present disclosure is not limited to the precise arrangements, instrumentalities, or instrumentalities described herein; on the contrary, the disclosure is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
Claims (10)
1. A remote assistance method based on intelligent wearable equipment is characterized by comprising the following steps:
performing three-dimensional perception through intelligent wearable equipment to determine a current position;
generating display information based on the current position and the real-time image;
generating interaction information based on the operation of a user; and
and performing remote assistance based on the interaction information and the display information.
2. The method of claim 1, further comprising:
acquiring remote assistance information; and
and displaying the real-time image, the display information and/or the remote assistance information through the intelligent wearable equipment.
3. The method of claim 2, wherein displaying the real-time image, the presentation information, and/or the remote assistance information through the smart wearable device comprises:
generating a first layer image through the real-time image;
generating a second layer image through the display information; and/or
Generating a third layer image through the remote assistance information;
and superposing and fusing the first layer image, the second layer image and the third layer image and displaying the images by an holographic waveguide head display technology.
4. The method of claim 1, wherein the performing three-dimensional perception by the smart wearable device to determine the current location comprises:
three-dimensional sensing is carried out through a structured light depth sensor of the intelligent wearable equipment to determine the current position; and/or
The method comprises the steps that three-dimensional sensing is carried out through a visible light sensor of the intelligent wearable device to determine the current position; and/or
And performing three-dimensional perception through an inertial sensor of the intelligent wearable device to determine the current position.
5. The method of claim 1, wherein the performing three-dimensional perception by the smart wearable device to determine the current location comprises:
establishing a three-dimensional scene model of the transformer substation;
associating the three-dimensional scene model with a coordinate system of the intelligent wearable device; and
displaying the current position of the intelligent wearable device in the three-dimensional scene.
6. The method of claim 5, wherein associating the three-dimensional scene model with a coordinate system of the smart wearable device comprises:
and associating the three-dimensional scene model with a coordinate system of the intelligent wearable device through a virtual reality marker and time positioning and map building technology.
7. The method of claim 1, wherein generating presentation information based on the current location and a real-time image comprises:
extracting guidance information from a standard job database based on the current location; and
and extracting the display information from the guide information through the real-time image.
8. The method of claim 1, wherein generating interaction information based on the user's actions comprises:
generating the interaction information based on the real-time voice, operation gesture and head action of the user.
9. The method of claim 1, wherein remotely assisting based on the interaction information and the presentation information comprises:
sending the interaction information and the display information to a background server; and
and obtaining the remote assistance information by the background server.
10. The utility model provides a remote assistance device based on intelligence wearing equipment which characterized in that includes:
the position module is used for carrying out three-dimensional perception through the intelligent wearable equipment so as to determine the current position;
the display module is used for generating display information based on the current position and the real-time image;
the interaction module is used for generating interaction information based on the operation of a user; and
and the assistance module is used for carrying out remote assistance based on the interaction information and the display information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011238081.0A CN112527100A (en) | 2020-11-09 | 2020-11-09 | Remote assistance method and device based on intelligent wearable equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011238081.0A CN112527100A (en) | 2020-11-09 | 2020-11-09 | Remote assistance method and device based on intelligent wearable equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112527100A true CN112527100A (en) | 2021-03-19 |
Family
ID=74979903
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011238081.0A Pending CN112527100A (en) | 2020-11-09 | 2020-11-09 | Remote assistance method and device based on intelligent wearable equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112527100A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113445987A (en) * | 2021-08-05 | 2021-09-28 | 中国铁路设计集团有限公司 | Railway drilling auxiliary operation method based on augmented reality scene under mobile terminal |
CN117289791A (en) * | 2023-08-22 | 2023-12-26 | 杭州空介视觉科技有限公司 | Meta universe artificial intelligence virtual equipment data generation method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160049004A1 (en) * | 2014-08-15 | 2016-02-18 | Daqri, Llc | Remote expert system |
CN108089696A (en) * | 2016-11-08 | 2018-05-29 | 罗克韦尔自动化技术公司 | For the virtual reality and augmented reality of industrial automation |
CN108879440A (en) * | 2018-06-20 | 2018-11-23 | 国网山东省电力公司济宁供电公司 | Intelligent examination and repair system and method based on wearable augmented reality display terminal and cloud platform |
CN110751734A (en) * | 2019-09-23 | 2020-02-04 | 华中科技大学 | Mixed reality assistant system suitable for job site |
WO2020165885A1 (en) * | 2019-02-13 | 2020-08-20 | Quaqua Experiences Pvt. Ltd. | Computer-implemented method and system for providing interaction rules in mixed reality |
CN111815000A (en) * | 2020-07-10 | 2020-10-23 | 国网上海市电力公司 | Method and system for reproducing power scene and computer readable storage medium |
-
2020
- 2020-11-09 CN CN202011238081.0A patent/CN112527100A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160049004A1 (en) * | 2014-08-15 | 2016-02-18 | Daqri, Llc | Remote expert system |
CN108089696A (en) * | 2016-11-08 | 2018-05-29 | 罗克韦尔自动化技术公司 | For the virtual reality and augmented reality of industrial automation |
CN108879440A (en) * | 2018-06-20 | 2018-11-23 | 国网山东省电力公司济宁供电公司 | Intelligent examination and repair system and method based on wearable augmented reality display terminal and cloud platform |
WO2020165885A1 (en) * | 2019-02-13 | 2020-08-20 | Quaqua Experiences Pvt. Ltd. | Computer-implemented method and system for providing interaction rules in mixed reality |
CN110751734A (en) * | 2019-09-23 | 2020-02-04 | 华中科技大学 | Mixed reality assistant system suitable for job site |
CN111815000A (en) * | 2020-07-10 | 2020-10-23 | 国网上海市电力公司 | Method and system for reproducing power scene and computer readable storage medium |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113445987A (en) * | 2021-08-05 | 2021-09-28 | 中国铁路设计集团有限公司 | Railway drilling auxiliary operation method based on augmented reality scene under mobile terminal |
CN117289791A (en) * | 2023-08-22 | 2023-12-26 | 杭州空介视觉科技有限公司 | Meta universe artificial intelligence virtual equipment data generation method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Machado et al. | Conceptual framework for integrating BIM and augmented reality in construction management | |
US10685489B2 (en) | System and method for authoring and sharing content in augmented reality | |
Casini | Extended reality for smart building operation and maintenance: A review | |
RU2524836C2 (en) | Information processor, processing method and programme | |
Maharjan et al. | Enabling human–infrastructure interfaces for inspection using augmented reality | |
García-Pereira et al. | A collaborative augmented reality annotation tool for the inspection of prefabricated buildings | |
Hajirasouli et al. | Augmented reality in design and construction: thematic analysis and conceptual frameworks | |
Oke et al. | An analysis of the application areas of augmented reality technology in the construction industry | |
CN112306233A (en) | Inspection method, inspection system and inspection management platform | |
KR102418994B1 (en) | Method for providng work guide based augmented reality and evaluating work proficiency according to the work guide | |
CN112527100A (en) | Remote assistance method and device based on intelligent wearable equipment | |
US10429926B2 (en) | Physical object addition and removal based on affordance and view | |
Wu et al. | Cognitive ergonomics-based Augmented Reality application for construction performance | |
JP2022130575A (en) | Road guidance method, apparatus, electronic device, and storage medium | |
Moreu et al. | Augmented reality enhancing the inspections of transportation infrastructure: Research, education, and industry implementation | |
Alavikia et al. | Pragmatic industrial augmented reality in electric power industry | |
JP2022155553A (en) | Business management support device, business management support system, business management support method, and business management support program | |
Wang et al. | User-centric immersive virtual reality development framework for data visualization and decision-making in infrastructure remote inspections | |
CN105138130A (en) | Information communication instructing method and system in same scene at different places | |
Gilson et al. | Leveraging augmented reality for highway construction | |
Khorrami Shad et al. | State-of-the-art analysis of the integration of augmented reality with construction technologies to improve construction safety | |
Tadeja et al. | PhotoTwinVR: An Immersive System for Manipulation, Inspection and Dimension Measurements of the 3D Photogrammetric Models of Real-Life Structures in Virtual Reality | |
CA3182255A1 (en) | A system and method for remote inspection of a space | |
KR20220041981A (en) | Head mounted display based augmented reality | |
CN112527101A (en) | Remote control method and device for variable electric field |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210319 |
|
RJ01 | Rejection of invention patent application after publication |