CN110609913A - Information processing method and device, electronic device and medium - Google Patents

Information processing method and device, electronic device and medium Download PDF

Info

Publication number
CN110609913A
CN110609913A CN201910883853.7A CN201910883853A CN110609913A CN 110609913 A CN110609913 A CN 110609913A CN 201910883853 A CN201910883853 A CN 201910883853A CN 110609913 A CN110609913 A CN 110609913A
Authority
CN
China
Prior art keywords
maintenance
original image
interactive
operated
target image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910883853.7A
Other languages
Chinese (zh)
Inventor
陈庆
张冉颀
黄香
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN201910883853.7A priority Critical patent/CN110609913A/en
Publication of CN110609913A publication Critical patent/CN110609913A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides an information processing method executed by an interactive home device, including: acquiring an original image in the process of keeping voice communication with interactive opposite-end equipment based on a real operation and maintenance scene, wherein the original image comprises an object to be operated and maintained, and the object to be operated and maintained is in the real operation and maintenance scene; sending the original image to the interactive opposite-end equipment so that the interactive opposite-end equipment can generate and return a target image based on the original image, wherein the target image comprises annotation information aiming at the object to be operated and maintained; and under the condition that the target image is received, superposing the target image to a real operation and maintenance scene so as to guide the operation and maintenance operation of the object to be operated and maintained. In addition, the disclosure also provides an information processing device executed by the interactive home terminal equipment, electronic equipment and a medium.

Description

Information processing method and device, electronic device and medium
Technical Field
The present disclosure relates to an information processing method and apparatus, an electronic device, and a medium.
Background
The equipment operation and maintenance is an important work of the data center infrastructure operation and maintenance. In order to ensure safe production operation, the operation and maintenance of the equipment of the data center basically need to be carried out in a non-main service time period. For a banking data center, the non-primary business time period is at night. However, the level of technicians performing field operations at night is relatively limited, and the field personnel can perform only a few general operations.
The inventor in the process of conceiving the present disclosure finds that when the field personnel encounter difficulty in the operation and maintenance work of the equipment, the efficiency is low because the field personnel generally seek help and guidance to remote high-level technical support personnel through the telephone or collect the field information and transmit the field information to the background for further analysis. When planned high-difficulty operation and maintenance actions are carried out, such as rare part replacement, important equipment microcode upgrading and the like, technical experts are required to go to the site for support, and therefore labor cost is high.
Disclosure of Invention
One aspect of the present disclosure provides an information processing method performed by an interactive home device, including: in the process of keeping voice communication with an interactive opposite-end device based on a real operation and maintenance scene, acquiring an original image, wherein the original image comprises an object to be operated and maintained, the object to be operated and maintained is located in the real operation and maintenance scene, sending the original image to the interactive opposite-end device, so that the interactive opposite-end device can generate and return a target image based on the original image, the target image comprises labeling information for the object to be operated and maintained, and superposing the target image to the real operation and maintenance scene under the condition of receiving the target image to guide operation and maintenance for the object to be operated and maintained.
According to an embodiment of the present disclosure, in the process of maintaining voice communication with the interactive peer device based on the real operation and maintenance scene, acquiring the original image includes: and acquiring an acquisition operation aiming at the object to be operated and maintained in the process of keeping voice communication with the interactive opposite terminal equipment based on a real operation and maintenance scene, and acquiring an original image in the real operation and maintenance scene in response to the acquisition operation.
According to an embodiment of the present disclosure, the above-mentioned collecting operation includes: a voice capture operation, and/or a gesture capture operation.
Another aspect of the present disclosure provides an information processing method performed by an interactive peer device, where the method includes: the method comprises the steps of receiving an original image sent by an interactive local terminal device, wherein the original image is collected in the process of keeping voice communication with an interactive opposite terminal device based on a real operation and maintenance scene, the original image comprises an object to be operated and maintained, the object to be operated and maintained is located in the real operation and maintenance scene, generating a target image based on the original image, the target image comprises label information aiming at the object to be operated and maintained, and sending the target image to the interactive local terminal device, so that the interactive local terminal device superposes the target image on the real operation and maintenance scene under the condition that the interactive local terminal device receives the target image, and guides operation and maintenance operation aiming at the object to be operated and maintained.
Another aspect of the present disclosure provides an information processing apparatus applied to an interactive home device, including: the system comprises an original image acquisition module, an original image sending module and a target image superposition module, wherein the original image acquisition module is configured to acquire an original image in the process of keeping voice communication with an interactive opposite-end device based on a real operation and maintenance scene, the original image comprises an object to be operated and maintained, the object to be operated and maintained is located in the real operation and maintenance scene, the original image sending module is configured to send the original image to the interactive opposite-end device so that the interactive opposite-end device can generate and send a target image based on the original image, the target image comprises annotation information aiming at the object to be operated and maintained, and the target image superposition module is configured to superpose the target image to the real operation and maintenance scene under the condition that the target image is received so as to guide the operation and maintenance operation aiming at the object to be operated and maintained.
According to an embodiment of the present disclosure, the original image capturing module includes: the acquisition operation acquisition sub-module is configured to acquire acquisition operation for the object to be operated and maintained in the process of keeping voice communication with the interactive opposite-end equipment based on a real operation and maintenance scene, and the original image acquisition sub-module is configured to respond to the acquisition operation and acquire an original image in the real operation and maintenance scene.
According to an embodiment of the present disclosure, the above-mentioned collecting operation includes: a voice capture operation, and/or a gesture capture operation.
Another aspect of the present disclosure provides an information processing apparatus applied to an interactive peer device, where the apparatus includes: the system comprises an original image receiving module, a target image generating module and a target image sending module, wherein the original image receiving module is configured to receive an original image sent by an interactive local terminal device, the original image is collected during a process of keeping voice communication with the interactive opposite terminal device based on a real operation and maintenance scene, the original image comprises an object to be operated and maintained, the object to be operated and maintained is located in the real operation and maintenance scene, the target image generating module is configured to generate a target image based on the original image, the target image comprises label information aiming at the object to be operated and maintained, and the target image sending module is configured to send the target image to the interactive local terminal device, so that the interactive local terminal device superposes the target image on the real operation and maintenance scene under the condition that the target image is received, and the operation and maintenance operation aiming at the object to be operated and maintained is guided.
Another aspect of the present disclosure provides an electronic device including: one or more processors; memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement any of the methods described above.
Another aspect of the disclosure provides a computer-readable storage medium storing computer-executable instructions that, when executed, perform any of the methods described above.
Through the remote auxiliary operation and maintenance method for the data center equipment, interaction between equipment operation and maintenance personnel in a data center machine room and remote technical support personnel can be realized by utilizing AR (augmented reality) glasses. The operation and maintenance field picture of the data center machine room can be transmitted to the mobile terminal of the remote technical support personnel in real time through AR glasses, so that the remote technical personnel can guide the field operation and maintenance personnel in the data center machine room through modes of voice, image labeling and back transmission and the like according to the real-time picture transmitted in the field, and the operation and maintenance work of equipment is assisted. The method improves the working efficiency, improves the remote technical support capability and reduces the labor cost input.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 schematically shows an application scenario of an information processing method according to an embodiment of the present disclosure;
FIG. 2 schematically shows a flow chart of an information processing method performed by an interactive home device according to an embodiment of the present disclosure;
FIG. 3 schematically illustrates a flow chart for acquiring an original image according to an embodiment of the present disclosure;
fig. 4 schematically shows a flowchart of an information processing method performed by an interactive peer device according to an embodiment of the present disclosure;
FIG. 5 schematically shows an overall flow diagram of an information processing method according to an embodiment of the disclosure;
fig. 6 schematically shows a block diagram of an information processing apparatus applied to an interactive home device according to an embodiment of the present disclosure;
FIG. 7 schematically illustrates a block diagram of a raw image acquisition module, in accordance with an embodiment of the present disclosure;
fig. 8 schematically shows a block diagram of an information processing apparatus applied to an interactive peer device according to an embodiment of the present disclosure; and
fig. 9 schematically shows a block diagram of an electronic device adapted to perform the information processing method of the embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable information processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. The techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system.
With the rapid development of cloud computing and artificial intelligence, the data center industry is rapidly developed. As data centers are a high energy consuming industry. Therefore, in order to save energy, the policy of successive release of large cities such as Beijing, Shanghai, Guangzhou and Shenzhen in China is limited to the construction of data centers in large cities. Therefore, the newly built large data center is basically built in a second-line city and a third-line city with scarce technical resources or a northern city with low average air temperature all the year round. In such situations, it will become increasingly important to increase the level of remote technical support and auxiliary operations.
Based on this, the present disclosure provides an information processing method executed by an interactive local device, which utilizes Augmented Reality (AR) glasses to realize the interaction between the device operation and maintenance personnel in the machine room and the remote technical support personnel. The machine room operation and maintenance scene picture is transmitted to a mobile terminal of a remote technical support person through AR glasses; and the remote technical personnel guide the field operation and maintenance personnel in the machine room by means of voice, image label reverse transmission and the like according to the real-time images transmitted from the field to assist in carrying out equipment operation and maintenance work. The method improves the working efficiency, improves the remote technical support capability and reduces the labor cost input. Specifically, firstly, in the process of keeping voice communication with the interactive opposite-end device based on a real operation and maintenance scene, an original image is collected, the original image contains an object to be operated and maintained, and the object to be operated and maintained is in the real operation and maintenance scene. And then, sending the original image to the interactive opposite-end equipment, so that the interactive opposite-end equipment can generate and return a target image based on the original image, wherein the target image comprises the annotation information aiming at the object to be operated and maintained. And finally, under the condition that the target image is received, superposing the target image to a real operation and maintenance scene so as to guide the operation and maintenance operation of the object to be operated and maintained.
The traditional equipment maintenance is always carried out by a maintenance man according to experience or on-site checking of a maintenance manual, and the maintenance man has to check the manual and maintain at the same time, so that the problems of low efficiency, easy occurrence of misoperation and the like are caused. Therefore, there is an urgent need to develop a humanized and intelligent maintenance system centered on the operator to help the operator to reduce the operation difficulty, reduce the operation errors, and improve the operation efficiency, thereby smoothly completing the work.
The equipment operation and maintenance is the key point of professional daily work of data center equipment and machine room infrastructure, and the operation and maintenance operating system based on augmented reality conforms to the trend of future intelligent operation and maintenance, and represents the development direction of the next generation of equipment operation and maintenance.
At present, the following difficulties exist in the operation and maintenance work of data center equipment: the manual operation of the host, the high-end server, the centralized storage and other equipment staff is less, and the technical level is difficult to improve; equipment and power air-conditioning infrastructure emergency drilling often adopts a desktop drilling mode, and the aims of drilling operation steps and evaluating emergency repair time cannot be achieved; the equipment operation and maintenance lacks effective remote technical support.
According to the embodiment of the disclosure, the AR technology is applied to operation and maintenance of complex equipment, and a virtual-real combined operation and maintenance environment and a more intuitive and flexible operation mode are provided for a user. The research related to the key technology of AR auxiliary operation and maintenance is developed, and the method has important guiding significance and reference value for optimizing the implementation of the AR auxiliary operation and maintenance system.
Fig. 1 schematically shows an application scenario 100 of an information processing method according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of an application scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
The present disclosure may be applied in an application scenario of the equipment operation and maintenance of the data center 100. The equipment operation and maintenance is an important work of the data center infrastructure operation and maintenance. The equipment of the data center 100 may include IT equipment 110 and infrastructure equipment 120. IT device 110 may include, but is not limited to, mainframe 111, minicomputer 112, server 113, network device 114, and (centralized and distributed) storage device 115, among others. Infrastructure equipment 120 may include, but is not limited to, transformers 121, uninterruptible power supplies 122(UPS), power distribution cabinets 123, chillers 124, room precision air conditioners 125.
In the present disclosure, the operation and maintenance work of the equipment of the data center 100 mainly includes: equipment installation and removal, equipment maintenance (including emergency maintenance), microcode upgrade, and the like. It should be noted that, unlike the operation and maintenance operations of the software system that can be performed remotely, the devices of the data center 100 are usually hardware devices, and the operation and maintenance operations basically have to be performed on site in the machine room of the data center.
The local operation and maintenance initiator 130 is in communication connection with the remote operation and maintenance supporter 140 through a network, so that interaction between equipment operation and maintenance personnel in the machine room and remote technical support personnel is realized. The machine room operation and maintenance scene picture is transmitted to a mobile terminal of a remote technical support person through AR glasses; and the remote technical personnel guide the field operation and maintenance personnel in the machine room by means of voice, image label reverse transmission and the like according to the real-time images transmitted from the field to assist in carrying out equipment operation and maintenance work.
It should be noted that, the operation and maintenance of the device has become the key point of the work of the data center, and as the integration level and the complexity of the device are higher and higher, the operation and maintenance of the device are more and more difficult. The operation and maintenance of the complex equipment become visual and convenient by means of the augmented reality technology, the understanding capacity of workers on complex operation and maintenance tasks and operation and maintenance processes is increased, and the operation and maintenance guarantee efficiency is improved. The AR technology may have the following application scenarios in terms of data center equipment maintenance management and auxiliary operation and maintenance:
the training method is applied to training of new employees in equipment specialties. When a device operation and maintenance person maintains a strange device, the person often needs to see a lot of manuals, and the AR system can be used for avoiding the manuals. The chance that the staff is manual to maintain the host equipment, the high-end centralized storage and the high-end server equipment is not many, so that some common maintenance operation procedures of the equipment can be made into AR software, and auxiliary information is provided for operation and maintenance personnel.
And (II) the method is applied to emergency drilling of equipment specialties. At present, the emergency drilling at the hardware equipment side mostly adopts a desktop drilling mode, the flow of processing is mainly verified, and the specific equipment emergency maintenance action is not involved. After the AR technology is adopted, the maintenance action in the emergency process can be made into a 3D image which is superimposed to the real machine room environment and physical equipment, and a visual new emergency disposal mode is realized. If a certain emergency scene relates to the action of a certain switch of a power distribution system, when emergency drilling is performed, an operator can wear AR glasses before the AR glasses reach a power distribution cabinet, the AR glasses show the switch and operation steps needing to be acted next, and field personnel perform operation or model learning through comparison.
And (III) the method is applied to auxiliary maintenance of equipment. Specific application scenarios in this aspect are as follows: one is for remote support repair of problematic faults. When the field personnel can not repair the fault according to the conventional maintenance process, by means of the remote video guidance scheme, the remote expert can support the emergency situation on the field at any time and provide the maintenance scheme. The situation of the site is transmitted to a remote support person through the AR device, the remote support person marks the components and operation steps needing to be operated, and the images are transmitted back to the site to guide the operation. And secondly, assisting field front-line personnel to implement maintenance actions. When field personnel are unfamiliar with the operating procedure, the AR device will display the specific operating procedure to assist in maintenance.
In this disclosure, the interactive home device and the interactive peer device in the embodiment of the present invention are only one title that the electronic devices serve different roles in the communication process, and do not mean that the interactive home device and the interactive peer device are necessarily two completely different types of electronic devices, that is, the interactive home device and the interactive peer device in the present invention may be the same type of electronic device or different types of electronic devices.
The disclosure provides a remote auxiliary operation and maintenance method for data center equipment. The method utilizes Augmented Reality (AR) glasses to release both hands of field operators in a machine room and transmit field images to remote technical support personnel, and the remote technical support personnel receive the field images, label the images and then send the images back to the AR glasses of the field personnel. The remote technician may also have a verbal conversation with the on-site personnel through the AR glasses system. The invention improves the remote technical support level of equipment operation and maintenance, improves the working efficiency and reduces the labor cost.
Fig. 2 schematically shows a flowchart of an information processing method performed by an interactive home device according to an embodiment of the present disclosure.
As shown in fig. 2, the information processing method performed by the interactive home device may include operations S210 to S230.
In operation S210, in the process of maintaining voice communication with the interactive peer device based on the real operation and maintenance scene, an original image is collected, where the original image includes an object to be operated and maintained, and the object to be operated and maintained is in the real operation and maintenance scene.
According to the embodiment of the disclosure, the interactive home terminal equipment can be various mature AR equipment, and the field operation and maintenance operators can wear AR glasses to carry out operation and maintenance operation in consideration of the convenience in operation of the field operation and maintenance operators, so that both hands are released. The equipment maintenance personnel in the data center wear AR glasses and can transmit operation and maintenance pictures in the machine room to terminals (such as a mobile phone, a tablet computer and a computer) of remote technicians. According to the embodiment of the present disclosure, the object to be operated and maintained may be any one of the IT devices (the mainframe 111, the mini-machine 112, the server 113, the network device 114, the storage device 115) and the infrastructure devices 120 (the transformer 121, the uninterruptible power supply 122, the power distribution cabinet 123, the freezer 124, and the machine room precision air conditioner 125) shown in fig. 1, depending on the specific operation and maintenance needs.
In the method, the application range of the augmented reality auxiliary operation and maintenance technology can be greatly expanded based on the automatic identification technology of the natural features of the operation and maintenance object. The research on the natural feature recognition technology of the operation and maintenance object needs to break through the multi-view and large-sample feature extraction and target recognition and positioning method, and meanwhile, the real-time performance of the algorithm needs to be improved on the premise of ensuring the robustness of the algorithm.
In operation S220, the original image is sent to the interactive peer device, so that the interactive peer device can generate and return a target image based on the original image, where the target image includes annotation information for the to-be-operated-dimension object.
In the disclosure, when the on-site operation and maintenance personnel need to obtain accurate guidance for an object to be operated and maintained, such as the position of an operation component, the position of an operation switch, and the position of a selection menu, the on-site operation and maintenance personnel can operate the AR glasses to shoot an on-site real operation and maintenance scene, collect an original image, and transmit an on-site picture to a terminal of a remote technician.
According to the embodiment of the disclosure, the original image can be acquired based on a real field scene according to the actual needs of the field operation and maintenance personnel under the condition of needing fine guidance, and the original image is sent to the interactive opposite-end device, after the interactive opposite-end device receives the original image, the labeling information can be generated on the original image aiming at the object to be operated and maintained, the operation and maintenance work of the device is assisted, the field operation and maintenance personnel are guided, the operation and maintenance information is visual and clear, and the operation and maintenance personnel can conveniently and quickly execute the operation and maintenance operation.
In operation S230, in the case that the target image is received, the target image is superimposed on the real operation and maintenance scene to guide the operation and maintenance operation on the object to be operated and maintained.
It can be understood that the AR technology is based on a computer graphics image processing function, superimposes a virtual object generated by a computer to a real scene in real time by using a visualization technology and a virtual-real registration technology, and presents a mixed scene composed of virtual information and a real scene in front of a user by using a head-mounted display, a projector or a mobile phone and other display devices, thereby providing an immersive 'real' environment for the user. The AR technology superimposes virtual object, scene or system prompt information generated by a computer to a real scene in real time, and the operation and maintenance real scene is enhanced.
In the disclosure, the interactive home terminal device superimposes the target image on the real operation and maintenance scene under the condition that the target image is received, so as to guide the operation and maintenance operation for the object to be operated and maintained.
Through the embodiment of the disclosure, the remote auxiliary operation and maintenance method of the data center equipment based on the augmented reality technology improves the remote technical support level of the equipment operation and maintenance, and reduces the labor cost.
Fig. 3 schematically shows a flow chart of acquiring an original image according to an embodiment of the present disclosure.
As shown in fig. 3, the aforementioned operation S210 may include operations S310 and S320.
In operation S310, in the process of maintaining voice communication with the interactive peer device based on the real operation and maintenance scene, an acquisition operation for the object to be operated and maintained is obtained.
According to an embodiment of the present disclosure, the collecting operation includes: voice collection operation; and/or gesture capture operations.
According to the embodiment of the disclosure, the development direction of the augmented reality auxiliary operation and maintenance prototype system should adopt a direct interaction mode of gestures or human behaviors as much as possible instead of adopting an indirect input device.
In the disclosure, a gesture control mode may be adopted to acquire a target image containing an object to be operated and maintained in a real operation and maintenance scene, a voice control mode may also be adopted to acquire a target image containing an object to be operated and maintained in a real operation and maintenance scene, and a gesture and voice common control mode may also be adopted to acquire a target image containing an object to be operated and maintained in a real operation and maintenance scene.
In operation S320, in response to the acquisition operation, an original image is acquired in the real operation and maintenance scene.
Through the embodiment of the disclosure, a virtual-real combined operation and maintenance environment and a more visual and flexible operation mode can be provided for a user, so that the technical effects of improving the working efficiency, improving the support capability of a remote operation and maintenance technology and reducing the input of the labor cost can be achieved.
Fig. 4 schematically shows a flowchart of an information processing method performed by an interactive peer device according to an embodiment of the present disclosure.
As shown in fig. 4, the information processing method performed by the interactive counterpart device may include operations S410 to S430.
In operation S410, an original image sent by the interactive home device is received, where the original image is collected in a process of maintaining voice communication with the interactive peer device based on a real operation and maintenance scene, the original image includes an object to be operated and maintained, and the object to be operated and maintained is in the real operation and maintenance scene.
In operation S420, a target image is generated based on the original image, and the target image includes annotation information for the dimension object to be operated.
In operation S430, the target image is sent to the interactive home device, so that the interactive home device superimposes the target image on a real operation and maintenance scene when receiving the target image, so as to guide an operation and maintenance operation on the object to be operated and maintained.
According to the embodiment of the disclosure, after receiving the original image sent by the interactive home terminal device, the remote operation and maintenance technical support personnel applying the interactive opposite terminal device can label the scene picture through the program of the AR glasses and can also perform voice communication with the scene personnel. And the remote technician guides the field personnel through voice according to the field picture. If necessary, in order to avoid speech ambiguity, the remote technician marks the received scene, such as indicating the position of the part to be operated, indicating the position of the menu to be clicked and selected, and then transmits the marked scene back to the AR glasses of the scene operation and maintenance personnel to guide the scene personnel to implement maintenance action.
By the embodiment of the disclosure, under the condition that the on-site operation and maintenance personnel need accurate guidance, the remote technical support personnel can label the received original image to clearly guide the operation of the on-site operation and maintenance personnel, so that the operation and maintenance guidance for the object to be operated and maintained is more intuitive, and the effectiveness of the operation and maintenance is improved.
In the present disclosure, in terms of hardware, the devices at the interactive home end may be assisted by various types of mature AR glasses on the market; on-site operation and maintenance operators wear AR glasses to carry out operation and maintenance operation, hands of the operation and maintenance operators are released, operation and maintenance operations on operation and maintenance objects are conveniently carried out according to operation and maintenance information of remote technical support personnel, and the operation and maintenance operations can be understood to be also an important reason for using the AR glasses for operation and maintenance of auxiliary equipment.
In the aspect of software, the device at the opposite interactive end can be customized and developed according to the customized requirement based on the control program of the existing AR glasses application program (APP).
As an alternative embodiment, the following generally describes the information processing method of the present disclosure with reference to a device of an interactive home terminal and a device of an interactive peer terminal. Fig. 5 schematically shows an overall flowchart of an information processing method according to an embodiment of the present disclosure.
As shown in fig. 5, the method may include operations S510 to S560.
In operation S510, the on-site operation and maintenance personnel wear the AR glasses to start the operation and maintenance operation.
In operation S520, the remote technical support staff starts a terminal control program and receives a live picture returned by the AR glasses.
In operation S530, according to the requirement, the on-site operation and maintenance staff and the remote technician perform a voice call through the AR glasses system to remotely guide the on-site operation.
In operation S540, when the on-site operation and maintenance person needs to give precise guidance, such as the position of the operation part, the position of the operation switch, and the position of the selection menu, the on-site operation and maintenance person operates the AR glasses through voice or gestures to photograph the on-site picture, and transmits the on-site picture to the terminal of the remote technician.
In operation S550, after receiving the field screen, the remote technician marks and prompts a position of a component to be operated or a menu on the image by using a mouse, handwriting, or the like, and then transmits the marked image to the AR glasses of the field operator on the control program.
In operation S560, the field operator receives the image of the AR glasses and performs operation and maintenance operations according to the annotation content.
Through the embodiment of the disclosure, the equipment of the interactive home terminal and the equipment of the interactive opposite terminal are mutually matched to finish the operation and maintenance work of the data center equipment together, so that technical experts can remotely provide technical support and also can assist in guiding the technical effect of the operation and maintenance work, and the technical problem of high labor cost caused by the fact that the technical experts must support on site is solved.
Fig. 6 schematically shows a block diagram of an information processing apparatus applied to an interactive home device according to an embodiment of the present disclosure.
As shown in fig. 6, the information processing apparatus 600 applied to the interactive home device may include an original image acquisition module 610, an original image transmission module 620, and a target image superposition module 630.
The original image collecting module 610 is configured to, for example, perform the foregoing operation S210, and collect an original image while maintaining voice communication with the opposite-end interactive device based on the real operation and maintenance scene, where the original image includes an object to be operated and maintained, and the object to be operated and maintained is in the real operation and maintenance scene.
The original image sending module 620 is configured to, for example, perform the foregoing operation S220, send the original image to the opposite-end device, so that the opposite-end device can generate and send a target image based on the original image, where the target image includes annotation information for the dimension object to be run.
And a target image overlaying module 630 configured to, for example, perform the foregoing operation S230, and in a case that the target image is received, overlay the target image into the real operation and maintenance scene to guide the operation and maintenance operation on the object to be operated and maintained.
Through the embodiment of the disclosure, the remote auxiliary operation and maintenance method of the data center equipment based on the augmented reality technology improves the remote technical support level of the equipment operation and maintenance, and reduces the labor cost.
FIG. 7 schematically illustrates a block diagram of a raw image acquisition module, in accordance with an embodiment of the present disclosure.
As shown in fig. 7, the raw image capture module 610 may include a capture operation acquisition sub-module 710 and a raw image capture sub-module 720.
The acquisition operation obtaining sub-module 710 is configured to, for example, perform the foregoing operation S310, and obtain an acquisition operation for an object to be operated and maintained in a process of maintaining voice communication with an interactive peer device based on a real operation and maintenance scene.
The original image capturing sub-module 720 is configured to, for example, perform the aforementioned operation S320, and capture an original image in the real operation and maintenance scene in response to the capturing operation.
According to an embodiment of the present disclosure, the collecting operation includes: voice collection operation; and/or gesture capture operations.
Through the embodiment of the disclosure, a virtual-real combined operation and maintenance environment and a more visual and flexible operation mode can be provided for a user, so that the technical effects of improving the working efficiency, improving the support capability of a remote operation and maintenance technology and reducing the input of the labor cost can be achieved.
Fig. 8 is a block diagram schematically illustrating an information processing apparatus applied to an interactive peer device according to an embodiment of the present disclosure.
As shown in fig. 8, the information processing apparatus 800 applied to the interactive counterpart device may include an original image receiving module 810, a target image generating module 820, and a target image transmitting module 830.
The original image receiving module 810 is configured to, for example, execute the foregoing operation S410, and receive an original image sent by the interactive home terminal device, where the original image is collected in a process of maintaining voice communication with the interactive peer terminal device based on a real operation and maintenance scene, the original image includes an object to be operated and maintained, and the object to be operated and maintained is in the real operation and maintenance scene.
A target image generating module 820 configured to, for example, execute the foregoing operation S420, and generate a target image based on the original image, where the target image includes annotation information for the object to be run and maintained.
The target image sending module 830 is configured to, for example, execute the foregoing operation S430, and send the target image to the interactive home device, so that the interactive home device superimposes the target image on the real operation and maintenance scene under the condition that the target image is received, so as to guide the operation and maintenance operation on the object to be operated and maintained.
By the embodiment of the disclosure, under the condition that the on-site operation and maintenance personnel need accurate guidance, the remote technical support personnel can label the received original image to clearly guide the operation of the on-site operation and maintenance personnel, so that the operation and maintenance guidance for the object to be operated and maintained is more intuitive, and the effectiveness of the operation and maintenance is improved.
It should be noted that, the embodiment of the information processing apparatus portion applied to the interactive home terminal device is similar to the embodiment of the information processing method portion executed by the interactive home terminal device, and correspondingly, the embodiment of the information processing apparatus portion applied to the interactive peer terminal device is similar to the embodiment of the information processing method portion executed by the interactive peer terminal device, and the achieved technical effects are also similar, and are not described herein again.
Any of the modules according to embodiments of the present disclosure, or at least part of the functionality of any of them, may be implemented in one module. Any one or more of the modules according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules according to the embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging the circuit, or in any one of three implementations, or in any suitable combination of any of the software, hardware, and firmware. Alternatively, one or more of the modules according to embodiments of the disclosure may be implemented at least partly as computer program modules which, when executed, may perform corresponding functions.
For example, any plurality of the original image capturing module 610, the original image sending module 620, the target image superimposing module 630, the capturing operation obtaining sub-module 710, the original image capturing sub-module 720, the original image receiving module 810, the target image generating module 820, and the target image sending module 830 may be combined and implemented in one module, or any one of them may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to the embodiment of the present disclosure, at least one of the original image capturing module 610, the original image sending module 620, the target image superimposing module 630, the capturing operation obtaining sub-module 710, the original image capturing sub-module 720, the original image receiving module 810, the target image generating module 820, and the target image sending module 830 may be at least partially implemented as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented by hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or implemented by any one of three implementation manners of software, hardware, and firmware, or implemented by a suitable combination of any several of them. Alternatively, at least one of the original image capturing module 610, the original image transmitting module 620, the target image superimposing module 630, the capturing operation acquiring sub-module 710, the original image capturing sub-module 720, the original image receiving module 810, the target image generating module 820, and the target image transmitting module 830 may be at least partially implemented as a computer program module, which when executed, may perform corresponding functions.
Fig. 9 schematically shows a block diagram of an electronic device adapted to implement a data recognition method according to an embodiment of the present disclosure. The electronic device shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 9, a computer system 900 according to an embodiment of the present disclosure includes a processor 901 which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)902 or a program loaded from a storage section 908 into a Random Access Memory (RAM) 903. Processor 901 may comprise, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or associated chipset, and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), among others. The processor 901 may also include on-board memory for caching purposes. The processor 901 may comprise a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
In the RAM903, various programs and data necessary for the operation of the system 900 are stored. The processor 901, the ROM 902, and the RAM903 are connected to each other through a bus 904. The processor 901 performs various operations of the method flows according to the embodiments of the present disclosure by executing programs in the ROM 902 and/or the RAM 903. Note that the programs may also be stored in one or more memories other than the ROM 902 and the RAM 903. The processor 901 may also perform various operations of the method flows according to embodiments of the present disclosure by executing programs stored in the one or more memories.
System 900 may also include an input/output (I/O) interface 905, input/output (I/O) interface 905 also connected to bus 904, according to an embodiment of the present disclosure. The system 900 may also include one or more of the following components connected to the I/O interface 905: an input portion 906 including a keyboard, a mouse, and the like; an output section 907 including components such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 908 including a hard disk and the like; and a communication section 909 including a network interface card such as a LAN card, a modem, or the like. The communication section 909 performs communication processing via a network such as the internet. The drive 910 is also connected to the I/O interface 905 as necessary. A removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 910 as necessary, so that a computer program read out therefrom is mounted into the storage section 908 as necessary.
According to embodiments of the present disclosure, method flows according to embodiments of the present disclosure may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable storage medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 909, and/or installed from the removable medium 911. The computer program, when executed by the processor 901, performs the above-described functions defined in the system of the embodiment of the present disclosure. The systems, devices, apparatuses, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the present disclosure.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. For example, according to embodiments of the present disclosure, a computer-readable storage medium may include the ROM 902 and/or the RAM903 described above and/or one or more memories other than the ROM 902 and the RAM 903.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
The embodiments of the present disclosure have been described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described separately above, this does not mean that the measures in the embodiments cannot be used in advantageous combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be devised by those skilled in the art without departing from the scope of the present disclosure, and such alternatives and modifications are intended to be within the scope of the present disclosure.

Claims (10)

1. An information processing method executed by an interactive home terminal device comprises the following steps:
acquiring an original image in the process of keeping voice communication with interactive opposite-end equipment based on a real operation and maintenance scene, wherein the original image comprises an object to be operated and maintained, and the object to be operated and maintained is located in the real operation and maintenance scene;
sending the original image to the interactive opposite-end equipment so that the interactive opposite-end equipment can generate and return a target image based on the original image, wherein the target image comprises the labeling information aiming at the object to be operated and maintained; and
and under the condition that the target image is received, superposing the target image to the real operation and maintenance scene so as to guide the operation and maintenance operation of the object to be operated and maintained.
2. The method of claim 1, wherein the acquiring of the original image in the process of maintaining voice communication with the interactive peer device based on the real operation and maintenance scene comprises:
acquiring acquisition operation aiming at the object to be operated and maintained in the process of keeping voice communication with the interactive opposite terminal equipment based on a real operation and maintenance scene;
and acquiring an original image in the real operation and maintenance scene in response to the acquisition operation.
3. The method of claim 2, wherein the acquiring operation comprises:
voice collection operation; and/or
And (5) gesture collection operation.
4. An information processing method executed by an interactive peer device, the method comprising:
receiving an original image sent by an interactive local terminal device, wherein the original image is acquired in the process of keeping voice communication with the interactive opposite terminal device based on a real operation and maintenance scene, the original image comprises an object to be operated and maintained, and the object to be operated and maintained is located in the real operation and maintenance scene;
generating a target image based on the original image, wherein the target image comprises labeling information aiming at the dimension object to be operated; and
and sending the target image to the interactive home terminal equipment, so that the interactive home terminal equipment superposes the target image on the real operation and maintenance scene under the condition of receiving the target image, and guides the operation and maintenance operation of the object to be operated and maintained.
5. An information processing device applied to an interactive home terminal device comprises:
the system comprises an original image acquisition module, a voice communication module and a voice communication module, wherein the original image acquisition module is configured to acquire an original image in the process of keeping voice communication with an interactive opposite-end device based on a real operation and maintenance scene, the original image comprises an object to be operated and maintained, and the object to be operated and maintained is in the real operation and maintenance scene;
an original image sending module configured to send the original image to the interactive peer device, so that the interactive peer device can generate and send a target image based on the original image, where the target image includes annotation information for the to-be-operated-maintained object; and
and the target image overlapping module is configured to overlap the target image into the real operation and maintenance scene under the condition that the target image is received so as to guide the operation and maintenance operation of the object to be operated and maintained.
6. The apparatus of claim 5, wherein the raw image acquisition module comprises:
the acquisition operation acquisition sub-module is configured to acquire acquisition operation for the object to be operated and maintained in the process of keeping voice communication with the interactive opposite-end equipment based on a real operation and maintenance scene;
and the original image acquisition sub-module is configured to respond to the acquisition operation and acquire an original image in the real operation and maintenance scene.
7. The apparatus of claim 6, wherein the acquisition operation comprises:
voice collection operation; and/or
And (5) gesture collection operation.
8. An information processing apparatus applied to an interactive peer device, the apparatus comprising:
the original image receiving module is configured to receive an original image sent by an interactive home terminal device, the original image is collected in a process of keeping voice communication with the interactive opposite terminal device based on a real operation and maintenance scene, the original image contains an object to be operated and maintained, and the object to be operated and maintained is located in the real operation and maintenance scene;
a target image generation module configured to generate a target image based on the original image, wherein the target image contains annotation information for the dimension object to be operated; and
the target image sending module is configured to send the target image to the interactive home terminal device, so that the interactive home terminal device superimposes the target image on the real operation and maintenance scene under the condition that the target image is received, and the operation and maintenance operation on the object to be operated and maintained is guided.
9. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-4.
10. A computer-readable storage medium storing computer-executable instructions that, when executed, implement the method of any one of claims 1 to 4.
CN201910883853.7A 2019-09-18 2019-09-18 Information processing method and device, electronic device and medium Pending CN110609913A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910883853.7A CN110609913A (en) 2019-09-18 2019-09-18 Information processing method and device, electronic device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910883853.7A CN110609913A (en) 2019-09-18 2019-09-18 Information processing method and device, electronic device and medium

Publications (1)

Publication Number Publication Date
CN110609913A true CN110609913A (en) 2019-12-24

Family

ID=68891340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910883853.7A Pending CN110609913A (en) 2019-09-18 2019-09-18 Information processing method and device, electronic device and medium

Country Status (1)

Country Link
CN (1) CN110609913A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111144306A (en) * 2019-12-27 2020-05-12 联想(北京)有限公司 Information processing method, information processing apparatus, and information processing system
CN111372057A (en) * 2020-04-01 2020-07-03 中国工商银行股份有限公司 Information interaction method, system, device, augmented reality equipment and medium
CN113780706A (en) * 2020-10-27 2021-12-10 北京京东尚科信息技术有限公司 On-site operation and maintenance operation method and device based on visual enhancement
CN115617234A (en) * 2022-12-20 2023-01-17 中科航迈数控软件(深圳)有限公司 Operation and maintenance guiding method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130157239A1 (en) * 2011-12-16 2013-06-20 Board Of Regents Of The Nevada System Of Higher Education, On Behalf Of The University Of Nevada Augmented reality tele-mentoring (art) platform for laparoscopic training
CN107277451A (en) * 2017-07-14 2017-10-20 福建铁工机智能机器人有限公司 A kind of utilization AR realizes the method and apparatus of remote guide scene investigation failure
CN107645651A (en) * 2017-10-12 2018-01-30 北京临近空间飞艇技术开发有限公司 A kind of remote guide method and system of augmented reality
CN107741785A (en) * 2017-10-12 2018-02-27 北京临近空间飞艇技术开发有限公司 A kind of remote guide method and system for protecting front end safety
KR20190056935A (en) * 2017-11-17 2019-05-27 주식회사 코이노 Mobile terminal providing augmented reality based maintenance guidance, remote managing apparatus and method for remote guidance using the same
CN109841217A (en) * 2019-01-18 2019-06-04 苏州意能通信息技术有限公司 A kind of AR interactive system and method based on speech recognition

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130157239A1 (en) * 2011-12-16 2013-06-20 Board Of Regents Of The Nevada System Of Higher Education, On Behalf Of The University Of Nevada Augmented reality tele-mentoring (art) platform for laparoscopic training
CN107277451A (en) * 2017-07-14 2017-10-20 福建铁工机智能机器人有限公司 A kind of utilization AR realizes the method and apparatus of remote guide scene investigation failure
CN107645651A (en) * 2017-10-12 2018-01-30 北京临近空间飞艇技术开发有限公司 A kind of remote guide method and system of augmented reality
CN107741785A (en) * 2017-10-12 2018-02-27 北京临近空间飞艇技术开发有限公司 A kind of remote guide method and system for protecting front end safety
KR20190056935A (en) * 2017-11-17 2019-05-27 주식회사 코이노 Mobile terminal providing augmented reality based maintenance guidance, remote managing apparatus and method for remote guidance using the same
CN109841217A (en) * 2019-01-18 2019-06-04 苏州意能通信息技术有限公司 A kind of AR interactive system and method based on speech recognition

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
豆大帷编著: "《新制造"智能+"赋能制造业转型升级》", 31 July 2019 *
陶文源等编著: "《虚拟现实概论》", 28 February 2019 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111144306A (en) * 2019-12-27 2020-05-12 联想(北京)有限公司 Information processing method, information processing apparatus, and information processing system
CN111372057A (en) * 2020-04-01 2020-07-03 中国工商银行股份有限公司 Information interaction method, system, device, augmented reality equipment and medium
CN111372057B (en) * 2020-04-01 2021-07-27 中国工商银行股份有限公司 Information interaction method, system, device, augmented reality equipment and medium
CN113780706A (en) * 2020-10-27 2021-12-10 北京京东尚科信息技术有限公司 On-site operation and maintenance operation method and device based on visual enhancement
CN115617234A (en) * 2022-12-20 2023-01-17 中科航迈数控软件(深圳)有限公司 Operation and maintenance guiding method and system

Similar Documents

Publication Publication Date Title
CN110609913A (en) Information processing method and device, electronic device and medium
CN110909898B (en) AR (augmented reality) glasses-based system and AR glasses-based method for diagnosing, maintaining and guiding faults of bank machine room
WO2021179399A1 (en) Mixed reality-based remote operation guidance system and method
CN110189416A (en) The remote guide method and system of Overhaul site
CN113011723B (en) Remote equipment maintenance system based on augmented reality
CN109271881A (en) Personnel safety management-control method, device and server in a kind of substation
US20230186583A1 (en) Method and device for processing virtual digital human, and model training method and device
JP2022529876A (en) Illegal building identification methods, equipment, equipment and storage media
CN112578907A (en) Method and device for realizing remote guidance operation based on AR
CN111062504A (en) AR technology-based intelligent power distribution station operation and maintenance system and method
CN112306233A (en) Inspection method, inspection system and inspection management platform
CN109002633B (en) Device network modeling method based on separate space
CN112542172A (en) Communication auxiliary method, device, equipment and medium based on online conference
CN117424924A (en) Tour enhancement method based on meta-universe interaction
CN104657902A (en) Integrated support intelligent terminal system for equipment and implementation method of intelligent terminal system
CN109469962A (en) A kind of air-conditioning defrosting method, device and storage medium
Schlueter Remote maintenance assistance using real-time augmented reality authoring
CN115756256A (en) Information labeling method, system, electronic equipment and storage medium
CN210534865U (en) Sign-in system
CN210896535U (en) Automatic guide and explanation device for museum
CN113780706A (en) On-site operation and maintenance operation method and device based on visual enhancement
CN111882676A (en) Remote monitoring support system and monitoring support method
CN214175139U (en) Cable AR remote expert diagnostic system
US10979777B2 (en) Processing system for performing reverse video content output generation
CN109034167A (en) Visual indicia method, system, electric terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20191224

RJ01 Rejection of invention patent application after publication