CN117640914B - Remote equipment maintenance method and system based on AR video and multiparty real-time cooperation - Google Patents

Remote equipment maintenance method and system based on AR video and multiparty real-time cooperation Download PDF

Info

Publication number
CN117640914B
CN117640914B CN202410112646.2A CN202410112646A CN117640914B CN 117640914 B CN117640914 B CN 117640914B CN 202410112646 A CN202410112646 A CN 202410112646A CN 117640914 B CN117640914 B CN 117640914B
Authority
CN
China
Prior art keywords
video
terminal
site
frozen
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410112646.2A
Other languages
Chinese (zh)
Other versions
CN117640914A (en
Inventor
郭引商
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qizhixinlian Nanjing Information Software Development Co ltd
Original Assignee
Qizhixinlian Nanjing Information Software Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qizhixinlian Nanjing Information Software Development Co ltd filed Critical Qizhixinlian Nanjing Information Software Development Co ltd
Priority to CN202410112646.2A priority Critical patent/CN117640914B/en
Publication of CN117640914A publication Critical patent/CN117640914A/en
Application granted granted Critical
Publication of CN117640914B publication Critical patent/CN117640914B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Information Transfer Between Computers (AREA)

Abstract

The invention discloses a remote equipment maintenance method based on AR video and multiparty real-time cooperation, which relates to the technical field of communication and comprises the following steps: recording a rotation degree parameter R and a corresponding displacement parameter T of the on-site AR terminal, and requesting maintenance support service from a server; the participants directly conduct voice guidance on the scene and conduct frozen screen collaborative annotation on the scene through the rear-end visual terminal, dynamic rectangular annotation is formed on the dynamic video picture corresponding to the marked object, all the participants can see the video with the dynamic rectangular annotation, and AR video and multiparty real-time remote maintenance collaboration are completed. The invention can carry out real-time superposition after the freeze-off screen is removed and the AR video is followed, and can carry out multiple annotations on the video, and a plurality of annotations are simultaneously tracked and displayed on the video, so that the objects marked in the images are accurately tracked in real time, and on-site operation and maintenance personnel can intuitively see on-site objects marked at the rear end, thereby improving the operation and maintenance efficiency.

Description

Remote equipment maintenance method and system based on AR video and multiparty real-time cooperation
Technical Field
The invention belongs to the technical field of communication, and particularly relates to a remote equipment maintenance method and system based on AR video and multiparty real-time cooperation.
Background
The existing remote maintenance positioning, such as power equipment maintenance positioning, wind power equipment maintenance positioning and photovoltaic equipment maintenance positioning, all depend on personnel field experience, when the problem cannot be handled by field personnel, the traditional telephone or WeChat video is relied on for point-to-point connection, the problems of unclear understanding, inaccurate guidance and the like often occur when the telephone, the text or the WeChat voice/video is relied on for guidance, and if the problem is solved, unilateral experts cannot effectively solve the problem, so that the operation and maintenance time is delayed.
At present, AR glasses remotely patrol, video of the AR glasses on site can be utilized, and a remote provider can extract a static image from the video transmitted by the AR glasses by a frozen screen method by means of multiparty vision remote real-time video collaboration, so that fault labeling on the static image is realized. However, after the operation and maintenance personnel restore to the video, the mark on the static image cannot track the mark on the dynamic video, the mark has deviation on the actual operation and maintenance display caused by the loss of the dynamic video, and a plurality of fault marks cannot be simultaneously tracked and fed back, so that certain defects and inconvenience exist in remote operation and maintenance, and therefore, the remote equipment maintenance method and system based on AR video and multiparty real-time cooperation are provided.
Disclosure of Invention
The invention aims to solve the defect that in the prior art, a mark on a static image cannot be tracked and marked on a dynamic video, and provides a remote equipment maintenance method and a remote equipment maintenance system based on real-time cooperation of an AR video and multiple parties. According to the remote equipment maintenance method and system based on real-time cooperation of the AR video and the multiple parties, the labels can be overlapped with the AR video in real time after the freeze screen is removed, multiple labels can be carried out on the video, and the multiple labels are tracked and displayed on the video at the same time.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
a remote equipment maintenance method based on AR video and multiparty real-time cooperation is designed, which comprises the following steps:
step 1, adding a 6-axis sensor on a site AR terminal, recording a rotation degree parameter R and a corresponding displacement parameter T of the site AR terminal, and requesting maintenance support service from a server;
step 2, the server responds to the request of the terminal, matches the visual terminal corresponding to the support engineer with the request, and establishes a two-way video call;
step 3, when the third party supports, the terminal system sends a third party personnel participation request to the service control system, and the service control system adds the third party personnel visualization terminal;
step 4, the on-site AR terminal requester synchronously uploads the local AR video to a media server system, and the media server system sends the received on-site image to all participants;
step 5, the participants directly communicate with the rear-end visual terminal to conduct voice guidance on the field video through the APP and conduct frozen screen collaborative annotation on the field picture, and a frozen screen collaborative annotation request is sent to the signaling system;
step 6, when the rear-end participants click the frozen screen, the rear-end visual terminal acquires currently stored parameters Ri and Ti from the on-site AR terminal and stores the parameters Ri and Ti;
step 7, the rear-end participants carry out rectangular labeling on the labeling object on the frozen static picture and form labeling coordinates (X1, Y1, X2 and Y2);
step 8, after the screen freezing is finished, for each frame of the video, acquiring current Rt and Tt, and converting the original frozen coordinates with Ri and Ti in the labeling process, wherein the conversion formula is as follows:
wherein DeltaR and DeltaT are differences between Ri and Ti and current Rt and Tt when the screen is frozen;
step 9, obtaining coordinates of rectangular labels on each frame of the video through the conversion formula, and forming dynamic rectangular labels on dynamic video pictures corresponding to the label objects;
and step 10, the media server system sends the video with the dynamic rectangular labels to all participants, so that all the participants can see the video with the dynamic rectangular labels, and AR video and multiparty real-time remote maintenance collaboration are completed.
Further, in step 1, the on-site AR terminal displays a two-dimensional planar video, and mapping conversion is required to be performed on the original coordinates, where the conversion steps are as follows:
step 11, converting a world coordinate system of an object into a camera coordinate system, and using the recorded R and t, the actual relationship is as follows:
step 12, projecting the camera coordinate system to a pixel coordinate system:
wherein u and v are in pixels/meter; fx and fy are focal lengths in x and y directions, and the unit is a pixel; (Cx, cy) is the dominant point, the center of the image in pixels.
Further, in step 3, the incoming call is continued, and a multi-person maintenance positioning expert group is formed.
Further, in step 5, a plurality of freeze-screen collaborative labels are performed on the field picture, and the processes are performed in parallel in the image calculation.
Further, in step 6, in order to reduce the amount of calculation, the values of Ri and Ti recorded per second may be sampled and saved in the storage of the on-site AR terminal.
Further, in step 8, in order to obtain the function of f (Δr), a plurality of frozen images and AR glasses sensing data are obtained in a laboratory, fitting is performed by using a multi-stage taylor function, and an accurate f (Δr) function is obtained by continuous approximation of the data, which specifically comprises the following steps:
step 81, gray all pixels in the rectangle are changed to 64 levels, an image with the size of 8 x 8 of the center of the rectangle is obtained, the gray value average value of the image is calculated, then the gray value of each pixel with the size of 8 x 8 of the image is compared with the average value, the gray value of each pixel is 1 which is larger than the average value and 0 which is smaller than the average value, and the obtained result is arranged into a 64-bit vector which is used as the characteristic vector of the image;
step 82, related images and R and T acquired by the second frozen screen are divided into 8 x 8 square blocks, vector calculation in step 82 is carried out, meanwhile, the nearest gray value image is found out through the feature vector, and similar rectangular coordinates after rotation are acquired through the image as the center;
and 83, after repeatedly freezing the screen, acquiring a plurality of groups of corresponding coordinate data, and obtaining an accurate f (delta R) function through polynomial fitting.
Further, in order to prevent overfitting, partial frozen screen pictures are collected as verification sets, and when the deviation of the verification sets is within an allowable range, f (delta R) functions meet the requirement of calculation display.
In order to solve the technical problems, the invention also provides a remote equipment maintenance system based on AR video and multiparty real-time cooperation, which is used for the remote equipment maintenance method based on AR video and multiparty real-time cooperation, and comprises a site AR terminal, a plurality of visual terminals and a server;
the on-site AR terminal is an AR glasses, and a 6-axis sensor is arranged on the AR glasses and used for recording a rotation degree parameter R and a corresponding displacement parameter T of the AR glasses;
the server comprises a media server system, a signaling system and a service control system;
the on-site AR terminal and the visual terminals register to a signaling system through a designated signaling, and the audio and video streams and the sharing collaboration streams of the visual terminals are synthesized through the media server system and then distributed to the participating visual terminals and on-site AR terminals;
the service control system controls the whole service flow, and can issue commands to the on-site AR terminal and the participated visual terminal.
Further, the visual terminal is a mobile phone APP terminal and a PCWEB terminal.
The remote equipment maintenance method based on AR video and multiparty real-time cooperation has the beneficial effects that: according to the invention, remote inspection and maintenance are completed by means of video of the AR glasses through means of cooperation video multiparty sharing, video multiparty cooperation and field data acquisition sharing, the inspection and maintenance efficiency of equipment is improved, the equipment maintenance can be positioned and analyzed in real time on site multiparty, the field maintenance difficulty is reduced, the equipment failure time is reduced, and the company maintenance cost is reduced. In addition, based on the frozen screen labeling, through calculating the rotation angle of the AR glasses and the data marked by the frozen screen, the algorithm provided by the invention can be used for carrying out real-time superposition on the AR video after the frozen screen is removed, and can be used for carrying out multiple labeling on the video, and a plurality of labels are simultaneously tracked and displayed on the video. Specifically:
(1) According to the invention, a two-way video call is established through the on-site AR terminal, the plurality of visual terminals and the server to form a maintenance positioning expert group, on the basis of AR glasses video, the video is shared to multiparty participators through multiparty video communication, meanwhile, the participators can mark the components in the video in real time on the basis of AR shared video, relevant parameters of the components are given, the components are superimposed on the AR video and are shared to on-site personnel and other personnel at the same time, expert consultation is carried out, and the relevant documents can be opened to share a local desktop to the on-site personnel.
(2) According to the invention, expert annotation is carried out in a 'frozen screen' mode, a frozen screen function is provided, background participants can freeze and annotate a scene picture at proper time, after the frozen screen is removed, an annotation frame changes along with AR glasses image transformation, objects marked in the image are accurately tracked in real time, scene operation and maintenance staff can intuitively see scene objects marked at the rear end, and operation and maintenance efficiency is improved.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
FIG. 1 is a flow chart of the present invention;
fig. 2 is a system block diagram of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention; it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments, and that all other embodiments obtained by persons of ordinary skill in the art without making creative efforts based on the embodiments in the present invention are within the protection scope of the present invention.
The structural features of the present invention will now be described in detail with reference to the accompanying drawings.
Referring to fig. 1, the present embodiment proposes a remote device maintenance method based on AR video and multiparty real-time collaboration, which specifically includes the following steps:
s1, adding a 6-axis sensor on a site AR terminal, recording a rotation degree parameter R and a corresponding displacement parameter T of the site AR terminal, and requesting maintenance support service from a server;
the on-site maintenance personnel carries the AR glasses to appointed on-site maintenance, requests remote assistance after finding faults, initiates remote fault maintenance positioning assistance through voice control of the AR glasses, such as voice instructions of calling remote assistance or pressing a specific button, adds a 6-axis sensor on an AR glasses device, records rotation degree parameters R and corresponding displacement parameters T of the glasses, displays an on-site AR terminal as a two-dimensional plane video, and needs to map and convert original coordinates, wherein the conversion steps are as follows:
s11, converting a world coordinate system of an object into a camera coordinate system, and using the recorded R and t, the actual relationship is as follows:
s12, projecting the camera coordinate system to a pixel coordinate system:
wherein u and v are in pixels/meter; fx and fy are focal lengths in x and y directions, and the unit is a pixel; (Cx, cy) is the dominant point, the center of the image in pixels.
Since the above values are fixed values, when R and T are changed, the relative coordinates of the pixels are also changed.
S2, the server responds to the request of the terminal and matches the visual terminal corresponding to the support engineer with the request;
the system automatically matches the remote support personnel according to the type of the calling personnel and establishes a two-way video call.
S3, if the support of a third party is needed, the terminal system sends a participation request of a third party to the service control system;
the service control system adds a visual terminal of a third party, remote support personnel cannot support the visual terminal effectively, a button such as expert call of a clickable system calls the third party expert to assist in positioning, the system calls the third party expert into a maintenance assisting video system, and a plurality of experts can be called by clicking for multiple times to form a maintenance positioning expert group.
S4, the on-site AR terminal requester uploads the local AR video to a media server system at the same time, and the media server system sends the received on-site image to all participants;
the remote support personnel cannot support effectively, a third party expert can be called to assist in positioning by clicking a button such as expert call of the system, and the system calls the third party expert into the maintenance assisting video system; clicking multiple requests may call into multiple specialists.
S5, the participants directly communicate with the rear-end visual terminal to conduct voice guidance on the field video through the APP and conduct frozen screen collaborative annotation on the field picture, and a frozen screen collaborative annotation request is sent to the signaling system;
the expert can conduct voice guidance on the live real-time video picture, and can conduct marking on the live picture, such as circling fault parts, drawing fault lines and other actions, and can conduct multiple frozen screen collaborative marking on the live picture, and the actions are conducted in parallel in image calculation.
When the on-site picture is cooperatively marked, a button can be clicked to freeze the current AR video for efficient analysis, the on-site picture is frozen to form a static picture, participants mark the static picture, and analysis and explanation are performed on the static picture.
Because the AR glasses field personnel shake, deviation exists in the labeling process, each participant provides a frozen screen function on the APP/WEB, namely when the field personnel rotate the AR glasses, the background clicks a frozen screen button, the APP/WEB can send a command to a signaling server, then the signaling server obtains an image of a current frame from a media server to serve as a base map of the labeling, the base map is shared and sent to all participants, the participants can label the map, and the labeling is shared to the participants through the signaling media server.
S6, when the rear-end participants click the frozen screen, the rear-end visual terminal acquires currently stored parameters Ri and Ti from the on-site AR terminal and stores the parameters Ri and Ti for subsequent conversion calculation. Since the AR glasses are on the head of the maintainer, the angle R and translation T of the glasses changes from moment to moment as the head moves, we assume that initially R0 and T0, the values of Ri and Ti can be sampled in seconds and saved to the AR glasses store in order to reduce the amount of computation.
And S7, the rear-end participants carry out rectangular labeling on the labeling object on the frozen static picture, and form labeling coordinates (X1, Y1, X2 and Y2), wherein the labeling coordinates are screen coordinates.
S8, after the screen freezing is finished, for each frame of the video, acquiring current Rt and Tt, and converting originally frozen coordinates with Ri and Ti in the labeling process, wherein the conversion formula is as follows:
wherein DeltaR and DeltaT are differences between Ri and Ti and current Rt and Tt when the screen is frozen.
In the above formula, in order to obtain the function of f (Δr), the offset state can be expressed truly, a plurality of frozen images and AR glasses sensing data can be obtained in a laboratory, a multi-stage taylor function is adopted for fitting, and a relatively accurate f (Δr) function can be obtained through continuous approximation of the data, and the specific steps are as follows:
s81, gray all pixels in the rectangle are changed into 64 levels, an image with the size of 8 x 8 of the center of the rectangle is obtained, the gray value average value of the image is calculated, then the gray value of each pixel with the size of 8 x 8 of the image is compared with the average value, the gray value of each pixel is 1 which is larger than the average value and 0 which is smaller than the average value, the obtained result is arranged into a 64-bit vector, and the vector is the fingerprint of the image, namely, the hash value is taken as the characteristic vector of the image;
s82, dividing each image into 8 x 8 square blocks, carrying out vector calculation in the step 82, finding out the nearest gray value image through the feature vector, and obtaining the rotated similar rectangular coordinates by taking the image as the center;
s83, after repeated screen freezing, a plurality of sets of corresponding coordinate data are obtained, and an accurate f (delta R) function is obtained through polynomial fitting, in order to prevent over fitting, partial screen freezing pictures are collected to serve as a verification set, and when the deviation of the verification set is within an allowable range, the f (delta R) function meets the requirement of calculation display.
S9, acquiring coordinates of rectangular labels on each frame of the video through the conversion formula, and forming dynamic rectangular labels on dynamic video pictures corresponding to the label objects.
And S10, the media server system sends the video with the dynamic rectangular labels to all participants, so that all the participants can see the video with the dynamic rectangular labels, and AR video and multiparty real-time remote maintenance collaboration are completed.
The AR initiator can close the maintenance collaboration and the audio/video discussion, the AR glasses send a closing signal to a signaling server, the signaling server informs the media to close all resources, and the signaling server informs the terminals of the participants that the remote maintenance positioning is finished.
The remote equipment maintenance method based on AR video and multiparty real-time cooperation of the invention completes remote inspection and maintenance by means of the video of AR glasses through the multiparty sharing of the collaborative video, multiparty cooperation of the video and the on-site data acquisition sharing, improves the inspection and maintenance efficiency of the equipment, establishes two-way video communication through an on-site AR terminal, a plurality of visual terminals and a server to form a maintenance positioning expert group, shares the video to multiparty participators through multiparty video communication on the basis of the AR glasses video, simultaneously the participators can label the components in the video in real time on the basis of the AR shared video, give the relevant parameters of the components, and is overlapped on the AR video and shared to the on-site personnel and other personnel at the same time, the method and the system can carry out expert consultation, can also share a local desktop with related documents and carry out expert annotation in a frozen screen mode in the use process, can carry out real-time superposition of the annotation and the AR video after the frozen screen is removed through calculating the rotation angle of the AR glasses based on the frozen screen annotation, can carry out multiple annotations on the video simultaneously after the frozen screen is removed, can carry out simultaneous tracking display on the multiple annotations on the video, can change along with the AR glasses image transformation after the frozen screen is removed, can accurately track the object marked in the image in real time, can intuitively see the field object marked at the rear end by on-site operation and maintenance personnel, and can improve the operation and maintenance efficiency.
For further explanation, referring to fig. 2, the invention further provides a remote equipment maintenance system based on AR video and multiparty real-time collaboration, which comprises a site AR terminal, a plurality of visual terminals and a server. The server comprises a media server system, a signaling system and a service control system, wherein the on-site AR terminal is an AR glasses, and the AR glasses are provided with 6-axis sensors for recording rotation degree parameters R and corresponding displacement parameters T of the AR glasses.
The on-site AR terminal and the visual terminals register to the signaling system through the designated signaling, the audio and video streams and the shared collaboration streams of the visual terminals are synthesized through the media server system and then distributed to the participating visual terminals and the on-site AR terminal, and the service control system controls the whole service flow and can issue commands to the on-site AR terminal and the participating visual terminals. Specifically, the visual terminals are a mobile phone APP terminal and a PCWEB terminal.
The foregoing description is only a preferred embodiment of the present invention, and the present invention is not limited thereto, but it is to be understood that modifications and equivalents of some of the technical features described in the foregoing embodiments may be made by those skilled in the art, although the present invention has been described in detail with reference to the foregoing embodiments. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (8)

1. A remote equipment maintenance method based on AR video and multiparty real-time cooperation is characterized by comprising the following steps:
step 1, adding a 6-axis sensor on a site AR terminal, recording a rotation degree parameter R and a corresponding displacement parameter T of the site AR terminal, and requesting maintenance support service from a server;
step 2, the server responds to the request of the terminal, matches the visual terminal corresponding to the support engineer with the request, and establishes a two-way video call;
step 3, third party support, the terminal system sends a third party personnel participation request to the service control system, and the service control system adds a third party personnel visual terminal;
step 4, the on-site AR terminal requester synchronously uploads the local AR video to a media server system, and the media server system sends the received on-site image to all participants;
step 5, the participants directly communicate with the rear-end visual terminal to conduct voice guidance on the field video through the APP and conduct frozen screen collaborative annotation on the field picture, and a frozen screen collaborative annotation request is sent to the signaling system;
step 6, when the rear-end participants click the frozen screen, the rear-end visual terminal acquires current parameters Ri and Ti from the on-site AR terminal and stores the parameters Ri and Ti;
step 7, the rear-end participants carry out rectangular labeling on the labeling object on the frozen static picture and form labeling coordinates (X1, Y1, X2 and Y2);
step 8, after the screen freezing is finished, for each frame of the video, acquiring current Rt and Tt, and converting originally frozen coordinates with Ri and Ti in the labeling process, wherein the conversion formula is as follows:
wherein DeltaR and DeltaT are differences of Ri, ti and current Rt and Tt when the screen is frozen, xt and Yt are converted rectangular labeling coordinates of the current screen, and Xi and Yi are original frozen rectangular labeling coordinates;
step 9, obtaining coordinates of rectangular labels on each frame of the video through the conversion formula, and forming dynamic rectangular labels on dynamic video pictures corresponding to the label objects;
step 10, the media server system sends the video with the dynamic rectangular labels to all participants, so that all the participants can see the video with the dynamic rectangular labels, and AR video and multiparty real-time remote maintenance collaboration is completed;
in step 8, in order to obtain the function of f (Δr), a plurality of frozen images and AR glasses sensing data are obtained in a laboratory, fitting is performed by using a multi-stage taylor function, and an accurate f (Δr) function is obtained by continuous approximation of the data, which specifically comprises the following steps:
step 81, gray all pixels in the rectangle are changed to 64 levels, an image with the size of 8 x 8 of the center of the rectangle is obtained, the gray value average value of the image is calculated, then the gray value of each pixel with the size of 8 x 8 of the image is compared with the average value, the gray value of each pixel is 1 which is larger than the average value and 0 which is smaller than the average value, and the obtained result is arranged into a 64-bit vector which is used as the characteristic vector of the image;
step 82, related images and R and T acquired by the second frozen screen are divided into 8 x 8 square blocks, vector calculation in step 82 is carried out, meanwhile, the nearest gray value image is found out through the feature vector, and similar rectangular coordinates after rotation are acquired through the image as the center;
and 83, after repeatedly freezing the screen, acquiring a plurality of groups of corresponding coordinate data, and obtaining an accurate f (delta R) function through polynomial fitting.
2. The remote equipment maintenance method based on AR video and multiparty real-time collaboration according to claim 1, wherein in step 1, the on-site AR terminal displays as a two-dimensional planar video, and performs mapping conversion on the original coordinates, and the conversion steps are as follows:
step 11, converting a world coordinate system of an object into a camera coordinate system, and using the recorded R and t, the actual relationship is as follows:
wherein Xc, yc and Zc are values of a camera coordinate system, and Xw, yw and Zw are values of a world coordinate system;
step 12, projecting the camera coordinate system to a pixel coordinate system:
wherein u and v are in pixels/meter; fx and fy are focal lengths in x and y directions, and the unit is a pixel; (Cx, cy) is the dominant point, the center of the image in pixels.
3. The remote equipment maintenance method based on AR video and multiparty real-time collaboration according to claim 1, wherein in step 3, the incoming call is continued to form a multiparty maintenance positioning expert group.
4. The remote equipment maintenance method based on AR video and multiparty real-time collaboration according to claim 1, wherein in step 5, a plurality of frozen screen collaborative annotations are performed on the live view, and the operations are performed synchronously in the image calculation.
5. The AR video and multiparty real-time collaborative based remote equipment maintenance method according to claim 1, wherein in step 6, the samples record Ri and Ti values per second and are saved to the store of the on-site AR terminal.
6. The AR video and multiparty real-time collaboration based remote device maintenance method according to claim 1, wherein partial frozen pictures are collected as a verification set, and f (Δr) functions meet the requirements of the computing display when the verification set deviation is within the allowable range.
7. The remote equipment maintenance system based on AR video and multiparty real-time cooperation is used for the remote equipment maintenance method based on AR video and multiparty real-time cooperation according to any one of 1-6, and is characterized by comprising a site AR terminal, a plurality of visual terminals and a server;
the on-site AR terminal is an AR glasses, and a 6-axis sensor is arranged on the AR glasses and used for recording a rotation degree parameter R and a corresponding displacement parameter T of the AR glasses;
the server comprises a media server system, a signaling system and a service control system;
the on-site AR terminal and the visual terminals register to a signaling system through a designated signaling, and the audio and video streams and the sharing collaboration streams of the visual terminals are synthesized through the media server system and then distributed to the participating visual terminals and on-site AR terminals;
the service control system controls the whole service flow and issues commands to the site AR terminal and the participated visual terminal.
8. The remote equipment maintenance system based on real-time cooperation of AR video and multiple parties according to claim 7, wherein the visual terminals are a mobile phone APP terminal and a PCWEB terminal.
CN202410112646.2A 2024-01-26 2024-01-26 Remote equipment maintenance method and system based on AR video and multiparty real-time cooperation Active CN117640914B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410112646.2A CN117640914B (en) 2024-01-26 2024-01-26 Remote equipment maintenance method and system based on AR video and multiparty real-time cooperation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410112646.2A CN117640914B (en) 2024-01-26 2024-01-26 Remote equipment maintenance method and system based on AR video and multiparty real-time cooperation

Publications (2)

Publication Number Publication Date
CN117640914A CN117640914A (en) 2024-03-01
CN117640914B true CN117640914B (en) 2024-04-05

Family

ID=90018527

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410112646.2A Active CN117640914B (en) 2024-01-26 2024-01-26 Remote equipment maintenance method and system based on AR video and multiparty real-time cooperation

Country Status (1)

Country Link
CN (1) CN117640914B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7185054B1 (en) * 1993-10-01 2007-02-27 Collaboration Properties, Inc. Participant display and selection in video conference calls
CN110266992A (en) * 2019-06-24 2019-09-20 苏芯物联技术(南京)有限公司 A kind of long-distance video interactive system and method based on augmented reality
CN111782035A (en) * 2020-06-12 2020-10-16 深圳增强现实技术有限公司 Remote operation guidance method and system based on augmented reality technology
CN112887657A (en) * 2021-01-27 2021-06-01 昭通亮风台信息科技有限公司 Remote conference method and system based on AR
CN114063954A (en) * 2021-11-11 2022-02-18 广西电网有限责任公司崇左供电局 Interactive system for remote cooperation of communication operation and inspection operation
CN116911823A (en) * 2023-07-05 2023-10-20 中国南方电网有限责任公司超高压输电公司贵阳局 AR (augmented reality) glasses-based substation inspection interaction method and system
CN116912722A (en) * 2023-07-05 2023-10-20 中国南方电网有限责任公司超高压输电公司贵阳局 Remote communication method and system for equipment inspection process based on AR equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7185054B1 (en) * 1993-10-01 2007-02-27 Collaboration Properties, Inc. Participant display and selection in video conference calls
CN110266992A (en) * 2019-06-24 2019-09-20 苏芯物联技术(南京)有限公司 A kind of long-distance video interactive system and method based on augmented reality
CN111782035A (en) * 2020-06-12 2020-10-16 深圳增强现实技术有限公司 Remote operation guidance method and system based on augmented reality technology
CN112887657A (en) * 2021-01-27 2021-06-01 昭通亮风台信息科技有限公司 Remote conference method and system based on AR
CN114063954A (en) * 2021-11-11 2022-02-18 广西电网有限责任公司崇左供电局 Interactive system for remote cooperation of communication operation and inspection operation
CN116911823A (en) * 2023-07-05 2023-10-20 中国南方电网有限责任公司超高压输电公司贵阳局 AR (augmented reality) glasses-based substation inspection interaction method and system
CN116912722A (en) * 2023-07-05 2023-10-20 中国南方电网有限责任公司超高压输电公司贵阳局 Remote communication method and system for equipment inspection process based on AR equipment

Also Published As

Publication number Publication date
CN117640914A (en) 2024-03-01

Similar Documents

Publication Publication Date Title
CN110266992A (en) A kind of long-distance video interactive system and method based on augmented reality
CN106060470B (en) Video monitoring method and system
CN102214000B (en) Hybrid registration method and system for target objects of mobile augmented reality (MAR) system
CN111199560B (en) Video monitoring positioning method and video monitoring system
CN112053446A (en) Real-time monitoring video and three-dimensional scene fusion method based on three-dimensional GIS
CN106851386B (en) Method and device for realizing augmented reality in television terminal based on Android system
CN109308174B (en) Cross-screen image splicing control method
CN103607568A (en) Stereo street scene video projection method and system
CN110992484B (en) Display method of traffic dynamic video in real scene three-dimensional platform
CN103004187A (en) Multiple-site drawn-image sharing apparatus, multiple-site drawn-image sharing system, method executed by multiple-site drawn-image sharing apparatus, program, and recording medium
CN112767480A (en) Monocular vision SLAM positioning method based on deep learning
CN111083368A (en) Simulation physics cloud platform panoramic video display system based on high in clouds
CN115294207A (en) Fusion scheduling system and method for smart campus monitoring video and three-dimensional GIS model
CN117640914B (en) Remote equipment maintenance method and system based on AR video and multiparty real-time cooperation
CN111399634A (en) Gesture-guided object recognition method and device
CN114638885A (en) Intelligent space labeling method and system, electronic equipment and storage medium
CN110177257A (en) A kind of multiple-camera video monitoring apparatus and its human body tracing method
CN110415293B (en) Interactive processing method, device, system and computer equipment
CN111669547A (en) Panoramic video structuring method
CN111782035A (en) Remote operation guidance method and system based on augmented reality technology
CN115633147A (en) Multi-user remote cooperative guidance system based on 5G multiple visual angles
CN116795464A (en) Method for realizing remote assistance and related equipment
CN112860946B (en) Method and system for converting video image information into geographic information
CN114299269A (en) Display method, display device, display system, electronic device, and storage medium
CN105303904A (en) Moving learning method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant