WO2022095651A1 - 视频巡检方法、装置、电子设备及可读介质 - Google Patents

视频巡检方法、装置、电子设备及可读介质 Download PDF

Info

Publication number
WO2022095651A1
WO2022095651A1 PCT/CN2021/122239 CN2021122239W WO2022095651A1 WO 2022095651 A1 WO2022095651 A1 WO 2022095651A1 CN 2021122239 W CN2021122239 W CN 2021122239W WO 2022095651 A1 WO2022095651 A1 WO 2022095651A1
Authority
WO
WIPO (PCT)
Prior art keywords
inspection
video
robot
scene
information
Prior art date
Application number
PCT/CN2021/122239
Other languages
English (en)
French (fr)
Inventor
袁振江
胡海军
冯平
刘枫
杨天骄
毕研珍
Original Assignee
通号通信信息集团有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 通号通信信息集团有限公司 filed Critical 通号通信信息集团有限公司
Priority to US18/030,806 priority Critical patent/US20230377277A1/en
Publication of WO2022095651A1 publication Critical patent/WO2022095651A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0227Control of position or course in two dimensions specially adapted to land vehicles using mechanical sensing means, e.g. for sensing treated area
    • G05D1/0229Control of position or course in two dimensions specially adapted to land vehicles using mechanical sensing means, e.g. for sensing treated area in combination with fixed guiding means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/243Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/646Following a predefined trajectory, e.g. a line marked on the floor or a flight path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/781Television signal recording using magnetic recording on disks or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/45Specific applications of the controlled vehicles for manufacturing, maintenance or repairing
    • G05D2105/47Specific applications of the controlled vehicles for manufacturing, maintenance or repairing for maintenance or repairing, e.g. fuelling or battery replacement
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/80Specific applications of the controlled vehicles for information gathering, e.g. for academic research
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/10Land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2111/00Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
    • G05D2111/10Optical signals
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/009Signalling of the alarm condition to a substation whose identity is signalled to a central station, e.g. relaying alarm signals in order to extend communication range

Definitions

  • the present disclosure relates to the technical field of video surveillance information, and in particular, to a video inspection method, device, electronic device, and readable medium.
  • the present disclosure provides a video inspection method, device, electronic device and readable medium.
  • a video inspection method including:
  • the video inspection information is video information obtained in a real scene
  • the 3D roaming scene is a virtual scene obtained by simulating the real scene
  • a video inspection device comprising:
  • an acquisition module to acquire video inspection information; wherein, the video inspection information is video information obtained in the real scene of the inspection area;
  • a matching and combination module configured to match and combine the video inspection information and a 3D roaming scene to obtain a 3D inspection video; wherein, the 3D roaming scene is a virtual scene obtained by simulating the real scene;
  • the display module is used to display the 3D inspection video.
  • an electronic device comprising:
  • the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor, so that the at least one processor can execute any of the video inspection methods provided in the first aspect. one of the methods described.
  • a non-transitory computer-readable storage medium storing computer instructions, the computer instructions being used to cause the computer to perform any one of the video inspection methods provided in the first aspect. method described.
  • the video inspection information obtained in the real scene and the 3D roaming scene are matched and combined to obtain a 3D inspection video that simulates the real environment, and the operation and maintenance personnel can access the 3D inspection video without reaching the scene. Knowing the operation status of inspection objects reduces the labor intensity of operation and maintenance personnel. Compared with on-site inspection, 3D inspection video enables operation and maintenance personnel to monitor more inspection objects and improve inspection efficiency.
  • FIG. 1 is a flowchart of a video inspection method provided by an embodiment of the present disclosure
  • Fig. 2 is the flow chart of a kind of 3D roaming scene and real scene synchronization calibration provided by the embodiment of the present disclosure
  • FIG. 3 is a flowchart of calibrating the position of a 3D roaming scene according to an embodiment of the present disclosure
  • FIG. 4 is a schematic block diagram of a video inspection apparatus provided by an embodiment of the present disclosure.
  • FIG. 5 is a schematic block diagram of an acquisition module provided by an embodiment of the present disclosure.
  • FIG. 6 is a schematic block diagram of a video inspection apparatus provided by an embodiment of the present disclosure.
  • FIG. 7 is a block diagram of an electronic device used to implement the video inspection method according to an embodiment of the present disclosure.
  • 400-video inspection device 401-acquisition module; 402-matching combination module; 403-display module; 500-acquisition module; 501-camera; 502-robot; 503-actual inspection guide rail; 504-control module, 600- Video inspection device; 601-acquisition module; 602-matching combination module; 603-control module; 604-storage module; 605-switch; 606-router; 700-equipment; 701-computing unit; 702-ROM; 703-RAM 704-bus; 705-I/O interface; 706-input unit, 707-output unit; 708-storage unit; 709-communication unit.
  • an embodiment of the present disclosure provides a video inspection method.
  • the method uses the 3D roaming technology to simulate the real scene of the inspection area (such as the computer room), so as to realize the automatic inspection of the inspection area.
  • FIG. 1 is a flowchart of a video inspection method provided by an embodiment of the present disclosure. As shown in Figure 1, the video inspection method includes:
  • Step S101 acquiring video inspection information.
  • the video inspection information is video information obtained from a real scene, and the real scene refers to a real scene in the inspection area.
  • the inspection area is a computer room, and the video inspection information is video information obtained from the computer room.
  • the video inspection information is a video obtained by using a camera or other device with a shooting function.
  • a camera to shoot videos obtained by various electronic and electrical equipment in the computer room.
  • the camera may acquire video inspection information according to a pre-planned path in the inspection area.
  • the planned path can be pre-planned by the user or by the designer.
  • the planned path can enable the camera to photograph all the electronic and electrical equipment in the inspection area, so as to monitor the operation status of the electronic and electrical equipment without dead ends. inspection.
  • Step S102 matching and combining the video inspection information and the 3D roaming scene to obtain a 3D inspection video.
  • the 3D roaming scene is a virtual scene obtained by simulating a real scene. Use 3D technology to build a 3D model of the inspection area.
  • the inspection area is the computer room
  • a 3D virtual scene with the same scale is constructed according to the actual length x, width y, and height z of the computer room, and virtual equipment is added in the corresponding position in the 3D virtual scene according to the electronic and electrical equipment arranged in the computer room, so as to Get a 3D walkthrough scene.
  • Electronic and electrical equipment is the inspection object, and the operation status of the inspection object is the main purpose of inspection.
  • the video inspection information is matched and combined with the 3D roaming scene, so that the video inspection information obtained in the real scene is consistent with the position in the 3D roaming scene, so that the camera shooting is synchronized with the 3D roaming scene.
  • Step S103 displaying the 3D inspection video.
  • the combined 3D inspection video is displayed by using VR devices such as VR glasses and VR helmets. Or, watch the 3D inspection video with a common display screen combined with 3D glasses.
  • the video inspection method provided by the embodiments of the present disclosure matches and combines the video inspection information obtained in the real scene with the 3D roaming scene to obtain a 3D inspection video that simulates the real environment.
  • the inspection video can know the operation status of the inspection object, which reduces the labor intensity of the operation and maintenance personnel.
  • the 3D inspection video can enable the operation and maintenance personnel to monitor more inspection objects and improve the inspection efficiency.
  • the video inspection information is obtained by a camera moving in a real scene
  • the camera is set on the robot
  • the robot carries the camera to move in the real scene.
  • the robot can carry the camera to move on the plane of the ground, and can also move in the vertical direction perpendicular to the ground and rotate 360°, so that the camera can realize inspection without dead angle.
  • the robot When the robot moves on the actual inspection track, it can send inspection commands to the robot through the operation module in the monitoring system, such as start, forward, backward, stop and return to the origin, etc. to move in the plane where the ground is located. Send up, down, rotation and other inspection commands.
  • the robot performs corresponding actions according to the inspection instructions.
  • an actual inspection guide rail is set in the inspection area, and the robot moves along the actual inspection guide rail.
  • the actual inspection guide rail is a guide rail pre-set in the inspection area.
  • the actual inspection guide rail plans the robot's inspection route. The robot only needs to move along the actual inspection guide rail and cooperate with the vertical direction and rotation to realize the inspection object without dead ends. Inspection.
  • the camera In order to synchronize the video inspection information obtained by the camera with the 3D roaming scene, before the actual inspection, it is necessary to synchronize the camera and the 3D roaming scene, so that the 3D roaming scene can be synchronized with the real scene captured by the camera, thereby improving the video inspection. accuracy of inspection. Since the camera is carried and moved by the robot, the camera can be synchronized with the 3D roaming scene as long as the 3D roaming scene is synchronized with the robot.
  • FIG. 2 is a flowchart of synchronous calibration of a 3D roaming scene and a real scene according to an embodiment of the present disclosure.
  • the video inspection information is matched and combined with the 3D roaming scene, and before obtaining the 3D inspection video, it also includes:
  • Step S201 obtaining a 3D roaming scene based on the real scene.
  • step S201 a 3D roaming scene is obtained by simulating a real scene by using a 3D technology.
  • the 3D walkthrough scene is obtained by simulating the real scene at the same scale. That is, according to the aspect ratio of the inspection area in the real scene, the 3D roaming scene is simulated in equal proportions.
  • simulation process and simulation method may adopt the prior art, which is not limited in the present disclosure.
  • Step S202 simulate the inspection track in the 3D roaming scene according to the actual inspection guide rail.
  • the inspection track is simulated in the 3D roaming scene according to the length and position of the actual inspection guide rail set in the inspection area, that is, the inspection track is the virtual inspection track of the actual inspection guide rail in the 3D roaming scene. Check the track.
  • Step S203 the virtual position in the 3D roaming scene is calibrated based on the current actual position of the robot and the current inspection duration.
  • the current inspection duration is the time it takes for the robot to reach the current actual position from the initial position of the actual inspection track.
  • the robot when the robot moves on the actual inspection track, it continuously acquires the current inspection duration and the corresponding current actual position of the robot according to preset time intervals, and pairs the robot based on the current inspection duration and the corresponding current actual position.
  • the inspection position in the 3D roaming scene is calibrated.
  • the preset time interval is 1 second, that is, when the robot moves on the actual inspection track, the current actual position is reported every 1 second, and the current inspection duration is recorded once.
  • the inspection speed of the robot can be set by a built-in setting unit.
  • the roaming speed in the 3D roaming scene is determined based on the roaming track length and the roaming inspection period.
  • the roaming track length and roaming inspection period are obtained by performing inspections through multiple simulations and robot linkage.
  • the inspection position in the 3D roaming scene is determined according to the current inspection duration, inspection cycle and inspection track length, and the inspection speed is determined by the inspection track length and inspection cycle.
  • the robot is determined as one inspection cycle from the starting position to the end position of the actual inspection track, the total inspection duration of multiple inspection cycles is obtained, and the average inspection duration is calculated from the total inspection durations of the multiple inspection cycles.
  • the average inspection time is determined as the total inspection time.
  • the total inspection time can be determined by the following methods: when the position of the robot starts to change (the robot starts to move), record the start time; when the position of the robot no longer changes, record the end time, start time and end time The difference is the total inspection time.
  • the calibration of the position of the 3D roaming scene may also be performed in other ways, such as calibration by using a position scale parameter.
  • FIG. 3 is a flowchart of calibrating the position of a 3D roaming scene according to an embodiment of the present disclosure.
  • the inspection position in the 3D roaming scene is calibrated, including:
  • step S301 the current actual position of the robot on the actual inspection track and the corresponding current inspection duration are obtained.
  • the current actual position refers to the position of the robot based on the starting point of the actual inspection track.
  • the current inspection duration refers to the duration of the robot starting from the starting point of the actual inspection track and reaching the current actual position.
  • the robot moves on the actual inspection track at a certain speed.
  • the current actual position of the robot can be determined through the current inspection duration.
  • the current actual position of the robot does not completely correspond to the current inspection duration. Therefore, it is necessary to calibrate the inspection position in the 3D roaming scene.
  • Step S302 Determine the current actual position proportional parameter based on the current actual position and the total length of the actual inspection track.
  • the current actual position ratio parameter is the ratio of the current actual position to the total length of the actual inspection track.
  • the total length of the actual inspection track is 10 meters
  • the current actual position of the robot is 1 meter, that is, from the starting point of the actual inspection track
  • the distance to the actual position of the robot is 1 meter
  • the current actual position scale parameter is 0.1.
  • Step S303 calibrating the inspection position in the 3D roaming scene based on the current actual position scale parameter and the current inspection duration.
  • the inspection position in the 3D roaming scene is determined according to the current inspection duration, and then the current actual position scale parameter is used to adjust the inspection position in the 3D roaming scene, so that the inspection position in the 3D roaming scene is adjusted.
  • the position is consistent with the actual position of the robot in the real scene.
  • the current actual position scale parameter is 0.1, and the position scale parameter determined in the 3D roaming scene is 0.09, it means that the patrol speed in the 3D roaming scene lags behind the moving speed of the robot in the real scene. Therefore, adjust the patrol speed in the 3D roaming scene.
  • the inspection speed in the 3D roaming scene is consistent with the moving speed of the robot in the real scene, so that the inspection position in the 3D roaming scene is consistent with the position of the robot, thereby improving the inspection experience.
  • the inspection position in the 3D roaming scene can be calibrated through multiple inspection cycles. Multiple position calibrations can ensure that the inspection position in the 3D roaming scene is synchronized with the robot's inspection.
  • the video inspection method further includes acquiring alarm information, and sending the alarm information to the client, so that the client sends an alarm reminder to remind the inspection personnel.
  • the client is a mobile terminal or a fixed terminal used by the inspection personnel.
  • Mobile terminals include but are not limited to terminals such as mobile phones and IPADs.
  • Fixed terminals include but are not limited to terminals such as computers.
  • the alarm information is information determined based on signals collected by sensors arranged in the inspection area, and the sensors are used to monitor the environment of the inspection area or the state of electronic and electrical equipment.
  • a temperature sensor is set in the inspection area, and when it is determined based on the temperature signal obtained by the temperature sensor that the temperature of the inspection area is higher than a preset temperature, an alarm message is issued.
  • a humidity sensor is set in the inspection area, and an alarm message is issued when it is determined that the humidity in the inspection area is higher than the preset humidity based on the humidity signal of the humidity sensor.
  • a current sensor is set in the inspection area to monitor the current value of the electronic and electrical equipment. If it is determined based on the current value obtained by the current sensor that the current flowing through the electronic and electrical equipment exceeds the preset current value, an alarm message is issued.
  • the client reminds the inspector in the form of an acoustic signal and/or an optical signal, so that the inspector can handle the fault in time, improve the efficiency of the fault processing, and avoid unnecessary losses.
  • the video inspection method provided by the embodiments of the present disclosure matches and combines the video inspection information obtained in the real scene with the 3D roaming scene to obtain a 3D inspection video that simulates the real environment.
  • the 3D inspection video can enable the operation and maintenance personnel to monitor more inspection objects and improve the inspection efficiency.
  • the inspection trajectory in the 3D roaming scene is consistent with the robot's inspection trajectory, which improves the inspection experience.
  • the present disclosure also provides a video inspection device, which uses 3D roaming technology to simulate a real scene of an inspection area (such as a computer room), so as to realize automatic inspection of the inspection area.
  • an inspection area such as a computer room
  • FIG. 4 is a schematic block diagram of a video inspection apparatus provided by an embodiment of the present disclosure. As shown in FIG. 4 , the video inspection device 400 includes:
  • the obtaining module 401 obtains video inspection information.
  • the video inspection information is video information obtained in the real scene of the inspection area, and the real scene refers to the real scene of the inspection area.
  • the inspection area is a computer room, and the video inspection information is video information obtained from the computer room.
  • the video inspection information is a video obtained by using a camera or other device with a shooting function.
  • a camera to shoot videos obtained by various electronic and electrical equipment in the computer room.
  • the matching and combining module 402 is used for matching and combining the video inspection information and the 3D roaming scene to obtain a 3D inspection video.
  • the 3D roaming scene is a virtual scene obtained by simulating a real scene. Use 3D technology to build a 3D model of the inspection area.
  • the inspection area is the computer room
  • a 3D virtual scene with the same scale is constructed according to the actual length x, width y, and height z of the computer room, and virtual equipment is added in the corresponding position in the 3D virtual scene according to the electronic and electrical equipment arranged in the computer room, so as to Get a 3D walkthrough scene.
  • Electronic and electrical equipment is the inspection object, and the operation status of the inspection object is the main purpose of inspection.
  • the video inspection information is matched and combined with the 3D roaming scene, so that the video inspection information obtained in the real scene is consistent with the position in the 3D roaming scene, so that the camera shooting is synchronized with the 3D roaming scene.
  • the display module 403 is used to display the 3D inspection video.
  • the display module 403 may be a VR device such as VR glasses or a VR helmet, or the display module 403 may be a display screen, which may be combined with 3D glasses when viewing the display screen.
  • the display module 403 can clearly present the status of the inspection objects in the inspection area.
  • the acquisition module 401 is used to obtain video inspection information
  • the matching and combination module is used to match and combine the video inspection information obtained in the real scene with the 3D roaming scene, so as to obtain a 3D inspection simulating the real environment.
  • Video, operation and maintenance personnel do not need to reach the scene to know the operation status of the inspection object through the 3D inspection video displayed by the display module, which reduces the labor intensity of the operation and maintenance personnel. It enables operation and maintenance personnel to monitor more inspection objects and improve inspection efficiency.
  • FIG. 5 is a schematic block diagram of an acquisition module provided by an embodiment of the present disclosure.
  • the acquisition module 500 includes: a camera 501, a robot 502, an actual inspection guide rail 503, and the video inspection device further includes a control module 504, wherein,
  • the actual inspection guide rail 503 is set in the real scene of the inspection area, and is used to provide the running track of the robot.
  • the robot 502 is used to carry the camera 501 to move along the actual inspection guide rail 503; the camera 501 is arranged on the robot 502, and the robot 502 carries the camera 501 to move in the real scene for obtaining video inspection information.
  • the control module 504 is signally connected with the robot 502 and the camera 501 , and is used for controlling the running state of the robot 502 and the shooting position and angle of the camera 501 .
  • the actual inspection guide rail 503 may be a track or a track line.
  • a component matching the rail is provided on the robot 502 .
  • a device for track line recognition is provided on the robot 502 .
  • FIG. 6 is a schematic block diagram of a video inspection apparatus provided by an embodiment of the present disclosure.
  • the video inspection apparatus 600 includes an acquisition module 601, a matching combination module 602, a control module 603 and a storage module 604, wherein the acquisition module 601 and the matching combination module 602 match the acquisition module 401 and the matching combination module 602 mentioned above.
  • the combination module 402 is the same and will not be repeated here.
  • the storage module 604 is used to store the video inspection information obtained by the camera.
  • the matching and combining module 602 needs video inspection information
  • the video inspection information can be obtained from the storage module 604 .
  • the matching and combining module 602 can flexibly obtain the video inspection information.
  • the matching and combining module can obtain the video inspection information at the current moment, and can also obtain the video inspection information at any previous moment.
  • control module includes a calibration unit (not shown in the figure), and the calibration unit obtains the current actual position of the robot on the actual inspection track and the corresponding current inspection duration, and based on the current actual position and the actual inspection track
  • the total length of the 3D roaming scene determines the current actual position scale parameter, and the patrol position in the 3D roaming scene is calibrated based on the current actual position scale parameter and the current patrol duration.
  • the current actual position refers to the position of the robot based on the starting point of the actual inspection track.
  • the current inspection duration refers to the duration of the robot starting from the starting point of the actual inspection track and reaching the current actual position.
  • the current actual position ratio parameter is the ratio of the current actual position to the total length of the actual inspection track.
  • the total length of the actual inspection track is 10 meters
  • the current actual position of the robot is 1 meter, that is, from the starting point of the actual inspection track
  • the distance to the actual position of the robot is 1 meter
  • the current actual position scale parameter is 0.1.
  • the calibration unit determines the inspection position in the 3D roaming scene according to the current inspection duration, and then uses the current actual position scale parameter to adjust the inspection position in the 3D roaming scene, so that the inspection position in the 3D roaming scene is consistent with the real scene.
  • the actual position of the robot is the same.
  • the inspection position in the 3D roaming scene can be calibrated through multiple inspection cycles. Multiple position calibrations can ensure that the inspection position in the 3D roaming scene is synchronized with the robot's inspection.
  • the storage module 604 includes a hard disk video recorder.
  • the hard disk video recorder can be signal-connected to multiple cameras.
  • the hard disk video recorder includes at least one storage channel, and the storage channel stores video inspection information obtained by one camera.
  • the matching combination module can obtain the video inspection information of the corresponding camera through the storage channel.
  • the video inspection apparatus 600 further includes:
  • the switch 605 is used for signally connecting the control module 603 and the storage module 604 .
  • the storage module 505 is a hard disk video recorder
  • the control module 603 and the hard disk video recorder can be signally connected through the switch 605 .
  • the router 606 is used to signally connect the robot with the control module 603, so that the control module 603 can control the movement modes of the robot, such as forward, backward, ascending, descending and other moving modes.
  • the video inspection apparatus further includes an alarm module (not shown in the figure) for acquiring fault information of the inspection objects in the equipment room, and sending the fault information to the display module for display.
  • an alarm module (not shown in the figure) for acquiring fault information of the inspection objects in the equipment room, and sending the fault information to the display module for display.
  • the client reminds the inspector in the form of an acoustic signal and/or an optical signal, so that the inspector can handle the fault in time, improve the efficiency of the fault processing, and avoid unnecessary losses.
  • the functions or modules included in the apparatuses provided in the embodiments of the present disclosure may be used to execute the methods described in the above method embodiments, and the specific implementation and technical effects may refer to the above method embodiments. Description, for brevity, will not be repeated here.
  • the video inspection device matches and combines the video inspection information obtained in the real scene with the 3D roaming scene, and obtains the 3D inspection video that simulates the real environment.
  • the 3D inspection video can enable the operation and maintenance personnel to monitor more inspection objects and improve the inspection efficiency.
  • the inspection trajectory in the 3D roaming scene is consistent with the robot's inspection trajectory, which improves the inspection experience.
  • the present disclosure also provides an electronic device, a readable storage medium, and a computer program product.
  • FIG. 7 shows a schematic block diagram of an example electronic device 700 that may be used to implement embodiments of the present disclosure.
  • Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframe computers, and other suitable computers.
  • Electronic devices may also represent various forms of mobile devices, such as personal digital processors, cellular phones, smart phones, wearable devices, and other similar computing devices.
  • the components shown herein, their connections and relationships, and their functions are by way of example only, and are not intended to limit implementations of the disclosure described and/or claimed herein.
  • the device 700 includes a computing unit 701 that can be executed according to a computer program stored in a read only memory (ROM) 702 or loaded into a random access memory (RAM) 703 from a storage unit 708 Various appropriate actions and handling. In the RAM 703, various programs and data required for the operation of the device 700 can also be stored.
  • the computing unit 701, the ROM 702, and the RAM 703 are connected to each other through a bus 704.
  • An input/output (I/O) interface 705 is also connected to bus 704 .
  • Various components in the device 700 are connected to the I/O interface 705, including: an input unit 706, such as a keyboard, mouse, etc.; an output unit 707, such as various types of displays, speakers, etc.; a storage unit 708, such as a magnetic disk, an optical disk, etc. ; and a communication unit 709, such as a network card, a modem, a wireless communication transceiver, and the like.
  • the communication unit 709 allows the device 700 to exchange information/data with other devices through a computer network such as the Internet and/or various telecommunication networks.
  • Computing unit 701 may be various general-purpose and/or special-purpose processing components with processing and computing capabilities. Some examples of computing units 701 include, but are not limited to, central processing units (CPUs), graphics processing units (GPUs), various specialized artificial intelligence (AI) computing chips, various computing units that run machine learning model algorithms, digital signal processing processor (DSP), and any suitable processor, controller, microcontroller, etc.
  • the computing unit 701 performs the various methods and processes described above, such as the video inspection method.
  • the video inspection method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 708 .
  • part or all of the computer program may be loaded and/or installed on device 700 via ROM 702 and/or communication unit 709 .
  • the computer program When the computer program is loaded into RAM 703 and executed by computing unit 701, one or more steps of the video inspection method described above can be performed.
  • the computing unit 701 may be configured to perform the video inspection method by any other suitable means (e.g., by means of firmware).
  • Various implementations of the systems and techniques described herein above may be implemented in digital electronic circuitry, integrated circuit systems, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), application specific standard products (ASSPs), systems on chips system (SOC), load programmable logic device (CPLD), computer hardware, firmware, software, and/or combinations thereof.
  • FPGAs field programmable gate arrays
  • ASICs application specific integrated circuits
  • ASSPs application specific standard products
  • SOC systems on chips system
  • CPLD load programmable logic device
  • computer hardware firmware, software, and/or combinations thereof.
  • These various embodiments may include being implemented in one or more computer programs executable and/or interpretable on a programmable system including at least one programmable processor that
  • the processor which may be a special purpose or general-purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device an output device.
  • Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer or other programmable data processing apparatus, such that the program code, when executed by the processor or controller, performs the functions/functions specified in the flowcharts and/or block diagrams. Action is implemented.
  • the program code may execute entirely on the machine, partly on the machine, partly on the machine and partly on a remote machine as a stand-alone software package or entirely on the remote machine or server.
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with the instruction execution system, apparatus or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices, or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), fiber optics, compact disk read only memory (CD-ROM), optical storage, magnetic storage, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM compact disk read only memory
  • magnetic storage or any suitable combination of the foregoing.
  • the systems and techniques described herein may be implemented on a computer having a display device (eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user ); and a keyboard and pointing device (eg, a mouse or trackball) through which a user can provide input to the computer.
  • a display device eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • a keyboard and pointing device eg, a mouse or trackball
  • Other kinds of devices can also be used to provide interaction with the user; for example, the feedback provided to the user can be any form of sensory feedback (eg, visual feedback, auditory feedback, or tactile feedback); and can be in any form (including acoustic input, voice input, or tactile input) to receive input from the user.
  • the systems and techniques described herein may be implemented on a computing system that includes back-end components (eg, as a data server), or a computing system that includes middleware components (eg, an application server), or a computing system that includes front-end components (eg, a user computer having a graphical user interface or web browser through which a user may interact with implementations of the systems and techniques described herein), or including such backend components, middleware components, Or any combination of front-end components in a computing system.
  • the components of the system may be interconnected by any form or medium of digital data communication (eg, a communication network). Examples of communication networks include: Local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
  • a computer system can include clients and servers.
  • Clients and servers are generally remote from each other and usually interact through a communication network.
  • the relationship of client and server arises by computer programs running on the respective computers and having a client-server relationship to each other.
  • the present disclosure also provides a computer program product, including a computer program, which implements any one of the above video inspection methods when executed by a processor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Testing And Monitoring For Control Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

本公开提供了一种视频巡检方法、装置、电子设备及可读介质,属于视频监控信息技术领域。视频巡检方法包括获取视频巡检信息;其中,视频巡检信息是在真实场景中获取的视频信息;将视频巡检信息与3D漫游场景匹配组合,获得3D巡检视频;其中,3D漫游场景是模拟真实场景获得的虚拟场景;展示3D巡检视频。本公开可以降低运维人员的劳动强度,提高巡检效率。

Description

视频巡检方法、装置、电子设备及可读介质 技术领域
本公开涉及视频监控信息技术领域,具体涉及一种视频巡检方法、装置、电子设备及可读介质。
背景技术
随着道路交通信息化和智能化程度的提升,机房的设备越来越多,这给运维工作带来越来越大的压力。然而,目前的运维工作主要是人工巡检,不仅巡检效率低,而且成本高,无法适应道路交通信息化和智能化的需求。
发明内容
本公开提供一种视频巡检方法、装置、电子设备及可读介质。
根据本公开的第一方面,提供了一种视频巡检方法,包括:
获取视频巡检信息;其中,所述视频巡检信息是在真实场景中获取的视频信息;
将所述视频巡检信息与3D漫游场景匹配组合,获得3D巡检视频;其中,所述3D漫游场景是模拟所述真实场景获得的虚拟场景;
展示所述3D巡检视频。
根据本公开的第二方面,提供了一种视频巡检装置,包括:
获取模块,获取视频巡检信息;其中,所述视频巡检信息是在巡检区域的真实场景中获取的视频信息;
匹配组合模块,用于将所述视频巡检信息与3D漫游场景匹配组合,获得3D巡检视频;其中,所述3D漫游场景是模拟所述真实场景获得的虚拟场景;
显示模块,用于展示所述3D巡检视频。
根据本公开的第三方面,提供了一种电子设备,其包括:
至少一个处理器;以及
与所述至少一个处理器通信连接的存储器;其中,
所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行第一方面提供的视频巡检方法中任一项所述的方法。
根据本公开的第四方面,提供了一种存储有计算机指令的非瞬时计算机可读存储介质,所述计算机指令用于使所述计算机执行第一方面提供的视频巡检方法中任一项所述的方法。
根据本公开的视频巡检方法,将真实场景中获取的视频巡检信息与3D漫游场景匹配组合,获得模拟真实环境的3D巡检视频,运维人员不需要达到现场即可从3D巡检视频获知巡检对象的运行状态,降低运维人员的劳动强度,而且,相对于现场巡检,通过3D巡检视频可以使运维人员监控更多个巡检对象,提高巡检效率。
应当理解,本部分所描述的内容并非旨在标识本公开的实施例的关键或重要特征,也不用于限制本公开的范围。本公开的其它特征将通过以下的说明书而变得容易理解。
附图说明
附图用来提供对本公开的进一步理解,并且构成说明书的一部分,与本公开的实施例一起用于解释本公开,并不构成对本公开的限制。通过参考附图对详细示例实施例进行描述,以上和其他特征和优点对本领域技术人员将变得更加显而易见。
图1为本公开实施例提供的一种视频巡检方法的流程图;
图2为本公开实施例提供的一种3D漫游场景与真实场景同步校准的流 程图;
图3为本公开实施例提供的一种3D漫游场景的位置进行校准的流程图;
图4为本公开实施例提供的一种视频巡检装置的原理框图;
图5为本公开实施例提供的一种获取模块的原理框图;
图6为本公开实施例提供的一种视频巡检装置的原理框图;
图7是用来实现本公开实施例的视频巡检方法的电子设备的框图。
在附图中:
400-视频巡检装置;401-获取模块;402-匹配组合模块;403-显示模块;500-获取模块;501-摄像头;502-机器人;503-实际巡检导轨;504-控制模块,600-视频巡检装置;601-获取模块;602-匹配组合模块;603-控制模块;604-存储模块;605-交换机;606-路由器;700-设备;701-计算单元;702-ROM;703-RAM;704-总线;705-I/O接口;706-输入单元,707-输出单元;708-存储单元;709-通信单元。
具体实施方式
为使本领域的技术人员更好地理解本公开的技术方案,以下结合附图对本公开的示范性实施例做出说明,其中包括本公开实施例的各种细节以助于理解,应当将它们认为仅仅是示范性的。因此,本领域普通技术人员应当认识到,可以对这里描述的实施例做出各种改变和修改,而不会背离本公开的范围和精神。同样,为了清楚和简明,以下的描述中省略了对公知功能和结构的描述。
在不冲突的情况下,本公开各实施例及实施例中的各特征可相互组合。
如本文所使用的,术语“和/或”包括一个或多个相关列举条目的任何和所有组合。
本文所使用的术语仅用于描述特定实施例,且不意欲限制本公开。如本文所使用的,单数形式“一个”和“该”也意欲包括复数形式,除非上下文 另外清楚指出。还将理解的是,当本说明书中使用术语“包括”和/或“由……制成”时,指定存在特征、整体、步骤、操作、元件和/或组件,但不排除存在或添加一个或多个其它特征、整体、步骤、操作、元件、组件和/或其群组。“连接”或者“相连”等类似的词语并非限定于物理的或者机械的连接,而是可以包括电性的连接,不管是直接的还是间接的。
除非另外限定,否则本文所用的所有术语(包括技术和科学术语)的含义与本领域普通技术人员通常理解的含义相同。还将理解,诸如那些在常用字典中限定的那些术语应当被解释为具有与其在相关技术以及本公开的背景下的含义一致的含义,且将不解释为具有理想化或过度形式上的含义,除非本文明确如此限定。
在铁路、公路等信息化、智能化程度较高的行业,机房内设置有大量的信息处理设备,为了确保这些设备能够稳定运行,日常的运维工作量较大,视频巡检能够降低运维人员的工作量,同时能提高运维效率。
第一方面,本公开实施例提供一种视频巡检方法。该方法使用3D漫游技术模拟巡检区域(如机房)的真实场景,以实现对巡检区域的自动巡检。
图1为本公开实施例提供的一种视频巡检方法的流程图。如图1所示,视频巡检方法包括:
步骤S101,获取视频巡检信息。
其中,视频巡检信息是从真实场景中获取的视频信息,真实场景是指巡检区域的真实场景。例如,巡检区域为机房,视频巡检信息是从机房内获得的视频信息。
在一些实施例中,视频巡检信息是利用摄像头或其它具有拍摄功能的设备获得的视频。如,利用摄像头拍摄机房内各电子电器设备获得的视频。
在一些实施例中,摄像头可以在巡检区域按照预先规划的路径来获取视频巡检信息。其中,规划路径可以由用户预先规划,也可以由设计人员预先 规划,而且,规划的路径可以使摄像头将巡检区域内的电子电器设备全部拍摄到,以对电子电器设备的运行状况进行无死角的巡检。
步骤S102,将视频巡检信息与3D漫游场景匹配组合,获得3D巡检视频。
其中,3D漫游场景是模拟真实场景获得的虚拟场景。利用3D技术构建巡检区域的3D模型。当巡检区域为机房时,根据机房的实际长x、宽y、高z,构建相同比例的3D虚拟场景,并根据机房内布置的电子电器设备在3D虚拟场景中对应位置添加虚拟设备,从而获得3D漫游场景。电子电器设备即为巡检对象,巡检对象的运行状态是巡检的主要目的。
在一些实施例中,将视频巡检信息与3D漫游场景匹配组合,使真实场景中获得的视频巡检信息与3D漫游场景中的位置一致,从而使摄像头拍摄与3D漫游同步。
步骤S103,展示3D巡检视频。
在一些实施例中,利用VR眼镜、VR头盔等VR设备展示组合后的3D巡检视频。或者,利用普通显示屏结合3D眼镜观看3D巡检视频。
本公开实施例提供的视频巡检方法,将真实场景中获取的视频巡检信息与3D漫游场景匹配组合,获得模拟真实环境的3D巡检视频,运维人员不需要达到现场即可从3D巡检视频获知巡检对象的运行状态,降低运维人员的劳动强度,而且,相对于现场巡检,通过3D巡检视频可以使运维人员监控更多的巡检对象,提高巡检效率。
在一些实施例中,视频巡检信息是通过真实场景中移动的摄像头获得,摄像头设置于机器人,由机器人携带摄像头在在真实场景中移动。例如,机器人可以携带摄像头在地面所在平面移动,还可以在垂直于地面的垂直方向移动以及360°旋转,从而使摄像头实现无死角巡检。
机器人在实际巡检轨道上移动时,可以通过监控系统中的操作模块对机 器人发送巡检指令,如开始、前进、后退、停止和返回原点等在地面所在平面内移动的巡检指令,也可以发送上升、下降、旋转等巡检指令。机器人依据巡检指令执行相应的动作。
在一些实施例中,在巡检区域设置实际巡检导轨,机器人沿实际巡检导轨移动。实际巡检导轨是预先设置在巡检区域的导轨,实际巡检导轨规划了机器人的巡检路线,机器人只要沿实际巡检导轨移动并配合垂直方向和旋转即可实现对巡检对象的无死角巡检。
为了使摄像头获取的视频巡检信息与3D漫游场景实现同步,在实际巡检之前,需要对摄像头和3D漫游场景进行同步校准,使3D漫游场景能够与摄像头拍摄的真实场景同步,从而提高视频巡检的准确性。由于摄像头是由机器人携带移动,因此,只要使3D漫游场景与机器人同步即可使摄像头与3D漫游场景同步。
图2为本公开实施例提供的一种3D漫游场景与真实场景同步校准的流程图。如图2所示,将视频巡检信息与3D漫游场景匹配组合,获得3D巡检视频之前,还包括:
步骤S201,基于真实场景获得3D漫游场景。
在步骤S201中,利用3D技术模拟真实场景,获得3D漫游场景。
在一些实施例中,按照等比例模拟真实场景获得3D漫游场景。即,按照真实场景中巡检区域的长宽比,等比例地模拟出3D漫游场景。
需要说明的是,模拟过程和模拟方式可以采用现有技术,本公开对此不作限定。
步骤S202,根据实际巡检导轨在3D漫游场景中模拟出巡检轨迹。
在一些实施例中,根据巡检区域内设置的实际巡检导轨的长度和位置,在3D漫游场景中模拟出巡检轨迹,即巡检轨迹是实际巡检导轨在3D漫游场景中的虚拟巡检轨迹。
步骤S203,基于机器人的当前实际位置和当前巡检时长对3D漫游场景中的虚拟位置进行校准。
其中,当前巡检时长是机器人从实际巡检轨道的初始位置到达当前实际位置所花费的时长。
在一些实施例中,机器人在实际巡检轨道上移动时,按照预设时间间隔不断地获取机器人的当前巡检时长以及对应的当前实际位置,并基于当前巡检时长和对应的当前实际位置对3D漫游场景中的巡检位置进行校准。
例如,预设时间间隔为1秒,即机器人在实际巡检轨道上移动时,每个1秒上报一次当前实际位置,并记录一次当前巡检时长。
在一些实施例中,机器人的巡检速度可以由内置的设置单元设定。3D漫游场景中的漫游速度基于漫游轨道长度和漫游巡检周期确定。在一些实施例中,通过多次模拟和机器人联动进行巡检,获取漫游轨道长度和漫游巡检周期。
在一些实施例中,3D漫游场景中的巡检位置是依据当前巡检时长、巡检周期和巡检轨道长度确定,巡检速度通过巡检轨道长度和巡检周期来确定。
在一些实施例中,将机器人从实际巡检轨道的起始位置到终点位置确定为一个巡检周期,获取多个巡检周期的巡检总时长,再由多个巡检总时长计算平均巡检时长,将平均巡检时长确定为巡检总时长。
其中,巡检总时长可以通过以下方式确定:当机器人的位置开始变化(机器人开始移动)时,记录起始时间;当机器人的位置不再发生变化时,记录结束时间,起始时间和结束时间的差值为巡检总时长。
需要说明的是,对3D漫游场景的位置进行校准也可以采用其它方式进行,例如利用位置比例参数进行校准。
图3为本公开实施例提供的一种3D漫游场景的位置进行校准的流程图。
如图3所示,基于真实场景中机器人的当前实际位置和当前巡检时长对 3D漫游场景中的巡检位置进行校准,包括:
步骤S301,获取机器人在实际巡检轨道上的当前实际位置和对应的当前巡检时长。
其中,当前实际位置是指机器人以实际巡检轨道的起始点为基准的位置。当前巡检时长是指机器人从实际巡检轨道的起始点开始计时,到达当前实际位置的时长。
在一些实施例中,机器人以一定的速度在实际巡检轨道上移动,理论上,通过当前巡检时长可以确定机器人的当前实际位置。但在实际应用中,由于硬件的误差等原因,机器人的当前实际位置并不完全与当前巡检时长对应,因此,需要对3D漫游场景中的巡检位置进行校准。
步骤S302,基于当前实际位置和实际巡检轨道的总长度确定当前实际位置比例参数。
当前实际位置比例参数是当前实际位置与实际巡检轨道的总长度的比例,例如,实际巡检轨道的总长度为10米,机器人的当前实际位置是1米,即从实际巡检轨道的起点到机器人的实际位置之间的距离为1米,那么当前实际位置比例参数为0.1。
步骤S303,基于当前实际位置比例参数和当前巡检时长对3D漫游场景中的巡检位置进行校准。
在一些实施例中,依据当前巡检时长来确定3D漫游场景中的巡检位置,再利用当前实际位置比例参数对3D漫游场景中的巡检位置进行调整,从而使3D漫游场景中的巡检位置与真实场景中机器人的实际位置一致。
若当前实际位置比例参数为0.1,而3D漫游场景中确定的位置比例参数为0.09,说明3D漫游场景中的巡检速度滞后于真实场景中机器人的移动速度,因此,调整3D漫游场景中的巡检速度,使3D漫游场景中的巡检速度与真实场景中机器人的移动速度一致,从而使3D漫游场景中的巡检位置与机 器人的位置一致,从而提升巡检体验。
为了使3D漫游场景中的巡检位置与真实场景中机器人的实际位置的一致性更准确,可以通过多个巡检周期对3D漫游场景中的巡检位置进行校准,经过多个巡检周期的多次位置校准,可以保证3D漫游场景中的巡检位置与机器人的巡检同步。
在一些实施例中,视频巡检方法还包括获取告警信息,将告警信息发送至客户端,以使客户端发出告警提醒,以提醒巡检人员。其中,客户端是巡检人员使用的移动终端或固定终端。移动终端包括但不限于手机、IPAD等终端。固定终端包括但不限于电脑等终端。
其中,告警信息是基于设置在巡检区域的传感器采集的信号确定的信息,传感器用于监控巡检区域环境或电子电器设备的状态的器件。
例如,在巡检区域设置温度传感器,基于温度传感器获得的温度信号确定巡检区域的温度高于预设温度时,发出告警信息。在巡检区域设置湿度传感器,基于湿度传感器的湿度信号确定巡检区域的湿度高于预设湿度时,发出告警信息。再如,在巡检区域设置电流传感器,用于监控电子电器设备的电流值,若基于电流传感器获得的电流值确定流经电子电器设备的电流超过预设电流值时,发出告警信息。
在一些实施例中,客户端以声信号和/或光信号方式提醒巡检人员,以便于巡检人员能够及时处理故障,提高故障处理的效率,避免不必要的损失。
需要说明的是,在不违背逻辑的情况下,本申请不同实施例之间可以相互结合,不同实施例描述有所侧重,未侧重描述的部分可参见其他实施例的记载。
本公开实施例提供的视频巡检方法,将真实场景中获取的视频巡检信息与3D漫游场景匹配组合,获得模拟真实环境的3D巡检视频,运维人员不需要达到现场即可从3D巡检视频获知巡检对象的运行状态,降低运维人员的 劳动强度,相对于现场巡检,通过3D巡检视频可以使运维人员监控更多个巡检对象,提高巡检效率。另外,通过位置校准,使得3D漫游场景中的巡检轨迹与机器人的巡检轨道一致,提升了巡检体验。
第二方面,本公开还提供一种视频巡检装置,该装置使用3D漫游技术模拟巡检区域(如机房)的真实场景,以实现对巡检区域的自动巡检。
图4为本公开实施例提供的一种视频巡检装置的原理框图。如图4所示,视频巡检装置400包括:
获取模块401,获取视频巡检信息。
其中,视频巡检信息是在巡检区域的真实场景中获取的视频信息,真实场景是指巡检区域的真实场景。例如,巡检区域为机房,视频巡检信息是从机房内获得的视频信息。
在一些实施例中,视频巡检信息是利用摄像头或其它具有拍摄功能的设备获得的视频。如,利用摄像头拍摄机房内各电子电器设备获得的视频。
匹配组合模块402,用于将视频巡检信息与3D漫游场景匹配组合,获得3D巡检视频。
其中,3D漫游场景是模拟真实场景获得的虚拟场景。利用3D技术构建巡检区域的3D模型。当巡检区域为机房时,根据机房的实际长x、宽y、高z,构建相同比例的3D虚拟场景,并根据机房内布置的电子电器设备在3D虚拟场景中对应位置添加虚拟设备,从而获得3D漫游场景。电子电器设备即为巡检对象,巡检对象的运行状态是巡检的主要目的。
在一些实施例中,将视频巡检信息与3D漫游场景匹配组合,使真实场景中获得的视频巡检信息与3D漫游场景中的位置一致,从而使摄像头拍摄与3D漫游同步。
显示模块403,用于展示3D巡检视频。
在一些实施例中,显示模块403可以是VR眼镜、VR头盔等VR设备, 或者,显示模块403是显示屏,观看显示屏时可以结合3D眼镜。显示模块403可以清晰地呈现巡检区域内巡检对象的状态。
本公开实施例提供的视频巡检装置,利用获取模块401获取视频巡检信息,利用匹配组合模块将真实场景中获取的视频巡检信息与3D漫游场景匹配组合,获得模拟真实环境的3D巡检视频,运维人员不需要达到现场即可通过显示模块展示的3D巡检视频获知巡检对象的运行状态,降低运维人员的劳动强度,而且,相对于现场巡检,通过3D巡检视频可以使运维人员监控更多的巡检对象,提高巡检效率。
图5为本公开实施例提供的一种获取模块的原理框图。如图5所示,获取模块500包括:摄像头501、机器人502、实际巡检导轨503,视频巡检装置还包括控制模块504,其中,
实际巡检导轨503设置于巡检区域的真实场景中,用于提供机器人的运行轨迹。机器人502用于携带摄像头501移动,其沿实际巡检导轨503移动;摄像头501设置于机器人502,机器人502携带摄像头501在真实场景中移动,用于获取视频巡检信息。控制模块504与机器人502和摄像头501信号连接,用于控制机器人502的运行状态以及摄像头501的拍摄位置和角度。
在一些实施例中,实际巡检导轨503可以是轨道,也可以是轨迹线。当实际巡检导轨503是轨道时,在机器人502上设置与轨道相匹配的部件。当实际巡检导轨503是轨迹线时,在机器人502上设置轨迹线识别的设备。
图6为本公开实施例提供的一种视频巡检装置的原理框图。如图6所示,视频巡检装置600包括获取模块601、匹配组合模块602、控制模块603和存储模块604,其中,获取模块601和匹配组合模块602与上文提及的获取模块401和匹配组合模块402相同,在此不再赘述。
存储模块604用于存储摄像头获取的视频巡检信息。当匹配组合模块602需要视频巡检信息时,可以从存储模块604获取视频巡检信息。
需要说明的是,借助存储模块604可以使匹配组合模块602灵活地获取视频巡检信息,例如匹配组合模块可以获取当前时刻的视频巡检信息,也可以获取之前任意时刻的视频巡检信息。
在一些实施例中,控制模块包括校准单元(图中未示出),校准单元获取机器人在实际巡检轨道上的当前实际位置和对应的当前巡检时长,基于当前实际位置和实际巡检轨道的总长度确定当前实际位置比例参数,基于当前实际位置比例参数和当前巡检时长对3D漫游场景中的巡检位置进行校准。
其中,当前实际位置是指机器人以实际巡检轨道的起始点为基准的位置。当前巡检时长是指机器人从实际巡检轨道的起始点开始计时,到达当前实际位置的时长。
当前实际位置比例参数是当前实际位置与实际巡检轨道的总长度的比例,例如,实际巡检轨道的总长度为10米,机器人的当前实际位置是1米,即从实际巡检轨道的起点到机器人的实际位置之间的距离为1米,那么当前实际位置比例参数为0.1。
校准单元依据当前巡检时长来确定3D漫游场景中的巡检位置,再利用当前实际位置比例参数对3D漫游场景中的巡检位置进行调整,从而使3D漫游场景中的巡检位置与真实场景中机器人的实际位置一致。
为了使3D漫游场景中的巡检位置与真实场景中机器人的实际位置的一致性更准确,可以通过多个巡检周期对3D漫游场景中的巡检位置进行校准,经过多个巡检周期的多次位置校准,可以保证3D漫游场景中的巡检位置与机器人的巡检同步。
在一些实施例中,存储模块604包括硬盘录像机,硬盘录像机可以与多个摄像头信号连接,硬盘录像机至少包括一个存储通道,存储通道存储一个摄像头获取的视频巡检信息。匹配组合模块可以通过存储通道获取对应摄像头的视频巡检信息。
在一些实施例中,视频巡检装置600还包括:
交换机605,用于将控制模块603和存储模块604信号连接。当存储模块505为硬盘录像机时,可以通过交换机605将控制模块603和硬盘录像机信号连接。
路由器606,用于将机器人与控制模块603信号连接,以供控制模块603控制机器人的移动方式,例如前进、后退、上升、下降等移动方式。
在一些实施例中,视频巡检装置还包括告警模块(图中未示出),用于获取机房内的巡检对象的故障信息,并将故障信息发送至显示模块显示。
在一些实施例中,客户端以声信号和/或光信号方式提醒巡检人员,以便于巡检人员能够及时处理故障,提高故障处理的效率,避免不必要的损失。
在本公开的一些实施例中,本公开实施例提供的装置具有的功能或包含的模块可以用于执行上文方法实施例描述的方法,其具体实现和技术效果可参照上文方法实施例的描述,为了简洁,这里不再赘述。
本公开实施例提供的视频巡检装置,将真实场景中获取的视频巡检信息与3D漫游场景匹配组合,获得模拟真实环境的3D巡检视频,运维人员不需要达到现场即可从3D巡检视频获知巡检对象的运行状态,降低运维人员的劳动强度,相对于现场巡检,通过3D巡检视频可以使运维人员监控更多个巡检对象,提高巡检效率。另外,通过位置校准,使得3D漫游场景中的巡检轨迹与机器人的巡检轨道一致,提升了巡检体验。
根据本公开的实施例,本公开还提供了一种电子设备、一种可读存储介质和一种计算机程序产品。
图7示出了可以用来实施本公开的实施例的示例电子设备700的示意性框图。电子设备旨在表示各种形式的数字计算机,诸如,膝上型计算机、台式计算机、工作台、个人数字助理、服务器、刀片式服务器、大型计算机、和其它适合的计算机。电子设备还可以表示各种形式的移动装置,诸如,个 人数字处理、蜂窝电话、智能电话、可穿戴设备和其它类似的计算装置。本文所示的部件、它们的连接和关系、以及它们的功能仅仅作为示例,并且不意在限制本文中描述的和/或者要求的本公开的实现。
如图7所示,设备700包括计算单元701,其可以根据存储在只读存储器(ROM)702中的计算机程序或者从存储单元708加载到随机访问存储器(RAM)703中的计算机程序,来执行各种适当的动作和处理。在RAM 703中,还可存储设备700操作所需的各种程序和数据。计算单元701、ROM 702以及RAM 703通过总线704彼此相连。输入/输出(I/O)接口705也连接至总线704。
设备700中的多个部件连接至I/O接口705,包括:输入单元706,例如键盘、鼠标等;输出单元707,例如各种类型的显示器、扬声器等;存储单元708,例如磁盘、光盘等;以及通信单元709,例如网卡、调制解调器、无线通信收发机等。通信单元709允许设备700通过诸如因特网的计算机网络和/或各种电信网络与其他设备交换信息/数据。
计算单元701可以是各种具有处理和计算能力的通用和/或专用处理组件。计算单元701的一些示例包括但不限于中央处理单元(CPU)、图形处理单元(GPU)、各种专用的人工智能(AI)计算芯片、各种运行机器学习模型算法的计算单元、数字信号处理器(DSP)、以及任何适当的处理器、控制器、微控制器等。计算单元701执行上文所描述的各个方法和处理,例如视频巡检方法。例如,在一些实施例中,视频巡检方法可被实现为计算机软件程序,其被有形地包含于机器可读介质,例如存储单元708。在一些实施例中,计算机程序的部分或者全部可以经由ROM 702和/或通信单元709而被载入和/或安装到设备700上。当计算机程序加载到RAM 703并由计算单元701执行时,可以执行上文描述的视频巡检方法的一个或多个步骤。备选地,在其他实施例中,计算单元701可以通过其他任何适当的方式(例如,借助 于固件)而被配置为执行视频巡检方法。
本文中以上描述的系统和技术的各种实施方式可以在数字电子电路系统、集成电路系统、场可编程门阵列(FPGA)、专用集成电路(ASIC)、专用标准产品(ASSP)、芯片上系统的系统(SOC)、负载可编程逻辑设备(CPLD)、计算机硬件、固件、软件、和/或它们的组合中实现。这些各种实施方式可以包括:实施在一个或者多个计算机程序中,该一个或者多个计算机程序可在包括至少一个可编程处理器的可编程系统上执行和/或解释,该可编程处理器可以是专用或者通用可编程处理器,可以从存储系统、至少一个输入装置、和至少一个输出装置接收数据和指令,并且将数据和指令传输至该存储系统、该至少一个输入装置、和该至少一个输出装置。
用于实施本公开的方法的程序代码可以采用一个或多个编程语言的任何组合来编写。这些程序代码可以提供给通用计算机、专用计算机或其他可编程数据处理装置的处理器或控制器,使得程序代码当由处理器或控制器执行时使流程图和/或框图中所规定的功能/操作被实施。程序代码可以完全在机器上执行、部分地在机器上执行,作为独立软件包部分地在机器上执行且部分地在远程机器上执行或完全在远程机器或服务器上执行。
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
为了提供与用户的交互,可以在计算机上实施此处描述的系统和技术,该计算机具有:用于向用户显示信息的显示装置(例如,CRT(阴极射线管)或者LCD(液晶显示器)监视器);以及键盘和指向装置(例如,鼠标或者轨迹球),用户可以通过该键盘和该指向装置来将输入提供给计算机。其它种类的装置还可以用于提供与用户的交互;例如,提供给用户的反馈可以是任何形式的传感反馈(例如,视觉反馈、听觉反馈、或者触觉反馈);并且可以用任何形式(包括声输入、语音输入或者、触觉输入)来接收来自用户的输入。
可以将此处描述的系统和技术实施在包括后台部件的计算系统(例如,作为数据服务器)、或者包括中间件部件的计算系统(例如,应用服务器)、或者包括前端部件的计算系统(例如,具有图形用户界面或者网络浏览器的用户计算机,用户可以通过该图形用户界面或者该网络浏览器来与此处描述的系统和技术的实施方式交互)、或者包括这种后台部件、中间件部件、或者前端部件的任何组合的计算系统中。可以通过任何形式或者介质的数字数据通信(例如,通信网络)来将系统的部件相互连接。通信网络的示例包括:局域网(LAN)、广域网(WAN)和互联网。
计算机系统可以包括客户端和服务器。客户端和服务器一般远离彼此并且通常通过通信网络进行交互。通过在相应的计算机上运行并且彼此具有客户端-服务器关系的计算机程序来产生客户端和服务器的关系。
根据本公开的实施例,本公开还提供了一种计算机程序产品,包括计算机程序,计算机程序在被处理器执行时实现上述视频巡检方法中任一项方法。
应该理解,可以使用上面所示的各种形式的流程,重新排序、增加或删除步骤。例如,本发公开中记载的各步骤可以并行地执行也可以顺序地执行也可以不同的次序执行,只要能够实现本公开公开的技术方案所期望的结果, 本文在此不进行限制。
上述具体实施方式,并不构成对本公开保护范围的限制。本领域技术人员应该明白的是,根据设计要求和其他因素,可以进行各种修改、组合、子组合和替代。任何在本公开的精神和原则之内所作的修改、等同替换和改进等,均应包含在本公开保护范围之内。

Claims (16)

  1. 一种视频巡检方法,其特征在于,包括:
    获取视频巡检信息;其中,所述视频巡检信息是在真实场景中获取的视频信息;
    将所述视频巡检信息与3D漫游场景匹配组合,获得3D巡检视频;其中,所述3D漫游场景是模拟所述真实场景获得的虚拟场景;
    展示所述3D巡检视频。
  2. 根据权利要求1所述的方法,其特征在于,所述获取视频巡检信息,包括:
    通过所述真实场景中移动的摄像头获取所述视频巡检信息,所述摄像头设置在机器人上,所述机器人在实际巡检轨道上移动。
  3. 根据权利要求2所述的方法,其特征在于,所述将所述视频巡检信息与3D漫游场景匹配组合,获得3D巡检视频之前,还包括:
    基于所述真实场景获得所述3D漫游场景;
    根据所述实际巡检导轨在所述3D漫游场景中模拟出3D漫游巡检轨迹;
    基于所述机器人的当前实际位置和当前巡检时长对所述3D漫游场景中的虚拟位置进行校准;其中,所述当前巡检时长是所述机器人从所述实际巡检轨道的初始位置到达所述当前实际位置所花费的时长。
  4. 根据权利要求3所述的方法,其特征在于,所述3D漫游场景中的巡检位置是依据所述机器人在所述实际巡检轨道上的巡检总时长与所述当前巡检时长确定。
  5. 根据权利要求3所述的方法,其特征在于,所述基于所述真实场景中所述机器人的当前实际位置和当前巡检时长对所述3D漫游场景中的巡检位置进行校准,包括:
    获取所述机器人在所述实际巡检轨道上的当前实际位置和对应的当前巡检时长;
    基于所述当前实际位置和所述实际巡检轨道的总长度确定当前实际位置比例参数;
    基于所述当前实际位置比例参数和所述当前巡检时长对所述3D漫游场景中的巡检位置进行校准。
  6. 根据权利要求5所述的方法,其特征在于,所述获取所述机器人在所述实际巡检轨道上的当前实际位置和对应的当前巡检时长,包括:
    在一个巡检周期内,按照固定频率获取所述机器人在所述实际巡检轨道上的当前实际位置和对应的当前巡检时长。
  7. 根据权利要求3所述的方法,其特征在于,按照等比例模拟所述真实场景获得所述3D漫游场景。
  8. 根据权利要求1-7任意一项所述的方法,其特征在于,还包括:
    获取告警信息,并将所述告警信息发送至客户端,以供所述客户端发出告警提醒。
  9. 一种视频巡检装置,其特征在于,包括:
    获取模块,获取视频巡检信息;其中,所述视频巡检信息是在巡检区域的真实场景中获取的视频信息;
    匹配组合模块,用于将所述视频巡检信息与3D漫游场景匹配组合,获得3D巡检视频;其中,所述3D漫游场景是模拟所述真实场景获得的虚拟场景;
    显示模块,用于展示所述3D巡检视频。
  10. 根据权利要求9所述的装置,其特征在于,所述获取模块包括:摄像头、机器人、实际巡检导轨和控制模块,其中,
    所述实际巡检导轨设置于巡检区域的真实场景中,用于提供所述机器人的运行轨迹;
    所述机器人沿所述实际巡检导轨移动;
    所述摄像头设置于所述机器人,所述机器人携带所述摄像头在所述真实 场景中移动,用于获取所述视频巡检信息;
    所述控制模块与所述机器人和所述摄像头信号连接,用于控制所述机器人的运行状态以及所述摄像头的拍摄位置和角度。
  11. 根据权利要求10所述的装置,其特征在于,所述装置还包括存储模块,用于存储所述摄像头获取的视频巡检信息。
  12. 根据权利要求11所述的装置,其特征在于,所述存储模块包括硬盘录像机,所述硬盘录像机至少包括一个存储通道,每一所述存储通道用于存储一个所述摄像头获取的所述视频巡检信息。
  13. 根据权利要求12所述的装置,其特征在于,所述装置还包括:
    交换机,用于将所述控制模块和所述硬盘录像机信号连接;
    路由器,用于将所述机器人与所述控制模块信号连接。
  14. 根据权利要求9-13任一所述的装置,其特征在于,所述装置还包括:
    告警模块,用于获取机房内的巡检对象的故障信息,并将所述故障信息发送至所述显示模块显示。
  15. 一种电子设备,包括:
    至少一个处理器;以及
    与所述至少一个处理器通信连接的存储器;其中,
    所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行权利要求1-8中任一项所述的方法。
  16. 一种存储有计算机指令的非瞬时计算机可读存储介质,其中,所述计算机指令用于使所述计算机执行权利要求1-8中任一项所述的方法。
PCT/CN2021/122239 2020-11-05 2021-09-30 视频巡检方法、装置、电子设备及可读介质 WO2022095651A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/030,806 US20230377277A1 (en) 2020-11-05 2021-09-30 Video patrol method and device, electronic device, and readable medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011222345.3 2020-11-05
CN202011222345.3A CN112399145A (zh) 2020-11-05 2020-11-05 一种机房3d漫游与视频巡检同步系统和方法

Publications (1)

Publication Number Publication Date
WO2022095651A1 true WO2022095651A1 (zh) 2022-05-12

Family

ID=74597378

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/122239 WO2022095651A1 (zh) 2020-11-05 2021-09-30 视频巡检方法、装置、电子设备及可读介质

Country Status (3)

Country Link
US (1) US20230377277A1 (zh)
CN (1) CN112399145A (zh)
WO (1) WO2022095651A1 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115225867A (zh) * 2022-09-16 2022-10-21 苏州万店掌网络科技有限公司 一种门店远程巡检方法、装置、设备及介质
CN115240289A (zh) * 2022-09-20 2022-10-25 泰豪信息技术有限公司 监所巡更人员巡更监测方法、装置、存储介质及电子设备
CN115278063A (zh) * 2022-07-08 2022-11-01 深圳市施罗德工业集团有限公司 一种巡检方法、巡检装置及巡检机器人
CN115793673A (zh) * 2023-01-10 2023-03-14 北京飞渡科技股份有限公司 一种基于vr技术的天然气场站机器人巡检方法和装置
CN116796008A (zh) * 2023-08-15 2023-09-22 北京安录国际技术有限公司 一种基于知识图谱的运维分析管理系统以及方法
CN116972855A (zh) * 2023-09-22 2023-10-31 北京洛斯达科技发展有限公司 基于bim的选煤厂设备三维漫游巡检方法及系统

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112399145A (zh) * 2020-11-05 2021-02-23 通号通信信息集团有限公司 一种机房3d漫游与视频巡检同步系统和方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160225179A1 (en) * 2015-01-29 2016-08-04 Institute Of Environmental Science And Research Limited Three-dimensional visualization of a scene or environment
CN109066416A (zh) * 2018-08-09 2018-12-21 深圳供电局有限公司 一种基于vr的变电站巡检系统及方法
CN110290350A (zh) * 2019-06-26 2019-09-27 广东康云科技有限公司 一种巡检机器人的实时状态监控方法、系统和存储介质
CN110400387A (zh) * 2019-06-26 2019-11-01 广东康云科技有限公司 一种基于变电站的联合巡检方法、系统和存储介质
CN110417120A (zh) * 2019-06-26 2019-11-05 广东康云科技有限公司 一种变电站实景三维智能巡检系统及方法
CN111798572A (zh) * 2020-06-12 2020-10-20 广东电网有限责任公司揭阳供电局 一种机房虚拟巡检方法及系统
CN112399145A (zh) * 2020-11-05 2021-02-23 通号通信信息集团有限公司 一种机房3d漫游与视频巡检同步系统和方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104809758B (zh) * 2015-05-08 2018-11-23 山东康威通信技术股份有限公司 基于三维实景漫游技术的现场隧道巡检及设备控制方法
CN106647586B (zh) * 2017-01-20 2019-07-19 重庆邮电大学 一种基于b/s架构的虚拟机房可视化监控管理系统及方法
CN108388194A (zh) * 2018-03-27 2018-08-10 中铁第勘察设计院集团有限公司 铁路机房智能机器人巡检系统及其方法
CN109531533B (zh) * 2018-11-30 2019-11-05 北京海益同展信息科技有限公司 一种机房巡检系统及其工作方法
CN110796727A (zh) * 2019-09-17 2020-02-14 国网天津市电力公司 基于虚拟现实技术的机房远程全景监控方法
CN110722559A (zh) * 2019-10-25 2020-01-24 国网山东省电力公司信息通信公司 一种智能巡检机器人辅助巡检定位方法
CN111476921B (zh) * 2020-04-10 2021-03-30 宁波思高信通科技有限公司 一种机房智能巡检系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160225179A1 (en) * 2015-01-29 2016-08-04 Institute Of Environmental Science And Research Limited Three-dimensional visualization of a scene or environment
CN109066416A (zh) * 2018-08-09 2018-12-21 深圳供电局有限公司 一种基于vr的变电站巡检系统及方法
CN110290350A (zh) * 2019-06-26 2019-09-27 广东康云科技有限公司 一种巡检机器人的实时状态监控方法、系统和存储介质
CN110400387A (zh) * 2019-06-26 2019-11-01 广东康云科技有限公司 一种基于变电站的联合巡检方法、系统和存储介质
CN110417120A (zh) * 2019-06-26 2019-11-05 广东康云科技有限公司 一种变电站实景三维智能巡检系统及方法
CN111798572A (zh) * 2020-06-12 2020-10-20 广东电网有限责任公司揭阳供电局 一种机房虚拟巡检方法及系统
CN112399145A (zh) * 2020-11-05 2021-02-23 通号通信信息集团有限公司 一种机房3d漫游与视频巡检同步系统和方法

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115278063A (zh) * 2022-07-08 2022-11-01 深圳市施罗德工业集团有限公司 一种巡检方法、巡检装置及巡检机器人
CN115225867A (zh) * 2022-09-16 2022-10-21 苏州万店掌网络科技有限公司 一种门店远程巡检方法、装置、设备及介质
CN115225867B (zh) * 2022-09-16 2022-12-09 苏州万店掌网络科技有限公司 一种门店远程巡检方法、装置、设备及介质
CN115240289A (zh) * 2022-09-20 2022-10-25 泰豪信息技术有限公司 监所巡更人员巡更监测方法、装置、存储介质及电子设备
CN115240289B (zh) * 2022-09-20 2022-12-20 泰豪信息技术有限公司 监所巡更人员巡更监测方法、装置、存储介质及电子设备
CN115793673A (zh) * 2023-01-10 2023-03-14 北京飞渡科技股份有限公司 一种基于vr技术的天然气场站机器人巡检方法和装置
CN116796008A (zh) * 2023-08-15 2023-09-22 北京安录国际技术有限公司 一种基于知识图谱的运维分析管理系统以及方法
CN116796008B (zh) * 2023-08-15 2024-02-13 北京安录国际技术有限公司 一种基于知识图谱的运维分析管理系统以及方法
CN116972855A (zh) * 2023-09-22 2023-10-31 北京洛斯达科技发展有限公司 基于bim的选煤厂设备三维漫游巡检方法及系统
CN116972855B (zh) * 2023-09-22 2023-12-26 北京洛斯达科技发展有限公司 基于bim的选煤厂设备三维漫游巡检方法及系统

Also Published As

Publication number Publication date
CN112399145A (zh) 2021-02-23
US20230377277A1 (en) 2023-11-23

Similar Documents

Publication Publication Date Title
WO2022095651A1 (zh) 视频巡检方法、装置、电子设备及可读介质
US10789771B2 (en) Method and apparatus for fusing point cloud data
CN110162083B (zh) 利用无人机执行三维结构检查及维护任务的方法和系统
CN107291879B (zh) 一种虚拟现实系统中三维环境地图的可视化方法
US11557083B2 (en) Photography-based 3D modeling system and method, and automatic 3D modeling apparatus and method
CN114140528A (zh) 数据标注方法、装置、计算机设备及存储介质
CN112530021B (zh) 用于处理数据的方法、装置、设备以及存储介质
CN113485156A (zh) 一种变压器数字孪生云平台及其实现方法
CN107515891A (zh) 一种机器人地图制作方法、装置和存储介质
CN110796738B (zh) 巡检设备状态跟踪的三维可视化方法及装置
WO2022135138A1 (zh) 一种机器人任务部署方法、系统、设备和存储介质
CN108259858B (zh) 变电站场景与设备的监控方法及装置
CN111578951B (zh) 一种自动驾驶中用于生成信息的方法和装置
KR102606423B1 (ko) 교통량 모니터링 시스템의 테스트 방법, 장치 및 기기
CN114693785A (zh) 一种目标定位的方法、系统及相关设备
CN103634571A (zh) 基于移动客户端的不规则多边形监控预警区域划定方法
EP4336385A1 (en) Method and apparatus for updating target detection model
CN103714199A (zh) 目标运动特性图像仿真输出系统
WO2019069436A1 (ja) 監視装置、監視システムおよび監視方法
CN116661477A (zh) 一种变电站无人机巡检方法、装置、设备及存储介质
CN111998959A (zh) 基于实时测温系统的温度校准方法、装置及存储介质
CN108665522A (zh) 一种高帧频短延时动态场景仿真生成系统和方法
CN113610702B (zh) 一种建图方法、装置、电子设备及存储介质
CN109931889A (zh) 基于图像识别技术的偏差检测系统及方法
Lam et al. Bluetooth mesh networking: An enabler of smart factory connectivity and management

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21888348

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21888348

Country of ref document: EP

Kind code of ref document: A1