CN118034343A - Inspection method and system for remote unmanned machine room equipment - Google Patents

Inspection method and system for remote unmanned machine room equipment Download PDF

Info

Publication number
CN118034343A
CN118034343A CN202410218460.5A CN202410218460A CN118034343A CN 118034343 A CN118034343 A CN 118034343A CN 202410218460 A CN202410218460 A CN 202410218460A CN 118034343 A CN118034343 A CN 118034343A
Authority
CN
China
Prior art keywords
equipment
result
image
patrolled
image recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410218460.5A
Other languages
Chinese (zh)
Inventor
王浩
郦俊岭
郭炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Blue Yidian Technology Co ltd
Original Assignee
Shenzhen Blue Yidian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Blue Yidian Technology Co ltd filed Critical Shenzhen Blue Yidian Technology Co ltd
Priority to CN202410218460.5A priority Critical patent/CN118034343A/en
Publication of CN118034343A publication Critical patent/CN118034343A/en
Pending legal-status Critical Current

Links

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to a patrol method and a system of remote unmanned machine room equipment, wherein the method comprises the following steps: acquiring state data and images of equipment to be patrolled in a remote unmanned aerial vehicle room; the display content of the image comprises equipment to be patrolled and the surrounding environment of the equipment to be patrolled; processing the state data through the equipment state prediction model to obtain a state prediction result of equipment to be patrolled; processing the image through the image recognition model to obtain an image recognition result of the equipment to be patrolled; generating a path planning result for the inspection robot based on the state prediction result and the image recognition result; controlling the inspection robot to inspect the equipment to be inspected according to the path of the path planning result to obtain a corresponding inspection result; based on the inspection result, reporting abnormal equipment with abnormality in the plurality of equipment to be inspected, and sending a remote operation request to a management center of the abnormal equipment. The invention can automatically patrol the remote unmanned machine room equipment and improve the patrol efficiency and accuracy.

Description

Inspection method and system for remote unmanned machine room equipment
Technical Field
The invention relates to the technical field of intelligent inspection, in particular to an inspection method and system of remote unmanned machine room equipment, electronic equipment and a non-transitory computer readable storage medium.
Background
The data center is used for transmitting, accelerating, displaying, calculating and storing data information, and the existing inspection mode of the machine room of the data center relies on operation and maintenance personnel to inspect the temperature and humidity of the machine room and the equipment state.
However, due to the large machine room size and numerous equipment of the data center, the patrol task is time consuming and error prone. In addition, the parameter difference of different equipment is big, and operation and maintenance personnel need to possess abundant knowledge and frequently consult manual, greatly increases the complexity of inspection operation.
Therefore, the existing inspection mode for the equipment in the machine room is low in efficiency and easy to make mistakes.
Disclosure of Invention
Aiming at the technical problems in the prior art, the invention provides a patrol method, a system, electronic equipment and a non-transitory computer readable storage medium, which can realize remote unmanned automatic patrol of machine room equipment, and have high patrol efficiency and high accuracy.
The technical scheme for solving the technical problems is as follows:
the invention provides a patrol method of remote unmanned machine room equipment, which comprises the following steps:
acquiring state data and images of equipment to be patrolled in a remote unmanned aerial vehicle room; the display content of the image comprises the equipment to be patrolled and the surrounding environment of the equipment to be patrolled;
Processing the state data through a device state prediction model to obtain a state prediction result of the device to be patrolled;
processing the image through an image recognition model to obtain an image recognition result of the equipment to be patrolled;
Generating a path planning result for the inspection robot based on the state prediction result and the image recognition result;
Controlling the inspection robot to inspect the equipment to be inspected according to the path of the path planning result to obtain a corresponding inspection result;
Based on the inspection result, reporting abnormal equipment with abnormality in the plurality of equipment to be inspected, and sending a remote operation request to a management center of the abnormal equipment.
Optionally, the processing the state data through the device state prediction model to obtain a state prediction result of the device to be patrolled includes:
Performing coding processing on the state data to obtain corresponding input characteristics;
Acquiring a first weight corresponding to each input feature;
the equipment state prediction model obtains the state prediction result based on the input characteristics and the first weight; the expression of the state prediction result is:
Where n is the number of input features; x i is the ith input feature; omega i is the first weight associated with the ith input feature; b is a bias term; sigma is the first activation function.
Optionally, the processing the image through the image recognition model to obtain an image recognition result of the device to be patrolled includes:
Performing coding processing and feature extraction processing on each image to obtain a corresponding input image;
Carrying out convolution processing on the input image by combining the second weight of each convolution kernel through the convolution kernel of the image recognition model to obtain a corresponding image recognition result; each input image corresponds to each convolution kernel one by one;
the expression of the image recognition result is as follows:
Where o is the number of convolution kernels; x i is the ith channel or feature map of the input image; w i is a second weight associated with the ith convolution kernel; b is a bias term; The method comprises the steps of representing that each convolution kernel and the corresponding input feature diagram carry out convolution operation and adding linear combination of bias items; g is the second activation function.
Optionally, the generating a path planning result for the inspection robot based on the state prediction result and the image recognition result includes:
Determining target equipment and key areas for patrol of the patrol robot according to the state prediction result and the image recognition result;
determining a distance between the current position of the inspection robot and the target equipment, a terrain and obstacle influence factor and an equipment layout influence factor according to the image recognition result;
a first time length reaching the target equipment, and planning an initial path meeting the first time length;
and optimizing the initial path according to the state prediction result and the image recognition result to obtain the path planning result.
Optionally, the expression of the first duration is:
Wherein T is a first duration, and d is a distance between the current position of the inspection robot and the target device; v is the travel speed of the inspection robot; f is a terrain and obstacle influencing factor; l is a device layout influencing factor; s is the state prediction result; r is the image recognition result.
Optionally, the terrain and obstacle influencing factor is expressed as:
Where m is the number of target terrain or obstacles; a i is the influencing factor for each target terrain or obstacle; the sum max (a 1,a2,…,am) of the influencing factors representing all the target terrains or obstacles represents the value at which the influencing factor is the largest among all the target terrains or obstacles;
the device layout influence factor is expressed as:
Wherein k is the number of the equipment to be patrolled; b i is an influencing factor of the equipment layout of each equipment to be patrolled; representing the sum of influencing factors of all device layouts; max (b 1,b2,…,bk) represents the value of the largest influencing factor among all device layouts.
Optionally, the controlling the inspection robot to inspect the equipment to be inspected according to the path of the path planning result to obtain a corresponding inspection result includes:
sending the path planning result to the inspection robot;
controlling the inspection robot to detect the equipment to be inspected in real time to obtain corresponding inspection data;
And analyzing the patrol data to obtain the patrol result.
Optionally, the method further comprises:
Optimizing parameters of the equipment state prediction model and parameters of the image recognition model according to the inspection result to obtain an updated state prediction model and an updated image recognition model;
Processing the state data according to the updated state prediction model to obtain an updated state prediction result;
processing the image according to the updated image recognition model to obtain an updated image recognition result;
And updating the path planning result according to the state prediction result and the image recognition result to obtain an updated path planning result.
Optionally, the obtaining the state data and the image of the equipment to be patrolled in the remote unmanned aerial vehicle room includes:
Acquiring first state data through a sensor arranged on the equipment to be patrolled, and acquiring second state data through a remote management tool installed in the equipment to be patrolled;
Acquiring a first image of the equipment to be patrolled and a second image of the surrounding environment of the equipment to be patrolled through image acquisition equipment arranged in the remote unmanned aerial vehicle room;
Acquiring a set of the first state data and the second state data to obtain the state data, and acquiring a set of the first image and the second image to obtain the image.
The invention also provides a patrol system of the remote unmanned machine room equipment, which comprises:
the data acquisition module is used for acquiring state data and images of equipment to be patrolled in the remote unmanned aerial vehicle room; the display content of the image comprises the equipment to be patrolled and the surrounding environment of the equipment to be patrolled;
The state prediction module is used for processing the state data through a device state prediction model to obtain a state prediction result of the device to be patrolled;
The image recognition module is used for processing the image through an image recognition model to obtain an image recognition result of the equipment to be patrolled;
the path planning module is used for generating a path planning result for the inspection robot based on the state prediction result and the image recognition result;
the automatic inspection module is used for controlling the inspection robot to inspect the equipment to be inspected according to the path of the path planning result to obtain a corresponding inspection result;
And the patrol processing module is used for reporting abnormal equipment with abnormality from a plurality of equipment to be patrol based on the patrol result and sending a remote operation request to a management center of the abnormal equipment.
In addition, to achieve the above object, the present invention also proposes an electronic device including: a memory for storing a computer software program; and the processor is used for reading and executing the computer software program so as to realize the inspection method of the remote unmanned machine room equipment.
In addition, in order to achieve the above object, the present invention also proposes a non-transitory computer readable storage medium, in which a computer software program is stored, which when executed by a processor, implements a method for patrolling a remote unmanned room device as described above.
The beneficial effects of the invention are as follows:
(1) According to the invention, equipment state prediction and image recognition are carried out on equipment in the machine room through a machine learning algorithm, so that the inspection robot can automatically detect and recognize equipment states and environmental conditions without manual intervention, and the inspection efficiency is greatly improved;
(2) Compared with the traditional manual inspection mode, the robot inspection system can continuously work without being limited by manpower, and can perform remote monitoring and operation when needed, so that the cost of manpower and time is reduced;
(3) According to the invention, the equipment state prediction and the image recognition are carried out through the machine learning algorithm, so that the equipment state and the environment condition can be more accurately recognized, misjudgment and omission caused by human factors are avoided, and the monitoring accuracy and reliability are improved;
(4) The inspection robot can monitor the equipment state and the environment condition in real time, and can immediately take corresponding measures, such as sending an alarm or remotely operating equipment, when abnormal conditions are found, so that problems can be timely handled, and potential losses are reduced;
(5) According to the invention, by combining the equipment state prediction, the image recognition and the path planning algorithm, the tour path can be optimized, so that the robot can reach the target equipment in the shortest time and the optimal path, and the tour efficiency and the tour accuracy are further improved;
In conclusion, the invention can realize automatic inspection of the remote unmanned machine room equipment, obviously improve inspection efficiency, accuracy and response speed of the machine room equipment, and reduce inspection cost, thereby bringing a plurality of benefits to the operation and maintenance of the data center.
Drawings
Fig. 1 is a scene diagram of a patrol method of a remote unmanned machine room device provided by the invention;
Fig. 2 is a flowchart of a method for patrolling a remote unmanned machine room device provided by the invention;
fig. 3 is a schematic structural diagram of a patrol system of the remote unmanned machine room equipment provided by the invention;
fig. 4 is a schematic hardware structure of one possible electronic device according to the present invention;
Fig. 5 is a schematic hardware structure of a possible computer readable storage medium according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In the description of the present invention, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described features. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the description of the present invention, the term "for example" is used to mean "serving as an example, instance, or illustration. Any embodiment described as "for example" in this disclosure is not necessarily to be construed as preferred or advantageous over other embodiments. The following description is presented to enable any person skilled in the art to make and use the invention. In the following description, details are set forth for purposes of explanation. It will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known structures and processes have not been described in detail so as not to obscure the description of the invention with unnecessary detail. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
Referring to fig. 1, fig. 1 is a schematic diagram of a method for inspecting a remote unmanned machine room device according to the present invention. As shown in fig. 1, the terminal and the server are connected through a network, for example, a wired or wireless network connection. The terminal may include, but is not limited to, mobile terminals such as mobile phones and tablets, and fixed terminals such as computers, inquiry machines and advertising machines, where applications of various network platforms are installed. The server provides various business services for the user, including a service push server, a user recommendation server and the like.
It should be noted that, the scenegraph of the inspection method of the remote unmanned machine room device shown in fig. 1 is only an example, and the terminal, the server and the application scenario described in the embodiment of the present invention are for more clearly describing the technical solution of the embodiment of the present invention, and do not generate any limitation on the technical solution provided by the embodiment of the present invention, and as a person of ordinary skill in the art can know that, with the evolution of the system and the appearance of a new service scenario, the technical solution provided by the embodiment of the present invention is applicable to similar technical problems.
Wherein the terminal may be configured to:
acquiring state data and images of equipment to be patrolled in a remote unmanned aerial vehicle room; the display content of the image comprises the equipment to be patrolled and the surrounding environment of the equipment to be patrolled;
Processing the state data through a device state prediction model to obtain a state prediction result of the device to be patrolled;
processing the image through an image recognition model to obtain an image recognition result of the equipment to be patrolled;
Generating a path planning result for the inspection robot based on the state prediction result and the image recognition result;
Controlling the inspection robot to inspect the equipment to be inspected according to the path of the path planning result to obtain a corresponding inspection result;
Based on the inspection result, reporting abnormal equipment with abnormality in the plurality of equipment to be inspected, and sending a remote operation request to a management center of the abnormal equipment.
Referring to fig. 2, a flowchart of a method for patrolling a remote unmanned machine room device according to the present invention is provided, including the following steps:
step 201, acquiring state data and images of equipment to be patrolled in a remote unmanned aerial vehicle room.
In some embodiments, the display content of the image may include the device to be walked around and the surrounding environment of the device to be walked around.
In some embodiments, step 201 may include:
Acquiring first state data through a sensor arranged on the equipment to be patrolled, and acquiring second state data through a remote management tool installed in the equipment to be patrolled;
Acquiring a first image of the equipment to be patrolled and a second image of the surrounding environment of the equipment to be patrolled through image acquisition equipment arranged in the remote unmanned aerial vehicle room;
Acquiring a set of the first state data and the second state data to obtain the state data, and acquiring a set of the first image and the second image to obtain the image.
In some embodiments, various sensors, such as temperature sensors, humidity sensors, pressure sensors, etc., may be disposed on the equipment to be surveyed for monitoring status parameters of the equipment in real time. These sensors periodically collect status data of the device, such as temperature, humidity, pressure, etc., and then send the data to a central data center or cloud server for storage and processing.
In some embodiments, a remote management tool may be installed within the device to be walked upon for remotely connecting to the device and querying its status information. Accordingly, the remote management tool may obtain information about the operating state of the device, the system log, the software version, etc., and transmit the information to the central data center for storage and processing.
In some embodiments, a camera or other image acquisition device may be installed in the remote unmanned room for real-time monitoring of the operational status and surrounding environment of the equipment to be surveyed. Correspondingly, the image acquisition device can periodically shoot the images of the device and transmit the image data to a central data center for storage and processing.
In some embodiments, the first state data and the second state data acquired by the sensor and the remote management tool may be aggregated to form complete device state data. The above device state data sets may include various state parameters of the device, such as temperature, humidity, pressure, operating state, system log, etc. information.
In some embodiments, a first image of the device to be patrolled acquired by the image acquisition device may be aggregated with a second image of the surrounding environment to form complete image data. The above image set may include real-time images of the device and a view of the surrounding environment for subsequent image recognition and analysis tasks
In summary, the present invention can acquire status data and images of the equipment to be patrolled through the sensors disposed on the equipment, the remote management tool installed in the equipment, and the image acquisition equipment in the remote unmanned room.
And 202, processing the state data through a device state prediction model to obtain a state prediction result of the device to be patrolled.
In some embodiments, step 202 may include:
Performing coding processing on the state data to obtain corresponding input characteristics;
Acquiring a first weight corresponding to each input feature;
the equipment state prediction model obtains the state prediction result based on the input characteristics and the first weight; the expression of the state prediction result is:
Where n is the number of input features; x i is the ith input feature; omega i is the first weight associated with the ith input feature; b is a bias term; sigma is the first activation function.
In a specific implementation, the first state data acquired from the sensor arranged on the equipment to be patrolled may be subjected to coding processing, so as to obtain the corresponding input feature. To normalizing, normalizing or otherwise feature converting state data to accommodate input requirements of a state prediction model. Then, a first weight corresponding to each input feature is acquired. These weights are learned during the training process by a machine learning algorithm to measure the importance or impact of the input features on the state prediction. Next, a state prediction result is obtained based on the input feature after the encoding processing and the corresponding first weight by using the device state prediction model.
Wherein,The linear combination of all input features and their corresponding weights, plus the bias term, corresponds to part of a linear model, but by the introduction of the activation function sigma, enables it to learn a nonlinear relationship. In a classification task, the σ function is typically a Sigmoid function, which is defined as follows:
Where z is the result of a linear combination, the output range of the Sigmoid function is between (0, 1), which makes it possible to map any real value to a probability value, and is therefore very common in the classification problem.
Thus, the formulaAnd a predicted probability value s obtained by a Sigmoid activation function after the input characteristic is linearly combined with the weight and the bias is represented, wherein the value represents the probability that the sample belongs to a positive class.
And 203, processing the image through an image recognition model to obtain an image recognition result of the equipment to be patrolled.
In some embodiments, step 203 may comprise:
Performing coding processing and feature extraction processing on each image to obtain a corresponding input image;
Carrying out convolution processing on the input image by combining the second weight of each convolution kernel through the convolution kernel of the image recognition model to obtain a corresponding image recognition result; each input image corresponds to each convolution kernel one by one;
the expression of the image recognition result is as follows:
Where o is the number of convolution kernels; x i is the ith channel or feature map of the input image; w i is a second weight associated with the ith convolution kernel; b is a bias term; The method comprises the steps of representing that each convolution kernel and the corresponding input feature diagram carry out convolution operation and adding linear combination of bias items; g is the second activation function.
In a specific implementation, the encoding process and the feature extraction process may be performed on each image to obtain a corresponding input image. This includes pre-processing the original image, such as resizing, cropping, normalizing, etc., and then extracting a feature representation of the image using an image recognition model (e.g., convolutional neural network).
Then, convolution kernels of the image recognition model can be utilized, and convolution processing is carried out on the input image by combining the second weight of each convolution kernel, so that a corresponding image recognition result is obtained. Each input image corresponds to each convolution kernel one by one, a series of feature images (also called convolution features) are obtained through convolution operation, and abstract representations of the images in different feature spaces are represented.
Wherein Conv (X i) represents convolution operation on the input feature map X i, which is one of core operations in CNN. The convolution operation multiplies the input feature map element by element using a convolution kernel (or filter) and adds the multiplication results to obtain an output feature map. This allows extracting local features in the image and learning different features by means of a plurality of convolution kernels.
Representing a linear combination of all convolution kernels with their corresponding input feature maps, plus a bias term. This corresponds to part of a linear model, but by introducing the activation function g, it is made possible to learn a nonlinear relationship. In the image recognition task, the activation function g that is typically used is a ReLU (RECTIFIED LINEAR Unit) function, which is defined as follows:
g(z)=max(0,z);
the ReLU function sets the portion of input z less than zero to zero, the portion greater than or equal to zero remains unchanged,
The nonlinear expression capacity of the model can be increased without introducing excessive computational cost.
Thus, the formulaThe input feature map is represented by an output feature map r obtained by performing convolution operation on the input feature map through a plurality of convolution kernels and weights corresponding to the convolution kernels and then through a ReLU activation function.
And 204, generating a path planning result for the inspection robot based on the state prediction result and the image recognition result.
In some embodiments, step 204 may include:
Determining target equipment and key areas for patrol of the patrol robot according to the state prediction result and the image recognition result;
determining a distance between the current position of the inspection robot and the target equipment, a terrain and obstacle influence factor and an equipment layout influence factor according to the image recognition result;
a first time length reaching the target equipment, and planning an initial path meeting the first time length;
and optimizing the initial path according to the state prediction result and the image recognition result to obtain the path planning result.
In some embodiments, the target device and the key area of the patrol robot may be determined according to the state prediction result and the image recognition result. The state prediction results may indicate which devices have problems or need attention, while the image recognition results may help determine areas or devices that need particular attention.
In some embodiments, a distance between the current position of the inspection robot and the target device, as well as terrain and obstacle influencing factors, device layout influencing factors, which may influence the travel speed and path selection of the inspection robot, may be determined from the image recognition result.
In some embodiments, the expression for the first duration is:
Wherein T is a first duration, and d is a distance between the current position of the inspection robot and the target device; v is the travel speed of the inspection robot; f is a terrain and obstacle influencing factor; l is a device layout influencing factor; s is the state prediction result; r is the image recognition result.
In a specific implementation, the calculation formula of the first duration may divide the linear distance d by the travelling speed v of the robot and then multiply the reciprocal of each influencing factor, so as to consider the influence of each factor on the travelling time of the robot, and if a certain factor is larger, the reciprocal corresponding to the factor is smaller, so that the overall time is increased.
For example, if f represents an influence factor of terrain and obstacles, l represents an influence factor of device layout, s represents an influence factor of device state prediction, and r represents an influence factor of image recognition results, when one or more of these factors are large, their corresponding reciprocal is small, which increases the time T for the robot to reach the target device.
The formula comprehensively considers the influence of various factors on the travel of the robot, can help optimize path planning, and improves the efficiency of the robot reaching the target equipment.
In some embodiments, the terrain and obstacle influencing factors may be expressed as:
Where m is the number of target terrain or obstacles; a i is the influencing factor for each target terrain or obstacle; The sum max (a 1,a2,…,am) of the influencing factors representing all the target terrains or obstacles represents the value at which the influencing factor is the largest among all the target terrains or obstacles.
In a specific implementation, the calculation formula of the terrain and obstacle influence factor f can average the influence factors of all the terrains or obstacles and normalize the influence factors to be in the range of [0,1 ]. Thus, when the influence factor of the terrain or the obstacle is relatively small, the value of f is close to 0; when there is a large terrain or obstacle, the value of f is close to 1.
The purpose of the terrain and obstacle influencing factor f is to take into account the different degrees of influence of different terrain or obstacles on the travel of the robot and to comprehensively take into account their influence on the path planning. For example, if a large obstacle is located between the robot and the target device, the value of f may be close to 1, indicating that this obstacle may significantly increase the time for the robot to reach the target device.
In some embodiments, the device layout influence factor may be expressed as:
Wherein k is the number of the equipment to be patrolled; b i is an influencing factor of the equipment layout of each equipment to be patrolled; representing the sum of influencing factors of all device layouts; max (b 1,b2,…,bk) represents the value of the largest influencing factor among all device layouts.
In a specific implementation, the calculation formula of the device layout influence factor l can average all the influence factors of the device layout and normalize the influence factors to the range of [0,1 ]. Therefore, when the influence factor of the device layout is relatively small, the value of l is close to 0; when there is a denser device layout, the value of l is close to 1.
The purpose of this formula is to consider the effect of the degree of compactness of the device layout on the travel of the robot, which can increase the difficulty of the travel of the robot, resulting in an increase in the complexity of path planning. Therefore, the difficulty and time of robot travel can be more accurately evaluated by comprehensively considering the influencing factors of equipment layout.
And 205, controlling the inspection robot to inspect the equipment to be inspected according to the path of the path planning result to obtain a corresponding inspection result.
In some embodiments, step 205 may comprise:
sending the path planning result to the inspection robot;
controlling the inspection robot to detect the equipment to be inspected in real time to obtain corresponding inspection data;
And analyzing the patrol data to obtain the patrol result.
In some embodiments, after the path planning is completed, the obtained path planning result may be sent to the inspection robot, for example, through wireless communication or other data transmission modes. After the inspection robot receives the path planning result, the inspection robot can navigate the self-action according to the path planning result, and inspect the equipment to be inspected along the planned path.
In some embodiments, once the inspection robot begins to move along the path planning result, the inspection robot may carry various sensors and devices for detecting the equipment to be inspected and its surroundings in real time. These sensors may include temperature sensors, humidity sensors, vibration sensors, etc. for monitoring device status and environmental parameters. The inspection robot can detect equipment to be inspected in real time according to a preset detection strategy, and corresponding inspection data are obtained.
In some embodiments, the inspection data acquired by the inspection robot may be transmitted to a data processing system for analysis. In this step, the tour data may be processed, filtered, aggregated, and comprehensively analyzed in combination with the previous state prediction results and image recognition results. The analysis process may involve methods of data cleansing, feature extraction, anomaly detection, statistical analysis, etc., to obtain corresponding inspection results.
In some embodiments, the corresponding tour results may be obtained through analysis of the tour data. These results may include content in terms of state information of the device, environmental parameters, anomalies, security risks, and the like. The inspection results can be used for evaluating the health condition of equipment, discovering potential problems in advance, guiding maintenance and repair work, and monitoring and managing the running state of the whole data center.
Through the mode, the inspection robot can detect the equipment to be inspected in real time according to the path planning result, acquire inspection data and analyze the inspection data to obtain the corresponding inspection result, so that the equipment state and the running condition of the data center are comprehensively monitored and managed.
And 206, based on the inspection result, reporting abnormal equipment with abnormality in the plurality of equipment to be inspected, and sending a remote operation request to a management center of the abnormal equipment.
In some embodiments, when the inspection robot completes real-time detection of the equipment to be inspected and analyzes the inspection data, the equipment with abnormality is identified. For example, the anomalies may include device temperature anomalies, humidity anomalies, vibration anomalies, network fluctuations anomalies, device software program anomalies, and so forth. The inspection robot may report the identification information and the anomaly type of the anomaly devices to a management system of the data center.
In some embodiments, after identifying the abnormal device, the inspection robot may send a remote operation request to a management center of the abnormal device. For example, the request may include a request for further diagnostic, repair, or maintenance operations on the abnormal device. The remote operation request may be sent to the management center of the device via a network communication protocol so that the relevant personnel or system can respond in time and take the necessary action.
In some embodiments, after receiving the remote operation request sent by the inspection robot, the abnormal device management center processes the request. For example, the processing means may include remote diagnosis of the status of the device, sending instructions to restart or repair the device, scheduling maintenance personnel to go to the field process, etc. The management center can take corresponding measures according to the actual situation and the emergency degree so as to ensure that the abnormal equipment can recover to the normal running state as soon as possible.
In sum, based on the inspection result, the inspection robot can identify the abnormal equipment and send a remote operation request to the management center so as to prompt the timely processing and maintenance of the abnormal equipment, thereby ensuring the stable operation and safety of the data center equipment.
In some embodiments, the present invention may further comprise:
Optimizing parameters of the equipment state prediction model and parameters of the image recognition model according to the inspection result to obtain an updated state prediction model and an updated image recognition model;
Processing the state data according to the updated state prediction model to obtain an updated state prediction result;
processing the image according to the updated image recognition model to obtain an updated image recognition result;
And updating the path planning result according to the state prediction result and the image recognition result to obtain an updated path planning result.
In some embodiments, based on the tour results, a large amount of data may be collected, including device status data and image data, for example. With this data, parameters of the device state prediction model and the image recognition model can be optimized. For example, machine learning algorithms, such as back-propagation algorithms in deep learning, may be used to adjust the weights and bias of the model by training the model to enable the model to more accurately predict device states and identify images.
In some embodiments, the optimized device state prediction model may be used to process the most current state data to obtain updated state prediction results. These results reflect the most current state of the device and are more accurate and reliable than previous predictions.
In some embodiments, the updated image recognition result may be obtained by processing the updated image data through the updated image recognition model. These results better reflect the features and information in the image, thereby improving the accuracy and robustness of image recognition.
In some embodiments, the previous path planning result may be updated based on the updated state prediction result and the image recognition result. For example, the equipment status, environmental conditions, and priority of path selection may be re-evaluated and the path planning adjusted accordingly to ensure that the inspection robot can more effectively inspect the equipment and areas.
By the method, parameters of the equipment state prediction model and the image recognition model can be optimized according to the latest inspection results, updated prediction results are obtained, and the path planning is updated according to the results, so that more accurate and effective inspection of equipment to be inspected is realized.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an inspection system of a remote unmanned machine room device according to the present invention.
As shown in fig. 3, a patrol system for a remote unmanned machine room device according to an embodiment of the present invention includes:
the data acquisition module is used for acquiring state data and images of equipment to be patrolled in the remote unmanned aerial vehicle room; the display content of the image comprises the equipment to be patrolled and the surrounding environment of the equipment to be patrolled;
The state prediction module is used for processing the state data through a device state prediction model to obtain a state prediction result of the device to be patrolled;
The image recognition module is used for processing the image through an image recognition model to obtain an image recognition result of the equipment to be patrolled;
the path planning module is used for generating a path planning result for the inspection robot based on the state prediction result and the image recognition result;
the automatic inspection module is used for controlling the inspection robot to inspect the equipment to be inspected according to the path of the path planning result to obtain a corresponding inspection result;
And the patrol processing module is used for reporting abnormal equipment with abnormality from a plurality of equipment to be patrol based on the patrol result and sending a remote operation request to a management center of the abnormal equipment.
Referring to fig. 4, fig. 4 is a schematic diagram of an embodiment of an electronic device according to an embodiment of the invention. As shown in fig. 4, an embodiment of the present invention provides an electronic device 400, including a memory 410, a processor 420, and a computer program 411 stored in the memory 410 and executable on the processor 420, wherein the processor 420 executes the computer program 411 to implement the following steps:
acquiring state data and images of equipment to be patrolled in a remote unmanned aerial vehicle room; the display content of the image comprises the equipment to be patrolled and the surrounding environment of the equipment to be patrolled;
Processing the state data through a device state prediction model to obtain a state prediction result of the device to be patrolled;
processing the image through an image recognition model to obtain an image recognition result of the equipment to be patrolled;
Generating a path planning result for the inspection robot based on the state prediction result and the image recognition result;
Controlling the inspection robot to inspect the equipment to be inspected according to the path of the path planning result to obtain a corresponding inspection result;
Based on the inspection result, reporting abnormal equipment with abnormality in the plurality of equipment to be inspected, and sending a remote operation request to a management center of the abnormal equipment.
Referring to fig. 5, fig. 5 is a schematic diagram of an embodiment of a computer readable storage medium according to an embodiment of the invention. As shown in fig. 5, the present embodiment provides a computer-readable storage medium 500 having stored thereon a computer program 411, which computer program 411, when executed by a processor, performs the steps of:
acquiring state data and images of equipment to be patrolled in a remote unmanned aerial vehicle room; the display content of the image comprises the equipment to be patrolled and the surrounding environment of the equipment to be patrolled;
Processing the state data through a device state prediction model to obtain a state prediction result of the device to be patrolled;
processing the image through an image recognition model to obtain an image recognition result of the equipment to be patrolled;
Generating a path planning result for the inspection robot based on the state prediction result and the image recognition result;
Controlling the inspection robot to inspect the equipment to be inspected according to the path of the path planning result to obtain a corresponding inspection result;
Based on the inspection result, reporting abnormal equipment with abnormality in the plurality of equipment to be inspected, and sending a remote operation request to a management center of the abnormal equipment.
In the foregoing embodiments, the descriptions of the embodiments are focused on, and for those portions of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create a system for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (10)

1. A method for patrolling remote unmanned machinery room equipment, the method comprising:
acquiring state data and images of equipment to be patrolled in a remote unmanned aerial vehicle room; the display content of the image comprises the equipment to be patrolled and the surrounding environment of the equipment to be patrolled;
Processing the state data through a device state prediction model to obtain a state prediction result of the device to be patrolled;
processing the image through an image recognition model to obtain an image recognition result of the equipment to be patrolled;
Generating a path planning result for the inspection robot based on the state prediction result and the image recognition result;
Controlling the inspection robot to inspect the equipment to be inspected according to the path of the path planning result to obtain a corresponding inspection result;
Based on the inspection result, reporting abnormal equipment with abnormality in the plurality of equipment to be inspected, and sending a remote operation request to a management center of the abnormal equipment.
2. The method for inspecting equipment in a remote unmanned machine room according to claim 1, wherein the processing the status data by using an equipment status prediction model to obtain a status prediction result of the equipment to be inspected comprises:
Performing coding processing on the state data to obtain corresponding input characteristics;
Acquiring a first weight corresponding to each input feature;
the equipment state prediction model obtains the state prediction result based on the input characteristics and the first weight; the expression of the state prediction result is:
Where n is the number of input features; x i is the ith input feature; omega i is the first weight associated with the ith input feature; b is a bias term; sigma is the first activation function.
3. The method for patrolling the remote unmanned aerial vehicle room equipment according to claim 2, wherein the processing the image through the image recognition model to obtain the image recognition result of the equipment to be patrolled comprises the following steps:
Performing coding processing and feature extraction processing on each image to obtain a corresponding input image;
Carrying out convolution processing on the input image by combining the second weight of each convolution kernel through the convolution kernel of the image recognition model to obtain a corresponding image recognition result; each input image corresponds to each convolution kernel one by one;
the expression of the image recognition result is as follows:
Where o is the number of convolution kernels; x i is the ith channel or feature map of the input image; w i is a second weight associated with the ith convolution kernel; b is a bias term; The method comprises the steps of representing that each convolution kernel and the corresponding input feature diagram carry out convolution operation and adding linear combination of bias items; g is the second activation function.
4. A method of tour of a remote unmanned machine room device according to claim 3, wherein the generating a path planning result for the tour robot based on the state prediction result and the image recognition result comprises:
Determining target equipment and key areas for patrol of the patrol robot according to the state prediction result and the image recognition result;
determining a distance between the current position of the inspection robot and the target equipment, a terrain and obstacle influence factor and an equipment layout influence factor according to the image recognition result;
a first time length reaching the target equipment, and planning an initial path meeting the first time length;
and optimizing the initial path according to the state prediction result and the image recognition result to obtain the path planning result.
5. The method for patrolling a remote unmanned aerial vehicle room device according to claim 4, wherein the expression for the first duration is:
Wherein T is a first duration, and d is a distance between the current position of the inspection robot and the target device; v is the travel speed of the inspection robot; f is a terrain and obstacle influencing factor; l is a device layout influencing factor; s is the state prediction result; r is the image recognition result.
6. The method of patrolling a remote unmanned aerial vehicle room facility of claim 5, wherein the terrain and obstacle influencing factors are expressed as:
Where m is the number of target terrain or obstacles; a i is the influencing factor for each target terrain or obstacle; the sum max (a 1,a2,…,am) of the influencing factors representing all the target terrains or obstacles represents the value at which the influencing factor is the largest among all the target terrains or obstacles;
the device layout influence factor is expressed as:
Wherein k is the number of the equipment to be patrolled; b i is an influencing factor of the equipment layout of each equipment to be patrolled; representing the sum of influencing factors of all device layouts; max (b 1,b2,…,bk) represents the value of the largest influencing factor among all device layouts.
7. The method for inspecting equipment in a remote unmanned machine room according to claim 6, wherein the controlling the inspection robot to inspect the equipment to be inspected according to the path of the path planning result to obtain a corresponding inspection result includes:
sending the path planning result to the inspection robot;
controlling the inspection robot to detect the equipment to be inspected in real time to obtain corresponding inspection data;
And analyzing the patrol data to obtain the patrol result.
8. The method of patrolling a remote unmanned machine room facility of claim 1, further comprising:
Optimizing parameters of the equipment state prediction model and parameters of the image recognition model according to the inspection result to obtain an updated state prediction model and an updated image recognition model;
Processing the state data according to the updated state prediction model to obtain an updated state prediction result;
processing the image according to the updated image recognition model to obtain an updated image recognition result;
And updating the path planning result according to the state prediction result and the image recognition result to obtain an updated path planning result.
9. The method for patrolling equipment in a remote unmanned aerial vehicle room according to any one of claims 1 to 8, wherein the acquiring the status data and the image of the equipment to be patrolled in the remote unmanned aerial vehicle room comprises:
Acquiring first state data through a sensor arranged on the equipment to be patrolled, and acquiring second state data through a remote management tool installed in the equipment to be patrolled;
Acquiring a first image of the equipment to be patrolled and a second image of the surrounding environment of the equipment to be patrolled through image acquisition equipment arranged in the remote unmanned aerial vehicle room;
Acquiring a set of the first state data and the second state data to obtain the state data, and acquiring a set of the first image and the second image to obtain the image.
10. A patrol system for a remote unmanned machine room device, the system comprising:
the data acquisition module is used for acquiring state data and images of equipment to be patrolled in the remote unmanned aerial vehicle room; the display content of the image comprises the equipment to be patrolled and the surrounding environment of the equipment to be patrolled;
The state prediction module is used for processing the state data through a device state prediction model to obtain a state prediction result of the device to be patrolled;
The image recognition module is used for processing the image through an image recognition model to obtain an image recognition result of the equipment to be patrolled;
the path planning module is used for generating a path planning result for the inspection robot based on the state prediction result and the image recognition result;
the automatic inspection module is used for controlling the inspection robot to inspect the equipment to be inspected according to the path of the path planning result to obtain a corresponding inspection result;
And the patrol processing module is used for reporting abnormal equipment with abnormality from a plurality of equipment to be patrol based on the patrol result and sending a remote operation request to a management center of the abnormal equipment.
CN202410218460.5A 2024-02-28 2024-02-28 Inspection method and system for remote unmanned machine room equipment Pending CN118034343A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410218460.5A CN118034343A (en) 2024-02-28 2024-02-28 Inspection method and system for remote unmanned machine room equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410218460.5A CN118034343A (en) 2024-02-28 2024-02-28 Inspection method and system for remote unmanned machine room equipment

Publications (1)

Publication Number Publication Date
CN118034343A true CN118034343A (en) 2024-05-14

Family

ID=90985644

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410218460.5A Pending CN118034343A (en) 2024-02-28 2024-02-28 Inspection method and system for remote unmanned machine room equipment

Country Status (1)

Country Link
CN (1) CN118034343A (en)

Similar Documents

Publication Publication Date Title
US11941868B2 (en) Inference apparatus, inference method, and computer-readable storage medium storing an inference program
US9852019B2 (en) System and method for abnormality detection
CN115527203B (en) Cereal drying remote control method and system based on Internet of things
US10775314B2 (en) Systems and method for human-assisted robotic industrial inspection
CN114218992A (en) Abnormal object detection method and related device
CN111581436B (en) Target identification method, device, computer equipment and storage medium
CN117235443A (en) Electric power operation safety monitoring method and system based on edge AI
US11682112B2 (en) Inspection device and machine learning method
CN115169602A (en) Maintenance method and device for power equipment, storage medium and computer equipment
CN115035328A (en) Converter image increment automatic machine learning system and establishment training method thereof
CN112528825A (en) Station passenger recruitment service method based on image recognition
CN111798518A (en) Mechanical arm posture detection method, device and equipment and computer storage medium
CN116849643A (en) Method for detecting falling of wearable equipment based on neural network
CN118034343A (en) Inspection method and system for remote unmanned machine room equipment
CN115373329B (en) Unmanned system of railway distribution substation
CN116956174A (en) Classification model for cold head state classification detection and life prediction and generation method of prediction model
CN116630909B (en) Unmanned intelligent monitoring system and method based on unmanned aerial vehicle
CN117671507B (en) River water quality prediction method combining meteorological data
CN117676099B (en) Security early warning method and system based on Internet of things
RU2743886C1 (en) System and method for automatic adjustment of the technical vision complex
EP4273752A1 (en) Device and method for detecting anomalies in technical systems
CN111339952B (en) Image classification method and device based on artificial intelligence and electronic equipment
CN117834385A (en) Fault early warning processing method based on Internet of things system and Internet of things system
CN115457482A (en) Image recognition method, device and system and computer equipment
CN115879041A (en) Method and system for evaluating health state of production line equipment and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination