WO2020021589A1 - 情報処理装置、情報処理システム、方法、及びプログラム - Google Patents

情報処理装置、情報処理システム、方法、及びプログラム Download PDF

Info

Publication number
WO2020021589A1
WO2020021589A1 PCT/JP2018/027445 JP2018027445W WO2020021589A1 WO 2020021589 A1 WO2020021589 A1 WO 2020021589A1 JP 2018027445 W JP2018027445 W JP 2018027445W WO 2020021589 A1 WO2020021589 A1 WO 2020021589A1
Authority
WO
WIPO (PCT)
Prior art keywords
portable device
diagnosis
instruction
inspection
data
Prior art date
Application number
PCT/JP2018/027445
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
安章 後藤
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to DE112018007754.3T priority Critical patent/DE112018007754B4/de
Priority to JP2019508996A priority patent/JP6584721B1/ja
Priority to CN201880095740.4A priority patent/CN112424726B/zh
Priority to PCT/JP2018/027445 priority patent/WO2020021589A1/ja
Priority to TW108112786A priority patent/TW202008278A/zh
Publication of WO2020021589A1 publication Critical patent/WO2020021589A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06395Quality analysis or management
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip

Definitions

  • the present invention relates to an information processing device, an information processing system, a method, and a program.
  • Visual inspection and hammering inspection are routine non-destructive inspections performed by workers. For example, the operator visually determines the pipe or determines whether or not the pipe has been damaged based on a tapping sound when the pipe is hit.
  • the visual inspection and hammering inspection are manual inspections by the operator, so the operator must have experience and skilled skills to correctly determine whether or not an abnormality has occurred. was there. For this reason, the inexperienced worker accompanies a skilled worker and has gained experience of the work. In addition, when there is no worker having a skilled skill, there is a situation in which an inexperienced worker works while reading a manual or receiving a telephone instruction from the control management room.
  • Patent Literature 1 In an information processing system described in Patent Literature 1, in a plant, an instruction value of an instrument read by an operator and an adjustment amount obtained by adjusting an operation parameter of a production device are transferred from a wearable computer carried by the operator to a host computer. It is described to be transmitted to.
  • the host computer transmits the collected meter readings and the adjustment amount of the operation parameter by the worker to a wearable computer carried by another worker.
  • Patent Literature 1 the workers only share the instruction value of the instrument in charge of another worker and the adjustment amount of the operation parameter, and finally the worker himself / herself It was necessary to determine whether or not an abnormality had occurred based on the indicated value of the meter in charge. However, it was difficult for inexperienced workers to make appropriate decisions.
  • the present invention has been made in view of the above circumstances, and aims to diagnose a diagnosis target from inspection data and to present an appropriate procedure for inspection to an operator.
  • the inspection data receiving means receives inspection data collected by the portable device in the inspection work on the diagnosis target from the portable device.
  • the learning means inputs the inspection data collected in the past to the neural network, and learns whether or not the diagnosis target is damaged.
  • the diagnosing means inputs the inspection data received by the inspection data receiving means to the neural network, and diagnoses whether or not the diagnosis target is damaged based on the output of the neural network.
  • the instruction unit transmits an instruction corresponding to a result of the diagnosis performed on the diagnosis target by the diagnosis unit to the portable device.
  • the information processing apparatus inputs the inspection data collected by the portable device in the inspection work on the diagnosis target to the neural network, and diagnoses the damage of the diagnosis target from the output of the neural network. An instruction corresponding to the diagnosis result is transmitted to the portable device. With such a configuration, it is possible to diagnose a diagnosis target from inspection data and to present an appropriate procedure for inspection to an operator.
  • FIG. 1 is a block diagram illustrating a hardware configuration of an information processing system according to an embodiment.
  • FIG. 1 is a diagram illustrating an example of a configuration of a portable device according to an embodiment.
  • Functional block diagram of an information processing device according to an embodiment The figure which shows an example of the data of the treatment table which concerns on embodiment.
  • FIG. 3 is a diagram illustrating an example of an image displayed on a display panel of the portable device according to the embodiment.
  • FIG. 10 is a view showing another example of an image displayed on the display panel of the portable device according to the embodiment.
  • FIG. 5 is a diagram illustrating an example of an image indicating an instruction for a tapping sound inspection according to the embodiment.
  • FIG. 7 is a diagram illustrating an example of an image indicating an instruction for maintenance work according to the embodiment.
  • the figure which shows an example of the image of the map of the campus displayed on the display of the information processing apparatus concerning embodiment.
  • Flow chart of inspection work processing according to the embodiment The figure which shows an example of the data of the treatment table concerning the modification 1.
  • the information processing system 1 As shown in FIG. 1, the information processing system 1 according to Embodiment 1 of the present invention includes an information processing apparatus 100 and a plurality of portable devices 200.
  • the information processing system 1 is a system that manages maintenance and inspection work of a chemical plant, a steel plant, and the like, for example.
  • the information processing apparatus 100 diagnoses whether or not the equipment in the plant is damaged based on the inspection data of the equipment in the plant collected by the portable device 200.
  • the information processing device 100 is installed in the control management room of the plant.
  • the workers 50 who perform inspection work while patrol the premises of the plant carry the portable devices 200 respectively.
  • the portable device 200 is a wearable computer worn by the worker 50, as shown in FIG. Specifically, the portable device 200 is integrally attached to the helmet 10 worn by the worker 50.
  • the information processing apparatus 100 and the portable device 200 perform wireless communication.
  • the information processing apparatus 100 transmits an instruction to the worker 50 to the portable device 200.
  • the portable device 200 transmits the acquired inspection data to the information processing device 100.
  • the inspection data is data indicating a state of a diagnosis target.
  • the inspection data includes sound data of a tapping sound of the tapping test performed by the worker 50, image data of an image of a diagnosis target, and a measured value of a gas concentration.
  • the information processing device 100 has, as a hardware configuration, a memory 101 that stores various data, an input device 102 that detects an input operation of a user, an output device 103 that outputs an image to a display device, and wireless communication with other devices. And a processor 105 for controlling the entire information processing apparatus 100.
  • the memory 101, the input device 102, the output device 103, and the wireless communication circuit 104 are all connected to the processor 105 via the bus 109, and communicate with the processor 105.
  • the memory 101 includes a volatile memory and a nonvolatile memory, and stores programs and various data.
  • the memory 101 is used as a work memory of the processor 105. Further, the memory 101 stores a program 1000 for realizing the inspection work processing in the information processing apparatus 100.
  • the input device 102 includes a keyboard, a mouse, a touch panel, and the like, detects an input operation of a user in the control management room, and outputs a signal indicating the detected input operation to the processor 105.
  • the output device 103 includes a display, a touch panel, and the like, and displays an image based on a signal supplied from the processor 105.
  • the output device 103 displays a map of the plant premises on a display, for example, in response to a user operation in the control management room.
  • the output device 103 displays image data captured by the portable device 200 on a display.
  • the wireless communication circuit 104 has an antenna 104a and includes a network interface circuit for performing wireless communication with another device.
  • the wireless communication circuit 104 converts the data supplied from the processor 105 into an electric signal, and outputs the electric signal on a radio wave.
  • the wireless communication circuit 104 receives a radio wave output from another device, restores an electric signal carried on the radio wave to data, and outputs the data to the processor 105.
  • the processor 105 includes a CPU (Central Processing Unit), executes various programs stored in the memory 101, and realizes various functions of the information processing apparatus 100. It is assumed that the processor 105 further includes a dedicated processor for AI (Artificial Intelligence).
  • AI Artificial Intelligence
  • the portable device 200 is an AR (Augmented Reality) head mounted display integrally attached to the helmet 10 worn by the worker 50.
  • the portable device 200 has, as a hardware configuration, a memory 201 that stores various data, an output device 202 that presents information supplied from the information processing device 100 to the worker 50, and a device related to inspection.
  • the mobile device 200 includes a collection device 203 that collects data, a wireless communication circuit 204 that performs wireless communication with another device, and a processor 205 that controls the entire portable device 200.
  • the memory 201, the output device 202, the collection device 203, and the wireless communication circuit 204 are all connected to the processor 205 via the bus 209 and communicate with the processor 205.
  • the memory 201 includes a volatile memory and a nonvolatile memory, and stores programs and various data.
  • the memory 201 is used as a work memory of the processor 205.
  • the memory 201 stores a program 2000 for realizing inspection data collection processing in the portable device 200.
  • the memory 201 is housed in the main unit 20 shown in FIG.
  • the output device 202 has a display panel 202a and a speaker 202b.
  • the display panel 202a displays an image received from the information processing device 100 on the display panel 202a under the control of the processor 205.
  • the image displayed on the display panel 202a is, for example, an image indicating a work instruction to the worker 50.
  • the display panel 202a is arranged so as to be located in front of one of the eyes of the worker 50.
  • the display panel 202a is large enough to cover one eye of the worker 50.
  • the speaker 202b outputs the sound received from the information processing apparatus 100 under the control of the processor 205.
  • the sound output from the speaker 202b is, for example, a sound indicating a work instruction to the worker 50.
  • the speaker 202b is housed in the main body 20 shown in FIG.
  • the collection device 203 includes a camera 203a, a microphone 203b, a gas detection sensor 203c, and a GPS (Global Positioning System) receiver 203d. Collect data.
  • GPS Global Positioning System
  • the camera 203a is arranged on the side of the face of the worker 50 so that the lens faces in the same direction as the line of sight of the worker 50.
  • the camera 203a captures an image of a diagnosis target existing in a direction in which the worker 50 faces.
  • the camera 203a continuously shoots while the power is on, and outputs the shot image data to the processor 205.
  • the microphone 203b is attached to the side surface of the face of the worker 50, at a position between the helmet 10 and the camera 203a.
  • the microphone 203b collects a sound emitted by the worker 50, a tapping sound generated in the tapping sound inspection, and the like.
  • the microphone 203b outputs the collected sound to the processor 205.
  • the gas detection sensor 203c sucks in the surrounding air, measures the concentration of the specified gas in the air, and outputs the measured value to the processor 205. As shown in FIG. 2, the gas detection sensor 203c is connected to the main body 20 via a cable 203e. The gas detection sensor 203c outputs a measured value to the processor 205 via the cable 203e. This is to allow the worker 50 to move the gas detection sensor 203c to suck the surrounding gas.
  • the GPS receiver 203d shown in FIG. 1 specifies the current position of the worker 50 from satellite radio waves received from GPS satellites, and outputs position data indicating the specified position to the processor 205.
  • the position data output from the GPS receiver 203d is represented by three-dimensional coordinates.
  • the GPS receiver 203d outputs information indicating the current position of the worker 50 to the processor 205 every predetermined period, for example, every one minute.
  • the GPS receiver 203d is housed in the main unit 20 shown in FIG.
  • the wireless communication circuit 204 illustrated in FIG. 1 includes an antenna 204a and performs wireless communication with a wireless communication circuit of another device.
  • the wireless communication circuit 204 converts the data supplied from the processor 205 into an electric signal, and outputs the electric signal on a radio wave. Further, the wireless communication circuit 204 receives a radio wave output from another device, restores an electric signal carried on the radio wave to data, and outputs the data to the processor 205.
  • the wireless communication circuit 204 is housed in the main unit 20 shown in FIG.
  • the processor 205 shown in FIG. 1 includes a CPU, executes various programs stored in the memory 201, and realizes various functions of the portable device 200.
  • the processor 205 executes the program 2000 to execute inspection data collection processing.
  • the processor 205 transmits the image data output by the camera 203a to the information processing device 100 via the wireless communication circuit 204 at predetermined intervals.
  • the processor 205 transmits the audio data to the information processing device 100 via the wireless communication circuit 204.
  • the gas detection sensor 203c outputs the measured value of the gas concentration
  • the processor 205 transmits the measured value to the information processing device 100 via the wireless communication circuit 204.
  • the processor 205 transmits the information indicating the current position of the worker 50 output from the GPS receiver 203d to the information processing apparatus 100 via the wireless communication circuit 204 every predetermined period in addition to the inspection data.
  • the processor 205 is housed in the main unit 20 shown in FIG.
  • the information processing apparatus 100 functionally includes a storage unit 110 that stores various data related to inspection and diagnosis, an inspection data receiving unit 120 that receives inspection data from the portable device 200, The learning unit 130 that performs learning, the diagnosis unit 140 that diagnoses whether or not there is damage to the diagnosis target based on the inspection data received by the inspection data reception unit 120 and the learning result of the learning unit 130, and the contents of instructions to the worker 50 And an instruction unit 150 for determining and outputting the content of the determined instruction to the portable device 200.
  • the inspection data receiving unit 120 is an example of the inspection data receiving unit of the present invention.
  • the learning unit 130 is an example of a learning unit of the present invention.
  • the diagnosis unit 140 is an example of a diagnosis unit according to the present invention.
  • the instruction unit 150 is an example of an instruction unit according to the present invention.
  • the information processing apparatus 100 uses the learned neural network to diagnose whether or not the diagnosis target is damaged.
  • the diagnosis result output from the information processing apparatus 100 is either a diagnosis result indicating that the diagnosis target is not damaged or a diagnosis result indicating that the diagnosis target is damaged.
  • the storage unit 110 stores various data related to inspection and diagnosis.
  • the storage unit 110 includes an inspection manual 111 relating to inspection work, history data 112 relating to execution of inspection, various reference data 113 for inspection, a learning model 114 defining a neural network, and learning data 114.
  • 115 and a treatment table 116 storing data indicating the contents of the treatment for the diagnosis target.
  • the function of the storage unit 110 is realized by the memory 101.
  • the treatment table 116 is an example of a treatment information storage unit of the present invention.
  • the inspection manual 111 includes information on the position of the inspection target and the inspection method.
  • the inspection manual 111 includes map data on the premises of the plant and position data indicating the positions of pipes, tanks, and the like to be diagnosed on the premises of the plant.
  • the position data of the diagnosis target is represented by three-dimensional coordinates.
  • the inspection manual 111 includes information indicating the position at which the diagnosis target is hit at the time of the tap sound inspection.
  • Information indicating the position at which the diagnosis target is hit is also represented by three-dimensional coordinates. For example, when the diagnosis target is a pipe extending into the premises, it may be necessary to hit a plurality of locations in a tapping test for one pipe. In such a case, the inspection manual 111 includes not only the position where the pipe is hit, but also information indicating the order in which the pipe is hit.
  • the history data 112 specifically includes inspection history data including the inspection data received from the portable device 200 and the date and time when the inspection was performed, information for specifying a diagnosis target, and diagnosis result data.
  • the inspection data includes image data obtained by capturing an image of a diagnosis target, sound data of a tapping sound of a tapping test performed by an operator, and a measured value of a gas concentration.
  • the information specifying the diagnosis target is, for example, information indicating the position of the diagnosis target.
  • the diagnosis result data includes data indicating a diagnosis result obtained by the diagnosis unit 140 from the inspection data. For example, the diagnosis result data indicates whether damage to the diagnosis target has occurred.
  • the reference data 113 includes a reference value indicating an appropriate loudness of the sound of the hammering test.
  • the instruction unit 150 instructs the worker 50 to tap harder via the portable device 200.
  • the reference data 113 stores a threshold value of a gas concentration which is a reference for determining whether or not gas leakage has occurred.
  • the learning model 114 includes information defining the shape and scale of the neural network. Specifically, the learning model 114 includes a mathematical expression representing the learning model, the total number of hidden layers, the number of neurons (nodes) in each layer, the weight coefficient of each neuron, and the like.
  • storage unit 110 stores a learning model suitable for image recognition and a learning model suitable for speech recognition. This is because the diagnosing unit 140 inputs image data obtained by capturing an image of a diagnosis target and sound data of a tapping sound to a learned neural network, and makes a diagnosis from an output of the neural network. The weight coefficient of each neuron of the learning model stored in the storage unit 110 is updated by the learning of the learning unit 130.
  • the learning data 115 includes, for each of the inspection data collected in the past, data classified as having damage to the inspection target or not having damage to the inspection target in learning by the learning unit 130 described later.
  • the learning data 115 includes image data of a damaged pipe, sound data of a tapping sound when the damaged pipe is hit, and a value indicating the presence or absence of damage for each diagnosis target. And include the set data. Further, the learning data 115 indicates image data of a pipe in which no damage has occurred, sound data of a tapping sound when the pipe has been hit in the case where no damage has occurred, and presence / absence of damage for each diagnostic object. Value and the set data.
  • the image data and audio data included in the learning data 115 may be data collected by the portable device 200 at the time of the past inspection, or may be data collected by another device, for example, an individual camera, a microphone, or the like. It may be.
  • the learning data 115 is grouped based on a premise regarding the diagnosis target such as the material of the diagnosis target, the size of the diagnosis target, and gas flowing inside.
  • the classification of whether the inspection data included in the learning data 115 is data on an inspection target with damage or data on an inspection target without damage is performed by, for example, image and sound data collected in a past inspection. Is made based on the diagnosis result determined by a skilled worker based on the above.
  • the treatment table 116 is a table that defines, for each diagnosis target, a countermeasure to be taken by the worker 50 when the diagnosis target is diagnosed as being damaged.
  • a countermeasure to be taken by the worker 50 when the diagnosis target is diagnosed as being damaged.
  • closing the valve is defined as a countermeasure. It is defined to report when a pipe is diagnosed as damaged. In this case, the pipe needs to be replaced, and after reporting to the control management room, pipe repair work, replacement work, and the like will be performed.
  • the inspection data receiving unit 120 receives input of inspection data relating to a diagnosis target from the portable device 200.
  • the function of the inspection data receiving unit 120 is realized by the wireless communication circuit 104 and the processor 105.
  • the inspection data receiving unit 120 outputs the inspection data received from the portable device 200 to the diagnosis unit 140 in association with the received date and time and the position data of the portable device 200. Further, the inspection data receiving unit 120 stores the received inspection data in the storage unit 110 in association with the received date and time and information for identifying the transmission source portable device 200.
  • the learning unit 130 inputs the learning data 115 to the neural network having the configuration defined by the learning model 114, and, for example, by a back propagation method, the neural network outputs an output of the neural network to approach a predetermined true value. Is adjusted, and a learning model used for diagnosis by the diagnosis unit 140 is determined. In the embodiment, since there is no damage and there is damage, the learning unit 130 arranges two neurons in the output layer. As the learning data, a plurality of sets are prepared in which past inspection data and neuron numbers indicating the presence / absence of damage to the inspection target are set.
  • the learning unit 130 supplies the inspection data of each learning data to the neural network, and performs a back propagation method on the weight coefficients of the neurons in the intermediate layer and the output layer so that the neurons in the output layer indicated by the corresponding numbers are fired. Adjust using By the learning, the value of the weight coefficient of the learning model 114 is updated with the adjusted weight coefficient. In this way, the neural network learns the relationship between the input and the output.
  • the function of the learning unit 130 is realized by the processor 105.
  • the learning data 115 includes (a) a set of image data of a damaged pipe and a value indicating damage, (b) a set of image data of an undamaged pipe and a value indicating no damage, c) a set of sound data of a tapping sound of a damaged pipe and a value indicating damage, and (d) a set of sound data of a tapping sound of an undamaged pipe and a value indicating no damage.
  • a neural network adopting a learning model suitable for image recognition receives the image data of (a) as input, outputs a value indicating that damage has occurred, adjusts a weight coefficient of neurons in an intermediate layer, The image data of (b) is input, and the value indicating that no damage has occurred is output, and the weight coefficient of the neurons in the intermediate layer is adjusted.
  • the neural network adopting a learning model suitable for speech recognition receives the speech data of (c) as an input, outputs a value indicating that damage has occurred, and adjusts a weight coefficient of a neuron in an intermediate layer, Further, the audio data of (d) is input, and a value indicating that no damage has occurred is output, and the weight coefficient of the neuron in the intermediate layer is adjusted.
  • the learning data 115 may include only some of the data in (a) to (d). Also in this case, deep learning is possible.
  • the neural network when new inspection data is input to the learned neural network, the neural network outputs a value indicating the presence or absence of damage to the diagnosis target. That is, the learned neural network can determine whether or not the diagnosis target is damaged. In this way, the learned neural network can diagnose whether or not the pipe has damage from the image data and the audio data collected by the portable device 200.
  • the learning of the learning unit 130 needs to be performed before the inspection data is supplied from the portable device 200.
  • the diagnosis unit 140 inputs the inspection data received from the portable device 200 to the neural network that has realized the learning model learned by the learning unit 130, and diagnoses the damage of the diagnosis target from the output.
  • the function of the diagnosis unit 140 is realized by the processor 105.
  • the diagnostic unit 140 inputs, for example, a captured image of a pipe in a plant captured by the portable device 200 to a neural network adopting a learned learning model, and based on an output value of the neural network, a diagnosis target is damaged. Diagnose whether or not.
  • the diagnosis unit 140 outputs the result of the diagnosis to the instruction unit 150.
  • diagnosis unit 140 inputs the sound of the tapping sound collected by the portable device 200 to the neural network that employs the learned learning model, and determines whether the diagnosis target is damaged based on the output value of the neural network. Diagnose.
  • diagnosis unit 140 outputs the result of the diagnosis to the instruction unit 150.
  • the diagnostic unit 140 determines that a gas leak has occurred.
  • the diagnosis unit 140 outputs the result of the diagnosis to the instruction unit 150.
  • the instructing unit 150 determines the contents of the instruction of the inspection work, and transmits the instruction of the inspection work according to the determined result to the portable device 200.
  • the instruction of the inspection work determined by the instruction unit 150 includes (i) a work instruction not based on the diagnosis result of the diagnosis unit 140 and (ii) a work instruction based on the diagnosis result of the diagnosis unit 140.
  • the instruction unit 150 instructs the worker 50 to move to the position of the inspection target.
  • the instruction unit 150 acquires the position data indicating the position of the worker 50 obtained by the GPS receiver 203d of the portable device 200 and the position data of the inspection target from the inspection manual 111.
  • the instruction unit 150 transmits, to the portable device 200, the current position data of the worker 50 on the map indicating the premises of the plant, the position data of the diagnosis target, and an instruction to move to the position of the diagnosis target.
  • the processor 205 of the portable device 200 displays, on the display panel 202a, an image of a map of the premises of the plant as shown in FIG. 4A and an image showing the position 30 of the diagnosis target.
  • the current position of the worker 50 is represented by a black human image
  • the position 30 to be diagnosed is represented by a black star image.
  • a path from the worker 50 to the position 30 to be diagnosed is indicated by an arrow.
  • the instruction unit 150 may transmit a signal to the portable device 200 indicating that an instruction by voice is output.
  • the processor 205 of the portable device 200 outputs a voice instructing to move to the designated position 30 from the speaker 202b.
  • the instruction unit 150 may control the portable device 200 to display an image of an arrow indicating a route on the display panel 202a while the worker 50 is moving, as shown in FIG. 4B.
  • the processor 205 of the portable device 200 displays the image of the arrow on the display panel 202a. Therefore, on the display panel 202a, the actual passage of the premises and the image of the arrow are superimposed and displayed.
  • the instruction unit 150 determines that the worker 50 has reached the position 30 to be diagnosed based on the position data received from the portable device 200 and the map data of the premises of the plant included in the inspection manual 111, and instructs the inspection work. Is transmitted to the portable device 200. More specifically, the instruction unit 150 transmits information on the position to be hit in the tapping sound inspection included in the inspection manual 111 and an instruction to hit the position in order to indicate to the operator 50 where to hit the diagnosis target. To the mold device 200.
  • the processor 205 of the portable device 200 displays on the display panel 202a an image indicating the position at which the worker 50 should hit and an instruction of "hit".
  • the pipe 1002 is the object to be diagnosed among the pipes 1001 and 1002, and a hatched portion indicates a hit position. Therefore, on the display panel 202a, the actual pipes 1001 and 1002, the instruction of "hitting", and the image indicating the position to be hit are superimposed and displayed.
  • the instruction unit 150 may transmit a signal to the portable device 200 indicating that an instruction by voice is output. In this case, the processor 205 of the portable device 200 outputs a sound for instructing to hit the diagnosis target from the speaker 202b.
  • the instruction unit 150 determines whether or not the loudness of the sound is sufficient. Is transmitted to the portable device 200. Therefore, as shown in FIG. 6, the processor 205 of the portable device 200 displays an instruction to hit harder on the display panel 202a. In the example shown in the figure, "hit a little harder" indicating the instruction content is displayed. Further, the instruction unit 150 may transmit a signal to the portable device 200 indicating that an instruction by voice is output. In this case, the processor 205 of the portable device 200 outputs, from the speaker 202b, a voice instructing to hit harder.
  • the instruction unit 150 further transmits an instruction to detect the gas concentration to the portable device 200. Therefore, the processor 205 of the portable device 200 displays an instruction to detect the gas concentration using the gas detection sensor 203c on the display panel 202a. Further, the instruction unit 150 may transmit a signal to the portable device 200 indicating that an instruction by voice is output. In this case, the processor 205 of the portable device 200 outputs a voice from the speaker 202b to instruct to detect the gas concentration using the gas detection sensor 203c.
  • the instruction unit 150 issues a work instruction based on the diagnostic result of the diagnostic unit 140 as follows. For example, when the diagnosis unit 140 determines that a gas leak has occurred, the instruction unit 150 transmits an instruction to be performed by the worker 50 to the portable device 200 based on the diagnosis result and the treatment table 116. . Since the result of the diagnosis is the occurrence of a gas leak, turning the valve clockwise as shown in FIG. 3B is the procedure to be performed. The instruction unit 150 transmits information indicating the position of the target valve and information indicating the direction in which the valve is turned to the portable device 200.
  • the processor 205 of the portable device 200 displays, on the display panel 202a, an instruction to “turn and close” the target valve and an image of an arrow indicating the direction in which the valve is to be turned. . Therefore, on the display panel 202a, the actual valve, the instruction to rotate the valve, ie, "turn and close", and the image of the arrow indicating the direction of rotation are superimposed and displayed. Further, the instruction unit 150 may transmit a signal to the portable device 200 indicating that an instruction by voice is output. In this case, the processor 205 of the portable device 200 outputs a sound indicating an instruction to close the valve from the speaker 202b.
  • the instructing unit 150 displays a screen as shown in FIGS. 4A and 8 on the display of the output device 103 so that the position of the worker 50 in the premises can be checked by a supervisor in the control management room. .
  • FIG. 4A the positions of all workers who carry the portable device 200 are displayed on a map of the premises of the plant.
  • a moving worker 50 represented by a broken line and an arrow indicating a moving path are displayed. Therefore, the observer can check the movement of the worker 50 who has received the movement instruction.
  • the function of the instruction unit 150 is realized by the wireless communication circuit 104 and the processor 105.
  • the worker 50 is on the premises of the plant with the helmet 10 to which the portable device 200 is attached, as shown in FIG. 4A.
  • the worker 50 carries a hammer for hammering inspection.
  • the information processing apparatus 100 instructs the worker 50 to check the diagnosis target.
  • the learning unit 130 has completed learning of the neural network, and the storage unit 110 stores a learned learning model.
  • the camera 203a, the microphone 203b, and the gas detection sensor 203c of the portable device 200 each acquire inspection data at a predetermined timing.
  • the processor 205 of the portable device 200 transmits the inspection data to the information processing device 100.
  • the GPS receiver 203d also acquires position data at a predetermined timing.
  • the processor 205 transmits the position data to the information processing device 100.
  • the instructing unit 150 transmits an instruction to the worker 50 to move to the position 30 to be diagnosed to the portable device 200 (Step S11). Specifically, the instruction unit 150 sends a signal including map data of the plant, coordinate values indicating the current position of the worker 50, coordinate values indicating the position 30 of the diagnosis target, and a movement instruction to the portable device 200. Send. In response, the portable device 200 displays an image of a map of the premises of the plant as shown in FIG. 4A and an instruction to move to the position on the display panel 202a. Further, the processor 205 of the portable device 200 outputs a voice instructing to move to the position 30 from the speaker 202b. It is assumed that the worker 50 has moved to the designated position 30 according to the instruction.
  • the instruction unit 150 determines whether or not the worker 50 has arrived at the position 30 specified by the worker 50 based on the position data received from the portable device 200 and the map data of the plant premises included in the inspection manual 111 (step S12). ). When it is determined that the worker 50 has arrived at the position 30 designated (Step S12; Yes), the instruction unit 150 determines whether or not image data of the diagnosis target has been received from the portable device 200 (Step S13). ). Note that, as described above, the camera 203a of the portable device 200 continuously performs photographing, and the portable device 200 transmits image data to the information processing apparatus 100 at predetermined intervals. When determining that the image data has been received (Step S13; Yes), the instruction unit 150 stores the received image data in the storage unit 110 for the diagnosis process. On the other hand, when determining that image data has not been received from the portable device 200 (step S13; No), the instruction unit 150 waits until image data is received.
  • the instructing unit 150 transmits an instruction for a tap sound inspection to the worker 50 to the portable device 200 (Step S14).
  • a signal including information indicating a place to hit the inspection target included in the inspection manual 111 is transmitted to the portable device 200.
  • the portable device 200 displays an image indicating a position to be hit on the display panel 202a as shown in FIG. 5 based on the signal received from the instruction unit 150. Further, the portable device 200 outputs a sound instructing to hit the diagnosis target from the speaker 202b. It is assumed that the worker 50 has performed the hammering test as instructed. Accordingly, the portable device 200 transmits the audio data to the information processing device 100.
  • the instructing unit 150 determines whether or not the tapping sound data has been received from the portable device 200 (step S15). Here, the instructing unit 150 determines whether or not the received loudness of the tapping sound is equal to or more than a reference value indicating an appropriate loudness of the sound of the tapping sound inspection included in the reference data 113.
  • the instruction unit 150 determines that the sound data of the tapping sound of an appropriate loudness has not been received (Step S15; No). In this case, the instruction unit 150 issues an instruction for the tap sound inspection again (step S14). Specifically, the instruction unit 150 transmits an instruction to hit the diagnosis target more strongly to the portable device 200. In response, as shown in FIG. 6, the portable device 200 displays on the display panel 202a a message instructing to hit harder. Further, the portable device 200 outputs a sound for instructing to hit harder from the speaker 202b. It is assumed that the worker 50 has hit the diagnosis target more strongly as instructed.
  • the instructing unit 150 determines that sound data of a sufficiently large tap sound has been received from the portable device 200 (step S15; Yes)
  • the instructing unit 150 stores the sound data in the storage unit 110.
  • the instruction unit 150 transmits an instruction to measure the gas concentration to the portable device 200 (Step S16).
  • portable device 200 displays a message instructing measurement of gas concentration on display panel 202a.
  • the portable device 200 outputs a voice instructing to measure the gas concentration from the speaker 202b. It is assumed that the worker 50 has measured the gas concentration using the gas detection sensor 203c as instructed. Therefore, the portable device 200 transmits the data of the measurement value to the information processing apparatus 100.
  • the instructing unit 150 determines whether or not the data of the gas concentration measurement value has been received from the portable device 200 (Step S17). For example, when the instruction unit 150 determines that the gas concentration measurement value data has been received from the portable device 200 after waiting for a predetermined time (step S17; Yes), the measurement unit stores the measurement value data in the storage unit 110. I do. On the other hand, for example, if the instruction unit 150 determines that the data of the gas concentration measurement value has not been received from the portable device 200 after waiting for a predetermined time (step S17; No), the process of step S16 is performed again. Execute.
  • the diagnosis unit 140 diagnoses whether or not the diagnosis target is damaged based on the inspection data received by the inspection data receiving unit 120 (Step S18).
  • the diagnosis unit 140 performs a diagnosis process as follows.
  • the diagnosis unit 140 inputs the image data received by the inspection data reception unit 120 to a neural network that employs a learning model for image analysis learned by the learning unit 130, and diagnoses whether the diagnosis target is damaged. I do.
  • the diagnosis unit 140 inputs the voice data received by the inspection data reception unit 120 to a neural network that employs a learning model for voice analysis learned by the learning unit 130, and determines whether the diagnosis target is damaged.
  • the diagnosis unit 140 determines whether or not gas leakage has occurred based on whether or not the measured value of the gas concentration received by the inspection data receiving unit 120 has exceeded the threshold value of the reference data 113. When determining that a gas leak has occurred, the diagnosis unit 140 diagnoses that the diagnosis target is damaged. Further, the diagnosis unit 140 records the diagnosis result in the history data 112 of the storage unit 110.
  • the instructing unit 150 presents the diagnosis result and the work instruction to the worker 50 (Step S19).
  • the instruction unit 150 indicates that at least one of the diagnosis result based on the image data, the diagnosis result based on the audio data, and the diagnosis result based on the measured gas concentration indicates that the diagnosis target is damaged. In this case, the operator 50 is notified that the diagnosis target is damaged.
  • the instruction unit 150 transmits information indicating a coping method based on the treatment table 116 to the portable device 200 together with the diagnosis result.
  • the diagnosis target is a pipe
  • the diagnosis result based on the measured value of the gas concentration indicates that the diagnosis target has damage.
  • the instruction unit 150 transmits an instruction to close the valve of the pipe to be diagnosed to the portable device 200. Therefore, the portable device 200 displays the diagnosis result and the coping method received from the information processing apparatus 100 on the display panel 202a.
  • the instruction unit 150 does not need to present a work instruction to the worker 50 in step S18.
  • the instruction unit 150 executes Step S11 again, and transmits an instruction to move to the next position to be diagnosed to the portable device 200.
  • the above is the inspection work process performed by the information processing apparatus 100.
  • an appropriate procedure for inspection is presented to the worker 50, a diagnosis is performed on the diagnosis target from the inspection data, and the worker 50 is instructed on an operation to be performed. I do. For this reason, even the inexperienced worker 50 can perform the inspection in an appropriate procedure. Further, since the information processing apparatus 100 makes a diagnosis based on the inspection data and presents a coping method to the worker 50 based on the diagnosis result, even the inexperienced worker 50 can perform an appropriate treatment.
  • the monitoring person in the control management room does not need to explain the method of working by telephone to the inexperienced worker 50 as in the related art.
  • the diagnosis unit 140 diagnoses the presence / absence of damage to the diagnosis target, but is not limited to this. For example, a level indicating the degree of damage is defined in advance, and the diagnosis unit 140 determines whether there is damage to the diagnosis target and, if there is damage, other items such as the level of damage and / or the cause of damage. May be diagnosed.
  • the learning unit 130 arranges the neural network in accordance with the number of events for determining the number of neurons in the output layer. For example, when determining the level of no damage, the level of damage 1, the level of damage 2,..., The level n of damage, (n + 1) neurons in the output layer are arranged. For example, when determining the cause of no damage, the cause of damage 1, the cause of damage 2,..., The cause of damage m, (m + 1) neurons in the output layer are arranged.
  • learning data a plurality of sets are prepared in which inspection data and the number of neurons in the output layer indicating the presence / absence of damage to the inspection target and the degree and / or cause of the damage are set.
  • the learning unit 130 supplies the inspection data of each learning data to the neural network, and calculates the weight coefficient of each neuron of the intermediate layer and the output layer such as a back propagation method so that the neuron of the output layer indicated by the corresponding number is fired. Adjust using. That is, the learning unit 130 causes the neural network to learn the learning data.
  • the diagnosis unit 140 inputs the inspection data received from the portable device 200 to the learned neural network, and diagnoses the degree and / or cause of damage to the diagnosis target from the output thereof. Even if there is damage, if the degree and cause of the damage are minor, continuous monitoring is required, but there is a case where it is not necessary to perform maintenance work quickly.
  • the instructing unit 150 may issue a work instruction to the portable device 200 based on the treatment table 116a as shown in FIG. 10, for example, according to the degree of damage. In the example shown in FIG. 10, the larger the level value, the greater the damage.
  • Modification 2 In the embodiment, the example in which the information processing apparatus 100 has the treatment table 116 in which data of the content of the treatment is stored in advance has been described, but the present invention is not limited to this.
  • the learning unit 130 may learn the contents of the treatment according to the diagnosis result, and the diagnosis unit 140 may determine whether the diagnosis target is damaged or not, and determine the contents of the treatment.
  • the method of diagnosing whether the diagnosis target is damaged by the diagnosis unit 140 is as described above.
  • the learning data a plurality of sets of data in which the data of the diagnosis result and the numbers of the neurons in the output layer indicating the contents of the treatment are prepared.
  • the learning unit 130 supplies the data of the diagnosis result of each learning data to the neural network, and adjusts the weight coefficient of each neuron by a back propagation method or the like.
  • the learning data for example, in the past inspection work, data on the diagnosis result and the contents of the treatment determined by a skilled worker can be used.
  • the diagnosis unit 140 inputs the diagnosis result to the learned neural network, and determines the content of the treatment from the output.
  • the information processing apparatus 100 issues an instruction such as a tapping sound inspection and closing of a valve to the worker 50, but the instruction to the worker 50 is not limited thereto.
  • the information processing apparatus 100 presents the position of the meter to the worker 50 and instructs the worker 50 to move to the position of the meter.
  • the information processing apparatus 100 may cause the moved worker 50 to photograph the value indicated by the meter with the camera 203a, and perform an appropriate operation according to the value of the meter to the worker 50.
  • the information processing apparatus 100 transmits a movement instruction to the portable device 200, and performs the hammering inspection and the imaging of the diagnosis target at a location where the worker 50 has moved, but the present invention is not limited to this. For example, it is assumed that the worker 50 is traveling around the premises, and the information processing apparatus 100 determines that the worker 50 is close to the diagnosis target based on the position data received from the portable device 200. It is also possible to transmit to the device 200 an instruction to perform a hammering test and photograph a diagnosis target.
  • the information processing apparatus 100 notifies the worker 50 of the inspection target if the inspection of the inspection target has been performed within a certain period in the past. You may not be instructed.
  • the information processing apparatus 100 may transmit the inspection history recorded in the history data 112 and the past diagnosis result to the portable device 200 when the worker 50 arrives at the position of the diagnosis target. . Therefore, the portable device 200 displays the inspection history and the past diagnosis result received from the information processing apparatus 100 on the display panel 202a.
  • the portable device 200 includes the camera 203a, the microphone 203b, the gas detection sensor 203c, and the GPS receiver 203d has been described, but the portable device 200 includes all of them. It is not necessary. For example, some of the workers may carry the portable device 200 without the camera 203a and perform only the hammering test. The remaining workers may carry the portable device 200 that does not have the microphone 203b and perform only imaging of the diagnosis target. Further, the portable device 200 may include other sensors, for example, a temperature sensor, a humidity sensor, and a pressure sensor.
  • the processor 105 of the information processing apparatus 100 performs an operation to implement a neural network.
  • the neural network may not be implemented by software.
  • the neural network may be realized by hardware using a neurochip in which neurons are formed by LSI chips.
  • SQUID Superconducting Quantum Interference Device
  • a computer-readable recording medium including a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, a semiconductor memory, and a magnetic tape can be used.
  • 1 information processing system 10 helmet, 20 body unit, 30 position, 50 worker, 100 information processing device, 101, 201 memory, 102 input device, 103, 202 output device, 104, 204 wireless communication circuit, 104a, 204a antenna 105, 205 processor, 109, 209 bus, 110 storage, 111 inspection manual, 112 history data, 113 reference data, 114 learning model, 115 learning data, 116 treatment table, 120 inspection data receiving unit, 130 learning unit, 140 Diagnostic unit, 150 indicating unit, 200 portable device, 202 output device, 202a display panel, 202b speaker, 203 collecting device, 203a camera, 203b microphone, 203c gas detection sensor, 203 GPS receiver, 203e cable, 1000 and 2000 programs, 1001 and 1002 pipes

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Strategic Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Educational Technology (AREA)
  • Game Theory and Decision Science (AREA)
  • Testing And Monitoring For Control Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
PCT/JP2018/027445 2018-07-23 2018-07-23 情報処理装置、情報処理システム、方法、及びプログラム WO2020021589A1 (ja)

Priority Applications (5)

Application Number Priority Date Filing Date Title
DE112018007754.3T DE112018007754B4 (de) 2018-07-23 2018-07-23 Informationsverarbeitungsvorrichtung, Informationsverarbeitungssystem, Verfahren und Programm
JP2019508996A JP6584721B1 (ja) 2018-07-23 2018-07-23 情報処理装置、情報処理システム、方法、及びプログラム
CN201880095740.4A CN112424726B (zh) 2018-07-23 2018-07-23 信息处理装置、信息处理系统、方法以及程序
PCT/JP2018/027445 WO2020021589A1 (ja) 2018-07-23 2018-07-23 情報処理装置、情報処理システム、方法、及びプログラム
TW108112786A TW202008278A (zh) 2018-07-23 2019-04-12 資訊處理裝置、資訊處理系統、方法及程式

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/027445 WO2020021589A1 (ja) 2018-07-23 2018-07-23 情報処理装置、情報処理システム、方法、及びプログラム

Publications (1)

Publication Number Publication Date
WO2020021589A1 true WO2020021589A1 (ja) 2020-01-30

Family

ID=68095335

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/027445 WO2020021589A1 (ja) 2018-07-23 2018-07-23 情報処理装置、情報処理システム、方法、及びプログラム

Country Status (5)

Country Link
JP (1) JP6584721B1 (de)
CN (1) CN112424726B (de)
DE (1) DE112018007754B4 (de)
TW (1) TW202008278A (de)
WO (1) WO2020021589A1 (de)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220083042A1 (en) * 2020-09-15 2022-03-17 Kabushiki Kaisha Toshiba Information processing device, information processing method, computer program product, and information processing system
WO2023053240A1 (ja) * 2021-09-29 2023-04-06 日本電気株式会社 保守作業支援装置、保守作業支援方法、及び記録媒体
EP4354132A1 (de) * 2022-10-13 2024-04-17 Subaru Corporation Anbohrinspektionssystem und anbohrinspektionsverfahren

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022158059A1 (ja) * 2021-01-21 2022-07-28 株式会社カネカ 情報処理装置、情報処理方法およびプログラム
WO2023145250A1 (ja) * 2022-01-27 2023-08-03 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ 装着型端末、提示方法、及び、プログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10116113A (ja) * 1996-10-09 1998-05-06 Hitachi Ltd プラント監視制御方法及び監視制御装置
JP2005077111A (ja) * 2003-08-29 2005-03-24 Tsukishima Techno Mente Service Kk 回転機器の状態診断支援装置およびそのプログラム、ならびに同プログラムを記録した記録媒体、状態診断支援方法
JP2018005661A (ja) * 2016-07-05 2018-01-11 株式会社サイバー・ラボ 点検システム
JP2018085016A (ja) * 2016-11-25 2018-05-31 Jfeエンジニアリング株式会社 鋼構造物の保全管理システム及び劣化判定装置

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10065954A1 (de) 2000-09-01 2002-03-14 Stefan Metzner Gerät zur Überwachung von Gleisanlagen
CN101509604B (zh) * 2009-03-20 2012-11-28 武汉大学 一种金属管内堆积物的检测和评定方法及装置
JP6195234B2 (ja) 2013-04-10 2017-09-13 福岡県 ろう付け物品の打音検査方法及びその装置
JPWO2017149587A1 (ja) * 2016-02-29 2018-03-08 三菱電機株式会社 携帯端末装置、サーバ装置、作業記録システム、作業記録方法及び作業支援プログラム
JP6833364B2 (ja) * 2016-07-01 2021-02-24 株式会社東芝 Icカード、および、icカード処理装置
DE102016213584A1 (de) * 2016-07-25 2018-01-25 Deere & Company Verfahren, tragbares Gerät und Kombination aus einer mobilen Arbeitsmaschine und einem tragbaren Gerät zur Unterstützung des Auffindens einer zu Fehlerbehebungs -, Störungsbeseitigungs- oder Wartungsarbeiten aufzusuchenden Position an einer mobilen Arbeitsmaschine oder einem damit gekoppelten Gerät

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10116113A (ja) * 1996-10-09 1998-05-06 Hitachi Ltd プラント監視制御方法及び監視制御装置
JP2005077111A (ja) * 2003-08-29 2005-03-24 Tsukishima Techno Mente Service Kk 回転機器の状態診断支援装置およびそのプログラム、ならびに同プログラムを記録した記録媒体、状態診断支援方法
JP2018005661A (ja) * 2016-07-05 2018-01-11 株式会社サイバー・ラボ 点検システム
JP2018085016A (ja) * 2016-11-25 2018-05-31 Jfeエンジニアリング株式会社 鋼構造物の保全管理システム及び劣化判定装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220083042A1 (en) * 2020-09-15 2022-03-17 Kabushiki Kaisha Toshiba Information processing device, information processing method, computer program product, and information processing system
WO2023053240A1 (ja) * 2021-09-29 2023-04-06 日本電気株式会社 保守作業支援装置、保守作業支援方法、及び記録媒体
EP4354132A1 (de) * 2022-10-13 2024-04-17 Subaru Corporation Anbohrinspektionssystem und anbohrinspektionsverfahren

Also Published As

Publication number Publication date
CN112424726A (zh) 2021-02-26
JPWO2020021589A1 (ja) 2020-07-30
DE112018007754T5 (de) 2021-03-11
TW202008278A (zh) 2020-02-16
JP6584721B1 (ja) 2019-10-02
DE112018007754B4 (de) 2023-08-31
CN112424726B (zh) 2022-01-25
DE112018007754T9 (de) 2021-08-05

Similar Documents

Publication Publication Date Title
WO2020021589A1 (ja) 情報処理装置、情報処理システム、方法、及びプログラム
CN110212451B (zh) 一种电力ar智能巡检装置
EP2558815B1 (de) Gerät zur ultraschallfrequenzspektrumserkennung und -abbildung
CN107965674B (zh) 一种扫描式气体泄漏全场预警系统
CN110163485A (zh) 一种机房巡检系统
US20210311187A1 (en) Systems and methods for tagging and linking acoustic images
EP3130900A1 (de) Multisensorinspektion zur identifizierung von leckenden druckrohrdefekten
US20180136035A1 (en) Method for detecting vibrations of a device and vibration detection system
JP2002287815A (ja) 現場支援システム
KR20030081441A (ko) 휴대형 누설 검출 장치
JP2019164751A (ja) 無人移動ユニットおよび巡回点検システム
CN114093052A (zh) 适用于机房管理的智能巡检方法及系统
WO2019230687A1 (ja) 打音検査端末、打音検査システムおよび打音検査データ登録方法
CN110440997A (zh) 泄漏检测模块和使用示踪气体检查待测物体密封性的方法
GB2576273A (en) Cloud-enabled testing of control systems
CN115939996A (zh) 一种电力巡检机器人的自动巡检系统
WO2019176710A1 (ja) 無人移動ユニットおよび巡回点検システム
CN106346475A (zh) 机器人和机器人控制方法
CN113206977B (zh) 输气站场的巡检监控方法、装置和存储介质
JP3697816B2 (ja) 巡回点検支援システム
CN115457331A (zh) 一种施工场所的智能巡检方法及系统
KR20210100319A (ko) 전력설비 진단 장치 및 방법
US11663900B2 (en) Apparatus, method and storage medium for detecting when a viewed equipment is different from an equipment to be inspected
KR20190062874A (ko) 기계 학습 기반 지반 및 구조물의 건전도 평가 방법 및 장치
CN113596387B (zh) 监控系统

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019508996

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18927683

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18927683

Country of ref document: EP

Kind code of ref document: A1