CN116522283A - Fusion diagnosis method, device, equipment and medium based on vibration and video data - Google Patents

Fusion diagnosis method, device, equipment and medium based on vibration and video data Download PDF

Info

Publication number
CN116522283A
CN116522283A CN202310799872.8A CN202310799872A CN116522283A CN 116522283 A CN116522283 A CN 116522283A CN 202310799872 A CN202310799872 A CN 202310799872A CN 116522283 A CN116522283 A CN 116522283A
Authority
CN
China
Prior art keywords
fault
equipment
vibration
image
video data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310799872.8A
Other languages
Chinese (zh)
Inventor
徐驰
李惠军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Leewell Intelligence Shenzhen Co ltd
Original Assignee
Leewell Intelligence Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leewell Intelligence Shenzhen Co ltd filed Critical Leewell Intelligence Shenzhen Co ltd
Priority to CN202310799872.8A priority Critical patent/CN116522283A/en
Publication of CN116522283A publication Critical patent/CN116522283A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01HMEASUREMENT OF MECHANICAL VIBRATIONS OR ULTRASONIC, SONIC OR INFRASONIC WAVES
    • G01H17/00Measuring mechanical vibrations or ultrasonic, sonic or infrasonic waves, not provided for in the preceding groups
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/02Feature extraction for speech recognition; Selection of recognition unit
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L15/16Speech classification or search using artificial neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Acoustics & Sound (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Measurement Of Mechanical Vibrations Or Ultrasonic Waves (AREA)

Abstract

The invention discloses a fusion diagnosis method, a device, equipment and a medium based on vibration and video data, which are applied to the technical field of equipment fault diagnosis, wherein the method comprises the following steps: obtaining vibration signals and video data when equipment to be diagnosed operates, and performing image and audio classification processing on the video data to obtain image data and sound data; taking the vibration signal and the sound data as a characteristic data pair, and determining a device fault period according to a sound abnormality period and a vibration abnormality period in the characteristic data pair; performing fault feature extraction processing on the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image; and marking the equipment fault position in the fault position image, and carrying out fault diagnosis processing on the equipment to be diagnosed according to the equipment fault position. The technical scheme of the invention can realize the effect of intuitively displaying the fault position of the equipment to the technician and the fault degree of the equipment.

Description

Fusion diagnosis method, device, equipment and medium based on vibration and video data
Technical Field
The invention relates to the technical field of equipment fault detection, in particular to a fusion diagnosis method, a device, equipment and a medium based on vibration and video data.
Background
Today, the rise in the technological level has prompted more and more machine equipment to be presented to the field of view and replace people's work. As the service life of the device increases, the risk of failure thereof increases correspondingly.
When detecting faults of equipment, a technician usually sets a vibration sensor on the equipment, collects vibration signals of the equipment through the vibration sensor, and judges whether the vibration signals of the equipment are abnormal or not, so as to judge whether the equipment has faults or not.
However, in the conventional equipment fault detection method, only the vibration signal can be used to draw a conclusion about whether the equipment is faulty or not, and a technician still needs to perform comprehensive detection on the equipment to know the fault position and the fault degree of the equipment.
Disclosure of Invention
The invention mainly aims to provide a fusion diagnosis method, device, equipment and medium based on vibration and video data, which aim to intuitively show the position of equipment failure and the failure degree of the equipment to technicians.
In order to achieve the above object, the present invention provides a fusion diagnosis method based on vibration and video data, comprising:
obtaining vibration signals and video data when equipment to be diagnosed operates, and performing image and audio classification processing on the video data to obtain image data and sound data;
taking the vibration signal and the sound data as a characteristic data pair, and determining a device fault period according to a sound abnormality period and a vibration abnormality period in the characteristic data pair;
performing fault feature extraction processing on the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image;
and marking the equipment fault position in the fault position image, and carrying out fault diagnosis processing on the equipment to be diagnosed according to the equipment fault position.
Optionally, before the step of performing fault feature extraction processing on the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image, the fusion diagnosis method further includes:
performing image slicing processing on the image data to obtain each equipment picture;
Carrying out graying treatment on the equipment picture to obtain a gray picture;
the step of extracting and processing the fault characteristics of the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image comprises the following steps:
and carrying out fault feature extraction processing on the positions of the equipment components in each gray level picture according to the equipment fault time period to obtain a fault position image.
Optionally, the step of performing fault feature extraction processing on the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image includes:
determining estimated fault time of the equipment to be diagnosed according to the equipment fault period;
and carrying out fault feature extraction processing on the position of the equipment component in the image data according to the estimated fault moment to obtain a fault position image.
Optionally, the fault feature extraction process includes: the step of performing fault feature extraction processing on the position of the equipment component in the image data according to the estimated fault moment to obtain a fault position image comprises the following steps:
performing displacement detection on the position of the equipment component in the image data according to the estimated fault moment to obtain a component displacement picture;
Performing deformation detection on the position of the equipment component in the image data according to the estimated fault moment to obtain a component deformation picture;
and determining a fault position image according to the component displacement image and the component deformation image.
Optionally, before the step of taking the vibration signal and the sound data as the characteristic data pair, the fusion diagnosis method further includes:
filtering and denoising the vibration signal to obtain a vibration effective signal, and filtering and denoising the sound data to obtain a sound effective signal;
the step of using the vibration signal and the sound data as a characteristic data pair includes:
and taking the vibration effective signal and the sound effective signal as characteristic data pairs.
Optionally, the fault diagnosis process includes: the step of performing fault diagnosis processing on the equipment to be diagnosed according to the equipment fault position comprises the following steps:
judging the fault range of the equipment fault position in the fault position image to obtain the estimated fault degree;
and performing fault intervention processing on the equipment to be diagnosed according to the estimated fault degree.
Optionally, the fault intervention process includes: and prompting fault information and performing equipment shutdown processing, wherein the step of performing fault intervention processing on the equipment to be diagnosed according to the estimated fault degree comprises the following steps:
carrying out fault information prompt on the equipment to be diagnosed according to the estimated fault degree; or,
and performing equipment shutdown processing on the equipment to be diagnosed according to the estimated fault degree.
In addition, in order to achieve the above object, the present invention also provides a fusion diagnostic apparatus based on vibration and video data, comprising:
the signal splitting module is used for acquiring vibration signals and video data when the equipment to be diagnosed operates, and performing image and audio classification processing on the video data to obtain image data and sound data;
a fault period determining module, configured to take the vibration signal and the sound data as a characteristic data pair, and determine a device fault period according to a sound abnormality period and a vibration abnormality period in the characteristic data pair;
the fault position determining module is used for extracting fault characteristics of the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image;
The fault diagnosis module is used for marking the fault position of the equipment in the fault position image and carrying out fault diagnosis processing on the equipment to be diagnosed according to the fault position of the equipment.
In addition, to achieve the above object, the present invention also provides a terminal device including: the system comprises a memory, a processor and a fusion diagnostic program based on vibration and video data stored on the memory and capable of running on the processor, wherein the fusion diagnostic program based on vibration and video data realizes the steps of the fusion diagnostic method based on vibration and video data.
In addition, in order to achieve the above object, the present invention also provides a storage medium having stored thereon a fusion diagnostic program based on vibration and video data, which when executed by a processor, implements the steps of the fusion diagnostic method based on vibration and video data as described above.
The invention provides a fusion diagnosis method, a device, equipment and a medium based on vibration and video data, wherein the fusion diagnosis method based on the vibration and the video data comprises the following steps: obtaining vibration signals and video data when equipment to be diagnosed operates, and performing image and audio classification processing on the video data to obtain image data and sound data; taking the vibration signal and the sound data as a characteristic data pair, and determining a device fault period according to a sound abnormality period and a vibration abnormality period in the characteristic data pair; performing fault feature extraction processing on the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image; and marking the equipment fault position in the fault position image, and carrying out fault diagnosis processing on the equipment to be diagnosed according to the equipment fault position.
The invention provides a fusion diagnosis method, a device, equipment and a medium based on vibration and video data, wherein the fusion diagnosis method comprises the following steps: when the equipment to be diagnosed operates, vibration signals and video data generated by the equipment to be diagnosed are obtained, images and audios in the video data are classified to obtain image data and sound data, then the vibration signals and the sound data are used as characteristic data pairs, equipment fault time periods when the equipment is in fault are determined according to sound abnormal time periods and vibration abnormal time periods in the characteristic data pairs, fault feature extraction processing is carried out on equipment component positions recorded in the image data according to equipment fault time periods to obtain fault position images in the image data, then equipment fault positions are marked in the fault position images, and fault diagnosis processing is carried out on the equipment to be diagnosed according to the equipment fault positions.
Compared with the traditional mode of judging whether equipment is in fault or not through vibration signals, the invention preliminarily judges the equipment fault time period through the vibration signals and the sound signals in the video data, detects whether equipment components in equipment to be diagnosed are in fault or not through image data in the video data, obtains a fault position image according to the equipment components in fault, marks the equipment fault position in the fault position image, and can intuitively convey the fault position and the fault degree of the equipment to technicians in a fault diagnosis processing mode according to the equipment fault position.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the description of the embodiments or the prior art will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a schematic device architecture diagram of a hardware operating environment of a terminal device according to an embodiment of the present invention;
FIG. 2 is a flow chart of a first embodiment of a fusion diagnostic method based on vibration and video data according to the present invention;
FIG. 3 is a schematic diagram of a fault location image determination process according to an embodiment of a fusion diagnostic method based on vibration and video data of the present invention;
FIG. 4 is a schematic diagram of a fault feature extraction process according to an embodiment of the fusion diagnosis method based on vibration and video data;
fig. 5 is a functional block diagram of an embodiment of a fusion diagnostic device based on vibration and video data according to the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that all directional indicators (such as up, down, left, right, front, and rear … …) in the embodiments of the present invention are merely used to explain the relative positional relationship, movement, etc. between the components in a particular posture (as shown in the drawings), and if the particular posture is changed, the directional indicator is changed accordingly.
In the present invention, unless specifically stated and limited otherwise, the terms "connected," "affixed," and the like are to be construed broadly, and for example, "affixed" may be a fixed connection, a removable connection, or an integral body; can be mechanically or electrically connected; either directly or indirectly, through intermediaries, or both, may be in communication with each other or in interaction with each other, unless expressly defined otherwise. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
Furthermore, descriptions such as those referred to as "first," "second," and the like, are provided for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implying an order of magnitude of the indicated technical features in the present disclosure. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In addition, the technical solutions of the embodiments may be combined with each other, but it is necessary to base that the technical solutions can be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be considered to be absent and not within the scope of protection claimed in the present invention.
As shown in fig. 1, fig. 1 is a schematic device structure diagram of a hardware operating environment of a terminal device according to an embodiment of the present invention.
The terminal device according to the embodiment of the present invention may be a vibration and video data fusion diagnostic device that performs the vibration and video data fusion diagnostic method of the present application, or may be other terminals such as a data storage control terminal, a PC, or a portable computer that performs the vibration and video data fusion diagnostic method of the present application.
As shown in fig. 1, in a hardware operating environment of a terminal device, the terminal device may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display, an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may further include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the terminal device structure shown in fig. 1 is not limiting of the terminal device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 1, an operating system, a network communication module, a user interface module, and a fusion diagnostic program based on vibration and video data may be included in a memory 1005, which is a type of computer storage medium.
In the device shown in fig. 1, the network interface 1004 is mainly used for connecting to a background server, and performing data communication with the background server; the user interface 1003 is mainly used for connecting a client (user side) and performing data communication with the client; and the processor 1001 may be configured to call a fusion diagnostic program based on vibration and video data stored in the memory 1005 and perform the following operations:
obtaining vibration signals and video data when equipment to be diagnosed operates, and performing image and audio classification processing on the video data to obtain image data and sound data;
taking the vibration signal and the sound data as a characteristic data pair, and determining a device fault period according to a sound abnormality period and a vibration abnormality period in the characteristic data pair;
performing fault feature extraction processing on the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image;
and marking the equipment fault position in the fault position image, and carrying out fault diagnosis processing on the equipment to be diagnosed according to the equipment fault position.
Based on the above hardware structure, the overall concept of the embodiments of the fusion diagnosis method based on vibration and video data of the present invention is presented.
In the embodiments of the present invention, today, the improvement of the technology level promotes the appearance of more and more machine devices in the field of view of people and replaces the work of people. As the service life of the device increases, the risk of failure thereof increases correspondingly.
When detecting faults of equipment, a technician usually sets a vibration sensor on the equipment, collects vibration signals of the equipment through the vibration sensor, and judges whether the vibration signals of the equipment are abnormal or not, so as to judge whether the equipment has faults or not.
However, in the conventional equipment fault detection method, only the vibration signal can be used to draw a conclusion about whether the equipment is faulty or not, and a technician still needs to perform comprehensive detection on the equipment to know the fault position and the fault degree of the equipment.
In view of the above problems, an embodiment of the present invention provides a fusion diagnosis method, apparatus, device and medium based on vibration and video data, where the fusion diagnosis method based on vibration and video data includes: obtaining vibration signals and video data when equipment to be diagnosed operates, and performing image and audio classification processing on the video data to obtain image data and sound data; taking the vibration signal and the sound data as a characteristic data pair, and determining a device fault period according to a sound abnormality period and a vibration abnormality period in the characteristic data pair; performing fault feature extraction processing on the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image; and marking the equipment fault position in the fault position image, and carrying out fault diagnosis processing on the equipment to be diagnosed according to the equipment fault position.
The invention provides a fusion diagnosis method, a device, equipment and a medium based on vibration and video data, wherein the fusion diagnosis method comprises the following steps: when the equipment to be diagnosed operates, vibration signals and video data generated by the equipment to be diagnosed are obtained, images and audios in the video data are classified to obtain image data and sound data, then the vibration signals and the sound data are used as characteristic data pairs, equipment fault time periods when the equipment is in fault are determined according to sound abnormal time periods and vibration abnormal time periods in the characteristic data pairs, fault feature extraction processing is carried out on equipment component positions recorded in the image data according to equipment fault time periods to obtain fault position images in the image data, then equipment fault positions are marked in the fault position images, and fault diagnosis processing is carried out on the equipment to be diagnosed according to the equipment fault positions.
Compared with the traditional mode of judging whether equipment is in fault or not through vibration signals, the invention preliminarily judges the equipment fault time period through the vibration signals and the sound signals in the video data, detects whether equipment components in equipment to be diagnosed are in fault or not through image data in the video data, obtains a fault position image according to the equipment components in fault, marks the equipment fault position in the fault position image, and can intuitively convey the fault position and the fault degree of the equipment to technicians in a fault diagnosis processing mode according to the equipment fault position.
Based on the above-described general idea of the fusion diagnostic method based on vibration and video data of the present invention, various embodiments of the fusion diagnostic method based on vibration and video data of the present invention are presented.
In the following description, for convenience of explanation, the fusion diagnostic device refers to the fusion diagnostic device based on vibration and video data described above.
Referring to fig. 2, fig. 2 is a flowchart illustrating a first embodiment of a fusion diagnosis method based on vibration and video data according to the present invention. It should be noted that although a logical sequence is shown in the flowchart, in some cases, of course, the individual steps of the fusion diagnostic method based on vibration and video data of the present invention may be performed in a different order than that herein.
In this embodiment, the fusion diagnosis method based on vibration and video data includes:
step S10: obtaining vibration signals and video data when equipment to be diagnosed operates, and performing image and audio classification processing on the video data to obtain image data and sound data;
in this embodiment, in order to obtain vibration signals and video data when the device to be diagnosed is running, a technician sets a high-definition camera and a vibration sensor on the device to be diagnosed, where the high-definition camera can collect sound signals and image signals generated when the device to be diagnosed is running. The fusion diagnosis device firstly acquires video data generated when the equipment to be diagnosed operates through the high-definition camera, acquires vibration signals generated when the equipment to be diagnosed operates through the vibration sensor, and performs image and audio classification processing on images and sounds in the video data to obtain image data and sound data.
Step S20: taking the vibration signal and the sound data as a characteristic data pair, and determining a device fault period according to a sound abnormality period and a vibration abnormality period in the characteristic data pair;
in this embodiment, after obtaining a vibration signal, sound data, and image data generated when the device to be diagnosed is operated, the fusion diagnosis apparatus uses the vibration signal and the sound data corresponding to the collection time as a characteristic data pair according to the collection time, and preliminarily determines the time when the device is faulty according to the sound abnormality period and the vibration abnormality period recorded in the characteristic data pair, that is, determines the device fault period.
Step S30: performing fault feature extraction processing on the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image;
the failure feature extraction process is to extract each piece/part information recorded in the image data and having a change.
In this embodiment, the fusion diagnosis apparatus detects whether or not positional information of each component/part in the apparatus has changed in the image data according to the apparatus failure period after preliminarily determining the time at which the apparatus has failed, and extracts the changed part as a failure position image.
Step S40: and marking the equipment fault position in the fault position image, and carrying out fault diagnosis processing on the equipment to be diagnosed according to the equipment fault position.
In this embodiment, after obtaining the fault location image, the fusion diagnosis apparatus marks the device fault location in the fault location image, and then performs fault diagnosis processing on the device to be diagnosed according to the device fault location.
For example, it is assumed that a technician is provided with a high-definition camera and a vibration sensor on a device to be diagnosed, and the high-definition camera can collect images and sounds at the same time. The fusion diagnosis device obtains video data generated when the equipment to be diagnosed operates through a high-definition camera, performs classification processing on images and audios of the obtained video data, takes the images in the video data as image data, takes audios in the video data as sound data, and obtains vibration signals generated when the equipment to be diagnosed operates through a vibration sensor. And then the fusion diagnosis device is used for taking the acquisition time as a basis, associating the vibration signal with the sound signal corresponding to the acquisition time as a characteristic data pair, preliminarily determining the time period of the fault of the equipment to be diagnosed according to the sound abnormal time period and the vibration abnormal time period recorded in the characteristic data pair, searching in the image data according to the time period of the fault of the equipment, namely detecting whether each component of the equipment to be diagnosed changes in the image data, obtaining a fault position image according to the positions of each changed component, marking the equipment fault position in the fault position image, and finally carrying out fault diagnosis treatment on the equipment to be diagnosed according to the equipment fault position. For example, when the acquisition time of the vibration signal is 10:00-12:00 and the acquisition time of the sound data is 11:00-12:00, the fusion diagnosis device takes the vibration signal and the sound data of 11:00 as the characteristic data pair, and when the abnormal sound time is 11:30-12:00 and the abnormal vibration time is 10:30-12:00, the equipment failure period can be 10:30-12:00.
In the embodiment, the invention preliminarily determines the time when the equipment to be diagnosed breaks down through the sound data and the vibration signal, detects whether the equipment component changes in the image data according to the time when the equipment to be diagnosed breaks down, and obtains the fault position image according to the changed equipment component to carry out fault diagnosis, so that the fault occurrence position of the equipment to be diagnosed can be intuitively transmitted to technicians. Furthermore, it is possible to provide a device for the treatment of a disease. The technician can also judge the fault degree of the equipment to be diagnosed through the fault position image.
Further, based on the above-described first embodiment of the fusion diagnostic method based on vibration and video data of the present invention, a second embodiment of the fusion diagnostic method based on vibration and video data of the present invention is proposed.
Referring to fig. 3, fig. 3 is a schematic diagram illustrating a fault location image determining process according to an embodiment of a method for fusion diagnosis based on vibration and video data. It should be noted that although a logical sequence is shown in the flowchart, in some cases, of course, the respective steps of determining a failure location image in the fusion diagnostic method based on vibration and video data of the present invention may be performed in a different order from that here.
In this embodiment, the step S30 is as follows: performing fault feature extraction processing on the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image, wherein the fault position image comprises;
step S301: determining estimated fault time of the equipment to be diagnosed according to the equipment fault period;
in this embodiment, after obtaining the equipment fault period of the equipment to be diagnosed, the fusion diagnosis apparatus may use the start time of the equipment fault period as the estimated fault time of the equipment to be diagnosed, or may use the start time of the intersection time period of the serious abnormal sound period and the serious abnormal vibration period in the equipment fault period as the estimated fault time of the equipment to be diagnosed.
Step S302: and carrying out fault feature extraction processing on the position of the equipment component in the image data according to the estimated fault moment to obtain a fault position image.
In this embodiment, after obtaining the estimated fault time of the device to be diagnosed, the fusion diagnostic apparatus performs fault feature extraction processing on the device component position recorded in the image data from the estimated fault time according to the estimated fault time to obtain the fault position image in the device to be diagnosed.
For example, assuming that the equipment fault period obtained by the fusion diagnosis device is 11:00-12:00, the fusion diagnosis device may take 11:00 as the estimated fault time of the equipment to be diagnosed; assume that the equipment failure period is 11:00-12:00, and 11:15-12:00 is a severely abnormal period of sound (such as extremely noisy), 11:05-12:00 is a severely abnormal period of vibration, and 11:05 can also be used as the estimated fault time of the equipment to be diagnosed. After determining the estimated fault time, the fusion diagnosis device detects each component of the equipment to be diagnosed in the image video based on the estimated fault time, so as to detect the component with the occurrence of the change or the component with the occurrence of the abnormality, and obtains a fault position image of the equipment to be diagnosed according to the component with the occurrence of the abnormality and the occurrence of the change.
Optionally, referring to fig. 4, fig. 4 is a schematic diagram of a fault feature extraction flow chart of an embodiment of a fusion diagnosis method based on vibration and video data according to the present invention. It should be noted that although a logical sequence is shown in the flowchart, in some cases, of course, the respective steps of the fault feature extraction in the fusion diagnostic method based on vibration and video data of the present invention may be performed in a different order from that here.
In this embodiment, the fault signature extraction process includes: displacement detection and deformation detection, step S302: performing fault feature extraction processing on the device component position in the image data according to the estimated fault moment to obtain a fault position image, including:
step S3021: performing displacement detection on the position of the equipment component in the image data according to the estimated fault moment to obtain a component displacement picture;
in this embodiment, the fault feature extraction processing includes displacement detection and deformation detection, and after obtaining the estimated fault time of the device to be diagnosed, the fusion diagnosis device performs displacement detection on the position of the device component in the image data according to the estimated fault time, detects the displaced device component, and obtains a component displacement picture according to the displaced component.
Step S3022: performing deformation detection on the position of the equipment component in the image data according to the estimated fault moment to obtain a component deformation picture;
in this embodiment, after obtaining the estimated fault time, the fusion diagnosis device performs deformation detection on the positions of each component in the image data according to the estimated fault time, so as to detect the component that generates deformation, and further obtains a component deformation picture according to the component that generates deformation.
Step S3023: and determining a fault position image according to the component displacement image and the component deformation image.
In this embodiment, the fusion diagnostic device obtains a member displacement picture by performing displacement detection on the image data, and integrates the displaced member recorded in the member displacement picture and the deformed member in the member deformation picture to obtain the fault position image after performing deformation detection on the image data to obtain the member deformation picture.
The displacement detection and deformation detection of the equipment member may be performed by big data analysis, or may be performed by historical data analysis, or may be performed by a machine learning algorithm model.
For example, assuming that the estimated fault time is 11:00, the fusion diagnosis apparatus performs big data analysis on the device components recorded in the image data after 11:00 to realize displacement detection, detects the device components with displacement and obtains component displacement pictures, and for 11: and (3) carrying out deformation detection on the equipment components recorded in the image data after 00 through the convolutional neural network model after training so as to detect the components subjected to deformation and obtain component deformation pictures. And integrating the abnormal components recorded in the component displacement picture and the component deformation picture by the fusion diagnosis equipment, so as to obtain a fault position image.
Optionally, in a possible embodiment, the fault diagnosis process includes: fault range judgment and fault intervention processing, step S40: and performing fault diagnosis processing on the equipment to be diagnosed according to the equipment fault position, including:
step S401: judging the fault range of the equipment fault position in the fault position image to obtain the estimated fault degree;
in this embodiment, after obtaining the fault location image, the fusion diagnosis apparatus performs fault range determination on the equipment fault location recorded in the fault location image, so as to obtain the estimated fault degree.
Step S402: and performing fault intervention processing on the equipment to be diagnosed according to the estimated fault degree.
In this embodiment, after obtaining the estimated fault degree of the device to be diagnosed, the fusion diagnostic device performs fault intervention processing on the device to be diagnosed, so as to improve the operation safety of the device to be diagnosed.
The fusion diagnosis device judges the fault range of the equipment fault position recorded in the fault position image to obtain displacement information of the component and deformation information of the component, compares the displacement information and the deformation information with a preset threshold value to obtain estimated fault degree, and then performs fault intervention processing on the operation of the equipment to be diagnosed according to the estimated fault degree to improve the operation safety of the equipment to be diagnosed. For example, if the displacement information is 0.5cm and the preset displacement threshold is 0.2cm, the estimated fault degree can be obtained according to the displacement information and the displacement threshold.
Optionally, the fault intervention process includes: fault information prompting and equipment shutdown processing, step S402 described above: performing fault intervention processing on the equipment to be diagnosed according to the estimated fault degree, including:
step S4021: carrying out fault information prompt on the equipment to be diagnosed according to the estimated fault degree;
in this embodiment, the fault intervention processing includes fault information prompting and equipment shutdown processing, and the fusion diagnosis device prompts the fault information of the equipment to be diagnosed according to the estimated fault degree so as to prompt technicians.
Step S4022: and performing equipment shutdown processing on the equipment to be diagnosed according to the estimated fault degree.
In this embodiment, the fusion diagnostic device further performs a device shutdown process on the device to be diagnosed according to the estimated fault level, and directly shuts down the device with serious fault, so as to ensure the safety of the device.
For example, the fusion diagnostic device may perform fault prompting or device shutdown processing on the device to be diagnosed according to the estimated fault degree. For example, when the estimated fault level is general, the fusion diagnosis device may edit a prompt message of "the device has general fault, and the fault location is marked in the fault location picture", and send the prompt message to a technician; when the estimated fault degree is serious, the fusion diagnosis equipment can directly shut down the equipment with serious fault so as to ensure the safety of the equipment.
In this embodiment, the fault detection accuracy of the equipment to be diagnosed can be improved by determining the estimated fault time of the equipment to be diagnosed, determining the fault position image in the image data according to the estimated fault time, and further performing deformation and displacement detection in the fault position image to judge the fault degree of the equipment. In addition, the invention can improve the efficiency of fault maintenance and the safety of equipment operation by marking the fault position in the fault position image and prompting the technician.
Further, based on the first and second embodiments of the fusion diagnostic method based on vibration and video data of the present invention described above, a third embodiment of the fusion diagnostic method based on vibration and video data of the present invention is proposed.
In the present embodiment, in the above step S30: before the fault feature extraction processing is performed on the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image, the fusion diagnosis method further comprises the following steps:
step S50: performing image slicing processing on the image data to obtain each equipment picture;
the image slicing process refers to cutting dynamic image data into pictures with frames as units, and comparing the cut pictures with similarity to remove pictures with higher similarity.
In this embodiment, the fusion diagnostic apparatus performs image slicing processing on image data to convert dynamic image data into each slice image in units of frames, then performs similarity extraction on each slice image, removes slice images with higher similarity, and uses each slice image with smaller similarity as each device image.
Step S60: carrying out graying treatment on the equipment picture to obtain a gray picture;
in this embodiment, after obtaining each device picture, the fusion diagnostic apparatus performs the grayscale processing on the device picture, and records the image after performing the grayscale processing as a grayscale image.
For example, assuming that the image video is 30 seconds, in order to reduce the amount of calculation of image processing, the fusion diagnosis device may extract every 1 second frame in the image video to obtain 30 slice images, perform similarity extraction on the 30 slice images, remove slice images with higher similarity, for example, if the slice images with higher similarity in the first 5 seconds are very high, only the slice images with higher similarity are retained, and assuming that after removing the slice images with higher similarity, 5 slice images remain, the 5 slice images are taken as 5 device images, and then perform graying processing on the 5 device images to obtain 5 gray-scale images.
Based on this, step S30 described above: performing fault feature extraction processing on the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image, wherein the fault position image comprises:
step S303: and carrying out fault feature extraction processing on the positions of the equipment components in each gray level picture according to the equipment fault time period to obtain a fault position image.
In this embodiment, after converting the image data into each gray-scale image, the fusion diagnostic device performs fault feature extraction processing on the device component positions recorded in each gray-scale image according to the device fault period to obtain a fault position image.
For example, assuming that the image data is converted into 240 grayscale images, the fusion diagnostic apparatus performs the failure feature extraction process on the positions of the apparatus members recorded in the 240 grayscale images according to the apparatus failure period to obtain the failure position image.
Optionally, in a possible embodiment, in step S20 above: before taking the vibration signal and the sound data as the characteristic data pair, the fusion diagnosis method further includes:
step S70: filtering and denoising the vibration signal to obtain a vibration effective signal, and filtering and denoising the sound data to obtain a sound effective signal;
In this embodiment, the fusion diagnosis circuit performs filtering noise reduction on the vibration signal acquired by the vibration sensor to remove noise in the vibration signal and obtain a vibration effective signal, and performs filtering noise reduction processing on the sound data to remove noise in the sound data and obtain a sound effective signal.
Illustratively, the fusion diagnostic device performs filtering and noise reduction processing on the vibration signal to remove interference information in the vibration signal to obtain a vibration effective signal, and similarly, the fusion diagnostic device also performs filtering and noise reduction processing on the sound data to remove the interference information in the sound data to obtain the sound effective signal.
Based on this, step S20 described above: the step of using the vibration signal and the sound data as a characteristic data pair includes:
step S201: and taking the vibration effective signal and the sound effective signal as characteristic data pairs.
In this embodiment, the fusion diagnostic device uses the vibration effective signal and the sound effective signal as the pair of characteristic data after obtaining the vibration effective signal and the sound effective signal.
For example, after the fusion diagnosis device performs the filtering noise reduction process, a vibration effective signal and a sound effective signal are obtained, and then the vibration effective signal and the sound effective signal are integrated into a characteristic data pair based on the acquisition time. For example, a vibration effective signal and a sound effective signal of 12:00 are taken as one characteristic data pair.
In this embodiment, the present invention can reduce the amount of calculation for identifying the failure member by the fusion diagnostic device by slicing and graying the image data to obtain the grayscale image. In addition, the invention improves the effectiveness of the vibration signal and the sound data in representing equipment faults by filtering and noise reduction processing of the vibration signal and the sound data.
In addition, the embodiment of the invention also provides a fusion diagnosis device based on vibration and video data.
Referring to fig. 5, the fusion diagnosis device based on vibration and video data of the present invention includes:
the signal splitting module 10 is used for acquiring vibration signals and video data when the equipment to be diagnosed is operated, and performing image and audio classification processing on the video data to obtain image data and sound data;
a failure period determining module 20 for taking the vibration signal and the sound data as a characteristic data pair, and determining a device failure period according to a sound abnormality period and a vibration abnormality period in the characteristic data pair;
a fault location determining module 30, configured to perform fault feature extraction processing on a location of a device component in the image data according to the device fault period to obtain a fault location image;
And a fault diagnosis module 40, configured to mark a device fault location in the fault location image, and perform fault diagnosis processing on the device to be diagnosed according to the device fault location.
Optionally, the fusion diagnosis device based on vibration and video data further comprises:
the slicing module is used for carrying out image slicing processing on the image data to obtain each equipment picture;
the graying processing module is used for graying the equipment picture to obtain a gray picture;
based on this, the fault location determining module 30 is further configured to perform fault feature extraction processing on the location of the equipment component in each of the grayscale pictures according to the equipment fault period to obtain a fault location image.
Optionally, the fault location determining module 30 includes:
the fault moment determining unit is used for determining the estimated fault moment of the equipment to be diagnosed according to the equipment fault period;
and the fault image determining unit is used for carrying out fault feature extraction processing on the position of the equipment component in the image data according to the estimated fault moment to obtain a fault position image.
Optionally, the fault feature extraction process includes: displacement detection and deformation detection, the above-mentioned failure image determining unit includes:
The displacement detection subunit is used for carrying out displacement detection on the position of the equipment component in the image data according to the estimated fault moment to obtain a component displacement picture;
the deformation detection subunit is used for carrying out deformation detection on the position of the equipment component in the image data according to the estimated fault moment to obtain a component deformation picture;
and the displacement deformation integration subunit is used for determining a fault position image according to the component displacement image and the component deformation image.
Optionally, the fusion diagnosis device based on vibration and video data further comprises:
the signal processing module is used for carrying out filtering noise reduction processing on the vibration signal to obtain a vibration effective signal, and carrying out filtering noise reduction processing on the sound data to obtain a sound effective signal;
based on this, the above-described malfunction period determination module 20 is also configured to take the vibration effective signal and the sound effective signal as a pair of characteristic data.
Optionally, the fault diagnosis process includes: fault range determination and fault intervention, the fault diagnosis module 40 includes:
the fault degree determining unit is used for judging the fault range of the fault position in the fault position image to obtain the estimated fault degree;
And the fault intervention unit is used for performing fault intervention processing on the equipment to be diagnosed according to the estimated fault degree.
Optionally, the fault intervention process includes: fault information prompt and equipment shutdown processing, the fault intervention unit comprises:
the fault prompting subunit is used for prompting fault information of the equipment to be diagnosed according to the estimated fault degree;
and the equipment shutdown subunit is used for carrying out equipment shutdown processing on the equipment to be diagnosed according to the estimated fault degree.
The function implementation of each module in the fusion diagnostic device based on the vibration and the video data corresponds to each step in the embodiment of the fusion diagnostic method based on the vibration and the video data, and the function and the implementation process of the module are not described in detail herein.
In addition, the invention also provides a terminal device, which comprises: the system comprises a memory, a processor and a fusion diagnostic program based on vibration and video data, wherein the fusion diagnostic program based on the vibration and video data is stored in the memory and can run on the processor, and the fusion diagnostic program based on the vibration and video data realizes the steps of the fusion diagnostic method based on the vibration and video data.
The specific embodiment of the terminal device of the present invention is substantially the same as the embodiments of the fusion diagnosis method based on vibration and video data, and will not be described herein.
Furthermore, the present invention proposes a storage medium having stored thereon a fusion diagnostic program based on vibration and video data, which when executed by a processor, implements the steps of the fusion diagnostic method based on vibration and video data of the present invention as described above.
The specific embodiment of the storage medium of the present invention is substantially the same as the above embodiments of the fusion diagnosis method based on vibration and video data, and will not be described herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal device (which may be a car-mounted computer, a smart phone, a computer, or a server, etc.) to perform the method described in the embodiments of the present application.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the claims, and all equivalent structures or equivalent processes using the descriptions and drawings of the present application, or direct or indirect application in other related technical fields are included in the scope of the claims of the present application.

Claims (10)

1. A fusion diagnostic method based on vibration and video data, the fusion diagnostic method based on vibration and video data comprising:
obtaining vibration signals and video data when equipment to be diagnosed operates, and performing image and audio classification processing on the video data to obtain image data and sound data;
taking the vibration signal and the sound data as a characteristic data pair, and determining a device fault period according to a sound abnormality period and a vibration abnormality period in the characteristic data pair;
performing fault feature extraction processing on the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image;
and marking the equipment fault position in the fault position image, and carrying out fault diagnosis processing on the equipment to be diagnosed according to the equipment fault position.
2. The fusion diagnostic method based on vibration and video data according to claim 1, wherein, before the step of performing a fault feature extraction process on a device member position in the image data according to the device fault period to obtain a fault position image, the fusion diagnostic method further comprises:
performing image slicing processing on the image data to obtain each equipment picture;
Carrying out graying treatment on the equipment picture to obtain a gray picture;
the step of extracting and processing the fault characteristics of the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image comprises the following steps:
and carrying out fault feature extraction processing on the positions of the equipment components in each gray level picture according to the equipment fault time period to obtain a fault position image.
3. The fusion diagnostic method based on vibration and video data according to claim 1, wherein the step of performing a fault feature extraction process on the position of the equipment member in the image data according to the equipment fault period to obtain a fault position image comprises:
determining estimated fault time of the equipment to be diagnosed according to the equipment fault period;
and carrying out fault feature extraction processing on the position of the equipment component in the image data according to the estimated fault moment to obtain a fault position image.
4. A fusion diagnostic method based on vibration and video data according to claim 3, wherein the fault signature extraction process comprises: the step of performing fault feature extraction processing on the position of the equipment component in the image data according to the estimated fault moment to obtain a fault position image comprises the following steps:
Performing displacement detection on the position of the equipment component in the image data according to the estimated fault moment to obtain a component displacement picture;
performing deformation detection on the position of the equipment component in the image data according to the estimated fault moment to obtain a component deformation picture;
and determining a fault position image according to the component displacement image and the component deformation image.
5. The fusion diagnostic method based on vibration and video data according to claim 1, wherein before the step of taking the vibration signal and the sound data as a characteristic data pair, the fusion diagnostic method further comprises:
filtering and denoising the vibration signal to obtain a vibration effective signal, and filtering and denoising the sound data to obtain a sound effective signal;
the step of using the vibration signal and the sound data as a characteristic data pair includes:
and taking the vibration effective signal and the sound effective signal as characteristic data pairs.
6. The fusion diagnostic method based on vibration and video data according to claim 1, wherein the fault diagnosis process includes: the step of performing fault diagnosis processing on the equipment to be diagnosed according to the equipment fault position comprises the following steps:
Judging the fault range of the equipment fault position in the fault position image to obtain the estimated fault degree;
and performing fault intervention processing on the equipment to be diagnosed according to the estimated fault degree.
7. The fusion diagnostic method based on vibration and video data according to claim 6, wherein the fault intervention process comprises: and prompting fault information and performing equipment shutdown processing, wherein the step of performing fault intervention processing on the equipment to be diagnosed according to the estimated fault degree comprises the following steps:
carrying out fault information prompt on the equipment to be diagnosed according to the estimated fault degree; or,
and performing equipment shutdown processing on the equipment to be diagnosed according to the estimated fault degree.
8. A fusion diagnostic device based on vibration and video data, the fusion diagnostic device based on vibration and video data comprising:
the signal splitting module is used for acquiring vibration signals and video data when the equipment to be diagnosed operates, and performing image and audio classification processing on the video data to obtain image data and sound data;
a fault period determining module, configured to take the vibration signal and the sound data as a characteristic data pair, and determine a device fault period according to a sound abnormality period and a vibration abnormality period in the characteristic data pair;
The fault position determining module is used for extracting fault characteristics of the position of the equipment component in the image data according to the equipment fault period to obtain a fault position image;
the fault diagnosis module is used for marking the fault position of the equipment in the fault position image and carrying out fault diagnosis processing on the equipment to be diagnosed according to the fault position of the equipment.
9. A terminal device, characterized in that the terminal device comprises: a memory, a processor and a vibration and video data based fusion diagnostic program stored on the memory and executable on the processor, which when executed by the processor, performs the steps of the vibration and video data based fusion diagnostic method according to any one of claims 1 to 7.
10. A storage medium having stored thereon a vibration and video data based fusion diagnostic program which when executed by a processor implements the steps of the vibration and video data based fusion diagnostic method of any one of claims 1 to 7.
CN202310799872.8A 2023-07-03 2023-07-03 Fusion diagnosis method, device, equipment and medium based on vibration and video data Pending CN116522283A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310799872.8A CN116522283A (en) 2023-07-03 2023-07-03 Fusion diagnosis method, device, equipment and medium based on vibration and video data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310799872.8A CN116522283A (en) 2023-07-03 2023-07-03 Fusion diagnosis method, device, equipment and medium based on vibration and video data

Publications (1)

Publication Number Publication Date
CN116522283A true CN116522283A (en) 2023-08-01

Family

ID=87403313

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310799872.8A Pending CN116522283A (en) 2023-07-03 2023-07-03 Fusion diagnosis method, device, equipment and medium based on vibration and video data

Country Status (1)

Country Link
CN (1) CN116522283A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117422888A (en) * 2023-09-13 2024-01-19 长龙(杭州)航空维修工程有限公司 Aircraft performance evaluation method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110006672A (en) * 2019-04-09 2019-07-12 唐山百川智能机器股份有限公司 Rail vehicle fault monitoring method based on acoustic imaging technology
CN112082781A (en) * 2019-06-12 2020-12-15 比亚迪股份有限公司 Vehicle and fault detection method and fault detection device thereof
CN112991297A (en) * 2021-03-15 2021-06-18 广东电网有限责任公司清远供电局 Power transmission line fault diagnosis method and system, electronic equipment and storage medium
CN114236309A (en) * 2021-11-30 2022-03-25 国网北京市电力公司 Power transmission and transformation fault determination method
CN114414963A (en) * 2022-01-20 2022-04-29 西安交通大学 Acoustic imaging positioning system and method for intelligent monitoring of substation domain faults

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110006672A (en) * 2019-04-09 2019-07-12 唐山百川智能机器股份有限公司 Rail vehicle fault monitoring method based on acoustic imaging technology
CN112082781A (en) * 2019-06-12 2020-12-15 比亚迪股份有限公司 Vehicle and fault detection method and fault detection device thereof
CN112991297A (en) * 2021-03-15 2021-06-18 广东电网有限责任公司清远供电局 Power transmission line fault diagnosis method and system, electronic equipment and storage medium
CN114236309A (en) * 2021-11-30 2022-03-25 国网北京市电力公司 Power transmission and transformation fault determination method
CN114414963A (en) * 2022-01-20 2022-04-29 西安交通大学 Acoustic imaging positioning system and method for intelligent monitoring of substation domain faults

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117422888A (en) * 2023-09-13 2024-01-19 长龙(杭州)航空维修工程有限公司 Aircraft performance evaluation method and system
CN117422888B (en) * 2023-09-13 2024-05-10 长龙(杭州)航空维修工程有限公司 Aircraft performance evaluation method and system

Similar Documents

Publication Publication Date Title
CN110031088B (en) Electronic equipment fault detection method, device, equipment and range hood
JP4272249B1 (en) Worker fatigue management apparatus, method, and computer program
CN116522283A (en) Fusion diagnosis method, device, equipment and medium based on vibration and video data
WO2022062775A1 (en) Monitoring processing method and system based on vehicle terminal system, and related device
JP5434448B2 (en) Vehicle failure detection device, electronic control unit, vehicle failure detection method
EP2840464A1 (en) Apparatus and method for enhancing system usability
US20180315131A1 (en) User-aware interview engine
CN113239913A (en) Noise source positioning method, device and system based on sound and image
CN114882509A (en) Media data adjusting method, device, equipment and medium
US20240046443A1 (en) Analysis apparatus, analysis system, analysis program, and analysis method
JP2019154613A (en) Drowsiness detection system, drowsiness detection data generation system, drowsiness detection method, computer program, and detection data
JP4840978B2 (en) IMAGING DEVICE, ITS CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
JP7375692B2 (en) Information processing device, information processing method, and information processing system
CN111783030B (en) Haptic experience assessment method, device and storage medium
US11763561B2 (en) Information processing apparatus, information processing method, and program
CN113593570B (en) Voice abnormality diagnosis method, device, equipment and storage medium
CN116012954A (en) Behavior recognition method, behavior recognition device, electronic equipment and computer readable storage medium
CN114155567A (en) Target detection method and device, storage medium and electronic equipment
CN116096301A (en) Program, information processing apparatus, and information processing method
US20210312654A1 (en) Computer system, object situation diagnostic method and program
CN105204638B (en) A kind of electronic equipment and information processing method
CN118072472B (en) Early warning method, device, equipment and storage medium for fall detection
TWI778428B (en) Method, device and system for detecting memory installation state
WO2022162817A1 (en) Failure diagnosis system, failure diagnosis method, and program
CN117115887A (en) Face recognition method and device under specific behaviors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20230801