WO2020162425A1 - Dispositif d'analyse, procédé d'analyse et programme - Google Patents

Dispositif d'analyse, procédé d'analyse et programme Download PDF

Info

Publication number
WO2020162425A1
WO2020162425A1 PCT/JP2020/004042 JP2020004042W WO2020162425A1 WO 2020162425 A1 WO2020162425 A1 WO 2020162425A1 JP 2020004042 W JP2020004042 W JP 2020004042W WO 2020162425 A1 WO2020162425 A1 WO 2020162425A1
Authority
WO
WIPO (PCT)
Prior art keywords
discriminator
analysis
production facility
data
detection result
Prior art date
Application number
PCT/JP2020/004042
Other languages
English (en)
Japanese (ja)
Inventor
康晴 大西
靖行 福田
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2020571198A priority Critical patent/JP7188463B2/ja
Priority to CN202080012631.9A priority patent/CN113383216A/zh
Priority to KR1020217024139A priority patent/KR20210107844A/ko
Publication of WO2020162425A1 publication Critical patent/WO2020162425A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01HMEASUREMENT OF MECHANICAL VIBRATIONS OR ULTRASONIC, SONIC OR INFRASONIC WAVES
    • G01H1/00Measuring characteristics of vibrations in solids by using direct conduction to the detector
    • G01H1/12Measuring characteristics of vibrations in solids by using direct conduction to the detector of longitudinal or not specified vibrations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01HMEASUREMENT OF MECHANICAL VIBRATIONS OR ULTRASONIC, SONIC OR INFRASONIC WAVES
    • G01H17/00Measuring mechanical vibrations or ultrasonic, sonic or infrasonic waves, not provided for in the preceding groups
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M99/00Subject matter not provided for in other groups of this subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present invention relates to an analysis device, an analysis method, and a program, and more particularly to an analysis device, an analysis method, and a program for analyzing data of a sensor that monitors the state of equipment.
  • vibration and acoustic sensors for manufacturing quality control of production materials using mechanical equipment. For example, by acquiring vibration data generated by a processing machine during processing of production materials and avoiding production loss by stopping the processing of production materials when abnormal vibrations are captured, the operating status of production equipment can be used as vibration. It is attracting attention as a technology that improves the production efficiency of the manufacturing industry by monitoring and monitoring it, improving the efficiency of maintenance, and finding the optimal operating conditions for extending the life of the equipment.
  • Patent Document 1 discloses a method in which a sensor is attached to a facility to be monitored and the facility is monitored based on time series data measured by the sensor.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a technique for efficiently and accurately monitoring the state of production equipment.
  • the first aspect relates to an analysis device.
  • the first analysis device is Image processing means for imaging the detection result of the vibration sensor provided in the production facility, Generating means for generating a discriminator by subjecting the imaged data to machine learning processing; And an analysis unit that performs a state analysis process of the production facility using the discriminator.
  • the second analysis device Determination means for performing abnormality determination processing of the production facility using the first discriminator based on the detection result of the vibration sensor provided in the production facility; Image processing means for imaging the detection result, which cannot be discriminated as normal or abnormal by the first discriminator; Generating means for generating a second discriminator using the imaged data as a target of machine learning processing; Analysis means for performing a state analysis process of the production facility using the second discriminator.
  • the second aspect relates to at least one computer implemented analysis method.
  • the first analysis method according to the second aspect is The analysis device Visualize the detection result of the vibration sensor installed in the production facility, Generate a discriminator with the imaged data as the target of machine learning processing, Performing state analysis processing of the production facility using the discriminator.
  • the second analysis method according to the second aspect is The analysis device Based on the detection result of the vibration sensor provided in the production equipment, the abnormality determination processing of the production equipment is performed using the first discriminator, The detection result, which could not be discriminated as normal or abnormal by the first discriminator, is imaged, A second discriminator is generated by using the imaged data as a target of machine learning processing, Performing state analysis processing of the production facility using the second discriminator.
  • the present invention may be a program that causes at least one computer to execute the method of the second aspect, or a computer-readable recording medium that records such a program. May be.
  • the recording medium includes a non-transitory tangible medium.
  • the computer program includes a computer program code that, when executed by a computer, causes the computer to perform the analysis method on an analysis device.
  • the various constituent elements of the present invention do not necessarily have to be independently present, and a plurality of constituent elements are formed as one member, and one constituent element is formed by a plurality of members. May be present, a certain component may be a part of another component, a part of a certain component may overlap a part of another component, and the like.
  • the order of description does not limit the order in which the plurality of procedures are executed. Therefore, when carrying out the method and computer program of the present invention, the order of the plurality of procedures can be changed within a range that does not hinder the contents.
  • the plurality of procedures of the method and computer program of the present invention are not limited to being executed at different timings. For this reason, another procedure may occur during the execution of a certain procedure, the execution timing of a certain procedure and the execution timing of another procedure may partially or entirely overlap, and the like.
  • FIG. 4 is a diagram showing time series data of measurement data before imaging. It is a figure which shows the image data imaged by the image processing part. It is a flowchart which shows an example of operation
  • 17 is a flowchart illustrating an example of a discriminator update processing procedure using the data extracted in step S406 of FIG. 16. 17 is a flowchart showing an example of a processing procedure when it is determined to be normal in step S401 of the state analysis processing of FIG. It is a flow chart which shows an example of the procedure of defective model construction processing of an analysis device.
  • FIG. 3 is a flowchart for explaining the analysis device of the first embodiment.
  • FIG. 6 is a flowchart for explaining the analysis device of the second embodiment.
  • FIG. 1 is a diagram conceptually showing a system configuration of an equipment monitoring system 1 using an analysis device according to an embodiment of the present invention.
  • the equipment monitored by the equipment monitoring system 1 is the production equipment 10, and in the present embodiment, a belt conveyor will be described as an example.
  • a plurality of sensors 12 for monitoring the belt conveyor are installed at a plurality of locations along the moving direction of the belt conveyor.
  • Each sensor 12 is, for example, a vibration sensor.
  • a vibration sensor that detects vibration in one direction
  • a plurality of vibration sensors may be installed at one location in order to detect vibration in multiple directions.
  • the measurement data output from the vibration sensor is time series data indicating a vibration waveform.
  • the vibration sensor measures the vibration generated in the production facility 10 to be monitored.
  • the vibration sensor may be a uniaxial acceleration sensor that measures acceleration in the uniaxial direction, a triaxial acceleration sensor that measures acceleration in the triaxial directions, or any other type.
  • the plurality of vibration sensors may be the same kind of vibration sensor, or a plurality of kinds of vibration sensors may be mixed.
  • the analysis device 100 is connected to the GW (GateWay) 5 via the network 3 and receives detection results from the plurality of sensors 12 provided in the production facility 10.
  • the analysis device 100 is connected to the storage device 20.
  • the storage device 20 stores vibration data analyzed by the analysis device 100.
  • the storage device 20 may be a device separate from the analysis device 100, a device included in the analysis device 100, or a combination thereof.
  • FIG. 2 is a diagram showing an example of a data structure of the vibration data 22 and the facility information 24 stored in the storage device 20 of this embodiment.
  • the vibration data 22 is associated with time information and measurement data for each sensor ID that identifies the vibration sensor.
  • the facility information 24 is associated with the sensor ID of at least one vibration sensor for each facility ID that identifies the facility.
  • FIG. 3 is a block diagram illustrating the hardware configuration of each device of this embodiment.
  • Each device has a processor 50, a memory 52, an input/output interface (I/F) 54, a peripheral circuit 56, and a bus 58.
  • the peripheral circuit 56 includes various modules.
  • the processing device may not have the peripheral circuit 56.
  • the bus 58 is a data transmission path for the processor 50, the memory 52, the peripheral circuit 56, and the input/output interface 54 to mutually transmit data.
  • the processor 50 is an arithmetic processing device such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit).
  • the memory 52 is a memory such as a RAM (Random Access Memory) or a ROM (Read Only Memory).
  • the input/output interface 54 includes an interface for acquiring information from an input device, an external device, an external server, a sensor, etc., an interface for outputting information to an output device, an external device, an external server, etc.
  • the input device is, for example, a keyboard, a mouse, a microphone, or the like.
  • the output device is, for example, a display, a speaker, a printer, a mailer, or the like.
  • the processor 50 can issue a command to each module and perform a calculation based on the calculation result.
  • Each component of the analysis device 100 of this embodiment of FIG. 4 described later is realized by an arbitrary combination of the hardware and software of the computer shown in FIG. It will be understood by those skilled in the art that there are various modified examples of the realizing method and the apparatus.
  • the functional block diagram showing the analysis device of each embodiment described below shows blocks of logical functional units, not of hardware units.
  • each function of each unit of the analysis device 100 in FIG. 4 can be realized.
  • the computer program of the present embodiment causes a computer (processor 50 in FIG. 3) for realizing the analysis device 100 to display a procedure for imaging the detection result of the vibration sensor 12 provided in the production facility 10 and the imaged data. It is described that a procedure for generating a discriminator as a target of machine learning processing and a procedure for performing an abnormality determination processing of the production facility 10 using the discriminator are executed.
  • the computer program of this embodiment may be recorded in a computer-readable recording medium.
  • the recording medium is not particularly limited, and various forms are conceivable.
  • the program may be loaded from the recording medium into the memory 52 (FIG. 3) of the computer, or may be downloaded to the computer through the network and loaded into the memory 52.
  • a recording medium for recording a computer program includes a non-transitory tangible computer-usable medium, and a computer-readable program code is embedded in the medium.
  • the computer program When the computer program is executed on the computer, it causes the computer to execute the analysis method of the present embodiment that realizes the analysis device 100.
  • FIG. 4 is a functional block diagram showing a logical configuration of the analysis device 100 of this embodiment.
  • the analysis device 100 includes an image processing unit 102, a generation unit 104, and an analysis unit 106.
  • the image processing unit 102 visualizes the detection result of the vibration sensor provided in the production facility 10.
  • the generation unit 104 generates the discriminator 110 by using the imaged data as the target of machine learning processing.
  • the analysis unit 106 uses the discriminator 110 to perform a state analysis process of the production facility 10.
  • the measurement data of the vibration sensor is time-series data including a plurality of vibration waveforms indicated by a plurality of parameters, the characteristics cannot be captured even by machine learning as it is, and an appropriate learning model cannot be generated. Therefore, in the present embodiment, the measurement data is imaged by the image processing unit 102.
  • FIG. 5 is a diagram showing time-series data of measurement data before imaging.
  • FIG. 6 is a diagram showing image data imaged by the image processing unit 102.
  • the image processing unit 102 acquires a detection result of the sensor 12 of the production facility 10 (hereinafter, also referred to as vibration data). Then, the image processing unit 102 performs FFT processing on the acquired vibration data, frequency-divides the acquired vibration data, and images the acquired vibration spectrum data to obtain image data. By these processes, the data capacity can be compressed, and the machine learning process and the discrimination process can be speeded up.
  • acquisition means that the device itself acquires data or information stored in another device or a storage medium (active acquisition), and is output to the device itself from another device. At least one of inputting data or information (passive acquisition) is included. Examples of active acquisition include requesting or inquiring another device to receive the reply, and accessing and reading the other device or a storage medium. An example of passive acquisition is receiving information to be distributed (or transmitted, push notification, etc.). Further, “acquisition” may mean selecting and acquiring from received data or information, or selecting and receiving distributed data or information.
  • the discriminator 110 is generated by the generation unit 104 using the imaged data as the target of machine learning processing.
  • the analysis unit 106 uses the discriminator 110 to perform a state analysis process of the production facility 10.
  • the state of the production facility 10 determined by the discriminator 110 is, for example, normal or any other state.
  • states other than normal are also referred to as “Unknown”.
  • the discriminator 110 does not discriminate an abnormal state.
  • the machine learning process of the discriminator 110 will be described in detail in an embodiment described later.
  • the discrimination result of the discriminator 110 may be output so that the operator can refer to it.
  • the output method may be displayed on the display of the analysis device 100, may be printed out from the analysis device 100 on a printer, or may be transmitted to another device (for example, an operation terminal) via a communication line. May be done.
  • FIG. 7 is a flowchart showing an example of the operation of the analysis device 100 of this embodiment.
  • the image processing unit 102 visualizes the detection result of the vibration sensor provided in the production facility 10 (step S101).
  • the generation unit 104 generates the discriminator 110 by using the imaged data as the target of the machine learning process (step S103).
  • the analysis unit 106 uses the discriminator 110 to perform a state analysis process of the production facility 10 (step S105).
  • FIG. 8 is a flowchart showing an example of a detailed flow of the imaging process of step S101 of FIG.
  • the image processing unit 102 subjects the measurement data of the sensor 12 to FFT processing and frequency division (step S113), images the obtained vibration spectrum data, and outputs the image data (step S115).
  • the image data obtained in step S115 is subjected to state analysis processing of the production facility 10 using the discriminator 110 by the analysis unit 106 in step S103 of FIG.
  • the discriminator 110 targets the image data imaged by the generation unit 104 as the target of the machine learning process.
  • the analysis unit 106 uses the discriminator 110 to perform a state analysis process of the production facility 10.
  • the amount of data can be compressed by imaging the vibration waveform data that is the target of the machine-learning processing, so that the processing load is reduced.
  • the processing speed can be increased as well as the processing speed is reduced.
  • FIG. 9 is a functional block diagram showing a logical configuration of the analysis device 100 of this embodiment.
  • the analysis device 100 includes an image processing unit 102 similar to the analysis device 100 of FIG. 4, a generation unit 104, and an analysis unit 106, and further includes an abnormality determination unit 120.
  • the abnormality determination unit 120 uses the first discriminator 124 to perform abnormality determination processing of the production facility 10 based on the detection result of the vibration sensor provided in the production facility 10.
  • the abnormality determination unit 120 outputs the result of the abnormality determination process using the first discriminator 124.
  • the output result may be used for the monitoring process of the production facility 10 in the facility monitoring system 1.
  • the image processing unit 102 is that the detection result to be imaged is a detection result that the first discriminator 124 cannot determine whether the image processing unit 102 is normal or abnormal. Different from the part 102.
  • the generation unit 104 generates the second discriminator 126 for performing the machine learning process on the measurement data which cannot be discriminated as normal or abnormal by the threshold analysis.
  • the vibration measured by the vibration sensor contains multiple vibration waveforms consisting of multiple factors.
  • frequency analysis is performed on measurement data detected by a vibration sensor by performing a Fast Fourier Transform (FFT) process.
  • FFT Fast Fourier Transform
  • the characteristic frequency (peak) is detected, and the abnormality can be diagnosed by determining whether the detected peak level is normal or abnormal by the threshold value.
  • the generation unit 104 generates the second discriminator 126 by using the measurement data, which cannot be discriminated as normal or abnormal by the threshold analysis, as the target of the machine learning process.
  • the second discriminator 126 corresponds to the discriminator 110 in FIG.
  • the first discriminator 124 cannot discriminate means that, for example, when the vibration waveform pattern registered in the first discriminant model 128 and the detection result do not match, the likelihood of the matching result is equal to or more than a predetermined value. This is the case when the reliability is not obtained and the reliability higher than the standard required for equipment diagnosis (for example, the detection rate of 90% or more) is not obtained.
  • the analysis unit 106 uses the second discriminator 126 to perform a state analysis process of the production facility 10.
  • the analysis unit 106 uses the second discriminator 126 to analyze the state of the production facility 10 with respect to the data obtained by imaging the detection result in which the first discriminator 124 cannot discriminate between normal and abnormal.
  • the first discriminator 124 uses the first discriminant model 128 to discriminate between normal and abnormal.
  • the first discriminant model 128 is, for example, a model using pattern matching of vibration spectrum or threshold determination.
  • the first discriminator 124 uses the frequency distribution obtained by subjecting the measurement data of the vibration sensor to the FFT processing to determine the peak level of the specific frequency, the ratio of the maximum and average peak levels, and the S/N ratio (Signal-to -Noise ratio), and at least one of the integrated value of peak level in the range of the specific frequency, set the threshold value to the calculated value, and perform the abnormality determination process based on whether it is within the range of the threshold value. To do. If it is within the threshold range, it is determined to be normal, and if it is outside the threshold range, it is determined to be abnormal.
  • the first discriminator 124 may perform the above-described determination process using a plurality of specific frequencies. In that case, the first discriminator 124 may output the determination result for each specific frequency. Further, the first discriminator 124 may determine that there is an abnormality when even one of the plurality of specific frequencies is out of the threshold value, or may identify a predetermined number or more of the plurality of specific frequencies. It may be judged to be abnormal if there is a value outside the threshold value for the frequency, or may be judged to be abnormal if all of the multiple specific frequencies are outside the threshold value, or all of the multiple specific frequencies. May be determined to be normal when is within the threshold range.
  • a combination of a plurality of values of specific frequencies is modeled for each fault event and registered in the first discriminant model 128, and the first discriminator 124 performs the pattern matching with the first discriminant model 128 to cause a defect.
  • the event may be determined.
  • the abnormality determination unit 120 may notify the operator of the corresponding malfunction event item.
  • Threshold may be set automatically or may be set manually by an operator.
  • the generation unit 104 for the vibration characteristic pattern (frequency distribution) registered in the first discriminant model 128, the peak level of the specific frequency, the ratio of the maximum and average peak levels, and the S/N ratio ( Signal-to-Noise ratio) and the integrated value of the peak level in the range of the specific frequency, and the threshold value may be set by detecting the boundary range between the normal time and the abnormal time.
  • the generation unit 104 for the vibration characteristic pattern (frequency distribution) registered in the first discriminant model 128, the peak level of the specific frequency, the ratio of the maximum and average peak levels, and the S/N ratio (Signal). -to-Noise ratio) and at least one of the integrated value of the peak level in the range of the specific frequency are presented to the operator, and the operator sets the threshold value of the first discriminator 124. May be accepted respectively.
  • the result may be displayed on the display of the analysis device 100, may be printed out from the analysis device 100 on a printer, or may be output via a communication line. It may be transmitted to a device (for example, an operation terminal or the like).
  • the second discriminator 126 corresponds to the discriminator 110 generated by the generation unit 104 and generated by the generation unit 104 in the above embodiment.
  • the generator 104 machine-learns only the data discriminated as normal by the second discriminator 126, normalizes the normal state, and updates the second discriminant model 130.
  • the normalization is to model the pattern of the imaging data of the measurement data of the vibration sensor at the normal time.
  • the second discriminator 126 extracts data outside the range of the second discriminant model 130, for example, as “Unknown”. That is, with respect to the detection result which cannot be discriminated by the first discriminator 124 by the analysis unit 106 using the second discriminator 126, the state of the production facility 10 is further determined based on the second discriminant model 130. , Normal or "Unknown" is determined.
  • the second discriminator 126 does not discriminate an abnormal state of the production facility 10, and discriminates a state other than normal as “Unknown”.
  • the data discriminated as “Unknown” by the discriminator 110 may be referred to and analyzed by the operator.
  • the result analyzed by the operator may be reflected in the threshold value of the abnormality determination process of the detection result.
  • a method of using the data determined as “Unknown” will be described in detail in an embodiment described later.
  • FIG. 10 is a flowchart showing an example of the operation of the analysis device 100 of this embodiment.
  • the abnormality determination unit 120 uses the first discriminator 124 to perform abnormality determination processing of the production facility 10 based on the detection result of the vibration sensor provided in the production facility 10 (step S121).
  • the image processing unit 102 visualizes the detection result of which the first discriminator 124 could not determine whether it is normal or abnormal (NO in step S123) (step S125). If it can be determined (YES in step S123), this process ends.
  • the generation unit 104 generates the second discriminator 126 by using the imaged data as the target of the machine learning process (step S127).
  • the analysis unit 106 uses the second discriminator 126 to perform a state analysis process of the production facility 10 (step S129).
  • the computer program of the present embodiment causes the computer (processor 50 in FIG. 3) for realizing the analysis apparatus 100 to detect an abnormality in the production facility 10 based on the detection result of the vibration sensor 12 provided in the production facility 10.
  • a procedure for performing the determination process using the first discriminator 124, a procedure for imaging the detection result in which the first discriminator 124 cannot determine whether it is normal or abnormal, and the imaged data is the target of the machine learning process. It is described that the procedure for generating the second discriminator 126 and the procedure for performing the state analysis process of the production facility 10 by using the second discriminator 126 are executed.
  • the abnormality determination unit 120 uses the first discriminator 124 to obtain no determination result in the abnormality determination process based on the detection result of the vibration sensor, the vibration at that time is detected.
  • the detection result of the sensor is imaged by the image processing unit 102, and the second discriminator 126 discriminates whether it is normal or “Unknown”.
  • the second discriminator 126 performs machine learning only on the detection result of the vibration sensor corresponding to the normal state and normalizes the normal state. Even in the case where the manufacturing conditions are always in fluidity such as in production, and it is difficult to accumulate the information on the failure event, the accuracy of the determination can be improved.
  • the amount of data can be compressed by imaging the vibration waveform data that is the target of the machine-learning process, so that the processing load is reduced. At the same time, the processing speed can be increased.
  • FIG. 11 is a flowchart showing an example of a procedure of abnormality determination processing in the analysis device 100 of this embodiment.
  • the present embodiment is the same as the analysis apparatus 100 in FIG. 9 except that it has a configuration for outputting the result of the abnormality determination processing of the production facility 10 using the first discriminator 124.
  • the abnormality determination unit 120 acquires vibration data from the vibration sensor provided in the production facility 10 (step S301). Then, the abnormality determination unit 120 executes the abnormality determination process of the vibration data acquired using the first discriminator 124 (step S303). Here, after the vibration data is subjected to FFT processing, the first discriminant model 128 of the abnormality determination unit 120 determines whether the vibration data is normal or abnormal (step S305).
  • the abnormality determination unit 120 determines the peak level of the specific frequency, the ratio of the maximum and average peak levels, the S/N ratio (Signal-to-Noise ratio), and the range of the specific frequency. For each of at least any one of the integrated values of the peak levels, the threshold value is used to determine whether these values are normal or abnormal (step S305). If it is within the threshold range (NO in step S305), the first discriminator 124 determines that the production facility 10 is normal (step S307), and the abnormality determination unit 120 determines that the production facility 10 is normal. Is output (step S311).
  • the first discriminator 124 determines that the production facility 10 is in an abnormal state (step S309), and the determination result indicating that the production facility 10 is in an abnormal state. Is output (step S311).
  • step S305 If the first discriminator 124 cannot determine whether it is normal or abnormal (determination in step S305 is not possible), the abnormality determination unit 120 outputs vibration data to the image processing unit 102 (step S313), and It progresses to step S125 of 10.
  • the first discriminator 124 is used for the abnormality determination processing of the production facility 10, and the second discriminator 126 is not used for the abnormality determination processing.
  • the discrimination result of the second discriminator 126 is used to update the second discriminant model 130 of the first discriminator 124, and this configuration will be described in an embodiment described later.
  • the abnormality determination unit 120 performs the abnormality determination processing of the production facility 10 using the first discriminator 124, and the abnormality determination unit 120 causes the second discriminator 126 to operate the production facility 10. It is not used for the abnormality determination processing of.
  • the first discriminator 124 can be updated using the discrimination result of the second discriminator 126 updated by the generation unit 104. According to this configuration, since the abnormality determination process can be performed using the first discriminator 124 that reflects the discrimination result of the second discriminator 126, the accuracy of the determination result can be improved.
  • FIG. 12 is a functional block diagram showing a logical configuration of the image processing unit 102 of the analysis device 100 of this embodiment.
  • the analysis apparatus 100 of the present embodiment is the same as the above-described embodiment except that the image processing unit 102 has a configuration of performing noise removal from measurement data and then performing imaging processing.
  • the configuration of this embodiment may be combined with the configuration of any other embodiment.
  • the image processing unit 102 includes a noise removal unit 112 and an imaging processing unit 114.
  • the noise removal unit 112 performs noise removal processing on the detection result.
  • the imaging processing unit 114 images the detection result after the noise removal processing by the noise removal unit 112 is performed. By removing noise, the vibration waveform of measurement data becomes clear.
  • the noise removal processing includes, for example, measuring and storing environmental noise from a plurality of arranged vibration sensors in advance and performing difference processing based on the noise data.
  • the noise removal process may be performed by another method.
  • FIG. 13 is a flowchart showing an example of the operation of the image processing unit 102 of the analysis device 100 of this embodiment.
  • the flowchart of FIG. 13 includes step S111 in addition to steps S113 and S115 of the flowchart of FIG. 8 described in the above embodiment.
  • step S111 the noise removal unit 112 performs noise removal processing on measurement data.
  • the image processing unit 114 performs FFT processing on the noise-removed data in step S111 and frequency division processing (step S113), and the obtained vibration spectrum data is imaged to output the image data (step S115).
  • the image data obtained in step S115 is subjected to state analysis processing of the production facility 10 using the discriminator 110 by the analysis unit 106 in step S103 of FIG.
  • the measurement data of the sensor 12 that has been subjected to the noise removal processing by the noise removal unit 112 is imaged by the imaging processing unit 114.
  • the vibration waveform of the measurement data becomes clear by the noise removal, and the FFT processing and The accuracy of the imaging process is improved, and the accuracy and reliability of the measurement data analysis result are improved.
  • FIG. 14 is a functional block diagram showing a logical configuration of the analysis device 100 of this embodiment.
  • the analysis device 100 of the present embodiment is different from the above-described embodiment in that it extracts data that is determined to be abnormal by the state analysis process of the production facility 10, and based on the data, the first first discriminator 124 of the first discriminator 124.
  • the configuration is the same as that of the above embodiment except that the discriminant model 128 is updated.
  • the analysis device 100 includes an image processing unit 102 similar to the analysis device 100 of FIG. 9, a generation unit 104, an analysis unit 106, and an abnormality determination unit 120, and further includes an extraction unit 140.
  • the extraction unit 140 extracts data that is determined to be abnormal (“Unknown”) in the state analysis processing using the second discriminator 126.
  • the generator 104 receives the correction information based on the extracted data and updates the first discriminant model 128 of the first discriminator 124.
  • the correction information includes information in which the malfunction event and the vibration characteristic are associated with each other.
  • the extracted data means the raw vibration waveform data received from the vibration sensor before the image processing by the image processing unit 102 and the time information of the data, which corresponds to the imaged data determined as “Unknown”. including.
  • the vibration indicated by the vibration data determined to be “Unknown” may be caused by a failure event of the production facility 10 that has not been specified yet.
  • the operator manually analyzes the extracted vibration data together with the time information by using the operation information, the status information, the information of the workpiece, etc. of the production equipment 10 that the equipment monitoring system 1 has, and the vibration concerned. Identify the failure event that caused the.
  • the analysis device 100 outputs the data extracted by the extraction unit 140 and presents it to the operator.
  • the data may be displayed on the display of the analysis device 100, may be printed out from the analysis device 100 on a printer, or may be output via a communication line. It may be transmitted to a device (for example, an operation terminal or the like).
  • the operator inputs the correction information 30 of FIG. 15 in which the vibration characteristic information of the corresponding vibration is associated with the defect information specified by the operator to the analysis device 100 using an operation screen or the like.
  • the generation unit 104 receives the input correction information 30 and updates the first discriminant model 128 of the first discriminator 124.
  • FIG. 16 is a flowchart showing an example of a detailed flow of the discrimination processing by the analysis unit 106.
  • the analysis unit 106 applies the image data output in step S115 of FIG. 13 to the second discriminator 126 to perform a state analysis process of the production facility 10 (step S401).
  • the data determined to be normal by the second discriminator 126 (“normal” in step S401) is passed to the generation unit 104, and machine learning processing is performed as normal data (step S403).
  • the analysis unit 106 extracts the data outside the normalization range by the second discriminator 126 (“Unknown” in step S401) (step S405).
  • step S115 of FIG. 8 may be similarly processed using the flow of FIG.
  • FIG. 17 is a flowchart showing an example of a discriminator update processing procedure using the data extracted in step S405 of FIG.
  • the extraction unit 140 outputs the data (vibration data and time information) extracted in step S405 (step S411).
  • the operator analyzes the vibration data displayed in step S411 together with the time information using the operation information of the production facility 10, the state information, the information of the workpiece, etc., which the facility monitoring system 1 has, Identify the faulty event that caused the vibration.
  • the operator creates information in which the identified malfunction event and the corresponding vibration characteristic information are associated with each other, and inputs the information as correction information for the threshold of the first discriminator 124 according to the operation screen of the analysis device 100.
  • the generation unit 104 receives the input correction information (step S413), and updates the first discriminator 124 using the received correction information (step S415).
  • the vibration characteristic information and the malfunction event included in the received correction information are registered in the first discrimination model 128, and the peak level of the specific frequency, the ratio of the maximum and average peak levels, and the S/N ratio ( Signal-to-Noise ratio) and the integrated value of the peak level in the range of the specific frequency, and calculate the boundary range between normal and abnormal conditions and set the threshold.
  • the operator may set thresholds based on the respective values obtained from the vibration characteristic information corresponding to the identified malfunction event, and input the thresholds using the operation screen.
  • the extracting unit 140 extracts the “Unknown” data of the second discriminator 126, presents it to the operator, and analyzes the operator by the generating unit 104.
  • the correction information associated with and is received, and the first discriminator 124 is updated based on the correction information.
  • the measurement data that the first discriminator 124 cannot discriminate is further discriminated by the second discriminator 126, and the data that is “Unknown” is extracted. Since the operator analyzes and reflects the result in the first discriminator 124, the accuracy of the abnormality determination processing can be improved.
  • the second discriminator 126 machine-learns only the information in the normal state, it may be the case that the manufacturing condition is always fluid and the information on the defective event is difficult to be accumulated in a small-lot, large-variety product or variable production.
  • the first determination model 128 can be updated, the accuracy of determining the abnormal state of the production facility 10 can be improved.
  • the analysis apparatus 100 of FIG. 14 has a configuration in which the extraction unit 140 is provided in the configuration of the analysis apparatus 100 of FIG. As a modification thereof, the analysis device 100 of FIG. 4 may be provided with the extraction unit 140.
  • the analysis device 100 further includes an abnormality determination unit 120 and an extraction unit 140.
  • the abnormality determination unit 120 performs the characteristic analysis process of the vibration of the vibration sensor indicated by the detection result, and performs the abnormality determination process of the production facility 10 using the threshold value.
  • the extraction unit 140 extracts data that is determined by the state analysis process that the production facility 10 is not in a normal state.
  • the generation unit 104 receives the correction information based on the extracted data and updates the threshold value.
  • the correction information includes information in which the fault event and the vibration characteristic are associated with each other.
  • FIG. 18 is a flowchart showing an example of a processing procedure when it is determined as normal in step S401 of the state analysis processing of FIG.
  • This embodiment is the same as the above embodiment except that the detection result determined by the second discriminator 126 is normal is used as the teacher data of the second discriminator 126.
  • the generation unit 104 uses the detection result determined to be normal by the state analysis processing using the second discriminator 126 (normal in step S401 of FIG. 16) as the second teacher data in the normal state of the production facility 10.
  • the second discriminant model 130 of the discriminator 126 is updated (step S501).
  • the second discriminant model 130 is updated by the generation unit 104 with the detection result determined as normal by the state analysis processing using the second discriminator 126 as the teacher data of the normal state of the production facility 10. To be done.
  • the teacher data in the normal state can be generated from the measurement data that could not be discriminated in the abnormality discriminating process by the first discriminator 124, and the second discriminant model 130 can be updated. It is possible to improve the determination accuracy of.
  • the analysis device 100 may include a third discriminator (not shown).
  • FIG. 19 is a flowchart showing an example of a processing procedure in which the analysis device 100 constructs a defect model using the third discriminator and identifies a defect event.
  • the third discriminator acquires the measurement data discriminated as normal by the first discriminator 124 (step S601). Furthermore, the third discriminator acquires the measurement data discriminated as Unknown by the second discriminator 126 (step S603). Then, the third discriminator machine-learns these data and constructs a model in which defective events are classified (step S605).
  • the noise removal processing and the imaging processing described in the above embodiment may be performed on each measurement data machine-learned by the third discriminator.
  • step S607 it is determined whether the measurement data of the sensor 12 of the production facility 10 is normal or abnormal, and if it is abnormal, the defective event is identified.
  • the position information of each of the plurality of sensors 12 is stored in the facility information 24, and the relationship between the vibration data and the position information is further machine-learned, and the classification model 202, the first discrimination model 128, and the second It may be reflected in at least one of the discrimination models 130.
  • information such as measurement conditions (equipment type, environment (temperature, humidity), etc.) of measurement data of a plurality of sensors 12 may be stored in the equipment information 24.
  • the measurement data is grouped with the measurement data of conditions close to the measurement condition, the operation condition included in the operation information of the production facility 10, the operation condition, etc., and the measurement data is machine-learned for each group, and the classification model 202 and the first discrimination model 128 are used.
  • the second discriminant model 130 may be reflected.
  • FIG. 20 is a flow chart for explaining the analyzing apparatus of the first embodiment.
  • the abnormality determination unit 120 performs the FFT processing in the first discriminator 124 (step S11).
  • the vibration characteristic is specified by the pattern matching process using the first discrimination model 128.
  • the first discriminator 124 determines that the vibration characteristics are normal when the vibration characteristics are within the threshold range, and determines that the vibration characteristics are abnormal when the vibration characteristics are outside the threshold range.
  • the abnormality determination unit 120 outputs this result to the equipment monitoring system 1 as an abnormality determination result of the production equipment 10 (not shown).
  • the abnormality determination unit 120 extracts the data for which the first discriminator 124 could not discriminate whether it is normal in step S11 (step S13), and sets it as the target of the machine learning process of the second discriminator 126. It is passed to the analysis unit 106.
  • the analysis unit 106 performs noise removal processing on the measurement data of the sensor 12 that cannot be discriminated by the first discriminator 124 (step S15), frequency-analyzes it, and forms an image (step S17).
  • the analysis unit 106 determines the imaged data using the second discriminator 126 (step S19), and determines whether the state of the production facility 10 is normal (step S21). When it is determined to be normal (YES in step S21), the generation unit 104 machine-learns the measurement data determined to be normal, and updates the second discriminant model 130 of the second discriminator 126 (step S23). ..
  • the extraction unit 140 extracts and outputs the unknown measurement data (step S31).
  • the operator refers to the extracted unknown data and analyzes it together with the operation information of the production facility 10 and the like, and identifies the defective phenomenon. Then, the correction information in which the fault event and the vibration characteristic are associated with each other is input to the analysis device 100 (step S33).
  • the generation unit 104 receives the input correction information, and updates the first discrimination model 128 and the threshold value based on the received correction information (step S35).
  • the analyzer 100 causes the second discriminator 126 to machine-learn the measurement data that could not be abnormally determined by the first discriminator 124, thereby extracting abnormal data that is not normal and producing the production facility.
  • the operator analyzes the operation information together with the operation information of 10 to identify a defective event, and inputs the corrected information to the analysis apparatus 100 as the correction information in which the vibration characteristic and the defective event are associated with each other.
  • the first discriminant model 128 and the threshold can be updated.
  • FIG. 21 is a flow chart for explaining the analyzing apparatus of the second embodiment.
  • the analysis apparatus 100 of this embodiment includes a third discriminator 200 in addition to the first discriminator 124 and the second discriminator 126.
  • the third discriminator 200 performs machine learning using the measurement data discriminated as normal by the first discriminator 124 and the measurement data discriminated as Unknown by the second discriminator 126 (step S41).
  • the second discriminator 126 constructs a normal and abnormal classification model 202 by machine learning.
  • the classification model 202 classifies defective events into classes.
  • the third discriminator 200 can discriminate whether the measurement data is normal or abnormal, and can further discriminate and specify the faulty event.
  • Image processing means for imaging the detection result of the vibration sensor provided in the production facility, Generating means for generating a discriminator by subjecting the imaged data to machine learning processing; And an analysis unit that performs a state analysis process of the production facility using the discriminator. 2.
  • a determination unit that performs a characteristic analysis process of the vibration of the vibration sensor indicated by the detection result, and performs an abnormality determination process of the production facility using a threshold value
  • the generation means receives correction information based on the extracted data, updates the threshold value,
  • the said correction information is an analysis apparatus containing the information which linked
  • Determination means for performing abnormality determination processing of the production facility using the first discriminator based on the detection result of the vibration sensor provided in the production facility; Image processing means for imaging the detection result, which cannot be discriminated as normal or abnormal by the first discriminator; Generating means for generating a second discriminator using the imaged data as a target of machine learning processing; Analysis means for performing a state analysis process of the production facility using the second discriminator; With Analyzer. 4. 3. In the analyzer described in The state analysis process further comprises an extracting unit for extracting data determined to be in a non-normal state of the production facility, The generation means receives correction information based on the extracted data, updates the first discriminator, The said correction information is an analysis apparatus containing the information which linked
  • the generation unit uses the data determined to be normal by the state analysis process as teacher data for the machine learning process.
  • the generation unit uses the data determined to be normal by the state analysis process as teacher data for the machine learning process.
  • 6. 1. To 5.
  • the analysis device according to any one of Further comprising processing means for performing noise removal processing on the detection result, The analysis device, wherein the image processing means images the detection result after the noise removal processing is performed by the processing means. 7. 1. To 6.
  • the analysis device according to any one of The production facility is a belt conveyor,
  • the said vibration sensor is an analysis apparatus which is a some vibration sensor provided in the said belt conveyor.
  • the analysis device Visualize the detection result of the vibration sensor installed in the production facility, Generate a discriminator with the imaged data as the target of machine learning processing, Performing a state analysis process of the production facility using the discriminator, analysis method. 9. 8. In the analysis method described in The analysis device further comprises Performing a characteristic analysis process of the vibration of the vibration sensor indicated by the detection result, performing an abnormality determination process of the production facility using a threshold, Extracting the data determined that the production equipment is not in a normal state in the state analysis process, Accept correction information based on the extracted data, update the threshold, The said correction information is an analysis method containing the information which linked
  • the analysis device Based on the detection result of the vibration sensor provided in the production equipment, the abnormality determination processing of the production equipment is performed using the first discriminator, The detection result, which could not be discriminated as normal or abnormal by the first discriminator, is imaged, A second discriminator is generated by using the imaged data as a target of machine learning processing, Performing a state analysis process of the production facility using the second discriminator, analysis method. 11. 10. In the analysis method described in The analysis device further comprises Extracting the data determined that the production equipment is not in a normal state in the state analysis process, Accepting correction information based on the extracted data, updating the first discriminator, The said correction information is an analysis method containing the information which linked
  • the analysis device further comprises An analysis method, wherein the data determined to be normal by the state analysis process is used as teacher data for the machine learning process. 13. 8. To 12. In the analysis method described in any one of The analysis device further comprises Noise removal processing is performed on the detection result, An analysis method of imaging the detection result after the noise removal processing is performed. 14. 8. To 13. In the analysis method described in any one of The production facility is a belt conveyor, An analysis method, wherein the vibration sensor is a plurality of vibration sensors provided on the belt conveyor.
  • Procedure for imaging the detection result of the vibration sensor installed in the production facility A procedure for generating a discriminator by subjecting the imaged data to machine learning processing, A program for executing a procedure of performing abnormality determination processing of the production facility using the discriminator. 16. 15.
  • a procedure for performing an abnormality determination process of the production facility using a threshold A procedure for extracting data that is determined in the state analysis process that the production facility is not in a normal state,
  • the correction information based on the extracted data is accepted, and the procedure of updating the threshold value is further executed by a computer,
  • the said correction information is a program containing the information which linked
  • a procedure for performing noise removal processing on the detection result A program for causing a computer to further execute the step of imaging the detection result after the noise removal processing is performed in the step of imaging. 21. 15.
  • the production facility is a belt conveyor, A program in which the vibration sensor is a plurality of vibration sensors provided on the belt conveyor.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Measurement Of Mechanical Vibrations Or Ultrasonic Waves (AREA)
  • Testing Of Devices, Machine Parts, Or Other Structures Thereof (AREA)

Abstract

Un dispositif d'analyse (100) comprend : une unité de traitement des images (102) qui convertit, en une image, un résultat de détection par un capteur de vibration fourni à une installation de production (10) ; une unité de génération (104) qui génère un discriminateur (110) en soumettant les données imagées à un processus d'apprentissage machine ; et une unité d'analyse (106) qui effectue un processus d'analyse d'état de l'installation de production (10) à l'aide du discriminateur (110).
PCT/JP2020/004042 2019-02-05 2020-02-04 Dispositif d'analyse, procédé d'analyse et programme WO2020162425A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2020571198A JP7188463B2 (ja) 2019-02-05 2020-02-04 解析装置、解析方法、およびプログラム
CN202080012631.9A CN113383216A (zh) 2019-02-05 2020-02-04 分析装置、分析方法和程序
KR1020217024139A KR20210107844A (ko) 2019-02-05 2020-02-04 분석 장치, 분석 방법, 및 프로그램

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019019068 2019-02-05
JP2019-019068 2019-02-05

Publications (1)

Publication Number Publication Date
WO2020162425A1 true WO2020162425A1 (fr) 2020-08-13

Family

ID=71947695

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/004042 WO2020162425A1 (fr) 2019-02-05 2020-02-04 Dispositif d'analyse, procédé d'analyse et programme

Country Status (5)

Country Link
JP (1) JP7188463B2 (fr)
KR (1) KR20210107844A (fr)
CN (1) CN113383216A (fr)
TW (1) TW202045898A (fr)
WO (1) WO2020162425A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022064769A1 (fr) * 2020-09-24 2022-03-31 国立大学法人大阪大学 Système et procédé de prédiction d'état de dégradation
WO2022158073A1 (fr) * 2021-01-25 2022-07-28 株式会社日本製鋼所 Programme informatique, procédé de détection d'anomalie, dispositif de détection d'anomalie, système de machine de moulage et procédé de génération de modèle d'apprentissage

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008232708A (ja) * 2007-03-19 2008-10-02 Jfe Steel Kk 劣化度診断方法、劣化度診断装置、および劣化診断プログラム
JP2018178810A (ja) * 2017-04-10 2018-11-15 株式会社デンソーテン ノック制御装置、ノック適合方法およびノック適合プログラム
WO2018216258A1 (fr) * 2017-05-25 2018-11-29 日本電気株式会社 Dispositif de traitement, procédé de traitement et programme

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3170076B2 (ja) * 1992-12-18 2001-05-28 株式会社小野測器 転がり軸受故障診断装置
JP2000321176A (ja) * 1999-05-17 2000-11-24 Mitsui Eng & Shipbuild Co Ltd 異常検知方法および装置
JP2011107093A (ja) 2009-11-20 2011-06-02 Jx Nippon Oil & Energy Corp 振動体の異常診断装置及び異常診断方法
CN102494882B (zh) * 2011-11-30 2013-11-06 中国神华能源股份有限公司 矿用振动筛弹簧在线监测与故障诊断装置及其方法
JP6877978B2 (ja) 2016-12-06 2021-05-26 日本電気通信システム株式会社 学習装置、学習方法およびプログラム
CN107560849B (zh) * 2017-08-04 2020-02-18 华北电力大学 多通道深度卷积神经网络的风电机组轴承故障诊断方法
CN108896296A (zh) * 2018-04-18 2018-11-27 北京信息科技大学 一种基于卷积神经网络的风电齿轮箱故障诊断方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008232708A (ja) * 2007-03-19 2008-10-02 Jfe Steel Kk 劣化度診断方法、劣化度診断装置、および劣化診断プログラム
JP2018178810A (ja) * 2017-04-10 2018-11-15 株式会社デンソーテン ノック制御装置、ノック適合方法およびノック適合プログラム
WO2018216258A1 (fr) * 2017-05-25 2018-11-29 日本電気株式会社 Dispositif de traitement, procédé de traitement et programme

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022064769A1 (fr) * 2020-09-24 2022-03-31 国立大学法人大阪大学 Système et procédé de prédiction d'état de dégradation
WO2022158073A1 (fr) * 2021-01-25 2022-07-28 株式会社日本製鋼所 Programme informatique, procédé de détection d'anomalie, dispositif de détection d'anomalie, système de machine de moulage et procédé de génération de modèle d'apprentissage

Also Published As

Publication number Publication date
CN113383216A (zh) 2021-09-10
KR20210107844A (ko) 2021-09-01
JPWO2020162425A1 (ja) 2021-12-09
JP7188463B2 (ja) 2022-12-13
TW202045898A (zh) 2020-12-16

Similar Documents

Publication Publication Date Title
US11521105B2 (en) Machine learning device and machine learning method for learning fault prediction of main shaft or motor which drives main shaft, and fault prediction device and fault prediction system including machine learning device
RU2704073C2 (ru) Способ и система для стадии обучения акустического или вибрационного анализа машины
US20180264613A1 (en) Abnormality detection apparatus and machine learning apparatus
JP6837848B2 (ja) 診断装置
CN109397703B (zh) 一种故障检测方法及装置
US6970804B2 (en) Automated self-learning diagnostic system
EP3206103A1 (fr) Surveillance d'un système basée sur un modèle
JP6200833B2 (ja) プラントと制御装置の診断装置
WO2020162425A1 (fr) Dispositif d'analyse, procédé d'analyse et programme
CN111964909A (zh) 滚动轴承运行状态检测方法、故障诊断方法及系统
JP2019067197A (ja) 故障予兆検知手法
EP2135144B1 (fr) Surveillance de l'état d'une machine à l'aide de règles à motifs
JP7006282B2 (ja) 設備異常診断装置
US20210149387A1 (en) Facility failure prediction system and method for using acoustic signal of ultrasonic band
JP4417318B2 (ja) 設備診断装置
KR102545672B1 (ko) 기계고장 진단 방법 및 장치
US20140058615A1 (en) Fleet anomaly detection system and method
JP6898607B2 (ja) 異常予兆検出システムおよび異常予兆検出方法
JP2002090266A (ja) 余寿命予測装置
KR20220102364A (ko) 진동 센서를 통한 설비 예지 보전 시스템
JP2004020484A (ja) 異常監視装置および異常監視プログラム
Tastimur et al. Defect diagnosis of rolling element bearing using deep learning
WO2020189245A1 (fr) Système de détection de symptôme de dysfonctionnement et procédé de détection de symptôme de dysfonctionnement
CN113591984A (zh) 设备运行事件的检测方法、装置、电子设备和存储介质
CN114286931B (zh) 异常部分检测装置、异常部分检测方法及计算机可读取的记录介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20751935

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20217024139

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2020571198

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20751935

Country of ref document: EP

Kind code of ref document: A1