EP2323105A1 - Monitoring of machines - Google Patents

Monitoring of machines Download PDF

Info

Publication number
EP2323105A1
EP2323105A1 EP09290794A EP09290794A EP2323105A1 EP 2323105 A1 EP2323105 A1 EP 2323105A1 EP 09290794 A EP09290794 A EP 09290794A EP 09290794 A EP09290794 A EP 09290794A EP 2323105 A1 EP2323105 A1 EP 2323105A1
Authority
EP
European Patent Office
Prior art keywords
data
modes
machine
mode
noise
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP09290794A
Other languages
German (de)
French (fr)
Other versions
EP2323105B1 (en
Inventor
Patricia Scanlon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alcatel Lucent SAS
Original Assignee
Alcatel Lucent SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alcatel Lucent SAS filed Critical Alcatel Lucent SAS
Priority to EP20090290794 priority Critical patent/EP2323105B1/en
Publication of EP2323105A1 publication Critical patent/EP2323105A1/en
Application granted granted Critical
Publication of EP2323105B1 publication Critical patent/EP2323105B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C3/00Registering or indicating the condition or the working of machines or other apparatus, other than vehicles
    • G07C3/08Registering or indicating the production of the machine either with or without registering working or idle time

Definitions

  • This invention relates to monitoring of machines.
  • it relates to monitoring of machinery performance, especially rotating machines and to predict failure time.
  • Rotating machinery such as fans, is often mission critical and it can be very important to be able to predict the lifetime and probable time-to-failure of such machinery in order that planned maintenance and replacement may take place with minimal downtime.
  • failure times are predicted by measuring at least one parameter of a machine in order to predict a time at which failure of the machine is anticipated based on said method parameter.
  • the parameters measured can be acquired from a number of different types of data such as airborne acoustic noise or structure borne acoustic emissions (ie vibrations).
  • airborne acoustic noise or structure borne acoustic emissions ie vibrations
  • the parameters acquired from these different types of data can be negatively affected by the presence of external noise.
  • This external noise may be background acoustic noise, external vibrations or noise from nearby machinery or vehicles or other externally generated noise. It might be acoustic noise or could also mean electromagnetic noise. Such external noise can lead to confusion in the system and poor results, leading to poor system performance.
  • the present invention arose in an attempt to provide an improved method and system for predicting failure times of rotating machinery.
  • a method of machine monitoring comprising measuring, over time, at least two different modes of data relating to the machine and fusing the data in a manner which takes into account environmental factors, in order to provide an indication of wear of the machine.
  • the method is used for predicting time to failure (or remaining useful lifetime (RUL)).
  • the modes of data may be selected from various modes, such as acoustic noise, structure-borne noise (vibration) or temperature, for example.
  • a decision level adaptive fusion approach of multiple modes of data is used in order to predict failure time of the machine. This applies weightings to the fusion which takes into account the different effects that environmental factors, such as external noise, have upon the different modes.
  • the fusion method takes into account confusion of output probabilities of each classifier, whereby a relatively high confusion is likely to represent a relatively low signal to noise ratio of the monitored data for a mode and a relatively low confusion likely to represent a relatively high signal to noise ratio of the monitored data for a mode.
  • a fusion approach is known from 'Adaptive fusion of acoustic and visual sources for automatic speech recognition ' Speech Communication 26 (1-2). 149-161, 1998 . In this example, it is used for automatic speech recognition. In some embodiments of the present invention, techniques of this type are used in a very different environment, to predict failure time of rotating or other machinery.
  • a method of machine monitoring comprising measuring, over time, two different modes of data relating to machine and fusing the data in a manner which takes into account environmental factors, in order to monitor one or more parameters relating to the machine's performance.
  • the individual parameters measured may be used to obtain individual and independent classifiers, which can then be combined using decision fusion techniques in order to make a determination as to RUL or other parameters relating to functioning of a machine.
  • a measure of confusion may be used to determine how each different modality (parameter) is being effected by external noise factors. For example, measures of the entropy or variance of classifier output scores can be used to determine confusion levels of each mode.
  • weighting is used to weight the contribution of each mode before decision level multi-modal fusion, whereby modes that are relatively noise free are weighted more heavily than those modes that are more effected by external noise factors.
  • a machine 1 may comprise a rotating machine such as a fan, or possibly a plurality of machines.
  • At least two modes of data are measured. These may relate to, for example, acoustic airborne noise, structure borne noise (ie vibration), temperature (either at a point of the machine itself or in its vicinity
  • Figure 1 shows two such modes, in this case this might be acoustic data (airborne noise) 2 and vibration data 3 which measures variation at the machine.
  • any number of different modes of data may be measured.
  • Figure 2 shows an alternative arrangement comprising sensors for measuring. acoustic noise 2, vibrations 3 and temperature 4.
  • Sensors 2 to 4 in the figure may be any convenient type of sensor for measuring the relevant mode of data.
  • microphones or other transducers may be used to measure acoustic signal; temperature sensors to measure temperature; or other sensors may be used to measure other parameters.
  • Outputs from these are then applied to a computational unit 5 for further processing.
  • this may of course comprise a number of different units, such as individual units for extracting relevant features and classifiers from each of the modes of data and then a further unit for combining these, or this may all be done in a single unit.
  • Data is obtained from the various sensors at fixed or variable sampling rates over time while the machine is operating, or alternatively during an initial testing phase.
  • the various modalities ie acoustic input, vibration input, temperature input, and so on
  • the various modalities ie acoustic input, vibration input, temperature input, and so on
  • Figure 3 shows a typical method for determining remaining useful lifetime (RUL) of a rotating machine.
  • two sets of data are obtained in this example. As described, however, more than two sets may be obtained, ie three or more types of data may be obtained.
  • the sets of data are acoustic data and vibration data.
  • the acoustic data is obtained from a suitable sensor 2 (eg a microphone) as raw acoustic samples 5. These samples may be obtained at periodic intervals, such as every second, every minute, every hour or any other time interval, which interval may vary. The samples will of course obtain, in addition to acoustic noise generated by the machine itself, environmental and external noise generated by external sources such as motor vehicles, external machinery and others.
  • vibration data is obtained from vibration sensor 3 and used to obtain raw vibration sample 6, again at the same or different sampling rates.
  • the vibrations from the machine will also be affected to external perturbations affecting the vibration of the machine and these might arise for example from air currents, the operation of other machinery or plant in the vicinity, movements of a vehicle in which the machine is in, or many other external factors which will affect the vibrations of the machine.
  • Each of the samples 5 and 6 is applied to a separate and independent feature extraction engine 7, 8 respectively.
  • Feature extraction is in itself an known technique and a set of representative features is obtained from each of the independent feature extraction engines 7 and 8. These are applied to respective classifiers for the acoustic data 9 and for the vibration data 10.
  • Classifiers are also well-known in themselves.
  • a classifier essentially represents a mapping from a discrete or continuous feature space to a discrete set of labels.
  • the classifier may be, for example, an indication of wear, such that classification A means that the machine is barely worn, classification B means that the machine is beginning to wear and classification C means that the machine is nearly worn out.
  • classification A may mean that the machine is zero to one third worn, classification B from one third to two thirds and classification C greater than two thirds worn, on a simplistic level.
  • the outputs from the independent classifiers 9 and 10 are then combined in a decision fusion technique at a decision level fusion engine 11 which combines them in order to determine a hopefully more accurate remaining useful life (RUL) prediction 12.
  • RUL remaining useful life
  • the decision fusion may be a multiplicative, additive or other process, including combinations of processes.
  • the output of the two classifiers 9 and 10 represent a list of all possible classes/outcome, ie worn, partially worn, not worn, and their associated scores (probability).
  • the scores are simply multiplied in the decision level fusion 11 or their log scores are added and the most likely output is determined.
  • a mechanism may also be provided which provides information on the reliability of each modality (ie acoustic and vibration input in the example of Figure 3 ) and thereby weightings that should be applied to each. That is, if it is found that acoustic data is more reliable in determining wear (or other parameters) than vibration data then greater emphasis and weighting may be placed upon the results from the acoustic classification than the vibration classification.
  • modality ie acoustic and vibration input in the example of Figure 3
  • weightings that should be applied to each. That is, if it is found that acoustic data is more reliable in determining wear (or other parameters) than vibration data then greater emphasis and weighting may be placed upon the results from the acoustic classification than the vibration classification.
  • Adaptive weightings are weightings that are dynamically adjusted during the fusion process.
  • the process is multiplicative using rules of probability.
  • Such a scheme initially selects a candidate that maximises the product of the N-best output probabilities of the modalities.
  • the N-best output probabilities of the acoustic and vibration mode are weighted according to the dispersion, variances or entropy of their N-best output probabilities. They will therefore account to some extent for the confusability of the N-best probabilities or outputs from each classifier which can be affected by external noise factors.
  • the weighting scheme accounts for the reliability of each of these modes.
  • each mode is then adaptively weighted before performing decision level multi-modal fusion.
  • each mode is evaluated and the mode that is considered to be least effected by external noise factors is weighted more heavily than that is more affected.
  • the two modes are acoustic and vibration data.
  • a separate classifier 9 and 10 respectively is trained for each type of data or mode.
  • For each sample presented to the classifier ie from the respective feature extraction module 7 and 8) a list of probabilities for each potential class is provided.
  • the classes may represent the degree of wear, or other parameters relating to the functioning and RUL of the machine.
  • the output classes for each class A to C given by the acoustic classifier 9 are:
  • the vibration classifier 10 is confused in that it provides broadly similar probabilities for all three classes. This could well be due to vibration data being corrupted with noise, which confuses the classifier. Alternatively, the vibration data may not provide enough information to clearly separate the classes A to C. Thus, a relatively high 'confusion' for a mode of data is considered to represent a low signal to noise ratio for that mode, suggesting the mode is more prone to noise interference. A low 'confusion' on the other hand indicates a higher signal to noise ratio, suggesting less noise interference.
  • the acoustic output can be weighted higher than the vibration output.
  • the acoustic output is considered to be much less affected by external environmental factors than the vibrational classifier.
  • the machine or machines upon which the process may be used might be a fan or other rotating machine operated in a factory, data centre or wind farm for example.
  • Embodiments may be used to provide indication of RUL of a machine (or group of machines) or other parameters or indications relating to wear, such as an indication of when a machine or a component is likely to need servicing or maintenance, or when it is likely to be X% worn, or other indications.

Abstract

A method of machine monitoring comprising measuring, over time, two different modes of data relating to machine and fusing the data in a manner which takes into account environmental factors, in order to monitor one or more parameters relating to the machine's performance.

Description

    Field of the Invention
  • This invention relates to monitoring of machines. In particular, it relates to monitoring of machinery performance, especially rotating machines and to predict failure time.
  • Background
  • Rotating machinery, such as fans, is often mission critical and it can be very important to be able to predict the lifetime and probable time-to-failure of such machinery in order that planned maintenance and replacement may take place with minimal downtime.
  • Presently, failure times are predicted by measuring at least one parameter of a machine in order to predict a time at which failure of the machine is anticipated based on said method parameter. The parameters measured can be acquired from a number of different types of data such as airborne acoustic noise or structure borne acoustic emissions (ie vibrations). However, the parameters acquired from these different types of data can be negatively affected by the presence of external noise. This external noise may be background acoustic noise, external vibrations or noise from nearby machinery or vehicles or other externally generated noise. It might be acoustic noise or could also mean electromagnetic noise. Such external noise can lead to confusion in the system and poor results, leading to poor system performance.
  • No previously proposed systems are known which factor in the presence of external noise.
  • The present invention arose in an attempt to provide an improved method and system for predicting failure times of rotating machinery.
  • Summary of the Invention
  • According to the present invention, in a first aspect, there is provided a method of machine monitoring comprising measuring, over time, at least two different modes of data relating to the machine and fusing the data in a manner which takes into account environmental factors, in order to provide an indication of wear of the machine.
  • Preferably, the method is used for predicting time to failure (or remaining useful lifetime (RUL)). The modes of data may be selected from various modes, such as acoustic noise, structure-borne noise (vibration) or temperature, for example.
  • Most preferably, a decision level adaptive fusion approach of multiple modes of data is used in order to predict failure time of the machine. This applies weightings to the fusion which takes into account the different effects that environmental factors, such as external noise, have upon the different modes.
  • Preferably, the fusion method takes into account confusion of output probabilities of each classifier, whereby a relatively high confusion is likely to represent a relatively low signal to noise ratio of the monitored data for a mode and a relatively low confusion likely to represent a relatively high signal to noise ratio of the monitored data for a mode.
  • A fusion approach is known from 'Adaptive fusion of acoustic and visual sources for automatic speech recognition ' Speech Communication 26 (1-2). 149-161, 1998 . In this example, it is used for automatic speech recognition. In some embodiments of the present invention, techniques of this type are used in a very different environment, to predict failure time of rotating or other machinery.
  • According to the present invention in a further aspect, there is provided a method of machine monitoring comprising measuring, over time, two different modes of data relating to machine and fusing the data in a manner which takes into account environmental factors, in order to monitor one or more parameters relating to the machine's performance.
  • The individual parameters measured may be used to obtain individual and independent classifiers, which can then be combined using decision fusion techniques in order to make a determination as to RUL or other parameters relating to functioning of a machine.
  • A measure of confusion may be used to determine how each different modality (parameter) is being effected by external noise factors. For example, measures of the entropy or variance of classifier output scores can be used to determine confusion levels of each mode.
  • In preferred embodiments, weighting is used to weight the contribution of each mode before decision level multi-modal fusion, whereby modes that are relatively noise free are weighted more heavily than those modes that are more effected by external noise factors.
  • Description of Drawings
  • Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
    • Figure 1 shows schematically a machine and sensors for measuring two parameters;
    • Figure 2 shows a machine that senses the measuring three parameter; and
    • Figure 3 shows a chart showing the main steps in a decision making method.
    Description of Some Embodiments of the Invention
  • Referring to Figure 1, a machine 1 may comprise a rotating machine such as a fan, or possibly a plurality of machines.
  • At least two modes of data (modalities), ie different parameters relating to the functioning and operation of the machine, are measured. These may relate to, for example, acoustic airborne noise, structure borne noise (ie vibration), temperature (either at a point of the machine itself or in its vicinity|) or other modes of data. Figure 1 shows two such modes, in this case this might be acoustic data (airborne noise) 2 and vibration data 3 which measures variation at the machine. However, any number of different modes of data may be measured.
  • Figure 2 shows an alternative arrangement comprising sensors for measuring. acoustic noise 2, vibrations 3 and temperature 4.
  • Sensors 2 to 4 in the figure may be any convenient type of sensor for measuring the relevant mode of data. For example, microphones or other transducers may be used to measure acoustic signal; temperature sensors to measure temperature; or other sensors may be used to measure other parameters. Outputs from these are then applied to a computational unit 5 for further processing. Instead of a single unit, this may of course comprise a number of different units, such as individual units for extracting relevant features and classifiers from each of the modes of data and then a further unit for combining these, or this may all be done in a single unit.
  • Data is obtained from the various sensors at fixed or variable sampling rates over time while the machine is operating, or alternatively during an initial testing phase.
  • During an initial classification process, the various modalities (ie acoustic input, vibration input, temperature input, and so on), and processed independently, to generate respective independent classifiers.
  • Figure 3 shows a typical method for determining remaining useful lifetime (RUL) of a rotating machine.
  • Referring to the figure, two sets of data are obtained in this example. As described, however, more than two sets may be obtained, ie three or more types of data may be obtained.
  • In the example, the sets of data are acoustic data and vibration data. The acoustic data is obtained from a suitable sensor 2 (eg a microphone) as raw acoustic samples 5. These samples may be obtained at periodic intervals, such as every second, every minute, every hour or any other time interval, which interval may vary. The samples will of course obtain, in addition to acoustic noise generated by the machine itself, environmental and external noise generated by external sources such as motor vehicles, external machinery and others.
  • Similarly, vibration data is obtained from vibration sensor 3 and used to obtain raw vibration sample 6, again at the same or different sampling rates. The vibrations from the machine will also be affected to external perturbations affecting the vibration of the machine and these might arise for example from air currents, the operation of other machinery or plant in the vicinity, movements of a vehicle in which the machine is in, or many other external factors which will affect the vibrations of the machine.
  • Each of the samples 5 and 6 is applied to a separate and independent feature extraction engine 7, 8 respectively. Feature extraction is in itself an known technique and a set of representative features is obtained from each of the independent feature extraction engines 7 and 8. These are applied to respective classifiers for the acoustic data 9 and for the vibration data 10.
  • Classifiers are also well-known in themselves. A classifier essentially represents a mapping from a discrete or continuous feature space to a discrete set of labels. The classifier may be, for example, an indication of wear, such that classification A means that the machine is barely worn, classification B means that the machine is beginning to wear and classification C means that the machine is nearly worn out. In quantitative terms, classification A may mean that the machine is zero to one third worn, classification B from one third to two thirds and classification C greater than two thirds worn, on a simplistic level.
  • The outputs from the independent classifiers 9 and 10 are then combined in a decision fusion technique at a decision level fusion engine 11 which combines them in order to determine a hopefully more accurate remaining useful life (RUL) prediction 12.
  • The decision fusion may be a multiplicative, additive or other process, including combinations of processes.
  • In a simple fusion technique, the output of the two classifiers 9 and 10 represent a list of all possible classes/outcome, ie worn, partially worn, not worn, and their associated scores (probability). Thus, there may be a probability Pa of a first wear state, Pb of a second wear state and Pc of a third wear state. The scores are simply multiplied in the decision level fusion 11 or their log scores are added and the most likely output is determined.
  • A mechanism may also be provided which provides information on the reliability of each modality (ie acoustic and vibration input in the example of Figure 3) and thereby weightings that should be applied to each. That is, if it is found that acoustic data is more reliable in determining wear (or other parameters) than vibration data then greater emphasis and weighting may be placed upon the results from the acoustic classification than the vibration classification.
  • Adaptive weightings are weightings that are dynamically adjusted during the fusion process. In one example of a fusion scheme, the process is multiplicative using rules of probability. Such a scheme initially selects a candidate that maximises the product of the N-best output probabilities of the modalities.
  • In more detail, a scheme may use the following equation: P AV k / χ = max i P ν l i / χ λ × P A l i / χ 1 - λ
    Figure imgb0001

    where λ = σ v σ v + σ a
    Figure imgb0002
    • Pav is the joint probability of the acoustic and vibrational modes of each class k, given the feature set x.
    • Let l be the number of possible classes, k is one of the l classes.
    • x is the set of features; acoustic or vibrational.
    • σa and σv are the variances of the audio and vibrational modality's N-best output probabilities, respectively. The visual N-best output probabilities are weighted using λ and the audio N-best output probabilities using 1 -λ.
  • The N-best output probabilities of the acoustic and vibration mode are weighted according to the dispersion, variances or entropy of their N-best output probabilities. They will therefore account to some extent for the confusability of the N-best probabilities or outputs from each classifier which can be affected by external noise factors. The weighting scheme accounts for the reliability of each of these modes.
  • The contribution of each mode is then adaptively weighted before performing decision level multi-modal fusion. In this adaptive weighting process, each mode is evaluated and the mode that is considered to be least effected by external noise factors is weighted more heavily than that is more affected.
  • An example is given below.
  • In the example shown in Figure 3, the two modes are acoustic and vibration data. A separate classifier 9 and 10 respectively is trained for each type of data or mode. For each sample presented to the classifier (ie from the respective feature extraction module 7 and 8) a list of probabilities for each potential class is provided. In one example, there are three possible classes, classes A to C. In some embodiments, there may be more or less than three classes. As described, the classes may represent the degree of wear, or other parameters relating to the functioning and RUL of the machine. In one embodiment, the output classes for each class A to C given by the acoustic classifier 9 are:
    • Probability of Class A : 0.6
    • Probability of Class B : 0.3
    • Probability of Class C : 0.1
  • The output probabilities for each Class A to C given by vibration classifier:
    • Probability of Class A : 0.33
    • Probability of Class B : 0.33
    • Probability of Class C : 0.34
  • It is seen from this that the vibration classifier 10 is confused in that it provides broadly similar probabilities for all three classes. This could well be due to vibration data being corrupted with noise, which confuses the classifier. Alternatively, the vibration data may not provide enough information to clearly separate the classes A to C.
    Thus, a relatively high 'confusion' for a mode of data is considered to represent a low signal to noise ratio for that mode, suggesting the mode is more prone to noise interference. A low 'confusion' on the other hand indicates a higher signal to noise ratio, suggesting less noise interference.
  • In real world terms, this may be envisaged in that if there were no environmental factors at all then increased vibration from a machine might normally be implied as meaning that the machine is wearing more. However, if there are external factors then an external factor which causes an extensive vibration might overwhelm an inherent small vibration of the machine at any time.
  • From the above probabilities, it is seem that the acoustic classifier is much less confused as this assigns a much higher probability to Class A, a lower probability to Class B and significantly lower still probability to Class C. Thus, a clear decision can be made from classifier 9.
  • Thus, in the decision level fusion 11, the acoustic output can be weighted higher than the vibration output. The acoustic output is considered to be much less affected by external environmental factors than the vibrational classifier. By weighting the acoustic output higher than the vibration output, the affects of vibration or noise on the overall multi-modal system decision is minimised.
  • The machine or machines upon which the process may be used might be a fan or other rotating machine operated in a factory, data centre or wind farm for example.
  • The description and drawings merely illustrate the principles of the invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the invention and are included within its spirit and scope. Furthermore, all examples recited herein are principally intended expressly to be only for pedagogical purposes to aid the reader in understanding the principles of the invention and the concepts contributed by the inventor(s) to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects and embodiments of the invention, as well as specific examples thereof, are intended to encompass equivalents thereof.
  • Embodiments may be used to provide indication of RUL of a machine (or group of machines) or other parameters or indications relating to wear, such as an indication of when a machine or a component is likely to need servicing or maintenance, or when it is likely to be X% worn, or other indications.

Claims (15)

  1. A method of machine monitoring comprising measuring, over time, at least two different modes of data relating to the machine and fusing the data in a manner which takes into account environmental factors, in order to provide an indication of wear of the machine.
  2. A method as claimed in Claim 1, comprising extracting one or more parameters from each mode of data and using each one as an input to a respective classifier for that mode, wherein outputs from the classifiers are used to determine the parameters relating to wear.
  3. A method as claimed in Claim 1, used for estimating the remaining useful lifetime (RUL) of the machine.
  4. A method as claimed in Claim 1, 2 or 3, wherein the two different modes of data are used to obtain independent classifiers which are combined using decision fusion.
  5. A method as claimed in Claim 4, wherein a measure of confusion is used to determine the affect on each individual modality of data of external noise factors.
  6. A method as claimed in Claim 4 or 5, wherein the fusion method takes into account confusion of output probabilities of each classifier, whereby a relatively high confusion is likely to represent a relatively low signal to noise ratio of the monitored data for a mode and a relatively low confusion likely to represent a relatively high signal to noise ratio of the monitored data for a mode.
  7. A method as claimed in Claim 5 or 6, wherein the measure of confusion is entropy or variants of classifier scores.
  8. A method as claimed in any preceding claim, wherein the contribution of each of mode measured data is weighted and modes that are considered to be relatively noise free are weighted more heavily than those modes that are more affected by external noise factors.
  9. A method as claimed in any preceding claim, wherein a candidate that maximises the product of the N-best output probabilities of the modes of data is selected.
  10. A method as claimed in Claim 9, wherein the joint probability Pav of the modes of data of each class a, given the feature set x, is determined according to: P AV k / χ = max i P ν l i / χ λ x P A l i / χ 1 - λ
    Figure imgb0003

    where λ = σ v σ v + σ a ;
    Figure imgb0004

    Pav is the joint probability of the acoustic and vibrational modes of each class k, given the feature set x; I is the number of possible classes, k is one of the I classes; x is the set of features; acoustic or vibrational; σa and σv are the variances of the audio and vibrational modality's N-best output probabilities, respectively.
  11. A method as claimed in any preceding claim, wherein the two or more modalities of data are used to generate respective classifiers and the classifiers are used to determine the relative effects of environmental noise on the measurement of each respective mode of data, and wherein weightings are applied accordingly.
  12. Apparatus for measuring one or more parameters relating to functioning of a machine, comprising monitoring, over time, two or more modes of data independently and combining the result with a weighting according to the effect environmental noise has on each mode of data to determine said parameter.
  13. Apparatus as claimed in Claim 12, wherein the parameter is remaining useful lifetime (RUL).
  14. Apparatus as claimed in Claim 12 or Claim 13, including sensors for sensing at least two modes of data and computational means adapted to receive inputs from the sensors and to determine RUL.
  15. Apparatus as claimed in claim 14, wherein the sensors are for measuring, respectively, sound and vibration modes of data.
EP20090290794 2009-10-16 2009-10-16 Monitoring of machines Active EP2323105B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP20090290794 EP2323105B1 (en) 2009-10-16 2009-10-16 Monitoring of machines

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP20090290794 EP2323105B1 (en) 2009-10-16 2009-10-16 Monitoring of machines

Publications (2)

Publication Number Publication Date
EP2323105A1 true EP2323105A1 (en) 2011-05-18
EP2323105B1 EP2323105B1 (en) 2014-12-03

Family

ID=41682446

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20090290794 Active EP2323105B1 (en) 2009-10-16 2009-10-16 Monitoring of machines

Country Status (1)

Country Link
EP (1) EP2323105B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2476935C1 (en) * 2011-08-26 2013-02-27 Учреждение Российской академии наук Санкт-Петербургский институт информатики и автоматизации РАН (СПИИРАН) Apparatus for determining values of operational characteristics of article
US20150222495A1 (en) * 2014-02-04 2015-08-06 Falkonry Inc. Operating behavior classification interface

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6490543B1 (en) * 1999-07-13 2002-12-03 Scientific Monitoring Inc Lifeometer for measuring and displaying life systems/parts
US6490545B1 (en) 2000-03-06 2002-12-03 Sony Corporation Method and apparatus for adaptive co-verification of software and hardware designs
US20030101019A1 (en) 2000-02-17 2003-05-29 Markus Klausner Method and device for determining the remaining serviceable life of a product
GB2430039A (en) * 2005-09-07 2007-03-14 Motorola Inc Product age monitoring device and method of use of the device
EP1906273A1 (en) * 2006-09-29 2008-04-02 Siemens Aktiengesellschaft Method for operating a large scale plant and control system for a large scale plant
EP1930855A2 (en) * 2006-12-08 2008-06-11 General Electric Company Method and system for estimating life or a gearbox

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6490543B1 (en) * 1999-07-13 2002-12-03 Scientific Monitoring Inc Lifeometer for measuring and displaying life systems/parts
US20030101019A1 (en) 2000-02-17 2003-05-29 Markus Klausner Method and device for determining the remaining serviceable life of a product
US6490545B1 (en) 2000-03-06 2002-12-03 Sony Corporation Method and apparatus for adaptive co-verification of software and hardware designs
GB2430039A (en) * 2005-09-07 2007-03-14 Motorola Inc Product age monitoring device and method of use of the device
EP1906273A1 (en) * 2006-09-29 2008-04-02 Siemens Aktiengesellschaft Method for operating a large scale plant and control system for a large scale plant
EP1930855A2 (en) * 2006-12-08 2008-06-11 General Electric Company Method and system for estimating life or a gearbox

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Adaptive fusion of acoustic and visual sources for automatic speech recognition", SPEECH COMMUNICATION, vol. 26, no. 1-2, 1998, pages 149 - 161

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2476935C1 (en) * 2011-08-26 2013-02-27 Учреждение Российской академии наук Санкт-Петербургский институт информатики и автоматизации РАН (СПИИРАН) Apparatus for determining values of operational characteristics of article
US20150222495A1 (en) * 2014-02-04 2015-08-06 Falkonry Inc. Operating behavior classification interface
US10037128B2 (en) * 2014-02-04 2018-07-31 Falkonry, Inc. Operating behavior classification interface

Also Published As

Publication number Publication date
EP2323105B1 (en) 2014-12-03

Similar Documents

Publication Publication Date Title
US6687654B2 (en) Techniques for distributed machinery monitoring
JP6140331B1 (en) Machine learning device and machine learning method for learning failure prediction of main shaft or motor driving main shaft, and failure prediction device and failure prediction system provided with machine learning device
Camci et al. Feature evaluation for effective bearing prognostics
CN108475052B (en) Diagnostic device, computer program and diagnostic system
Boutros et al. Detection and diagnosis of bearing and cutting tool faults using hidden Markov models
EP1927830B1 (en) Device for overall machine tool monitoring and corresponding method therefor
Siegel et al. Methodology and framework for predicting helicopter rolling element bearing failure
US7555407B2 (en) Anomaly monitoring device and method
EP2538210A2 (en) Acoustic diagnostic of fielded turbine engines
EP1892505A2 (en) Anomaly monitoring device
EP2461222B1 (en) Method and system for detection of machine operation state for monitoring purposes
Wang et al. An evolving fuzzy predictor for industrial applications
US10359401B2 (en) Malfunction diagnosing apparatus, malfunction diagnosing method, and recording medium
KR102321607B1 (en) Rotating machine fault detecting apparatus and method
EP2208981B1 (en) Monitoring of rotating machines
JP4417318B2 (en) Equipment diagnostic equipment
EP3913453B1 (en) Fault detection system and method for a vehicle
CN111964909A (en) Rolling bearing operation state detection method, fault diagnosis method and system
EP2323105B1 (en) Monitoring of machines
Sun et al. Evaluation of transducer signature selections on machine learning performance in cutting tool wear prognosis
JP2018018507A (en) Diagnostic device, program, and diagnostic system
KR102433483B1 (en) System for Predicting Flaw of Facility Using Vibration Sensor
Pathiran et al. Performance and predict the ball bearing faults using wavelet packet decomposition and ANFIS
JP4513796B2 (en) Abnormality monitoring device
Fezari et al. Noise emission analysis a way for early detection and classification faults in rotating machines

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

AX Request for extension of the european patent

Extension state: AL BA RS

17P Request for examination filed

Effective date: 20111118

17Q First examination report despatched

Effective date: 20111223

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ALCATEL LUCENT

111Z Information provided on other rights and legal means of execution

Free format text: AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

Effective date: 20130410

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20140604

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ALCATEL LUCENT

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

D11X Information provided on other rights and legal means of execution (deleted)
AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 699750

Country of ref document: AT

Kind code of ref document: T

Effective date: 20141215

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602009028080

Country of ref document: DE

Effective date: 20150115

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20141203

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 699750

Country of ref document: AT

Kind code of ref document: T

Effective date: 20141203

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150303

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150304

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150403

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150403

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602009028080

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

26N No opposition filed

Effective date: 20150904

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: LU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20151016

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20151031

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20151031

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 8

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20151016

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20091016

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 9

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 602009028080

Country of ref document: DE

Representative=s name: METACOM LEGAL RECHTSANWAELTE, DE

Ref country code: DE

Ref legal event code: R082

Ref document number: 602009028080

Country of ref document: DE

Representative=s name: METACOM LEGAL, DE

Ref country code: DE

Ref legal event code: R081

Ref document number: 602009028080

Country of ref document: DE

Owner name: WSOU INVESTMENTS, LLC, LOS ANGELES, US

Free format text: FORMER OWNER: ALCATEL LUCENT, BOULOGNE BILLANCOURT, FR

Ref country code: DE

Ref legal event code: R082

Ref document number: 602009028080

Country of ref document: DE

Representative=s name: BARKHOFF REIMANN VOSSIUS, DE

REG Reference to a national code

Ref country code: GB

Ref legal event code: 732E

Free format text: REGISTERED BETWEEN 20200924 AND 20200930

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 602009028080

Country of ref document: DE

Representative=s name: METACOM LEGAL RECHTSANWAELTE, DE

Ref country code: DE

Ref legal event code: R082

Ref document number: 602009028080

Country of ref document: DE

Representative=s name: METACOM LEGAL, DE

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230328

Year of fee payment: 14

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230327

Year of fee payment: 14

Ref country code: DE

Payment date: 20230329

Year of fee payment: 14

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230606