CN115428046A - Vehicle health monitor - Google Patents

Vehicle health monitor Download PDF

Info

Publication number
CN115428046A
CN115428046A CN202180029738.9A CN202180029738A CN115428046A CN 115428046 A CN115428046 A CN 115428046A CN 202180029738 A CN202180029738 A CN 202180029738A CN 115428046 A CN115428046 A CN 115428046A
Authority
CN
China
Prior art keywords
vehicle
sensor
component
data
examples
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180029738.9A
Other languages
Chinese (zh)
Inventor
M·H·A·克莱森斯
P·舒安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zoox Inc
Original Assignee
Zoox Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/856,733 external-priority patent/US11842580B2/en
Priority claimed from US16/856,597 external-priority patent/US11482059B2/en
Application filed by Zoox Inc filed Critical Zoox Inc
Publication of CN115428046A publication Critical patent/CN115428046A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0259Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterized by the response to fault detection
    • G05B23/0283Predictive maintenance, e.g. involving the monitoring of a system and, based on the monitoring results, taking decisions on the maintenance schedule of the monitored system; Estimating remaining useful life [RUL]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0218Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults
    • G05B23/0224Process history based detection method, e.g. whereby history implies the availability of large amounts of data
    • G05B23/0227Qualitative history assessment, whereby the type of data acted upon, e.g. waveforms, images or patterns, is not relevant, e.g. rule based assessment; if-then decisions
    • G05B23/0232Qualitative history assessment, whereby the type of data acted upon, e.g. waveforms, images or patterns, is not relevant, e.g. rule based assessment; if-then decisions based on qualitative trend analysis, e.g. system evolution
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/40Maintenance of things
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2637Vehicle, car, auto, wheelchair
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0218Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults
    • G05B23/0221Preprocessing measurements, e.g. data collection rate adjustment; Standardization of measurements; Time series or signal analysis, e.g. frequency analysis or wavelets; Trustworthiness of measurements; Indexes therefor; Measurements using easily measured parameters to estimate parameters difficult to measure; Virtual sensor creation; De-noising; Sensor fusion; Unconventional preprocessing inherently present in specific fault detection methods like PCA-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computing Systems (AREA)
  • Measurement Of Mechanical Vibrations Or Ultrasonic Waves (AREA)
  • Time Recorders, Dirve Recorders, Access Control (AREA)
  • Traffic Control Systems (AREA)

Abstract

Techniques for monitoring and predicting vehicle health are disclosed. In some examples, sensor data (e.g., audio data) may be used to create sensor features associated with vehicle components. During the life of the vehicle component, the sensor characteristic may be compared to one or more second sensor characteristics associated with the vehicle component to determine a change in an operating state associated with the vehicle component. In some examples, the machine learning model may be trained to identify vehicle components and/or operating states of the vehicle components based on sensor data input into the machine learning model. In this manner, sensor data may be input to the machine learning model and the machine learning model may output a corresponding vehicle component and/or operating state associated with the component.

Description

Vehicle health monitor
Cross Reference to Related Applications
This PCT international application claims priority to U.S. application serial No. 16/856,597, filed on 23/4/2020 and U.S. application serial No. 16/856,733, filed on 23/4/2020, which is incorporated by reference in its entirety.
Background
Vehicles include a wide range of individual components or systems that may wear out, fail, or otherwise require repair or replacement throughout the life of the vehicle. Today, many vehicles rely on periodic service to diagnose and detect component wear. Some vehicles may alert the user to periodic maintenance. In addition, some vehicles notify the user when individual components fail or wear. However, servicing the assembly at this time may be inconvenient or may result in a vehicle shutdown. Furthermore, the components may have suffered irreparable damage, resulting in higher costs for replacing the components. Moreover, failure of a component may also result in damage and/or failure of other components of the vehicle, even further increasing repair costs and/or the amount of labor required to repair and/or replace the component.
Drawings
The detailed description is described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference symbols in different drawings indicates similar or identical items or features.
FIG. 1 is a schematic illustration of an example vehicle including a plurality of sensors.
FIG. 2 is a schematic view of an example vehicle interior including a plurality of sensors.
FIG. 3 is an exploded view of an example drive assembly of a vehicle including multiple assemblies.
FIG. 4 is a block diagram of an example system for implementing the techniques described herein.
FIG. 5 is a flow chart illustrating an example method for monitoring vehicle health and/or vehicle component health.
FIG. 6 is a flow chart illustrating an example method that includes additional details of processing sensor data.
FIG. 7 is a flow diagram illustrating an example method for monitoring vehicle health and/or vehicle component health using a machine learning model.
Detailed Description
As noted above, a vehicle may include a wide range of individual components or systems that may wear out, fail, or otherwise require repair or replacement throughout the life of the vehicle. Existing methods of alerting users to repair their vehicles or notifying users when a component fails are insufficient because it may be inconvenient for the user to repair the component, the component may have suffered irreparable damage, and/or the failure of the component may have caused damage to other components of the vehicle.
Systems and methods for monitoring vehicle health are described. In some examples, the vehicle may include one or more sensors (e.g., microphones, inertial Measurement Units (IMUs), temperature sensors, image sensors, piezoelectric sensors, pressure sensors, accelerometers, air quality sensors, voltage sensors, current sensors, etc.) that continuously or periodically collect sensor data associated with one or more components of the vehicle throughout the life of the vehicle. The sensor data may include sensor characteristics associated with one or more components of the vehicle. Further, the vehicle and/or a computer monitoring system associated with the vehicle can determine changes in sensor characteristics associated with vehicle components over time. Based on determining the change in the sensor characteristic of the component, an operating state associated with the component of the vehicle may be determined. By way of example and not limitation, an "operational status" may include an indication of wear associated with a component of the vehicle, such as a percentage of useful life and/or a percentage of remaining useful life of the component (e.g., 50% useful life, 75% remaining useful life, etc.), a time of failure associated with the component, such as an amount of time and/or distance the vehicle may travel until the component may fail (e.g., 10 hours until the component fails, 100 miles until the component fails, etc.), or an indication of an anomaly associated with the component, such as one or more fault conditions. In this way, vehicle health and/or vehicle component health may be more accurately monitored throughout the life of the vehicle, and changes in sensor characteristics of the components may be used to detect wear and/or predict failure, thereby allowing repairs to be made prior to component failure and at the most convenient time. Further, in some examples, the techniques may be implemented using sensors already present on the vehicle such that no additional hardware is required to implement at least some of the techniques described herein.
In some examples, a first (e.g., baseline, initial, raw, etc.) sensor characteristic may be determined for a vehicle component. The first sensor features may include acoustic features captured by one or more microphones, inertial measurements captured by one or more IMUs, temperature measurements captured by one or more temperature sensors, images from one or more image sensors, and/or combinations of these and/or other sensors. The first sensor characteristic may be determined during bench testing of the component or based on sensor data captured by another vehicle that previously experienced a failure and/or anomaly of the component of the same type (e.g., same or equivalent part number, brand, model, category, etc.) as the component. Additionally or alternatively, the first sensor characteristic may be determined at a first time (e.g., during a "test mode" associated with the vehicle when the vehicle is first commissioned or the component is put into service). At a first time, the vehicle may perform one or more operations associated with activating the component under one or more conditions, such that sensor data associated with the component may be captured. For example, in the case of a fan of a heating, ventilation, and air conditioning (HVAC) system, the vehicle may run the fan through all of its available settings (e.g., high, medium, low, etc.). In some examples, the vehicle may also run the fan through all of its available settings under a number of different conditions (e.g., when the temperature setting of the HVAC system is set to a number of different temperatures, when the vehicle is moving at different speeds, when one or more doors or windows are open and closed, etc.) in order to isolate the sensor features associated with the component under various operating conditions. The sensor data may then be used to determine a first sensor characteristic of the component. Further, a second (e.g., progressive, real-time, current, etc.) sensor characteristic may be determined for the vehicle component at a first time and/or a second time after the test mode. The second sensor characteristic may then be compared to the first sensor characteristic to determine an operating state associated with the component. The vehicle may perform the techniques described herein to determine sensor characteristics of any or all components of the vehicle. In this manner, component and/or vehicle health may be determined in real time prior to component failure. Further, these techniques enable determination of component failures and/or anomalies without human intervention (e.g., in the case of unmanned and/or autonomous vehicles, when an occupant of the vehicle may not be present, to determine when a component is worn, malfunctioning, or otherwise experiencing an anomaly).
As used herein, "sensor characteristics" may include data representing a series of measurements of one or more sensors over time. The sampling rate and characteristics of the sensor data may vary depending on the type of sensor data used. In some examples, the sensor features may include raw sensor data, while in other examples, the sensor data may be filtered (e.g., to remove noise), compressed, or otherwise processed to obtain the sensor features. In some examples, the sensor features may be represented as a vector or matrix representing one or more features of the sensor data. In some examples, for example in the context of audio data, the sensor features may include digital audio data stored in known audio coding formats, such as MP3, advanced Audio Coding (AAC), opus, vorbis, and the like. In at least some examples, the sensor features may include information derived from raw sensor data, such as, but not limited to, fourier transforms, laplace transforms, principal component analysis, harmonic decomposition, and/or any other method of determining relevant features.
Depending on the type of sensor data used, the format of the sensor features, and/or the criteria to compare the sensor features, a variety of different comparison techniques may be used to compare the plurality of sensor features (e.g., the first sensor feature and the second sensor feature). For example, in the context of audio data, the criteria to be compared may include frequency (including a set of frequencies), amplitude, pitch, visual appearance of a waveform, and the like. By way of example and not limitation, sensor features may be compared based on their similarity in the time domain (with and/or without a shift), their similarity in the frequency domain (with and/or without a shift), and/or the similarity of energy or power. In some examples, the comparison may be performed based on a weighted representation of any or all of these or other criteria. Comparisons may be made over the entire correlation vector to measure the total general correlation and/or only over values in the correlation vector that exceed a threshold (e.g., to filter out noise, echoes, etc.).
In some examples, the machine learning model may be trained to determine one or more components of the vehicle associated with the sensor data and/or sensor features. For example, based on a particular type of sensor data (e.g., audio data, IMU data, image data, etc.) and/or sensor characteristics, the machine learning model may determine the vehicle components that produced/generated the particular sensor data and/or sensor characteristics. Additionally or alternatively, the machine learning model may be trained to detect changes in sensor characteristics of individual components and/or to determine operating states associated with vehicle components. For example, sensor data and/or sensor characteristics may be input into a machine learning model, which in response may determine and/or output respective operating states associated with components that generate the sensor data and sensor characteristics. In some examples, the machine learning model may be trained to determine an estimated amount of time until a component of the vehicle is predicted to experience a fault or require repair. In this manner, repairs to the vehicle and/or components can be made at a convenient time prior to failure, thereby avoiding costly down time and avoiding damage to other components. Further, in some examples, periodic preventative maintenance may be avoided, and maintenance may be performed only when necessary. This may avoid unnecessary downtime and allow the vehicle component to reach its maximum expected life rather than prematurely replacing and/or repairing the vehicle component for preventative reasons. The techniques described herein may be configured for monitoring virtually any vehicle component by using real-time sensor data (e.g., audio data, accelerometer data, voltage measurements, current measurements, imaging data, piezoelectric sensor data, pressure data, temperature data, etc.), and then evaluating these sensor characteristics over time to accurately monitor vehicle health.
By way of example and not limitation, a method of monitoring vehicle health according to the present application may include: a component of a vehicle is activated at a first time, and first data associated with the component of the vehicle is received from a sensor of the vehicle at the first time. In this manner, a first sensor characteristic associated with a component of the vehicle at a first time may be determined and stored based at least in part on the first data. The first data may include first sensor data, such as audio data, inertial measurements, image data, and the like. In some cases, the first sensor characteristic may comprise a baseline sensor characteristic. In additional or alternative examples, a first sensor characteristic indicative of an operating state associated with a component of the vehicle may be stored. In this manner, the first sensor characteristic may be based on bench test sensor data associated with a similar component of the component, or based on stored log data captured by one or more sensors of another vehicle experiencing a failure and/or other anomaly of the similar component. For example, in the case of a braking system of a vehicle, brake pads typically include a wear indicator that causes the brake pad to squeal after the brake pad experiences a threshold amount of wear (e.g., 80%, 85%, 90%, etc.). During bench testing, brake pads that have experienced a threshold amount of wear (e.g., by use on another vehicle, manual machining, etc.) may be used to establish a first sensor signature (e.g., a baseline acoustic signature) for use by the system. In at least some examples, such first sensor characteristics may be associated with operating conditions (or parameters) of the vehicle (and/or components, subassemblies, etc.) during testing. By way of non-limiting example, the first sensor characteristic may be associated with engine speed, operating state of other components (e.g., HVAC temperature and/or fan speed), brake pressure, and the like.
As used herein, "first time," "second time," "third time," and the like may include a particular point in time and/or may include a period of time. In some examples, the first time may correspond to a test state performed when the vehicle is first commissioned. Additionally, or alternatively, the test condition may be performed when the vehicle component is placed into service (e.g., replaced, repaired, etc.). In some examples, the second time, the third time, etc. may correspond to a diagnostic status performed at a time subsequent to the first time to monitor vehicle health. The second and subsequent times may be periodic (e.g., daily, weekly, monthly, etc.) and/or may be triggered by one or more events (e.g., when the vehicle stops servicing, when the vehicle is charging, when the vehicle is parked, when the vehicle is in transit, etc.).
In some examples, the method may include: a second sensor characteristic associated with the vehicle component is determined and/or stored at a second time. In some cases, the second sensor characteristic may include a progressive sensor characteristic, and the second time may be after the first time. Further, determining the second sensor characteristic may be based at least in part on the second data. In at least one example, to determine the second sensor characteristic, the method may include causing the vehicle to activate the component at a second time. In this way, second data associated with the vehicle component may be received from the sensor at a second time. The second data may include second sensor data, such as audio data, inertial measurements, image data, and the like.
In some examples, the method may include determining whether there is a change and/or correlation between the first sensor characteristic and the second sensor characteristic. If there is a change and/or correlation, the method may include determining and/or outputting an operating state associated with the component based at least in part on the change between the first sensor characteristic and the second sensor characteristic. Additionally, or alternatively, the method may include determining whether the change is greater than a threshold change, and determining and/or outputting an operating state associated with the component based at least in part on the change being greater than the threshold change. In some examples, the method may include determining an estimated time to failure associated with a component of the vehicle (e.g., an estimated number of miles before the component may fail, a number of hours to repair, etc.) based at least in part on the first sensor characteristic, the second sensor characteristic, and/or a change between the first sensor characteristic and the second sensor characteristic. This estimated time to failure may additionally be output, recorded, and/or transmitted to a remote monitoring system associated with the vehicle. In some examples, a change and/or correlation between the first sensor characteristic and the second sensor characteristic may be determined based at least in part on a comparison of frequencies and/or amplitudes of the respective sensor characteristics. Additionally or alternatively, a change and/or correlation between the first sensor characteristic and the second sensor characteristic may be determined based at least in part on a tonal analysis of the various sensor characteristics (e.g., which frequencies are more prominent compared to background noise) and/or operating conditions associated with the vehicle and/or component (e.g., revolutions Per Minute (RPM), speed, steering angle, temperature, etc.). In at least some examples, such a comparison may be based on, for example, operating parameters of the vehicle and/or other components. In such examples where the current operating parameter is different from the current parameter, interpolation or extrapolation may be used to change the first characteristic and/or adjust the threshold difference for comparison.
In some examples, a sensor characteristic (e.g., a first sensor characteristic and/or a second sensor characteristic) may be determined based at least in part on processing the sensor data (e.g., the second data and/or the first data). For example, if the sensor data includes audio data, the audio data may include acoustic features associated with the vehicle components as well as background noise. Accordingly, the audio data may be processed (e.g., filtered) to remove at least some background noise from the audio data. In this way, the portion of the audio signature attributable to the component may be isolated and/or the quality of the acoustic signature of the audio data may be improved to better monitor vehicle health. Further, processing the sensor data may include setting a limit of a signal-to-noise ratio of the sensor data, setting a target frequency of the sensor data, including performing a Fast Fourier Transform (FFT) process on the sensor data to transform the sensor data from a time domain to a frequency domain, and the like.
In some examples, when a component of the vehicle is activated at a first time and a second time to receive the first data and the second data as described above, the method may further include controlling operation of one or more other components of the vehicle at the first time or the second time according to the operating parameter. For example, if the activated component is a fan of an HVAC system, the vehicle may run the fan through various settings while controlling operation of other components according to operating parameters (e.g., when the vehicle is moving at different speeds, when the vehicle is stopped, when one or more doors or windows are open and closed, etc.). In this manner, activated components may be isolated under various operating conditions to record sensor characteristics. Further, controlling the operation of other components as a function of operating parameters may allow for more consistent measurements associated with sensor characteristics. In further examples, the operating parameters may include a speed of the vehicle, a speed of the component (e.g., translation or rotation), a steering angle, an ambient air temperature, an ambient humidity, an ambient weather (e.g., rain, snow, sunny day, etc.), a component temperature, a time of day, an air pressure (e.g., altitude), and/or the like.
In some examples, the method may further include causing the vehicle and/or the component to be serviced. Causing the vehicle and/or component to be serviced may be based at least in part on an operating state of the component, an estimated time to failure of the component, a sensor characteristic of the component, and/or the like. Further, causing the vehicle and/or component to be serviced may include outputting an indication that the vehicle needs to be serviced, scheduling a time for the vehicle to be serviced, autonomously navigating the vehicle to a service location, and/or combinations thereof.
In some examples, determining whether the change between the first sensor characteristic and the second sensor characteristic is greater than the threshold change may be based at least in part on the third sensor characteristic. The third sensor characteristic may be associated with a faulty operating state of a component of the vehicle or of the same or similar component of another vehicle.
In various examples, actual conditions associated with the vehicle component may be measured over time to generate a damage model associated with the vehicle component. For example, measurements related to actual conditions (e.g., force, pressure, current, temperature, etc.) experienced by vehicle components may be made over time. Over time, these measurements may be recorded and evaluated to create a wear and/or damage model associated with the component, and the model may be compared to known operating limits (e.g., fatigue and/or stress conditions of the metal, number of cycles, number of revolutions, number of operating hours, etc.) to determine when to repair and/or replace the component, when the component may encounter a fault and/or abnormal condition, and so forth. In at least one example, the pressure sensor may be used as a proxy for force to determine a load input on the vehicle body. Further, the stress-strain relationship of the vehicle component may be evaluated (e.g., using Miner's rule and/or a cumulative damage model).
As described herein, a machine learning model may be trained to predict an operating state associated with a component of a vehicle. By way of example and not limitation, a method of training a machine learning model according to the present application may include: stored sensor data previously captured by one or more sensors of the vehicle is received. The sensor data may include audio data, inertial Measurement Unit (IMU) data, temperature data, image data, voltage measurements, current measurements, and the like. In some examples, the operating state of the vehicle component generating the sensor data may be determined or known. Additionally, or alternatively, an identification of the vehicle component generating the sensor data may be determined or known. In at least one example, the sensor data may include training data. The training data may be tagged to include a designation of a ground truth operating state of the component at the time the training data was captured (e.g., an indication of wear associated with the vehicle component, a time of failure associated with the component, an indication of an anomaly associated with the component, etc.). Additionally, or alternatively, the training data may be tagged to include a designation of the identity of the component that the training data represents. In various examples, the training data may include second sensor data captured by one or more other sensors of another vehicle. The second sensor data may additionally be associated with a component of the second vehicle. The designation of ground truth may be generated manually or may be automatically generated based on historical service log data associated with the component.
In some examples, the method may include inputting sensor data into a machine learning model, and receiving a predicted operating state associated with a vehicle component from the machine learning model. By way of example and not limitation, the machine learning model may include and/or utilize a penalized linear regression model, a decision tree, a logistic regression model, a Support Vector Machine (SVM), a naive bayes model, a k-nearest neighbor (KNN) model, a k-Means model, a neural network, or other logic, model, or algorithm alone or in combination.
In some examples, a difference between a ground truth operating state (e.g., a measured or actual operating state) and a predicted operating state output by the machine learning model may be determined. In that case, one or more parameters of the machine learning model may be changed and/or adjusted based at least in part on the difference to obtain a trained machine learning model that is capable of accurately predicting an operating state of the vehicle component under repair.
In various examples, sensor data may be processed and/or filtered prior to being input into the machine learning model. For example, where the sensor data includes audio data, the audio data may include at least acoustic features associated with vehicle components and background noise. Accordingly, the method may include identifying background noise of the audio data and processing at least a portion of the audio data including the background noise to generate processed audio data having less background noise. Additionally, processing the sensor data may include setting a limit on a signal-to-noise ratio of the sensor data, setting a target frequency of the sensor data, including performing a Fast Fourier Transform (FFT) process on the sensor data to convert the sensor data from a time domain to a frequency domain, and so forth. Thus, in at least some examples, the processed audio data may be input into a machine learning model.
In some examples, the machine learning model may predict the location of the component that generated the sensor data. For example, in examples where one or more sensors of the vehicle include an array of microphones, the machine learning model may predict the location of the component generating the audio sensor data. For example, if the component generates audio data, a first audio signal strength of a first microphone of the array may be greater than a second audio signal strength of a second microphone of the array (and so on), based in part on the first microphone being closer to the component than the second microphone. In this manner, based on the respective signal strengths, the predicted locations of the vehicle components associated with generating the audio data may be predicted. Additionally, the machine learning model may predict an identity of the component associated with generating the audio data based at least in part on the predicted location of the component.
In various examples, the trained machine learning model may be used in an inference mode (e.g., during vehicle operation) to predict an operating state of a vehicle component. For example, the second sensor data may be captured by one or more sensors of the vehicle and input into the trained machine learning model continuously, periodically, and/or in response to one or more conditions. In this manner, the trained machine learning model may output an operating state associated with the vehicle component (e.g., a predicted operating state that is the same as an actual and/or measured operating state of operation).
Example techniques described herein, including the example methods described above, may be implemented as methods performed by a vehicle, a vehicle computing device, a robot, a remote computing system associated with a vehicle, a machine learning model, and/or the like. Additionally or alternatively, the example techniques described herein may be implemented as a system comprising one or more processors and one or more computer-readable media storing instructions that, when executed by the one or more processors, cause the system to perform various operations of one or more of the example techniques. Additionally or alternatively, the example techniques described herein may be implemented as a non-transitory computer-readable storage medium storing instructions that, when executed by a computing device (e.g., a processor, a vehicle computing device, a remote computing system, etc.), cause the computing device to perform one or more of the various operations of the example techniques.
Although some examples described herein relate to using audio data to detect an operating state of a vehicle component, in some examples, other types of sensor data may be used in addition to or in place of audio data to monitor an operating state of a vehicle and/or vehicle component. For example, the output of any other type of sensor described herein may be used to generate sensor features, as input to a machine learning model, etc., in addition to or in place of audio data.
These and other aspects will be further described below with reference to the accompanying drawings. The drawings are exemplary implementations only and should not be construed as limiting the scope of the claims. For example, while the example vehicle is shown and described as an autonomous vehicle capable of navigating between different locations without manual control or intervention, the techniques described herein are also applicable to non-autonomous and/or semi-autonomous vehicles. Further, while the example vehicle is illustrated with a passenger car type seat, other seat configurations are also contemplated. According to this invention, the vehicle may be configured to accommodate any number of passengers (e.g., 0, 1, 2, 3, 4, 5, 6,7, 8, etc.). Further, while the illustrated example includes a passenger cabin, in other examples, the vehicle may not have a passenger cabin (e.g., for a freight vehicle, a transportation vehicle, a construction vehicle, etc.). Further, the techniques described herein are applicable to land vehicles, air vehicles, water vehicles, robots, and the like.
Example vehicle architecture
FIG. 1 is a schematic diagram of an example vehicle 100 that includes a plurality of sensors 102A-102N (hereinafter collectively referred to as "sensors 102"), where N represents any number greater than or equal to 1. In this example, the vehicle 100 is a bi-directional autonomous vehicle capable of navigating between locations without manual control or intervention. As used herein, a bi-directional vehicle is a vehicle that is configured to switch between traveling in a first direction along the vehicle and in an opposite second direction along the vehicle. In other words, the vehicle 100 does not have a fixed "front" or "rear". In contrast, either longitudinal end of the vehicle 100 becomes "front" when leading, and the rear longitudinal end becomes "rear". However, as noted above, the techniques described herein may be applied to vehicles other than bi-directional vehicles, including autonomous, semi-autonomous, or manually driven vehicles, as well as robots, and the like.
As shown in fig. 1, the sensors 102 may be located outside of the vehicle 100 and in various locations. In the illustrated example, the sensors 102 include a first sensor 102A, a second sensor 102B, and a third sensor 102C, the first sensor 102A, the second sensor 102B, and the third sensor 102C being disposed in an array over a door opening on a first lateral side (the side shown in fig. 1) of the vehicle 100. The fourth sensor 102D and the fifth sensor 102E are disposed on first lateral sides near the roof of the vehicle at the front end and the rear end of the vehicle, respectively. Sixth sensor 102F, seventh sensor 102G, eighth sensor 102H, and ninth sensor 102N are disposed at an elevation below fourth sensor 102D and fifth sensor 102E along the first side. In this example, the fourth and sixth sensors 102D, 102F are generally disposed at a first corner of the vehicle, while the fifth and ninth sensors 102E, 102N are disposed at a second corner of the vehicle 100. In this example, the seventh sensor 102G is disposed outside the first door panel 104A, and the eighth sensor 102H is disposed outside the second door panel 104B. Although not shown in fig. 1, additional sensors may be provided at similar locations on opposite sides of the vehicle 100. By way of example and not limitation, sensors 102 may include one or more microphones, surface Acoustic Wave (SAW) sensors, inertial Measurement Units (IMU), temperature sensors, image sensors (e.g., cameras), lidar sensors, radar sensors, time-of-flight (TOF) sensors, sonar sensors, pressure sensors, strain gauges, humidity sensors, geolocation sensors (e.g., GPS sensors), environmental sensors, piezoelectric sensors, accelerometers, air quality sensors, voltage sensors, current sensors, and so forth. Further, the sensors 102 may include a plurality of different types of sensors, e.g., a first sensor may include a first sensor type (e.g., a microphone), a second sensor may include a second sensor type (e.g., an IMU), and so on. In some examples, the sensor may include one or more sensor types (e.g., a sensor that is both a microphone and an image sensor). The vehicle 100 may include more or fewer sensors than shown in fig. 1. For example, although depicted in fig. 1 as including multiple sensors 102, in some examples, the vehicle 100 may include only a single sensor. Further, the location of the sensor 102 on the vehicle 100 is for illustration purposes, and it is contemplated that the sensor 102 may be disposed in a different location of the vehicle 100 than shown in FIG. 1, including inside and/or outside of the vehicle 100.
The sensors 102 of the vehicle may be used for a variety of purposes. For example, at least some of the sensors 102 (e.g., image sensors, lidar sensors, radar sensors, etc.) may provide sensor data to a perception system of the vehicle to allow the vehicle to detect and classify objects in the vehicle environment and/or to locate the position of the vehicle in the environment. As another example, at least some of the sensors 102 (e.g., GPS sensors, IMUs, etc.) may provide location and/or motion data of the vehicle to a perception system, navigation system, or other system of the vehicle. Additionally or alternatively, at least some of the sensors 102 may be used to capture input from a vehicle user (e.g., verbal, touch, and/or gesture commands provided by a vehicle occupant) to control or interact with the vehicle. In some examples, any or all of these sensors may also be used to capture data related to the operation of the vehicle components. As such, in some examples, the operating state of the vehicle or vehicle components may be monitored using existing sensors of the vehicle. In some examples, the sensors 102 may also include additional sensors specifically designed and positioned to capture data associated with the operation of the vehicle components.
In some examples, an array of sensors may be used, such as sensors 102A, 102B, and 102C. In such an example, the sensors 102 in the array may be used to determine the directionality associated with the sensor data. For example, if the sensor array includes a microphone array for capturing audio data produced by a vehicle component, the audio data captured by the microphone array may be used to determine the location of the component producing the audio data and/or to distinguish between different components to which the audio data may be attributed. For example, if a first microphone of the array is closer to the component than a second microphone of the array, the first audio data captured by the first microphone may comprise a stronger audio signal than the second audio data captured by the second microphone, and so on. In this way, based on different signal strengths in the first audio data, the second audio data, the third audio data, etc., the direction and/or position of the component producing the audio data may be determined. In additional or alternative examples, sensors may be provided in each of the four quadrants of the vehicle 100 to locate the components. For example, a first microphone may be located in a first quadrant (e.g., front and left quadrants) of the vehicle 100, a second microphone may be located in a second quadrant (e.g., front and right quadrants) of the vehicle 100, a third microphone may be located in a third quadrant (e.g., rear and left quadrants) of the vehicle 100, and a fourth microphone may be located in a fourth quadrant (e.g., rear and right quadrants) of the vehicle 100. In the context of this "quadrant" term, the term "front" refers to a first end of the vehicle (which may be the front or rear end of the vehicle in the case of a bi-directional vehicle) and "rear" refers to a second end of the vehicle opposite the front/first end.
In some examples, one or more sensors, such as sensors 102G and 102H, may be located on vehicle 100 near one or more components. In the example of FIG. 1, sensors 102G and 102H are disposed on door panels 104A and 104B, respectively. As such, in at least one example, the sensor 102G and/or the sensor 102H may capture data indicating whether the door panels 104A and 104B are operating properly. For example, the sensors 102G and/or 102H may capture sensor data indicating that one or more of the door panels 104A and/or 104B are experiencing vibrations. Thus, this sensor data indicative of the vibration may be used to determine that one or more of the door panels 104A and/or 104B are not properly closed (e.g., one or both of the door panels are half-open, not in its track, blocked by an object, etc.). Additionally or alternatively, the sensor data may indicate that one or more of the door and/or window seals 106A, 106B, and 106C are not sealing properly (e.g., one of the seals is worn or fails). Additionally or alternatively, the sensor data may include audio from other nearby components (e.g., wheels of a vehicle), which may indicate a condition of the nearby components (e.g., tire wear condition, brake condition, etc.).
FIG. 2 is a schematic view of an example interior of a vehicle, such as vehicle 100, including a plurality of interior sensors 202A-202M (collectively "interior sensors 202", where M is any integer greater than or equal to 1). In the illustrated example, the interior sensors 202 include a first sensor 202A, a second sensor 202B, and a third sensor 202C disposed in a roof, or trim of the vehicle 100. In some examples, the first sensor 202A, the second sensor 202B, and the third sensor 202C may be arranged in an array longitudinally along a centerline of the vehicle 100. The fourth sensor 202D and the fifth sensor 202E are disposed on a first side of the vehicle at the front and rear ends, respectively, proximate a top of the vehicle passenger compartment. Although not shown, a similar sensor may be provided on a second, opposite side of the vehicle. Sixth sensor 202F and seventh sensor 202G are disposed at the front and rear ends, respectively, within the passenger compartment, approximately at the head or shoulder height of the passenger. In this example, the eighth sensor 202H is disposed on the interior of the first door panel 104A and the ninth sensor 102J is disposed on the exterior of the second door panel 104B. Tenth and eleventh sensors 202K and 202M are disposed proximate to one or more mechanical systems or components of the vehicle. Although not shown in fig. 2, additional sensors may be provided at similar locations on opposite sides of the vehicle 100. In some examples, the interior of the vehicle 100 may include more or fewer sensors than shown in fig. 2. For example, although depicted in fig. 2 as including multiple sensors 202, in some examples, the vehicle 100 may include only a single sensor. Additionally, the location of the sensor 202 within the vehicle 100 is for illustrative purposes, and it should be understood that the sensor 202 may be disposed at a different location of the vehicle 100 than that shown in FIG. 2.
In this example, the vehicle 100 includes removable drive assemblies 204A and 204B (collectively, "drive assemblies 204"). In some examples, vehicle 100 may be configured such that substantially all of the major systems and/or components of the vehicle are located on drive assembly 204. For example, each drive assembly 204 may include some or all of the following: propulsion systems, power systems (including batteries, fuel cells, internal combustion engines, etc.) and associated electronics, steering systems, braking systems, suspension systems, HVAC systems, wheel and tire assemblies, and associated controls and actuators for such systems. In some examples, the drive assembly 204 may also include exterior lighting, body panels, dashboards, and/or sensors, such as sensors 202K and 202M. In other examples that do not include a detachable drive assembly, the sensor 202 and any or all of these systems or assemblies may be coupled to the frame or body of the vehicle.
In some examples, the sensors 202 may capture sensor data associated with one or more components of the respective drive components 204. The sensor data may include one or more types of sensor data, such as audio data, video data, IMU data, temperature data, or data associated with any other type of sensor described herein. The sensor data may be representative of or associated with one or more different components of the vehicle. For example, the sensor data may include audio data indicating that the brake system is squeaking, IMU data indicating that a Constant Velocity (CV) joint of the vehicle is vibrating or knocking, audio data indicating that a compressor bearing of the HVAC system is failing, and so forth. In some examples, sensors 202 may include an array of sensors, such as sensors 202A, 202B, and 202C. As such, the sensor data may be used to determine that the drive component 204, the drive component 204A, or the drive component 204B is associated with generating the sensor data.
The drive assembly 204 in this example is removable and can be easily removed and replaced with a different drive assembly in the event of wear or failure of the assembly. Components of the drive module may then be serviced based on sensor characteristics determined from sensor data associated with the components of the drive module. In several examples, based on determining the operating states associated with components of vehicle 100 as described herein, the drive assembly may be replaced with a drive assembly that incorporates a new brake system, a new power unit, a new HVAC system, a new sensor system, and/or the like. In one example, based at least in part on determining that the brake system of the drive assembly has a degraded operational state (e.g., based on audio sensor characteristics corresponding to brake squeal), a new brake system may be installed in the drive assembly or an existing brake system may be repaired. In another example, sensor features (e.g., based on voltage sensors, current sensors, temperature sensors, and/or other sensors) may indicate that the battery is malfunctioning (e.g., not fully charged, overcharged, over-temperature, etc.), requiring replacement.
As discussed above with reference to fig. 1, the vehicle 100 includes door panels 104A and 104B (collectively, "door panels 104") and door and/or window seals 106A, 106B, and 106C (collectively, "door/window seals 106"). The door/window seal 106 seals door and/or window seams of the vehicle 100 so that external elements, such as rain, water, snow, mud, gas, odors, etc., do not pass through the door and/or window seams. Additionally, the door/window seal 106 may insulate the door and/or window seams such that noise, cold air, warm air, etc., may not enter the interior of the vehicle 100 through the door and/or window seams. For example, the door/window seal 106 may include rubber, plastic, silicon, fabric, composites thereof, or other materials to fill the seams 104 between the door panel 104 and the vehicle body and/or between the vehicle window and the door panel.
In some examples, the sensors 202 may capture sensor data associated with the door/window seal 106 to determine a condition and/or an operational state associated with the door seal 106. As the vehicle 100 travels over the road, the sensors 202 may capture audio data representing ambient noise associated with the interior of the vehicle 100. Based on the audio data, a progressive (e.g., second) sensor characteristic may be determined for ambient noise inside the vehicle 100. The progressive sensor signature can be compared to a baseline (e.g., first) sensor signature associated with ambient noise inside the vehicle to determine an operational state associated with the door/window seal 106 (e.g., whether the door seal 106 is properly sealing). If the progressive sensor characteristic differs from the baseline characteristic by a threshold amount (e.g., a change in amplitude, frequency, and/or other characteristic), the operating status may indicate that one or more of the door/window seals 106 has failed, is damaged, or requires maintenance. Further, in examples where the sensor array 202 is used, the sensor characteristics may also indicate a particular door panel 104 or door/window seal 106 associated with a faulty operating condition. In a variation of this example, the sensor 202 in this example may additionally or alternatively include a pressure sensor located inside and/or outside the vehicle, and the baseline and progressive sensor characteristics may be based at least in part on pressure measurements taken inside and/or outside the vehicle. Likewise, if the progressive sensor characteristic differs from the baseline characteristic by a threshold amount (e.g., a change in absolute internal pressure, a change in pressure difference between the vehicle interior and exterior, etc.), the operating state may indicate: one or more of the door/window seals 106 have failed, been damaged, or require repair.
FIG. 3 is an exploded view of an example drive assembly of a vehicle, such as drive assembly 204 of vehicle 100, including a plurality of components and one or more sensors. In the illustrated example, the drive assembly 204 includes a first sensor 300A and a second sensor 300B (collectively "sensors 300") located near various components or systems of the drive assembly 204. In the illustrated example, drive assembly 204 includes a propulsion system (including drive motor 302, gearbox 304, axle 306), a braking system (including rotor disk 308 and caliper 310), a power system (including inverter 312), an HVAC system (including air conditioning compressor 314), a coolant system (including coolant pump 322), a wheel assembly 316, and a sensor system (including sensor 300).
In an example, the sensor 300 (and/or the sensor 102 and/or the sensor 202) may capture sensor data associated with any of the various components of the drive assembly 204 and/or other components of the vehicle 100. By way of example and not limitation, at least one of sensors 300 (and/or sensors 102 and/or sensors 202) may include an audio sensor (e.g., a microphone) that captures audio data representing sound waves 320. In some examples, the sound waves 320 may be emitted by the braking system when the braking system is activated to decelerate the vehicle 100. Further, in some examples, the sensor 300 (and/or the sensor 102 and/or the sensor 202) may capture inertial sensor data indicating that one of the components of the propulsion system (e.g., the drive motor 302, the gearbox 304, and/or the axle 306) is vibrating, which may indicate wear or failure of the respective component. In some examples, sensor 300 (and/or sensor 102 and/or sensor 202) may include an image sensor to capture image data indicating a positional misalignment of one or more components of the propulsion system and a need for adjustment, replacement, or otherwise repair. As yet another example, the sensors 302 (and/or the sensors 102 and/or the sensors 202) may capture sensor data (e.g., audio data, temperature data, pressure data, combinations of these, and/or other data) associated with components of the HVAC system, such as the air conditioning compressor 314, the condenser, the heat exchanger, and/or the ventilation fan (not shown). The foregoing are just a few examples of sensors and sensor data that may be used to monitor operating conditions of vehicle components in accordance with the techniques described herein.
Example System architecture
Fig. 4 is a block diagram of an example system 400 for implementing techniques described herein. In some examples, the system 400 may include one or more features, components, and/or functions of the examples described herein with reference to other figures (e.g., fig. 1, 2, 3, etc.).
The system 400 may include a vehicle 402. In some examples, the vehicle 402 may include some or all of the features, components, and/or functions described above with respect to the vehicle 100. For example, the vehicle 402 may include a bi-directional vehicle. As shown in fig. 4, the vehicle 402 may also include a vehicle computing device 404, one or more sensor systems 406, one or more transmitters 408, one or more communication connections 410, one or more direct connections 412, and/or one or more drive components 414.
In some examples, the vehicle computing device 404 can include one or more processors 416 and memory 418 communicatively coupled with the one or more processors 416. In the example shown, the vehicle 402 is an autonomous vehicle; however, the vehicle 402 may be any other type of vehicle (e.g., car, truck, bus, airplane, ship, train, etc.), or any other system (e.g., robotic system, automated assembly/manufacturing system, etc.) having components such as those shown in fig. 4. In an example, the one or more processors 416 can execute instructions stored in the memory 418 to perform one or more operations on behalf of the one or more vehicle computing devices 404.
The memory 418 of the one or more vehicle computing devices 404 stores a positioning component 420, a perception system 422, a planning component 424, one or more system controllers 426, a mapping component 428, a monitoring component 430, a filtering component 432, one or more sensor features 434 associated with one or more components of the vehicle 402, and/or sensor data 436. Although depicted in fig. 4 as residing in memory 418 for illustrative purposes, it is contemplated that positioning component 420, sensing system 422, planning component 424, one or more system controllers 426, mapping component 428, monitoring component 430, filtering component 432, one or more sensor features 434 associated with one or more components of vehicle 402, and/or sensor data 436 may additionally or alternatively be accessible by vehicle 402 (e.g., stored on or otherwise accessible from a memory remote from vehicle 402, such as memory 444 of one or more computing devices 440).
In at least one example, the positioning component 420 can include receiving data from the sensor system 406 to determine a position and/or orientation (e.g., x-, y-, z-position, roll, pitch, or yaw) of the vehicle 402. For example, the positioning component 420 may include and/or request/receive a map of the environment, and may continuously determine the location and/or orientation of the autonomous vehicle within the map. In some cases, the locating component 420 can utilize SLAM (simultaneous location and mapping), CLAMS (simultaneous calibration, location and mapping), relative SLAM, beam adjustment, non-linear least squares optimization, etc., based on image data, lidar data, radar data, IMU data, GPS data, wheel encoder data, etc., captured by the one or more sensor systems 406 or received from one or more other devices (e.g., computing device 440) to accurately determine the location of the autonomous vehicle. In some cases, the positioning component 420 may provide data to various components of the vehicle 402 to determine an initial position of the autonomous vehicle for generating a trajectory and/or for determining to retrieve map data.
In some cases, perception system 422 may include functionality to perform object tracking, detection, segmentation, and/or classification. In some examples, sensing system 422 may provide processed sensor data that indicates the presence of and/or classification of an entity near vehicle 402 as an entity type (e.g., car, pedestrian, bicyclist, animal, building, tree, road surface, curb, sidewalk, unknown, etc.). In additional and/or alternative examples, sensing system 422 may provide processed sensor data that is indicative of one or more characteristics associated with the detected entity (e.g., the tracked object) and/or the environment in which the entity is located. In some examples, the features associated with the entity may include, but are not limited to, an x-position (global and/or local position), a y-position (global and/or local position), a z-position (global and/or local position), an orientation (e.g., roll, pitch, yaw), an entity type (e.g., classification), a speed of the entity, an acceleration of the entity, a range (size) of the entity, and/or the like. The characteristics associated with the environment may include, but are not limited to, the presence of another entity in the environment, the status of another entity in the environment, a time of day, a day of the week, a season, a weather condition, an indication of darkness/light, and the like.
In general, the planning component 424 can determine a path to be followed by the vehicle 402 through the environment. For example, the planning component 424 can determine various routes and trajectories and various levels of detail. For example, the planning component 424 may determine a route to travel from a first location (e.g., a current location) to a second location (e.g., a target location). For purposes of this discussion, a route may be a sequence of waypoints for travel between two locations. By way of example, waypoints may include streets, intersections, global Positioning System (GPS) coordinates, and the like. Further, the planning component 424 may generate instructions for guiding the autonomous vehicle along at least a portion of a route from the first location to the second location. In at least one example, the planning component 424 can determine how to direct an autonomous vehicle from a first waypoint in a sequence of waypoints to a second waypoint in the sequence of waypoints. In some examples, the instructions may be a trace or a portion of a trace. In some examples, the plurality of tracks may be generated substantially simultaneously (e.g., within a technical tolerance) according to a rolling horizon optimization technique (rolling horizon technique), wherein one of the plurality of tracks is selected for navigation by the vehicle 402. In some examples, the planning component 424 may generate instructions to control the vehicle 402 to perform one or more operations during a test mode during which sensor characteristics are generated for one or more components of the vehicle. For example, the planning component 424 may direct the vehicle to travel at a constant speed for a period of time when the component is activated, such that the acoustic signature of the component may be captured while the vehicle is traveling at a constant speed.
In at least one example, the vehicle computing device 404 can include one or more system controllers 426, which can be configured to control steering, propulsion, braking, safety, transmitters, communications, components, and other systems of the vehicle 402. These system controllers 426 may communicate with and/or control respective systems of the drive assembly 414 and/or other components of the vehicle 402.
In some examples, the vehicle 402 may control the operation of one or more components to monitor their condition and generate sensor characteristics associated with the components using one or more sensors of the sensor system 406 (e.g., sensors 102, 202, and/or 300 of the vehicle 100). In some examples, the vehicle computing device 404 may implement a "test mode" at a first time (e.g., when the vehicle is first commissioned or when a new component is put into use). At a first time, the vehicle 402 may perform one or more operations associated with activating a component under one or more conditions such that sensor data associated with the component may be captured by the one or more sensor systems 406. For example, in the case of a fan of a heating, ventilation, and air conditioning (HVAC) system, the system controller 426 can control the fan to run all of its available settings (e.g., high, medium, low, etc.), while the planning component 424 instructs one or more other system controllers 426 to control the vehicle to traverse the environment at different speeds, open and close one or more doors or windows, etc., in order to isolate sensor features associated with the component under various operating conditions. The planning component 424 may again cause the one or more system controllers 426 to control the vehicle to implement the test mode while the sensor system 406 captures the sensor data to generate a second (e.g., progressive, real-time, current, etc.) sensor signature for the component of the vehicle at a second time after the first time. The second sensor characteristic may then be compared to the first sensor characteristic to determine an operating state associated with the component.
The memory 418 may further include a map component 428 to maintain and/or update one or more maps (not shown) that may be used by the vehicle 402 to navigate within the environment. For purposes of discussion, a map may be any number of data structures modeled in two, three, or N dimensions that are capable of providing information about an environment, such as, but not limited to, topology (e.g., intersections), streets, mountains, roads, terrain, and general environment. In some cases, the map may include, but is not limited to: texture information (e.g., color information (e.g., RGB color information, lab color information, HSV/HSL color information), etc.), intensity information (e.g., lidar information, radar information, etc.); spatial information (e.g., image data projected onto a mesh, individual "bins" (e.g., polygons relating to individual colors and/or intensities)), reflectivity information (e.g., specular reflection information, retroreflective information, BRDF information, BSSRDF information, etc.). In one example, the map may include a three-dimensional grid of the environment. In some cases, the map may be stored in a tiled format, such that individual tiles of the map represent discrete portions of the environment and may be loaded into working memory as needed. In at least one example, the one or more maps can include at least one map (e.g., an image and/or a grid). In some examples, the vehicle 402 may be controlled based at least in part on a map. That is, the map can be used in conjunction with the positioning component 420, perception system 422, and/or planning component 424 to determine a location of the vehicle 402, identify objects in the environment, and/or generate a route and/or a trajectory for navigating through the environment.
In some examples, the one or more maps may be stored on a remote computing device (e.g., computing device 440) accessible via network 438. In some examples, multiple maps may be stored based on, for example, characteristics (e.g., entity type, time of day, day of week, season of year, etc.). Storing multiple maps may have similar memory requirements but may increase the speed of accessing the data in the maps.
The monitoring component 430 is configured to monitor the condition or operational status of the vehicle as a whole and/or various components or systems of the vehicle. The monitoring component 430 receives sensor data from the one or more sensor systems 406 and uses the sensor data to estimate current and/or predicted future operating states associated with the vehicle 402 and/or one or more components of the drive component 414. The monitoring component 430 can monitor the condition of virtually any component of the vehicle. Some illustrative examples of component ranges that may be monitored by monitoring component 430 include propulsion systems (e.g., motors, gearboxes, drive trains, etc.), energy storage systems (e.g., batteries, fuel cells, internal combustion engines, etc.), braking systems, steering systems, door seals, HVAC systems, cooling systems, computing systems, and so forth. As described above, the operating status may include a wear indication associated with the vehicle component, or the like, such as a percentage of the component's used life and/or remaining life (e.g., 50% used life, 75% remaining life, etc.), a time of failure associated with the component, such as an amount of time and/or distance the vehicle may travel until the component may fail (e.g., 10 hours until the component fails, 100 miles until the component fails, etc.), or an indication of an anomaly associated with the component, such as one or more fault conditions. For example, the monitoring component 430 can receive audio data from sensor data 436 captured by an audio sensor (e.g., a microphone) of the sensor system 406 and predict a component of the vehicle 402 associated with the audio data, a location of the component associated with the audio data, and/or an operating state associated with the component. The prediction may be based on a single audio data recording or multiple audio data recordings.
The memory 418 of the vehicle computing device 404 may further include a filtering component 432 to filter and/or process sensor data captured by the sensor system 406. As an example, the filtering component 432 may process audio data captured by an audio sensor of the vehicle 402 to remove background noise associated with the audio data. For example, if the audio data includes audio features associated with a component of the vehicle 402 and background noise, the filtering component 432 may identify the audio features associated with the component to filter out at least some of the background noise. Additionally or alternatively, the filtering component 432 can identify background noise of the audio data relative to the audio features of the component and remove at least some of the background noise from the audio data. Although the filtering component 432 is described with respect to filtering audio signals, the filtering component 432 may filter any type of sensor data received from various sensors of the vehicle 402, such as image sensor data, inertial sensor data, temperature sensor data, pressure sensor data, environmental sensor data, and so forth.
In some examples, the memory 418 may store one or more sensor characteristics 434 associated with components of the vehicle 402. Additionally or alternatively, the sensor features 434 may be associated with components of other vehicles other than the vehicle 402. The sensor features 434 may include various types of sensor features for various components of the vehicle 402. For example, the sensor features 434 may include one or more audio sensor features associated with a particular vehicle component (e.g., a brake system) and one or more image sensor features associated with the same particular component or a different component (e.g., an HVAC system). In some examples, sensor characteristics 434 may include one or more baseline sensor characteristics associated with components of vehicle 402, and may additionally or alternatively include one or more progressive sensor characteristics associated with components of vehicle 402 (e.g., sensor characteristics that are continually updated over the respective life of the components.
In some examples, the sensor features 434 may be transmitted to the computing device 440 via the network 438. Additionally or alternatively, the sensor features 434 may be received from a computing device 440 via a network 438.
The memory 418 may also store sensor data 436 captured by one or more sensors of the sensor system 406 of the vehicle 402. The sensor data 436 may include raw sensor data (e.g., sensor data captured by sensors of the vehicle 402) and/or processed sensor data (e.g., sensor data processed by the filtering component 432 after being captured by the sensors). The sensor data 436 may be used to determine one or more of the sensor characteristics 434 described above. Additionally or alternatively, the sensor data 436 may be used to determine an operating state associated with a component of the vehicle 402. In some examples, the sensor data 436 may be sent to the computing device 440 via the network 438 for use as log data 448 and/or training data 452.
In some cases, aspects of some or all of the memory storage components discussed herein may include any model, algorithm, and/or machine learning algorithm. For example, in some cases, components in memory 418 (and memory 444, discussed in further detail below), such as perception system 422, planning component 424, monitoring component 430, and filtering component 436, may be implemented as a neural network. For example, the monitoring component 430 can include a machine learning model (e.g., a neural network) that has been trained to predict an operating state of one or more components of the vehicle 402.
As described herein, an exemplary neural network is a biologically inspired algorithm that passes input data through a series of connected layers to produce an output. Each layer in the neural network may also comprise another neural network, or may comprise any number of layers (whether convoluted or not). As can be appreciated in the context of the present disclosure, neural networks may utilize machine learning, which may refer to a broad class of such algorithms, where outputs are generated based on learned parameters.
Although discussed in the context of a neural network, any type of machine learning consistent with the present disclosure may be used. For example, the machine learning algorithm may include, but is not limited to: regression algorithms (e.g., ordinary Least Squares Regression (OLSR), linear regression, logistic regression, stepwise regression, multivariate Adaptive Regression Splines (MARS), local estimation scatter plot smoothing (LOESS)), example-based algorithms (e.g., ridge regression, least Absolute Shrinkage and Selection Operator (LASSO), elastic networks, minimum angle regression (LARS)), decision tree algorithms (e.g., classification and regression trees (CART), iterative dichotomy 3 (ID 3), chi-squared automated interaction detection (CHAID), decision stumps, conditional decision trees), bayesian algorithms (e.g., naive Bayes, gaussian Bayes, polynomial naive Bayes, average single dependence estimators (AODE), gaussian belief networks (BNN), bayesian networks), clustering algorithms (e.g., k-means, k-median, expectation Maximization (EM), hierarchical clustering), association rule learning algorithms (e.g., perceptron (perceptron), backpropagation, field-hopping network, radial Basis Function Network (RBFN)), deep learning algorithms (e.g., deep Boltzmann Machine (DBM), deep Belief Network (DBN), convolutional Neural Network (CNN), stacked autoencoder), dimension reduction algorithms (e.g., principal Component Analysis (PCA), principal Component Regression (PCR), partial Least Squares Regression (PLSR), sammon mapping, multi-dimensional scaling (MDS), projection tracking, linear Discriminant Analysis (LDA) Hybrid discriminant analysis (MDA), quadratic Discriminant Analysis (QDA), flexible Discriminant Analysis (FDA)), integrated algorithms (e.g., lifting, bootstrap aggregation (Bagging), adaBoost, stack generalization (hybrid), gradient elevator (GBM), gradient-lifted regression tree (GBRT), random forest), SVM (support vector machine), supervised learning, unsupervised learning, semi-supervised learning, and the like. Other examples of architectures include neural networks such as ResNet-50, resNet-101, VGG, denseNet, pointNet, and the like.
In at least one example, the sensor system 406 may include a lidar sensor, a radar sensor, an ultrasonic transducer, a sonar sensor, a position sensor (e.g., GPS, compass, etc.), an inertial sensor (e.g., inertial Measurement Unit (IMU)), an accelerometer, a magnetometer, a gyroscope, etc.), an image sensor (e.g., camera, RGB, IR, intensity, depth, etc.), an audio sensor (e.g., microphone), a wheel encoder, an environmental sensor (e.g., a temperature sensor, a humidity sensor, a light sensor, a pressure sensor, etc.), a temperature sensor (e.g., for measuring the temperature of a vehicle component), and/or the like. The sensor system 406 may include multiple instances of each of these or other types of sensors. In some cases, the locations of the various sensors of sensor system 406 may correspond to the locations of external sensors 102A-102N of FIG. 1 and/or internal sensors 202A-M of FIG. 2, respectively. For example, the lidar sensors may include individual lidar sensors located at corners, front, rear, sides, and/or top of the vehicle 402. As another example, the image sensor may include multiple image sensors disposed at different locations around the exterior and/or interior of the vehicle 402. As a further example, the audio sensor may include a plurality of audio sensors disposed at different locations around the exterior and/or interior of the vehicle 402. Additionally, the audio sensor may include an array of multiple audio sensors for determining directionality of the audio data. The sensor system 406 may provide input to the vehicle computing device 404. Additionally or alternatively, the sensor system 406 can send sensor data to one or more computing devices 440 via one or more networks 438 at a particular frequency, after a predetermined period of time has elapsed, in near real-time, or the like.
The vehicle 402 may also include a transmitter 408 for emitting light and/or sound. The transmitter 408 in this example may include an internal audio and visual transmitter that communicates with the occupants of the vehicle 402. By way of example and not limitation, the internal transmitters may include speakers, lights, signs, display screens, touch screens, tactile transmitters (e.g., vibration and/or force feedback), mechanical actuators (e.g., seat belt tensioners, seat positioners, headrest positioners, etc.), and the like. The transmitter 408 in this example may also include an external transmitter. By way of example and not limitation, the external transmitters in this example include lights for indicating direction of travel or other indicators of vehicle motion (e.g., indicator lights, signs, arrays of lights, etc.) and one or more audio transmitters (e.g., speakers, speaker arrays, horns, etc.) to audibly communicate with pedestrians or other nearby vehicles, one or more of which include acoustic beam steering technology.
The vehicle 402 may also include one or more communication connections 410 that enable communication between the vehicle 402 and one or more other local or remote computing devices. For example, the communication connection 410 may facilitate communication with the vehicle 402 and/or other local computing devices on the drive assembly 414. Also, the communication connection 410 may allow the vehicle 402 to communicate with other computing devices in the vicinity (e.g., other vehicles in the vicinity, traffic lights, etc.). The communication connection 410 also enables the vehicle 402 to communicate with a remote computing device or other remote service.
The communication connection 410 may include a physical and/or logical interface for connecting the vehicle computing device 404 to another computing device (e.g., computing device 440) and/or a network, such as network 438. For example, the communication connection 410 may enable Wi-Fi based communication, such as via frequencies defined by the IEEE 702.11 standard, short-range wireless frequencies (e.g.,
Figure BDA0003899335670000231
) Cellular communication (e.g., 2G, 3G, 4G LTE, 5G, etc.), or any suitable wired or wireless communication protocol that enables the respective computing device to interface with other computing devices.
In at least one example, the direct connection 412 of the vehicle 202 may provide a physical interface to couple one or more drive components 414 with the body of the vehicle 402. For example, the direct connection 412 may allow energy, fluid, air, data, etc. to be transferred between the drive assembly 414 and the vehicle 402. In some cases, the direct connection 412 may further releasably secure the drive assembly 414 to the body of the vehicle 402.
In at least one example, the vehicle 402 may include one or more drive assemblies 414. In some examples, the vehicle 402 may have a single drive assembly 414. In at least one example, if the vehicle 402 has multiple drive assemblies 414, the individual drive assemblies 414 may be positioned at opposite longitudinal ends (e.g., front and rear ends, front and rear, etc.) of the vehicle 402. In at least one example, the drive assembly 414 may include one or more sensor systems to detect conditions of the drive assembly 414 and/or the environment surrounding the vehicle 402. For example, the sensor system may include one or more wheel encoders (e.g., rotary encoders) to sense rotation of wheels of the drive system, inertial sensors (e.g., inertial measurement units, accelerometers, gyroscopes, magnetometers, etc.) to measure orientation and acceleration of the drive system, image sensors, ultrasonic sensors for acoustic detection of objects around the drive system, lidar sensors, radar sensors, audio sensors, and so forth. Some sensors (e.g., wheel encoders) may be unique to the drive assembly 414. In some cases, the sensor system on the drive assembly 414 may overlap or supplement a corresponding system of the vehicle 402 (e.g., the sensor system 406).
The drive assembly 414 may include a number of vehicle systems including a high voltage battery, a motor for driving the vehicle, an inverter for converting direct current from the battery to alternating current for use by other vehicle systems, a steering system including a steering motor and a steering rack (which may be electric), a braking system including hydraulic or electric actuators, a suspension system including hydraulic and/or pneumatic components, a stability control system for distributing braking force to mitigate loss of traction and maintain control, an HVAC system, lighting (e.g., lighting such as headlights/taillights for illuminating the environment outside the vehicle), and one or more other systems (e.g., a cooling system, a safety system, an on-board charging system, other electrical components such as DC/DC converters, high voltage connectors, high voltage cables, a charging system, charging ports, etc.). Additionally, the drive assembly 414 may include a drive assembly controller that may receive and pre-process data from the sensor system and control the operation of various vehicle systems. In some cases, the drive assembly controller may include one or more processors and a memory communicatively coupled to the one or more processors. The memory may store one or more systems to perform various functions of the drive component 414. In addition, the drive components 414 may also include one or more communication connections that enable the respective drive components to communicate with one or more other local or remote computing devices.
In at least one example, the memory storage components discussed herein may process sensor data, as described above, and may send their respective outputs to one or more computing devices 440 over one or more networks 438. In at least one example, the memory storage components discussed herein can send their respective outputs to the one or more computing devices 440 at a particular frequency, after a predetermined period of time has elapsed, in near real-time, and/or the like.
In some examples, the vehicle 402 may send the sensor data 436 to one or more computing devices 440 via a network 438. In some examples, the vehicle 402 may send raw sensor data to the computing device 438. In other examples, the vehicle 402 may send the processed sensor data and/or a representation of the sensor data to the computing device 440. In some examples, the vehicle 402 may transmit the sensor data to the computing device 440 at a particular frequency, after a predetermined period of time has elapsed, in near real-time, or the like. In some cases, the vehicle 402 may send the sensor data 436 (raw or processed) to the computing device 440 as one or more log files.
Computing device 440 may include one or more processors 442 and memory 444, which may be communicatively coupled to the one or more processors 442. The memory 444 may store a training component 446, log data 448, a machine learning component 450, training data 452, one or more sensor characteristics 454 associated with a vehicle component (e.g., a component of the vehicle 402), and/or a fault log 456.
Log data 448 may include historical and/or pre-recorded sensor data obtained from a computing system of a vehicle (e.g., vehicle 100 and/or other vehicle, etc.) that is captured and stored during operation. The log data 448 can include raw sensor data and/or processed sensor data. In some examples, the log data 448 may include fused perception data captured by multiple sensor systems on the vehicle, such as image sensors, lidar sensors, radar sensors, TOF sensors, sonar sensors, global positioning system sensors, audio sensors, IMUs, and/or any combination of these. The log data 448 may additionally or alternatively include classification data that includes a classification of the object (e.g., pedestrian, vehicle, building, road surface, etc.) and/or a component represented in the sensor data and/or tracking data that corresponds to motion of the object in the environment that is classified as a dynamic object. Over time, trajectory data may include multiple trajectories of multiple different objects.
Training component 446 may generate training data 452 using log data 448. For example, the training component 446 may tag sensor data associated with the vehicle component with one or more measured parameters and/or characteristics of the vehicle component associated with the sensor data. The sensor data and/or measured parameters or characteristics can be obtained from log data 448, sensor signatures 454, and/or fault logs 456. The tags may include an indication of an operational state (e.g., normal, fault, time to failure, etc.) associated with a vehicle component (e.g., a braking system, HVAC system, door/window seal, etc.) and/or any other characteristic of the vehicle component at the time the sensor data is captured and/or one or more times after the sensor data is captured. For example, the tag may indicate that the component represented in the sensor data failed later (e.g., 100 hours after the sensor data was captured). The training component 446 may then train the machine learning component 450 using the training data 452 to predict a current and/or future operating state associated with the vehicle component based at least in part on the sensor data received as input. Further, the training component 446 may use the training data 452 to train the machine learning component 450 to predict any other characteristic of the vehicle component based on receiving the sensor data input (e.g., identify the vehicle component associated with the sensor data, a location of the component in the vehicle, an indication of an amount of component wear, a remaining life of the component, etc.).
Memory 444 of computing device 440 may additionally store one or more sensor features 454 associated with a vehicle component, such as a component of vehicle 402 and/or another vehicle. Sensor feature 454 may include various types of sensor features for various components of the vehicle. For example, sensor features 454 may include one or more audio/acoustic sensor features associated with a vehicle component, one or more image sensor features associated with the same component and/or a different component (e.g., an HVAC system). The sensor features may be based on sensor data of a single sensor modality (e.g., audio data or image data, or IMU data, etc.), or they may be based on data of multiple different sensor modalities (e.g., audio data, image data, IMU data, and/or other sensor data). In some examples, sensor features 454 may include one or more initial or baseline sensor features associated with one or more vehicle components, and may additionally or alternatively include one or more subsequent or progressive sensor features associated with components acquired throughout the life cycle of the component. In that case, subsequent sensor characteristics may be compared to the initial sensor characteristics to determine changes in the operating state of the various components. In this manner, the sensor feature 454 may be used to determine an operating state associated with the vehicle 402 or other vehicle components. In some examples, the sensor feature 454 may be transmitted to the vehicle computing device 404 via the network 438.
In some examples, the memory 444 may include a fault log 456, and faults or anomalies associated with one or more vehicles, such as the vehicle 100 and/or the vehicle 402, may be recorded in the fault log 456. The fault log 456 may include an indication of detected faults or abnormal measurements and identifiers of the components/systems involved (e.g., sensor characteristics related to a faulty operating state of a vehicle component). The fault log 456 may also store a snapshot of operating conditions (e.g., a series of progressive sensor signatures over at least a portion of the life of a vehicle component) that result in a fault or abnormal measurement. Although depicted in fig. 4 as residing in memory of the computing device 440, in at least some examples, the fault log 456 may be stored locally at the vehicle 402. In some examples, fault log 456 can be used to label log data 448 for use in training machine learning component 450 and/or monitoring component 430's machine learning models to predict the component's operating state. Additionally, in at least some examples, the fault log 456 may be reported by the vehicle 402 to the computing device 440. The report may be periodic (e.g., daily, hourly, etc.) or upon the occurrence of certain events (e.g., detecting a collision, transitioning to a service location, detecting a change in the operating state of a component, a change in a sensor characteristic associated with a component, etc.).
The processor 416 of the vehicle 402 and the processor 442 of the computing device 440 may be any suitable processor capable of executing instructions to process data and perform operations as described herein. By way of example, and not limitation, processors 416 and 442 may include one or more Central Processing Units (CPUs), graphics Processing Units (GPUs), or any other device or portion of a device that processes electronic data to convert that electronic data into other electronic data that may be stored in registers and/or memory. In some examples, integrated circuits (e.g., ASICs, etc.), gate arrays (e.g., FPGAs, etc.), and other hardware devices may also be considered processors, so long as they are configured to implement the coded instructions.
Memories 418 and 444 are examples of non-transitory computer-readable media. Memories 418 and 444 may store an operating system and one or more software applications, instructions, programs, and/or data to implement the methods described herein and the functions attributed to the various systems. In various embodiments, the memory may be implemented using any suitable memory technology, such as Static Random Access Memory (SRAM), synchronous Dynamic RAM (SDRAM), non-volatile/flash type memory, or any other type of memory capable of storing information. The architectures, systems, and individual elements described herein may include many other logical, procedural, and physical components, of which those shown in the figures are merely examples relevant to the discussion herein.
As can be appreciated, the components discussed herein are described as being partitioned for illustrative purposes. However, the operations performed by the various components may be combined or performed in any other component.
It should be noted that while fig. 4 is illustrated as a distributed system, in alternative examples, components of the vehicle 402 may be associated with the computing device 440, and/or components of the computing device 440 may be associated with the vehicle 402. That is, the vehicle 402 may perform one or more of the functions associated with the computing device 440, and vice versa. Further, aspects of the machine learning component 450 can be executed on any of the devices discussed herein.
Example method
Fig. 5, 6, and 7 are flow diagrams illustrating example methods relating to monitoring vehicle component health, processing sensor data, and using machine learning models. For convenience and ease of understanding, the methods illustrated in fig. 5, 6, and 7 are described with reference to one or more of the vehicles and/or computing devices illustrated in fig. 1-4. However, the methods illustrated in fig. 5, 6, and 7 are not limited to being performed using the vehicles and/or computing devices illustrated in fig. 1-4, and may be implemented using any other vehicles, computing devices, computing systems, and/or drive components described in the present application, as well as vehicles, computing devices, computing systems, and/or drive components other than those described herein. Further, the vehicles, computing devices, computing systems, and/or drive assembly modules described herein are not limited to performing the methods illustrated in fig. 5, 6, and 7.
FIG. 5 is a flow chart illustrating an example method 500 for monitoring vehicle health and/or vehicle component health. At operation 502, the method 500 includes receiving a first sensor characteristic indicative of an operating state associated with a vehicle component. In some examples, the sensor characteristics may include log data captured by a sensor of another vehicle experiencing a fault or other anomaly associated with the component. Additionally or alternatively, the first sensor signature may be generated and/or simulated on the test stand. In at least one example, the first sensor characteristic may be determined during a test mode of the vehicle by activating a component of the vehicle and capturing sensor data associated with the component via one or more sensors of the vehicle.
At operation 504, the method 500 includes activating a component of the vehicle. For example, if the vehicle component to be activated includes a braking system, the vehicle may accelerate to a certain speed (e.g., 10MPH, 25MPH, etc.), and then apply the braking system to reduce the speed of the vehicle and/or stop the vehicle. As another example, if the vehicle component to be activated includes an HVAC system, the vehicle may turn various HVAC system components on or off, such as an air conditioning compressor, a ventilation fan, and the like. As yet another example, if the vehicle component to be activated includes a door/window seal, the vehicle may accelerate to and/or maintain a certain speed (e.g., 25 MPH) such that audio data corresponding to ambient noise within a passenger compartment of the vehicle may be captured. Further, in some examples, one or more other systems or components of the vehicle may change upon activation of a vehicle component in order to isolate a portion of the sensor data attributable to the activated component. Further, in some examples, multiple components may be activated in combination to identify interactions or relationships between the various components. For example, in the case of testing a door/window seal, the HVAC system can be opened and closed to determine if a change in passenger cabin pressure affects the ability of the door/window seal to seal the passenger cabin.
At operation 506, the method 500 includes receiving data associated with a component of the vehicle from a sensor of the vehicle. In some examples, the sensor may include an audio sensor (e.g., a microphone), an image sensor (e.g., a camera), an inertial sensor (e.g., an IMU), a temperature sensor, a motion sensor, an accelerometer, or any other sensor type described herein. Further, the data may include audio data, image data, inertial data, temperature data, pressure data, voltage measurements, current measurements, or any other type of sensor data described herein. In some examples, the data may be received via one or more wired and/or wireless communication channels.
At operation 508, the method 500 includes processing the data. In some examples, the first data may include raw sensor data. Accordingly, the sensor data may be processed to generate processed sensor data. In some examples, and as described in more detail below with reference to fig. 6, processing the sensor data may include removing and/or reducing portions of the data (e.g., background noise) from the sensor data and/or isolating portions of the sensor data associated with particular components.
At operation 510, the method 500 includes determining a second sensor characteristic associated with the vehicle component. The second sensor characteristic may be determined based at least in part on the received data. In some examples, the second sensor characteristic may be determined based on a single capture of the sensor data, and in additional or alternative examples, the second sensor characteristic may be determined based on multiple instances of captured sensor data. In some examples, the second sensor characteristic may include a progressive sensor characteristic for comparison with previous sensor data and/or sensor characteristics (e.g., the first sensor characteristic) to determine an operating state of the vehicle component. In at least one example, the second sensor feature may comprise a progressive sensor feature that is monitored continuously, periodically (e.g., hourly, daily, weekly, monthly, etc.) throughout the life of the vehicle component or upon the occurrence of a triggering event (e.g., collision with an object, charging, component repair, vehicle offline or not currently in use, etc.).
At operation 512, the method 500 includes determining a correlation between the first sensor characteristic and the second sensor characteristic. In some examples, determining the correlation may include determining at least one of a change or a similarity between the first sensor characteristic and the second sensor characteristic. Determining the correlation may include comparing the first sensor characteristic and the second sensor characteristic to determine at least one of a change or a similarity. At operation 514, the method 500 includes determining whether a correlation exists (e.g., whether the change is greater than a threshold change, whether the similarity is within a threshold range, etc.). If the correlation between the first sensor characteristic and the second sensor characteristic is not present, insignificant, or not within the threshold range, the method 500 may continue to operation 516. However, if a correlation does exist between the first sensor characteristic and the second sensor characteristic, the method 500 may proceed to operation 518.
At operation 516, the method 500 may include performing a first action. In some examples, performing the first action may include sending a notification to the remote computing system that the operational state of the vehicle component has changed but that the change is not significant. Additionally or alternatively, the first action may include recording or storing the second sensor characteristic and/or the second data in a memory of the vehicle (e.g., memory 418 of the vehicle computing device 404) or in a memory of a computing system and/or device remote from and accessible by the vehicle (e.g., memory 444 of computing device 440).
At operation 518, the method 500 may include performing a second action. In some examples, the second action may include determining and/or outputting an operating state associated with the vehicle component. As described above, the operating status may include a wear indication associated with the vehicle component, such as a percentage of the component's useful life and/or remaining life (e.g., 50% useful life, 75% remaining life, etc.), a time of failure associated with the component, such as an amount of time and/or distance the vehicle may travel until the component may fail (e.g., 10 hours until the component fails, 100 miles until the component fails, etc.), or an anomaly indication associated with the component, such as one or more fault conditions. In some examples, the operating state and/or estimated time until failure may be determined based at least in part on the data, the second sensor characteristic, the correlation between the first sensor characteristic and the second sensor characteristic, the amount of time that a component of the vehicle has been in service, how many miles the vehicle has traveled with the vehicle component installed, and/or the like. For example, a look-up table may be stored that associates respective ones of the sensor characteristics with respective ones of the operating states, mean time to failure, etc. of the components. In this manner, the sensor characteristics of the lookup table may be identified based at least in part on the second sensor characteristics, and corresponding operating states, failure times, and the like may be determined. In at least one example, performing the second action can include transmitting the operating state to the remote computing system. In some examples, the remote computing system may be associated with a remote monitoring system that monitors a fleet of autonomous vehicles.
Fig. 6 is a flow diagram illustrating an example method 600, including additional details of processing sensor data at operation 506. As described above, at operation 506, the method 500 may include processing the sensor data. In some examples, sensor data 506 may include raw sensor data. Accordingly, the sensor data may be processed to generate processed sensor data. Further, the processed sensor data may be used to determine one or more characteristics associated with the sensor data. For example, the processed sensor data may be used to determine a type of sensor data (e.g., audio data, image data, etc.).
At operation 602, the method 600 may include determining that the sensor data includes audio data. As shown, in some examples, the audio data may represent at least acoustic features associated with vehicle components and background noise. By way of example and not limitation, the audio data may represent acoustic features associated with brake system squeal and background noise related to, for example, other vehicle components, ambient noise, noise of pedestrians near the vehicle, noise of other vehicles, and the like.
At operation 604, the method 600 may include identifying at least one of a background noise represented by the audio data or an acoustic feature associated with the vehicle component. In some examples, identifying the background noise and/or the acoustic features may be based at least in part on a condition associated with the vehicle. As an example, if the vehicle is activating a braking system to slow the vehicle, the vehicle computing device may know that the audio data may include acoustic features associated with the braking system. In this way, the vehicle computing device may use this information to identify the acoustic features of the audio data so that the audio data may be processed.
At operation 606, the method 600 may include processing the audio data to generate processed audio data that includes less background noise than the raw sensor data. In some examples, processing the audio data may include processing at least a portion of the audio data that includes at least one of background noise and/or acoustic characteristics of the component. Additionally or alternatively, processing the audio data may include processing a portion of the audio data that includes only background noise or acoustic features.
At operation 608, the method 600 may include determining whether to continue processing audio data. For example, the audio data may be additionally processed one or more times to generate processed audio data. In some examples, processing the audio data each time may result in the processed audio data having less background noise than previously processed audio data. Accordingly, at operation 608, the method 600 may continue processing the audio data (e.g., re-processing the audio data) by repeating operation 606. However, if the processed audio data is sufficient (e.g., the processed audio data has been processed a threshold number of times, the processed audio data contains less than a threshold amount of background noise, etc.), method 600 may proceed to operation 610.
At operation 610, the method 600 may include providing the processed audio data to a computing device of the vehicle. Additionally or alternatively, the processed audio data may be provided to a remote computing device/system. In this manner, the computing device of the vehicle and/or the remote computing device/system may utilize the processed audio data to determine sensor characteristics associated with the processed audio data and/or to determine an operating state associated with the vehicle component.
FIG. 7 is a flow diagram illustrating an example method 700 for monitoring vehicle health using a machine learning model. At operation 702, the method 700 includes receiving sensor data indicative of an operational state associated with a vehicle component. In some examples, the sensor data may include log data captured by sensors of the vehicle experiencing a fault or other anomaly associated with the component. Additionally or alternatively, sensor data may be generated and/or simulated on the test stand. In at least one example, sensor data may be captured during a test mode of a vehicle by activating a component of the vehicle and capturing sensor data associated with the component via one or more sensors of the vehicle. In various examples, the sensor data may include audio data, IMU data, accelerometer data, image data, pressure data, temperature data, voltage measurements, current measurements, and the like. The sensor data may be associated with a component of the vehicle, such as an HVAC system, a braking system, a window/door seal, a rotating machine of the vehicle, or any other component of the vehicle. The sensor data may be received at a computing device of the vehicle (e.g., vehicle computing device 404) and/or at a remote computing device (e.g., computing device 440). Additionally or alternatively, the sensor data may be stored in a memory of the vehicle computing device 404 and then uploaded to the computing device 440.
At operation 704, the method 700 includes determining a vehicle component (e.g., a braking system, an HVAC system, etc.) associated with generating sensor data, and at operation 706, the method 700 may include determining an operating state associated with the component. In some examples, the components and/or operating states of the vehicle may be determined by a human annotator to generate annotated training data to train the machine learning model. Additionally or alternatively, the components and/or operating states of the vehicle may be automatically determined based on previously stored data, such as log data 448 and/or fault log 456, which includes one or more instances of sensor data associated with the component during operation, as well as an indication of a fault or failure of the component at a later time.
In at least one example, sensor data may be used as training data. The training data may include a preconfigured specification of an operational state of the component represented by the training data. The preconfigured designation of operational status may be any operational status indication described herein, and may be based on previously stored log data and/or fault logs associated with the component. Additionally or alternatively, the training data may include a preconfigured designation of an identity of a component represented by the training data. In various examples, the training data may include second sensor data captured by one or more other sensors of another vehicle. The second sensor data may additionally be associated with a component of a second vehicle.
At operation 506, the method 700 includes processing the first data. In some examples, the first data may include raw sensor data. Accordingly, the sensor data may be processed to generate processed sensor data. In some examples, and as described above with reference to fig. 6, processing the sensor data may include removing and/or reducing portions of the data (e.g., background noise) from the sensor data and/or isolating portions of the sensor data attributable to the component of interest.
At operation 708, the method 700 includes inputting sensor data into the machine learning model. In some examples, the sensor data may include raw sensor data. Additionally or alternatively, the sensor data may include processed sensor data (e.g., sensor data that has been filtered according to one or more filtering algorithms). By way of example and not limitation, the machine learning model may include and/or utilize a penalized linear regression model, a decision tree, a logistic regression model, a Support Vector Machine (SVM), a naive bayes model, a k-nearest neighbor (KNN) model, a k-Means model, a neural network, or other separate logic, model, or algorithm, or a combination thereof.
At operation 710, the method 700 includes receiving a predicted operating state associated with the vehicle component from the machine learning model. In some examples, the predicted operating state may be predicted by the machine learning model based at least in part on one or more previous inputs to the machine learning model. Additionally or alternatively, the predicted operating state may be predicted by the machine learning model based at least in part on one or more changes in parameters of the machine learning model that were previously incorrectly predicted based on the machine learning model. In some examples, the predicted operating state may include an indication of wear associated with a component of the vehicle, a predicted and/or estimated time to failure associated with the component, and/or an indication of an anomaly associated with the component.
At operation 712, the method 700 may include determining whether there is a discrepancy between the operating state (e.g., the actual or measured operating state determined at 706) and the predicted operating state. In some examples, the differences may be determined by a human operator who is training the machine learning model. Additionally or alternatively, the difference may be determined by a computer that is training the machine learning model.
In some examples, there may be no difference between the operating state and the predicted operating state (e.g., the machine learning model made the correct prediction). In that case, at operation 714, the method 700 may determine to return to operation 702 to continue training the machine learning model with additional sensor data. For example, the machine learning model may not output more than a threshold number of correctly predicted operating state outputs. Additionally or alternatively, if there is no difference, the machine learning model may be determined to be sufficiently trained and training may be stopped.
However, if there is indeed a difference between the operating state and the predicted operating state, at operation 714, the method 700 may determine to proceed to operation 716. At operation 714, the method 700 may include changing one or more parameters of the machine learning model to minimize the difference to obtain a trained machine learning model. For example, parameters of the machine learning model algorithm may be adjusted to obtain incremental and/or more frequent correct predictions from the machine learning model. After adjusting one or more parameters of the machine learning model, in some examples, method 700 may proceed to operation 708, where the sensor data may be re-input into the machine learning model. However, method 700 may proceed to other operations, such as operations 702, 704, 706, and so on. In this way, the machine learning model may be retrained using the same sensor data to determine whether the machine learning model makes the correct predictions.
In some examples, the trained machine learning model may reside in memory stored locally to the vehicle, such as on a vehicle computing device, and/or may reside in memory stored remotely from the vehicle, such as on a server computing device or a computing device associated with a remote monitoring system of the vehicle.
The methods 500, 600, and 700 described above are illustrated as a collection of blocks in a logical flow graph, which represent a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the blocks represent computer-executable instructions stored on one or more computer-readable storage media that, when executed by one or more processors, perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and so forth that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described blocks can be combined in any order and/or in parallel to implement the processes. In some embodiments, one or more blocks of the process may be omitted entirely. Further, the methods 500, 600, and 700 may be combined with each other in whole or in part or with other methods.
The various techniques described herein may be implemented in the context of computer-executable instructions or software, such as program modules, stored in computer-readable memory and executed by processors of one or more computers or other devices, such as those illustrated in the figures. Generally, program modules include routines, programs, objects, components, data structures, etc. and define the operating logic for performing particular tasks or implement particular abstract data types.
Other architectures can be used to implement the described functionality and are intended to fall within the scope of the present disclosure. Further, although a particular allocation of responsibilities is defined above for purposes of discussion, the various functions and responsibilities may be allocated and divided in different ways depending on the situation.
Similarly, software may be stored and distributed in a variety of ways and using different instrumentalities, and the particular software storage and execution configurations described above may be varied in many different ways. Thus, software implementing the techniques described above may be distributed on various types of computer-readable media, and is not limited to the form of memory specifically described.
Example clauses
Any of the example clauses in this section may be used with any of the other example clauses and/or any of the other examples or embodiments described herein.
A. A system, comprising: one or more processors; and one or more computer-readable media storing instructions that, when executed by the one or more processors, cause the system to perform operations comprising: activating a vehicle component at a first time; receiving first audio data associated with a vehicle component from an audio sensor of a vehicle at a first time; determining a first sensor characteristic associated with the vehicle component based at least in part on processing the first audio data; storing the first sensor characteristic; determining a second sensor characteristic associated with the vehicle component at a second time after the first time based at least in part on the second audio data; determining that a change between the first sensor characteristic and the second sensor characteristic is greater than a threshold change; and outputting an operating state associated with the component based at least in part on the change being greater than the threshold change.
B. The system of paragraph a, wherein determining a second sensor characteristic associated with the vehicle component comprises: activating a vehicle component; receiving second audio data associated with the vehicle component from the audio sensor; and determining a second sensor characteristic associated with the vehicle component based at least in part on the processing of the second audio data.
C. The system of any of paragraphs a or B, the operations further comprising: controlling operation of another component of the vehicle at a first time in accordance with the operating parameter; and controlling operation of the other components at the second time in accordance with the operating parameter, wherein the operating parameter includes at least one of speed, steering angle, braking condition, position, temperature, or time of day.
D. The system of any of paragraphs a-C, wherein at least a portion of the first audio data comprises audio data attributed to the component and background noise, and wherein processing the first audio data comprises filtering the first audio data to remove the background noise.
E. The system of any of paragraphs a-D, wherein the change between the first sensor characteristic and the second sensor characteristic is determined to be greater than the threshold change based at least in part on at least one of a frequency, an amplitude, or a pitch of the first sensor characteristic and the second sensor characteristic.
F. A method, comprising: receiving a first sensor characteristic indicative of an operating state associated with a vehicle component; activating a component of the vehicle; receiving data associated with the component from a sensor of the vehicle; determining a second sensor characteristic associated with the component based at least in part on the data; determining a correlation between the first sensor characteristic and the second sensor characteristic; and outputting an operating state associated with the component based at least in part on the correlation.
G. The method of paragraph F, further comprising: receiving second data associated with another component of the same type as the component; and determining a first sensor characteristic based at least in part on the second data.
H. The method of any of paragraphs F or G, further comprising: while activating a vehicle component, controlling operation of another component of the vehicle in accordance with an operating parameter, wherein the operating parameter includes at least one of a speed, a steering angle, a braking condition, a position, a temperature, or a time of day.
I. The method of any of paragraphs F through H, wherein the sensor comprises a microphone, the data comprises audio data associated with the component and background noise, and wherein determining the second sensor characteristic further comprises processing the data to remove the background noise.
J. The method of any of paragraphs F through I, further comprising: determining an operating state based at least in part on the second sensor characteristic, wherein the operating state includes at least one of a wear indication associated with the component, a predicted time to failure associated with the component, or an anomaly indication associated with the component.
K. The method of any of paragraphs F through J, further comprising: at least one of the vehicle or the component is serviced based at least in part on the operating condition.
L. the method of any of paragraphs F through K, wherein at least one of the first sensor characteristic or the second sensor characteristic comprises a time series of data measured from a sensor over time and is associated with one or more operating parameters associated with the vehicle.
A method according to any of paragraphs F-L, wherein outputting the operating status associated with the component comprises transmitting data indicative of the operating status to a remote monitoring system associated with the vehicle.
The method of any of paragraphs F-M, further comprising: storing the data in a local memory of the vehicle; and transmitting the data over a network to a remote computing system associated with the vehicle.
The method of any of paragraphs F to N, wherein the sensor comprises a microphone, an Inertial Measurement Unit (IMU), an accelerometer, or a piezoelectric sensor.
P. the method of any of paragraphs F-O, wherein the sensor comprises one or more microphones to locate sound associated with the vehicle component.
One or more non-transitory computer-readable storage media storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising: receiving a first sensor characteristic indicative of an operating state associated with a vehicle component; activating a vehicle component; receiving data related to the component from a sensor of the vehicle; determining a second sensor characteristic associated with the component based at least in part on the data; and outputting an operating state associated with the component based at least in part on the correlation between the first sensor characteristic and the second sensor characteristic.
R. the one or more non-transitory computer-readable storage media of paragraph Q, wherein the sensor comprises a microphone, the data comprising audio data associated with the component and background noise, and wherein determining the second sensor characteristic further comprises processing the data to remove the background noise.
S. one or more non-transitory computer-readable storage media according to any of paragraphs Q or R, the operations further comprising: determining an operating state based at least in part on the second sensor characteristic, wherein the operating state includes at least one of a wear indication associated with the component, a predicted time to failure associated with the component, or an anomaly indication related to the component.
T. one or more non-transitory computer-readable storage media according to any of paragraphs Q to S, the operations further comprising: receiving second data associated with another component of the same type as the component; and determining a first sensor characteristic based at least in part on the second data.
A system, comprising: one or more processors; one or more computer-readable media storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising: receiving sensor data captured by one or more sensors of a vehicle, the sensor data including audio data indicative of an operating state associated with a vehicle component; determining a vehicle component associated with generating the audio data; determining an operating state associated with the vehicle component; inputting audio data into a machine learning model; receiving a predicted operating state associated with a vehicle component from a machine learning model; determining a difference between a predicted operational state associated with the component and an operational state associated with the component; and based at least in part on the difference, changing one or more parameters of the machine learning model to minimize the difference to obtain a trained machine learning model that is trained to predict an operating state of the vehicle component.
V. the system of paragraph U, wherein the machine learning model includes at least one of a penalized linear regression model or a decision tree.
W. the system of any of paragraphs U or V, wherein the audio data represents at least acoustic features associated with components and background noise, the operations further comprising: identifying background noise of the audio data; processing at least a portion of the audio data to remove at least a portion of background noise to generate processed audio data; and inputting the processed audio data into a machine learning model.
X. the system of any of paragraphs U through W, the operations further comprising: receiving additional sensor data captured by one or more sensors of the vehicle, the additional sensor data including additional audio data; inputting additional audio data into the trained machine learning model; and receiving an operating state associated with the component from the trained machine learning model based at least in part on the additional audio data.
Y. a method comprising: receiving sensor data indicative of an operating state associated with a vehicle component; determining an operational state associated with the component based at least in part on the sensor data; inputting sensor data into a machine learning model; receiving a predicted operating state associated with a vehicle component from a machine learning model; and changing parameters of the machine learning model based at least in part on the predicted operating state to obtain a trained machine learning model.
Z. a method according to paragraph Y, wherein the sensor data comprises stored log data from a second vehicle that experienced a failure of another component of the same type as the component.
The method of any of paragraphs Y or Z, wherein the predicted operational state comprises at least one of a wear indication associated with the component, a predicted time to failure associated with the component, or an anomaly indication associated with the component.
The method of any of paragraphs Y-AA, further comprising: determining a mean time to failure associated with a component of a vehicle based at least in part on sensor data of a plurality of components of the same type as the component; receiving, from a machine learning model, an estimated time to failure associated with a vehicle component; and wherein the parameters of the machine learning model are further varied based at least in part on a difference between the mean time to failure and the estimated time to failure.
The method of any of paragraphs Y-BB, wherein the sensor data comprises audio data representing at least acoustic features associated with the component and background noise, the method further comprising: identifying background noise of the audio data; processing the audio data to remove at least a portion of background noise to generate processed audio data; and inputting the processed audio data into a machine learning model.
The method of any of paragraphs Y-CC, wherein the sensor data comprises a series of measurements of at least one of audio data or Inertial Measurement Unit (IMU) data.
The method of any of paragraphs Y-DD, wherein the sensor data comprises first audio data from a first acoustic sensor and second audio data from a second acoustic sensor, the method further comprising: one or more components or positions of the components are determined based at least in part on the first audio data and the second audio data.
The method of any of paragraphs Y-EE, further comprising: receiving second sensor data captured by one or more sensors of the vehicle; inputting second sensor data into the trained machine learning model; and receiving an operating state associated with the vehicle component from the trained machine learning model.
Gg. One or more non-transitory computer-readable storage media storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising: receiving sensor data indicative of an operating state associated with a vehicle component; determining an operational state associated with the component based at least in part on the sensor data; inputting sensor data into a machine learning model; receiving a predicted operating state associated with a vehicle component from a machine learning model; and based at least in part on the predicted operating state, changing parameters of the machine learning model to obtain a trained machine learning model.
One or more non-transitory computer readable storage media according to paragraph GG wherein the sensor data includes stored log data from a second vehicle that experienced a failure of another component of the same type as the component.
The one or more non-transitory computer-readable storage media of any of paragraphs GG or HH, wherein the predicted operational status comprises at least one of a wear indication associated with the component, a predicted time to failure associated with the component, or an anomaly indication associated with the component.
One or more non-transitory computer-readable storage media according to any of paragraphs GG to II, the operations further comprising: determining a mean time to failure associated with a plurality of components having a same type as the component; receiving, from a machine learning model, an estimated time to failure associated with a vehicle component; and wherein the parameters of the machine learning model are further varied based at least in part on a difference between the mean time to failure and the estimated time to failure.
One or more non-transitory computer-readable storage media according to any one of paragraphs GG to JJ, wherein the sensor data comprises a series of measurements over time of at least one of audio data or Inertial Measurement Unit (IMU) data.
One or more non-transitory computer-readable storage media according to any of paragraphs GG through KK, wherein the sensor data comprises audio data representing at least acoustic features associated with components and background noise, the operations further comprising: identifying background noise of the audio data; processing the audio data to remove at least a portion of background noise to generate processed audio data; and inputting the processed audio data into a machine learning model.
One or more non-transitory computer-readable storage media according to any one of paragraphs GG-LL, wherein the sensor data comprises first audio data from a first acoustic sensor and second audio data from a second acoustic sensor, the method further comprising: determining one or more components or positions of components based at least in part on the first audio data and the second audio data.
One or more non-transitory computer-readable storage media according to any of paragraphs GG to MM, the operations further comprising: receiving second sensor data captured by one or more sensors of the vehicle; inputting second sensor data into the trained machine learning model; and receiving an operating state associated with the vehicle component from the trained machine learning model.
While the above example clauses are described with respect to one particular implementation, it should be understood that in the context of this document, the contents of the example clauses may also be implemented by a method, apparatus, system, computer-readable medium, and/or another implementation. Furthermore, any of the example A-NNs can be implemented alone or in combination with any other one or more of the example A-NNs.
Conclusion
While one or more examples of the technology described herein have been described, various modifications, additions, permutations and equivalents thereof are included within the scope of the technology described herein.
In the description of the examples, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific examples of the claimed subject matter. It is to be understood that other examples may be used and that changes or alterations, such as structural changes, may be made. Such examples, changes, or variations do not necessarily depart from the scope of the claimed subject matter. Although the steps herein may be presented in a particular order, in some cases the order may be changed to provide particular inputs at different times or in a different order without changing the functionality of the systems and methods described. The disclosed procedures may also be performed in a different order. In addition, the various calculations herein need not be performed in the order disclosed, and other examples using alternative orderings of calculations may be readily implemented. In addition to reordering, a computation may be decomposed into sub-computations with the same result.

Claims (15)

1. A method, comprising:
receiving a first sensor characteristic indicative of an operating state associated with a component of a vehicle;
activating a component of the vehicle;
receiving data related to the component from a sensor of the vehicle;
determining a second sensor characteristic associated with the component based at least in part on the data;
determining a correlation between the first sensor characteristic and the second sensor characteristic; and
outputting an operational state associated with the component based at least in part on the correlation.
2. The method of claim 1, further comprising:
receiving second data associated with another component of the same type as the component; and
determining the first sensor characteristic based at least in part on the second data.
3. The method of claim 1 or 2, further comprising: while activating a component of the vehicle, controlling operation of another component of the vehicle as a function of an operating parameter, wherein the operating parameter includes at least one of speed, steering angle, braking condition, position, temperature, or time of day.
4. The method of any of claims 1-3, wherein the sensor comprises a microphone and the data comprises audio data associated with the component and background noise, and wherein determining the second sensor characteristic further comprises processing the data to remove the background noise.
5. The method of any of claims 1 to 4, further comprising:
determining the operating state based at least in part on the second sensor characteristic,
wherein the operational state comprises at least one of a wear indication associated with the component, a predicted time to failure associated with the component, or an anomaly indication associated with the component.
6. The method of any of claims 1-5, further comprising causing at least one of the vehicle or the component to be serviced based at least in part on the operating condition.
7. The method of any one of claims 1 to 6, wherein at least one of the first sensor characteristic or the second sensor characteristic comprises a time series of measurement data from the sensor over time and associated with one or more operating parameters associated with the vehicle.
8. The method of any one of claims 1-7, wherein outputting the operational status associated with the component includes transmitting data indicative of the operational status to a remote monitoring system associated with the vehicle.
9. The method of any of claims 1 to 7, further comprising:
storing the data in a local memory of the vehicle; and
transmitting the data over a network to a remote computing system associated with the vehicle.
10. The method of any of claims 1-9, wherein the sensor comprises a microphone, an Inertial Measurement Unit (IMU), an accelerometer, or a piezoelectric sensor.
11. The method of any one of claims 1-9, wherein the sensor comprises one or more microphones to locate sound associated with a component of the vehicle.
12. The method of any one of claims 1 to 11, wherein:
the first sensor characteristic is based at least in part on audio data associated with the component at a first time; and
the second sensor characteristic is based at least in part on audio data associated with the component at a second time subsequent to the first time.
13. The method of any of claims 1-12, wherein determining the correlation between the first sensor characteristic and the second sensor characteristic comprises:
determining that a change between the first sensor characteristic and the second sensor characteristic is greater than a threshold change; and
outputting an operational state associated with the component based at least in part on the change being greater than the threshold change.
14. One or more non-transitory computer-readable storage media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform the method of any one of claims 1-13.
15. A system, comprising:
one or more processors; and
one or more non-transitory computer-readable storage media storing instructions that, when executed by the one or more processors, cause the system to perform the method of any of claims 1-13.
CN202180029738.9A 2020-04-23 2021-04-12 Vehicle health monitor Pending CN115428046A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US16/856,597 2020-04-23
US16/856,733 US11842580B2 (en) 2020-04-23 2020-04-23 Predicting vehicle health
US16/856,733 2020-04-23
US16/856,597 US11482059B2 (en) 2020-04-23 2020-04-23 Vehicle health monitor
PCT/US2021/026906 WO2021216313A1 (en) 2020-04-23 2021-04-12 Vehicle health monitor

Publications (1)

Publication Number Publication Date
CN115428046A true CN115428046A (en) 2022-12-02

Family

ID=78270167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180029738.9A Pending CN115428046A (en) 2020-04-23 2021-04-12 Vehicle health monitor

Country Status (4)

Country Link
EP (1) EP4139907A4 (en)
JP (1) JP2023523187A (en)
CN (1) CN115428046A (en)
WO (1) WO2021216313A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220382272A1 (en) * 2021-05-28 2022-12-01 Skygrid, Llc Predictive maintenance of an unmanned aerial vehicle

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116130721B (en) * 2023-04-11 2023-06-16 杭州鄂达精密机电科技有限公司 Status diagnostic system and method for hydrogen fuel cell

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2814841B1 (en) * 2000-09-29 2003-08-22 Renault METHOD AND SYSTEM FOR DIAGNOSING THE WEAR OF A VEHICLE
DE102005023359A1 (en) * 2005-05-20 2006-11-23 Robert Bosch Gmbh Control residual life-span information obtaining method for e.g. commercial vehicle, involves evaluating actual on-board voltage, and comparing actual on-board voltage with averaged on-board voltage for receiving difference value
RU2615806C1 (en) * 2015-11-10 2017-04-11 Федеральное государственное автономное образовательное учреждение высшего образования "Санкт-Петербургский государственный электротехнический университет "ЛЭТИ" им. В.И. Ульянова (Ленина)" Method for remote diagnostics of motor vehicle
WO2018020475A1 (en) * 2016-07-29 2018-02-01 Ather Energy Pvt. Ltd. A method and system for determining an operational condition of a vehicle component
DE102016114048A1 (en) * 2016-07-29 2018-02-01 Claas Tractor Sas Method and system for determining a load condition of a, in particular agricultural, vehicle
DE102017111901A1 (en) * 2017-05-31 2018-12-06 Voith Patent Gmbh Maintenance of a commercial vehicle
US11120650B2 (en) * 2017-09-14 2021-09-14 Ford Global Technologies, Llc Method and system for sending vehicle health report

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220382272A1 (en) * 2021-05-28 2022-12-01 Skygrid, Llc Predictive maintenance of an unmanned aerial vehicle

Also Published As

Publication number Publication date
WO2021216313A1 (en) 2021-10-28
EP4139907A4 (en) 2024-05-29
EP4139907A1 (en) 2023-03-01
JP2023523187A (en) 2023-06-02

Similar Documents

Publication Publication Date Title
US11842580B2 (en) Predicting vehicle health
US11482059B2 (en) Vehicle health monitor
JP7440587B2 (en) Detecting errors in sensor data
CN111902731B (en) Automatic detection of sensor calibration errors
US11521438B2 (en) Using sound to determine vehicle health
EP3838702A1 (en) Prevention, detection and handling of tire blowouts on autonomous trucks
US11254323B2 (en) Localization error monitoring
WO2021133810A1 (en) Sensor degradation monitor
CN115428046A (en) Vehicle health monitor
US20220244061A1 (en) Vehicle consumption monitoring system and method
US11884311B2 (en) Route inspection system
US11055624B1 (en) Probabilistic heat maps for behavior prediction
CN117980212A (en) Planning system based on optimization
US11590969B1 (en) Event detection based on vehicle data
US11680824B1 (en) Inertial measurement unit fault detection
CN115249066A (en) Quantile neural network
US11794811B2 (en) Determining estimated steering data for a vehicle
US20220348187A1 (en) Determining vehicle ride height using a ball joint sensor
US11919526B1 (en) Model-based vehicle drive system monitors
US11754471B1 (en) Techniques for testing a deflection of a structural section of a vehicle
CN117545674A (en) Technique for identifying curbs
US11794759B2 (en) Automatic testing of autonomous vehicles
US12003929B1 (en) Microphone cleaning and calibration
US12038348B1 (en) Vehicle component monitoring
US11780471B1 (en) System for determining a state of an object using thermal data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination