WO2020053148A1 - Removing noise caused by vehicular movement from sensor signals using deep neural networks - Google Patents

Removing noise caused by vehicular movement from sensor signals using deep neural networks Download PDF

Info

Publication number
WO2020053148A1
WO2020053148A1 PCT/EP2019/073993 EP2019073993W WO2020053148A1 WO 2020053148 A1 WO2020053148 A1 WO 2020053148A1 EP 2019073993 W EP2019073993 W EP 2019073993W WO 2020053148 A1 WO2020053148 A1 WO 2020053148A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
vehicle
motion
neural network
sensor device
Prior art date
Application number
PCT/EP2019/073993
Other languages
French (fr)
Inventor
Hans-Peter Beise
Steve DIAS DA CRUZ
Udo Schröder
Original Assignee
Iee International Electronics & Engineering S.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Iee International Electronics & Engineering S.A. filed Critical Iee International Electronics & Engineering S.A.
Publication of WO2020053148A1 publication Critical patent/WO2020053148A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6893Cars
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • A61B5/7207Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts
    • A61B5/721Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts using a separate sensor to detect motion or using motion information derived from signals other than the physiological signal to be measured
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24143Distances to neighbourhood prototypes, e.g. restricted Coulomb energy networks [RCEN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • G06V10/451Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
    • G06V10/454Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/04Babies, e.g. for SIDS detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/20Workers
    • A61B2503/22Motor vehicles operators, e.g. drivers, pilots, captains
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/7257Details of waveform analysis characterised by using transforms using Fourier transforms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients

Definitions

  • the invention relates to a method of operating a sensor device in an interior of a vehicle for obtaining vehicle passenger-related physical quantities.
  • Occupancy sensors based on radar technology offer advantages in comparison to other occupancy detection methods as their operation is contact-free and unnoticeable for vehicle occupants. Moreover, radar sensors can easily be integrated in the vehicle interior, for example behind plastic covers and textiles.
  • Vehicle seat occupancy detection systems are nowadays widely used in vehicles, in particular in passenger cars, for instance for detection of left-behind pets and/or children, vital sign monitoring, vehicle seat occupancy detection for seat belt reminder (SBR) systems, or anti-theft alarm.
  • vehicle seat occupancy detection systems can be employed for providing a seat occupancy signal for various appliances, for instance for the purpose of a seat belt reminder (SBR) system or an activation control for an auxiliary restraint system (ARS).
  • SBR seat belt reminder
  • ARS auxiliary restraint system
  • An output signal of a seat occupant detection and/or classification system is usually transferred to an electronic control unit of the vehicle to serve, for instance, as a means of assessing a potential activation of an installed vehicle passenger restraint system, such as an airbag.
  • ADAS Advanced Driver Assistance Systems
  • WO 2013/037399 A1 describes a system and method for detecting a vital-related signal pattern of a seated person in a vehicle seat.
  • the seat comprises a substantially horizontal base and a substantially vertical backrest having a front surface accommodating the back of the seated person when in use, and a rear surface.
  • the system further comprises at least one Doppler radar arranged behind the front surface of the backrest, in such a way that a main radiation lobe of the emitter/receiver of the Doppler radar is focused towards the front surface of the backrest, by which a movement created by a vital sign of the person can be detected and a vital-related signal can be obtained.
  • the system comprises a module for detecting, based on a radar signal obtained by the Doppler radar, a vital-related signal pattern of the seated person.
  • sensor-based devices that include at least one such sensor becomes more challenging, as a vehicle is naturally undergoing some vertical movement caused by, for instance, driving on a road with an uneven road profile, by vibrations from a running car engine or by wind gusts. In such scenarios, the vehicle motion induces a movement of the sensor device and the passengers. Consequently, a sensor-based device such as an interior RADAR device with a radar sensor not only measures the motions that are desired to observe, such as a breathing motion or a heart rate, but also captures unwanted passenger motions, which are induced by the vehicle motions.
  • the sensor device itself may undergo a different motion than the passengers do, which, for example in the case of RADAR-based devices, causes complex relative motions in the observations. This leads to noisy sensor measurements, in which it is more difficult to detect and monitor e.g. vital signs.
  • a de-noising i.e. a removal of a motion-induced portion from such signals becomes challenging when using conventional de-noising techniques.
  • WO 2016/038148 A1 (US 2017/0282828 A1 ) describes a method for sensing an occupancy status within an automotive vehicle.
  • the method uses a radar sensor system having an antenna system, at least one sensor and processing circuitry.
  • the method comprises a step of illuminating, using the antenna system, at least one occupiable position within the vehicle with an outgoing radar signal; a step of receiving, using the at least one sensor, at least one sensor signal reflected as a result of the outgoing radar signal; a step of obtaining accelerometer data value from at least one accelerometer, wherein the accelerometer data contain information regarding vibration or motion of the automotive vehicle and a step of supplying the accelerometer data to the processing circuitry; and a step of operating the processing circuitry for generating, based on the at least one sensor signal and on the accelerometer data, one or more occupancy status signals, wherein the occupancy status signal indicates a property related to the at least one occupiable position.
  • the method includes accelerometer data to a classification software of the radar sensor system and is therefore able to compensate for motion or vibration of the vehicle.
  • the information regarding vibration or motion can be taken into consideration when a classification (interior human detection) algorithm needs to classify. This information can help to filter out exterior influences that might falsify the classification (passing traffic, wind shakes, various vibrations of the engine or any exterior event leading to a vehicle movement).
  • FIG. 1 A typical scenario for conventional detection and monitoring of, for instance, vital signs of one or several persons 32 seated in an interior 18 of a vehicle 16 by radar is illustrated in Fig. 1 in a side view.
  • a radar-based sensor device RS with one radar sensor is shown being installed in an interior 18 of the vehicle 16, which is designed as a sedan passenger car. While driving on a roadway 42, the vehicle 16 is undergoing some motion, for example, due to an uneven vertical road profile 44, which is shown in Fig. 1 in an exaggerated manner for clarity purposes, or even by a running engine of the vehicle 16. The roughness of the vertical road profile 44 causes vertical motion 38 of vehicle wheels 22.
  • the vertical motion 38 of the wheels 22 is transferred to a vehicle body 24 via a vehicle suspension system 26, generating force vibrations of the seat 20 and the person 32 occupying the seat 20, forming a seat/person system 36.
  • force vibrations of the seat 20 and the person 32 are induced by the mechanical vibrations 40 of the running vehicle engine.
  • the vehicle motion induces a movement of the radar-based sensor device RS and also causes the passenger(s) 32 to move away from a rest position. This additional motion is going to introduce noise to the sensor measurement signal.
  • information about the motion of the vehicle body 24 can be provided, for instance by using an accelerometer device ACC.
  • a breathing motion 34 of the person 32 is superimposed by the forced vibrations of the seat 20 and the person 32 that are mainly induced by the vertical road profile 44 and the vehicle engine vibrations 40.
  • Other exterior sources that may as well induce forced vibrations of the seat 20 and the person 32 are, for instance, strong winds or heavy oncoming traffic passing in a close distance.
  • a physical model can be introduced for simulating a relation of motions of the wheels 22 of the vehicle 16, the suspension system 26 connecting the wheels 22 to the vehicle body 24, the vehicle body 24, the seat 20, the radar-based sensor device RS and the accelerometer device ACC that are fixedly attached to the vehicle body 24.
  • a possible embodiment of a mathematical model is described, which takes the motion of the vehicle body 24 as an input and then describes the resulting motion of the seat 20.
  • the corresponding differential equation describing this specific model can be expressed as
  • the accelerometer data contain information regarding vibration or motion of the vehicle body 24.
  • the coupled motion between the seat 20 and the radar-based sensor device RS can be described and solved in a mathematically simple way, for instance by applying the simple Euler method for solving differential equations.
  • CN 105 769 173 A describes a method for removing vehicle motion noise in a three-lead ECG signal.
  • the method uses a three-axis acceleration sensor to collect motion data as a neural network input sample, and uses a format conversion and normalization method before the neural network training.
  • the dynamic ECG data of the human body during stationary and motion are preprocessed and the deviation between the two is used as the supervision of the neural network.
  • the artificial neural network learning algorithm is used to establish the noise model in the optimized way of proportional conjugate gradient. De-noising is accomplished by subtracting the corresponding motion noise from the dynamic ECG data.
  • the method can effectively remove vehicle motion noise during the ECG monitoring process, and obtain accurate ECG data.
  • US 2016/354027 A1 describes an occupant support system that includes a vehicle seat and an electronics system for the vehicle seat.
  • the electronics system includes a sensor system configured to obtain sensor data, and a computer coupled to the sensor system to process the sensor data and perform a predetermined action using the sensor data (abstract).
  • a ballistocardiogram (BCG) signal can be measured by radar system waves on very high frequencies.
  • An option of employing a technique for detecting the BCG signal is described that comprises the use of neural network algorithms on a field- programmable gate array (FPGA) to detect heartbeats from the BCG signal.
  • FPGA field- programmable gate array
  • the object is achieved by a method of operating a sensor device in an interior of a vehicle.
  • the sensor device has at least one sensor that is sensitive to a relative motion to parts of and objects within the interior of the vehicle, wherein the at least one sensor is configured for detecting at least one vehicle passenger-related physical quantity.
  • the proposed method comprises at least steps of
  • the combined deep learning scheme comprises a plurality of exemplary pairs of raw or processed data sensed by the at least one sensor and raw or processed data sensed by the at least one motion sensor device on the one side and at least one specific vehicle passenger- related physical quantity on the other side, and wherein the exemplary pairs of raw or processed data and the at least one specific vehicle passenger- related physical quantity are known a priori,
  • Non-limiting examples of vehicle passenger-related physical quantities are vital signs of one or several passengers in the vehicle interior, a number of breathing motions present in the vehicle interior, individual breathing frequencies, a presence of a baby, a mass of the passenger, and so forth.
  • Artificial neural network are known to comprise a plurality of interconnected artificial neurons and to have an input side and an output side.
  • each artificial neurons of the plurality of interconnected artificial neurons also called nodes
  • the output of each artificial neuron may be calculated using a non-linear function of the sum of its inputs.
  • weights of the non-linear function usually are being adjusted.
  • a complex task may be learned by determining a set of weights for the artificial neurons such that the output signal of the artificial neural network is close to a desired output signal, which is performed when the artificial neural network is trained.
  • supervised learning a function is learned that maps an input to an output based on exemplary input-output pairs.
  • An artificial neural network that has been submitted to a learning scheme is often called a“trained” artificial neural network.
  • vehicle as used in this application, shall particularly be understood to encompass passenger cars, trucks, semi-trailer tractors and buses.
  • the proposed at least one artificial neural network efficient methods can be learned, which can take into account information about the motion of the vehicle in order to perform a de- noising of interior vehicle sensor measurement data or can extract desired information from the noisy sensor measurement data directly.
  • the at least one artificial neural network can learn the physical model and/or pre-processing techniques necessary to efficiently remove a portion of a sensor signal that is generated by the sensor due to being moved by vehicular movements and/or can directly extract desired information, such as a vehicle passenger-related physical quantity, for a system it is implemented in.
  • the motion sensor device may include at least one motion sensor, which may be formed by an accelerometer sensor or a gyroscope sensor. It is known in the art to employ gyroscope sensors (gyrometers) in passenger cars for measuring a car directional change. In combination with a measurement traveled distance, a determination of a car position is facilitated even in regions without GPS signal by extrapolation.
  • gyroscope sensors gyrometers
  • the step of carrying out a combined deep learning scheme further comprises a preceding step of carrying out a first deep learning scheme with a first artificial neural network, wherein the first deep learning scheme comprises a plurality of exemplary pairs of raw or processed data of the at least one sensor and at least one specific vehicle passenger-related physical quantity, and further comprises a preceding step of carrying out a second deep learning scheme with a second artificial neural network, wherein the second deep learning scheme comprises a plurality of exemplary pairs of raw or processed data sensed by the at least one motion sensor device and at least one specific motion of the vehicle body.
  • the step of providing the raw or processed data generated in the detection scenario comprises providing an output of the first artificial neural network in response to the generated raw or processed data of the at least one sensor in the detection scenario within the vehicle interior as one input to the artificial neural network trained by the combined deep learning scheme, and providing an output of the second artificial neural network in response to the generated raw or processed data of the at least one motion sensor device in the detection scenario within the vehicle interior as another input to the artificial neural network trained by the combined deep learning scheme.
  • the artificial neural networks can learn transformations of the sensor data and data of the at least one motion sensor device that are necessary to output sensor data with a removed motion-induced signal portion or, instead, can extract desired information from the noisy sensor measurement data directly, for instance vital signs, properties of the vital sign(s) like the number of breathing motions present in the vehicle interior, frequencies of the breathing motion, or an individual mass of the passengers.
  • the exemplary pairs of the first deep learning scheme are known a priori
  • the exemplary pairs of the second deep learning scheme are known a priori
  • the step of generating raw or processed data of the at least one sensor and raw or processed data of the at least one motion sensor device in a detection scenario is a step of generating processed data of at least one out of the at least one sensor and the at least one motion sensor device and includes applying a fast Fourier transform on raw data of at least one out of the at least one sensor and the at least one motion sensor device.
  • the deep learning can be extended to then include the frequency domain, which can enable an easier and faster learning process.
  • the step of deriving an output representing one or more vehicle passenger-related physical quantity or quantities comprises using the derived output of the artificial neural network trained with the combined deep learning scheme for directly deriving the output representing one or more vehicle passenger-related physical quantity or quantities, and/or for determining and applying parameters for de-noising the signal of the at least one sensor by removing a portion of the sensor signal that is generated by the at least one sensor due to being moved by vehicular movements.
  • a flexible provision of an output can be enabled that represents one or more vehicle passenger-related physical quantity or quantities, based on the carried out combined deep learning scheme.
  • a sensor device for operation in an interior of a vehicle comprises at least one sensor and an evaluation and control unit.
  • the at least one sensor is sensitive to a relative motion to parts of and objects within the interior of the vehicle, wherein the at least one sensor is configured for detecting a vehicle passenger-related physical quantity.
  • the evaluation and control unit comprises at least one artificial neural network and is at least configured for
  • the combined deep learning scheme may comprise a plurality of exemplary pairs of raw or processed data of the at least one sensor and raw or processed data of the at least one motion sensor device on the one side and at least one specific vehicle passenger-related physical quantity on the other side, wherein the exemplary pairs of raw or processed data and the at least one specific vehicle passenger-related physical quantity are known a priori.
  • At least one artificial neural network is formed by a deep neural network (DNN), for instance a recurrent neural network (RNN).
  • DNN deep neural network
  • RNN recurrent neural network
  • a DNN is an artificial neural network with multiple hidden layers of artificial neurons between the input and the output side. DNNs are known to be able to model complex non- linear relationships.
  • An RNN is a DNN whose connections between nodes form a directed graph along a sequence. RNNs can show dynamic temporal behavior for a time sequence. Both types of artificial neural network are beneficially employable in the proposed sensor device for operation in an interior of a vehicle.
  • the sensor device is formed as a radar sensor system and the at least one sensor is formed by a radar sensor.
  • the radar sensor includes a radar transmitting unit having at least one radar transmitting antenna and being configured for transmitting radar signals towards at least a portion of the vehicle interior.
  • the radar sensor further comprises a radar receiving unit having at least one radar receiving antenna and being configured for receiving radar signals that have been transmitted by the radar transmitter unit and have been reflected by parts of and objects within the interior of the vehicle.
  • the evaluation and control unit is configured for
  • the evaluated radar signal information may be formed by Doppler radar signal information, which is well known in the art.
  • the evaluated radar signal information comprises angular information; i.e. azimuthal and/or elevational quantities.
  • the sensor device further includes at least one motion sensor device that is configured for providing the motion sensor data.
  • at least one motion sensor device that is configured for providing the motion sensor data.
  • the at least one motion sensor device is an integral part of the sensor device.
  • a sensor device in particular a radar sensor system, with a compact design can be provided.
  • a systematic error for the motion sensor data due to a spatial separation of the motion sensor device and the sensor device can be avoided.
  • the sensor device and the at least one motion sensor device forming an integral part of the sensor device are rigidly attached to the vehicle body. In this way, any motion of the vehicle body can properly be detected.
  • the at least one motion sensor device comprises at least one accelerometer sensor that is designed as a micro-electromechanical system (MEMS).
  • MEMS micro-electromechanical system
  • the use of the disclosed sensor device including at least one radar sensor as the at least one sensor, in an automotive vehicle interior sensing system for detection of vital sign characteristics is proposed.
  • the benefits described in context with the disclosed method of operating a sensor device in an interior of a vehicle apply to the use of the disclosed sensor device for operation in an interior of a vehicle for vital sign detection to the full extent.
  • Fig. 1 schematically shows a conventional radar sensor system for detecting passenger vital signs in an interior of a vehicle in a side view, installed in the vehicle,
  • Fig. 2 is a mechanical equivalent of the configuration pursuant to Fig. 1
  • Fig. 3 schematically shows a possible embodiment of a sensor device in accordance with the invention, formed as a radar sensor system and being installed in a vehicle, in a side view, and
  • Fig. 4 schematically illustrates an evaluation and control unit of the radar sensor system pursuant to Fig. 3, and further illustrates steps of a method of operating the radar sensor system in accordance with the invention for detecting vehicle passenger-related physical quantities.
  • Fig. 3 schematically shows, in a side view, a possible embodiment of a sensor device in accordance with the invention.
  • the sensor device which is formed as a radar sensor system 10, is installed in an interior 18 of a vehicle 16, which is identically designed to the vehicle 16 pursuant to Fig. 1.
  • the sensor device is configured for operation in the interior 18 of the vehicle 16, and for use in an automotive vehicle interior sensing system for vital sign detection.
  • the person 32 in the interior 18 of the vehicle 16 is the driver, who is located at and is occupying a seat 20 of the vehicle 16, namely the driver’s seat, thus forming a seat/person system 36.
  • the vehicle 16 is shown to be driving on a roadway 42 having a vertical road profile 44, which is shown in Fig. 3 in an exaggerated manner for clarity purposes.
  • the roughness of the vertical road profile 44 causes vertical motion 38 of vehicle wheels 22.
  • the vertical motion 38 of the wheels 22 is transferred to a vehicle body 24 via a vehicle suspension system (not shown), which is identical to the suspension system 26 shown in Fig. 1 , generating forced vibrations of the seat 20 and the person 32 occupying the seat 20. Additional forced vibrations of the seat/person system 36 are induced by mechanical vibrations 40 of a running engine of the vehicle 16.
  • the radar sensor system 10 includes a radar sensor 12 that is arranged in front of the person 32 at an inside of a roof 28 of the vehicle 16, and that is configured for detecting vehicle passenger-related physical quantities that are given by a breathing motion 34, characterized by an amplitude and a breathing frequency.
  • the radar sensor 12 includes a radar transmitting unit and a radar transceiver antenna that is directed backwards towards the vehicle interior 18 and is configured for transmitting radar signals towards the vehicle interior 18.
  • the radar sensor 12 is sensitive to a relative motion between the radar transceiver antenna and the person's chest, and is further configured for receiving radar signals that have been transmitted by the radar transmitting unit and have in particular been reflected by the person's chest.
  • the radar sensor 12 is also sensitive to a relative motion of the radar transceiver antenna to parts within the interior 18 of the vehicle 16.
  • the radar sensor system 10 moreover comprises an evaluation and control unit 46, shown in detail in Fig. 4, which is configured for evaluating Doppler information from the radar signals received by the radar receiving unit in a detection scenario.
  • the evaluation and control unit 46 comprises a processor unit and a digital data memory unit (not shown) to which the processor unit has data access.
  • the evaluation and control unit 46 further includes a plurality of three artificial neural networks, which are formed as deep neural networks 48, 50, 52.
  • the evaluated Doppler information is also a superposition of Doppler information generated by the person’s breathing motion 34 and Doppler information generated by the forced vibrations of the seat/person system 36 that are mainly induced by the vertical road profile 44 and the vehicle engine vibrations 40.
  • Other exterior sources that may as well induce forced vibrations of the seat/person system 36 are, for instance, strong winds or heavy oncoming traffic passing in close distance.
  • the radar sensor system 10 includes a motion sensor device, which is formed by an accelerometer device 14 that is arranged at the inside of the vehicle roof 28 (Fig. 1 ).
  • the accelerometer device 14 comprises a three-axis accelerometer sensor that is designed as an on-chip micro-electromechanical system (MEMS).
  • MEMS micro-electromechanical system
  • a digital data link (wireless or by wire connection), indicated in Fig. 3 by a dashed line, is provided between the accelerometer device 14 and the evaluation and control unit 46.
  • the accelerometer device 14 is configured to provide digital accelerometer data to the evaluation and control unit 46 via the digital data link.
  • the accelerometer data contain information regarding vibration or motion of the vehicle body 24 in the detection scenario.
  • the evaluation and control unit 46 is configured to receive digital accelerometer data from the accelerometer device 14 via the digital data link.
  • the accelerometer device 14 is arranged near a top of the inside of the vehicle roof 28, spaced from the balance of the radar sensor system 10.
  • the accelerometer device 14 may be an integral part of the radar sensor system, and, in this way, may be arranged close to the radar transceiver antenna.
  • Fig. 4 schematically illustrates the evaluation and control unit 46 of the radar sensor system 10 pursuant to Fig. 3, and further illustrates steps of the method of operating the radar sensor system 10 in accordance with the invention for detecting vehicle passenger-related physical quantities.
  • Fig. 3 schematically illustrates the evaluation and control unit 46 of the radar sensor system 10 pursuant to Fig. 3, and further illustrates steps of the method of operating the radar sensor system 10 in accordance with the invention for detecting vehicle passenger-related physical quantities.
  • the evaluation and control unit 46 is equipped with a software module.
  • the method steps to be conducted are converted into a program code of the software module.
  • the program code is implemented in the digital data memory unit of the evaluation and control unit 46 and is executable by the processor unit of the evaluation and control unit 46.
  • a first step 70 of the method processed data sensed by the radar sensor 12 is provided to a first deep neural network (DNN) 48 of the plurality of three DNNs 48, 50, 52.
  • DNN deep neural network
  • a data connection is provided within the evaluation and control unit 46 from an output port of the processor unit to an input side 54 of the first DNN 48.
  • the processed data sensed by the radar sensor 12 are given by the evaluated Doppler information from the radar signals received by the radar receiving unit in a detection scenario.
  • step 72 of the method which may be executed before, simultaneously to or after the first step 70, processed data sensed by the accelerometer device 14 is provided to a second DNN 50 of the plurality of three DNNs 48, 50, 52.
  • a data connection is provided between an output port of the accelerometer device 14 to an input side 56 of the second DNN 50.
  • a combined supervised learning scheme is carried out in another step 74 of the method.
  • This step comprises a preceding step 76 of carrying out a first supervised learning scheme with the first DNN 48 and another preceding step 78 of carrying out a second supervised learning scheme with the second DNN 50.
  • the first learning scheme comprises a plurality of exemplary pairs of processed data sensed by the radar sensor 12 and specific vehicle passenger- related physical quantities, which are given by the breathing amplitude and the breathing frequency.
  • the exemplary pairs of the first learning scheme are known a priori.
  • the second learning scheme comprises a plurality of exemplary pairs of processed data sensed by the accelerometer device 14 and a specific motion of the vehicle body 24.
  • the exemplary pairs of the second learning scheme are known a priori.
  • Both the first DNN 48 and the second DNN 50 use the data obtained from the respective exemplary pairs in order to learn a function that will be used to map a future input to an output.
  • the output 60 of the first DNN 48 will contain main features in the processed data sensed by the radar sensor 12 occurring in the event of a certain combination of the breathing amplitude and the breathing frequency of the vehicle passenger 32.
  • the output 62 of the second DNN 50 will contain main features in the processed data sensed by the accelerometer device 14 occurring in the event of a certain motion of the vehicle body 24.
  • a next step 80 for carrying out the combined supervised learning scheme with the third DNN 52 the output 60 of the first DNN 48 is provided to an input side 58 of the third DNN 52 of the plurality of three DNNs 48, 50, 52 as one input.
  • the output 62 of the second DNN 50 is provided to the input side 58 of the third DNN 52 as another input.
  • the step 80 of providing the output 60 of the first DNN 48 and providing the output 62 of the second DNN 50 to the input side 58 of the third DNN 52 is carried out by a step 82 of combining the output 60 of the first DNN 48 and the output 62 of the second DNN 50.
  • the combined supervised learning scheme comprises the plurality of exemplary pairs of processed data sensed by the radar sensor 12 and processed data sensed by the accelerometer device 14 on the one side and the specific vehicle passenger-related physical quantities, which are given by the breathing amplitude and the breathing frequency, on the other side.
  • the exemplary pairs of processed data and the specific vehicle passenger-related physical quantities are known a priori.
  • the radar sensor system 10 is ready for operation in an actual detection scenario.
  • the data flow scheme shown in Fig. 4 is the same for carrying out the supervised learning and for processing data in the actual detection scenario.
  • processed data of the radar sensor 12 and processed data of the accelerometer device 14 are generated in another step 84 of the method.
  • the processed data of the radar sensor 12 and the processed data of the accelerometer device 14 are provided as input to the third DNN 52 that is trained by the combined supervised learning scheme.
  • an output is derived directly by the third DNN 52, based on the carried out combined supervised learning scheme, that represents the breathing amplitude, the breathing frequency or the number of passengers 32 present in the interior 18 of the vehicle 16.
  • the directly derived output may represent a vital sign motion or other vital sign characteristics.
  • the output 64 is formatted as an output vector 66 and can be expressed as (number n, breathing amplitude 1 , breathing frequency 1 , breathing amplitude 2, breathing frequency 2, ..., breathing amplitude n, breathing frequency n), wherein placeholders for more than one vehicle passenger 32 may be filled up with zeros if not detected.
  • the number n is equal to or less than five.
  • the output 64 of the third DNN 52 may contain a plurality of parameters for de-noising the processed radar signal sensed by the radar sensor 12 for removing a portion of the processed radar signal that has been generated by the radar sensor 12 due to being moved by vehicular movements.

Abstract

A method of operating a sensor device (10) in an interior (18) of a vehicle (16) for detecting at least one vehicle passenger-related physical quantity comprises steps of - providing (70, 72) data sensed by the at least one sensor (12) and data sensed by at least one motion sensor device (14) that provides information regarding motion of a body (24) of the vehicle (16) as input data to at least one artificial neural network (48, 50, 52), - carrying out (74) a combined deep learning scheme with the at least one artificial neural network (52), wherein the combined deep learning scheme comprises a plurality of exemplary pairs of raw or processed data sensed by the at least one sensor and raw or processed data sensed by the at least one motion sensor device on the one side and at least one specific vehicle passenger-related physical quantity on the other side, and wherein the exemplary pairs of raw or processed data and the at least one specific vehicle passenger-related physical quantity are known a priori, - generating (84) data of the at least one sensor (12) and data of the at least one motion sensor device (14) in a detection scenario, - providing (86, 88) the data generated in the detection scenario as an input to at least one artificial neural network (52) trained by the combined deep learning scheme, and - by operating at least the artificial neural network (52) for processing the provided input data, derive (90) an output representing one or more vehicle passenger-related physical quantity or quantities, based on the carried out combined deep learning scheme.

Description

Removing noise caused by vehicular movement from sensor signals
using Deep Neural Networks
Technical field
[0001] The invention relates to a method of operating a sensor device in an interior of a vehicle for obtaining vehicle passenger-related physical quantities.
Background of the Invention
[0002] It is known in the art to use radar technology for automotive seat occupant detection systems. Occupancy sensors based on radar technology offer advantages in comparison to other occupancy detection methods as their operation is contact-free and unnoticeable for vehicle occupants. Moreover, radar sensors can easily be integrated in the vehicle interior, for example behind plastic covers and textiles.
[0003] Vehicle seat occupancy detection systems are nowadays widely used in vehicles, in particular in passenger cars, for instance for detection of left-behind pets and/or children, vital sign monitoring, vehicle seat occupancy detection for seat belt reminder (SBR) systems, or anti-theft alarm. Such vehicle seat occupancy detection systems can be employed for providing a seat occupancy signal for various appliances, for instance for the purpose of a seat belt reminder (SBR) system or an activation control for an auxiliary restraint system (ARS).
[0004] An output signal of a seat occupant detection and/or classification system is usually transferred to an electronic control unit of the vehicle to serve, for instance, as a means of assessing a potential activation of an installed vehicle passenger restraint system, such as an airbag.
[0005] Further valuable information, usable as important input for Advanced Driver Assistance Systems (ADAS) could be provided by monitoring a vital sign of the detected person, which has been proposed in the art.
[0006] For instance, WO 2013/037399 A1 describes a system and method for detecting a vital-related signal pattern of a seated person in a vehicle seat. The seat comprises a substantially horizontal base and a substantially vertical backrest having a front surface accommodating the back of the seated person when in use, and a rear surface. The system further comprises at least one Doppler radar arranged behind the front surface of the backrest, in such a way that a main radiation lobe of the emitter/receiver of the Doppler radar is focused towards the front surface of the backrest, by which a movement created by a vital sign of the person can be detected and a vital-related signal can be obtained. Moreover, the system comprises a module for detecting, based on a radar signal obtained by the Doppler radar, a vital-related signal pattern of the seated person.
[0007] Due to their operation principle, many sensors, in particular for automotive applications, generate undesired signals in a moving environment such as a moving vehicle. An application of sensor-based devices that include at least one such sensor becomes more challenging, as a vehicle is naturally undergoing some vertical movement caused by, for instance, driving on a road with an uneven road profile, by vibrations from a running car engine or by wind gusts. In such scenarios, the vehicle motion induces a movement of the sensor device and the passengers. Consequently, a sensor-based device such as an interior RADAR device with a radar sensor not only measures the motions that are desired to observe, such as a breathing motion or a heart rate, but also captures unwanted passenger motions, which are induced by the vehicle motions. Further, the sensor device itself may undergo a different motion than the passengers do, which, for example in the case of RADAR-based devices, causes complex relative motions in the observations. This leads to noisy sensor measurements, in which it is more difficult to detect and monitor e.g. vital signs. A de-noising, i.e. a removal of a motion-induced portion from such signals becomes challenging when using conventional de-noising techniques.
[0008] It has therefore been proposed in the art to employ additional sensors that are sensitive to vehicle movements for distinguishing between a vital signal portion and a signal portion induced by vehicle movement.
[0009] By way of example, WO 2016/038148 A1 (US 2017/0282828 A1 ) describes a method for sensing an occupancy status within an automotive vehicle. The method uses a radar sensor system having an antenna system, at least one sensor and processing circuitry. The method comprises a step of illuminating, using the antenna system, at least one occupiable position within the vehicle with an outgoing radar signal; a step of receiving, using the at least one sensor, at least one sensor signal reflected as a result of the outgoing radar signal; a step of obtaining accelerometer data value from at least one accelerometer, wherein the accelerometer data contain information regarding vibration or motion of the automotive vehicle and a step of supplying the accelerometer data to the processing circuitry; and a step of operating the processing circuitry for generating, based on the at least one sensor signal and on the accelerometer data, one or more occupancy status signals, wherein the occupancy status signal indicates a property related to the at least one occupiable position.
[0010] The method includes accelerometer data to a classification software of the radar sensor system and is therefore able to compensate for motion or vibration of the vehicle. The information regarding vibration or motion can be taken into consideration when a classification (interior human detection) algorithm needs to classify. This information can help to filter out exterior influences that might falsify the classification (passing traffic, wind shakes, various vibrations of the engine or any exterior event leading to a vehicle movement).
[0011] A typical scenario for conventional detection and monitoring of, for instance, vital signs of one or several persons 32 seated in an interior 18 of a vehicle 16 by radar is illustrated in Fig. 1 in a side view. A radar-based sensor device RS with one radar sensor is shown being installed in an interior 18 of the vehicle 16, which is designed as a sedan passenger car. While driving on a roadway 42, the vehicle 16 is undergoing some motion, for example, due to an uneven vertical road profile 44, which is shown in Fig. 1 in an exaggerated manner for clarity purposes, or even by a running engine of the vehicle 16. The roughness of the vertical road profile 44 causes vertical motion 38 of vehicle wheels 22. The vertical motion 38 of the wheels 22 is transferred to a vehicle body 24 via a vehicle suspension system 26, generating force vibrations of the seat 20 and the person 32 occupying the seat 20, forming a seat/person system 36. In addition, force vibrations of the seat 20 and the person 32 are induced by the mechanical vibrations 40 of the running vehicle engine. In other words, the vehicle motion induces a movement of the radar-based sensor device RS and also causes the passenger(s) 32 to move away from a rest position. This additional motion is going to introduce noise to the sensor measurement signal. In order to remove this noise from the sensor measurement signal, information about the motion of the vehicle body 24 can be provided, for instance by using an accelerometer device ACC. Thus, a breathing motion 34 of the person 32 is superimposed by the forced vibrations of the seat 20 and the person 32 that are mainly induced by the vertical road profile 44 and the vehicle engine vibrations 40. Other exterior sources that may as well induce forced vibrations of the seat 20 and the person 32 are, for instance, strong winds or heavy oncoming traffic passing in a close distance.
[0012] As shown in Fig. 1 , a physical model can be introduced for simulating a relation of motions of the wheels 22 of the vehicle 16, the suspension system 26 connecting the wheels 22 to the vehicle body 24, the vehicle body 24, the seat 20, the radar-based sensor device RS and the accelerometer device ACC that are fixedly attached to the vehicle body 24.
[0013] In Fig. 2, a possible embodiment of a mathematical model is described, which takes the motion of the vehicle body 24 as an input and then describes the resulting motion of the seat 20. The corresponding differential equation describing this specific model can be expressed as
M (t) + Dx(t ) + 5x(t) = Dy(t) + 5y(t), wherein x denotes a displacement of the seat 20 and the chest of the person 32, respectively (Fig. 1 ), y denotes a displacement of the interior 18 of the vehicle 16 and parameter M denotes the mass of the seat/person system 36. Further parameters are damping coefficient D and stiffness coefficient s (Fig. 2), representing a resilient member and a damping member between the seat 20 and the vehicle body 24, respectively, which are taken to be constant over time.
[0014] The accelerometer data contain information regarding vibration or motion of the vehicle body 24. In this way, the coupled motion between the seat 20 and the radar-based sensor device RS can be described and solved in a mathematically simple way, for instance by applying the simple Euler method for solving differential equations.
[0015] Flowever, all the constants in the above differential equation are a priori unknown, and it is quite complex to determine them in practice. Furthermore, it is not clear if the differential equation is able to describe the physical system accurately enough in the first place.
[0016] CN 105 769 173 A describes a method for removing vehicle motion noise in a three-lead ECG signal. The method uses a three-axis acceleration sensor to collect motion data as a neural network input sample, and uses a format conversion and normalization method before the neural network training. The dynamic ECG data of the human body during stationary and motion are preprocessed and the deviation between the two is used as the supervision of the neural network. Then the artificial neural network learning algorithm is used to establish the noise model in the optimized way of proportional conjugate gradient. De-noising is accomplished by subtracting the corresponding motion noise from the dynamic ECG data. The method can effectively remove vehicle motion noise during the ECG monitoring process, and obtain accurate ECG data.
[0017] US 2016/354027 A1 describes an occupant support system that includes a vehicle seat and an electronics system for the vehicle seat. The electronics system includes a sensor system configured to obtain sensor data, and a computer coupled to the sensor system to process the sensor data and perform a predetermined action using the sensor data (abstract). In one embodiment, a ballistocardiogram (BCG) signal can be measured by radar system waves on very high frequencies. An option of employing a technique for detecting the BCG signal is described that comprises the use of neural network algorithms on a field- programmable gate array (FPGA) to detect heartbeats from the BCG signal.
Object of the invention
[0018] It is therefore an object of the invention to provide a method that is capable of either removing a portion of a sensor signal that is generated by the sensor due to being moved by vehicular movements from a signal portion that is generated by the sensor due to motions that are desired to observe, such as a vital sign of a person or pet, and/or that is capable of directly extracting desired information from a sensor signal comprising a signal portion generated due to a sensor being moved by vehicular movements and a signal portion generated due to motions that are desired to observe, in particular vital signs of a person or pet.
General Description of the Invention
[0019] In one aspect of the present invention, the object is achieved by a method of operating a sensor device in an interior of a vehicle. The sensor device has at least one sensor that is sensitive to a relative motion to parts of and objects within the interior of the vehicle, wherein the at least one sensor is configured for detecting at least one vehicle passenger-related physical quantity.
[0020] The proposed method comprises at least steps of
providing raw or processed data sensed by the at least one sensor and raw or processed data sensed by at least one motion sensor device that is configured to provide information regarding vibration or motion of a body of the vehicle as input data to at least one artificial neural network,
carrying out a combined deep learning scheme with the at least one artificial neural network, wherein the combined deep learning scheme comprises a plurality of exemplary pairs of raw or processed data sensed by the at least one sensor and raw or processed data sensed by the at least one motion sensor device on the one side and at least one specific vehicle passenger- related physical quantity on the other side, and wherein the exemplary pairs of raw or processed data and the at least one specific vehicle passenger- related physical quantity are known a priori,
generating raw or processed data of the at least one sensor and raw or processed data of the motion sensor device in a detection scenario within the vehicle interior,
providing the raw or processed data generated in the detection scenario as an input to at least one artificial neural network trained by the combined deep learning scheme, and
by operating at least the artificial neural network trained with the combined deep learning scheme for processing the provided input data, derive an output representing one or more vehicle passenger-related physical quantity or quantities, based on the carried out combined deep learning scheme.
[0021] Non-limiting examples of vehicle passenger-related physical quantities are vital signs of one or several passengers in the vehicle interior, a number of breathing motions present in the vehicle interior, individual breathing frequencies, a presence of a baby, a mass of the passenger, and so forth.
[0022] Artificial neural network are known to comprise a plurality of interconnected artificial neurons and to have an input side and an output side. As is well known in the field of artificial neural networks, each artificial neurons of the plurality of interconnected artificial neurons (also called nodes) can transmit a signal to another artificial neuron connected to it, and the received signal can further be processed and transmitted to the next artificial neuron. The output of each artificial neuron may be calculated using a non-linear function of the sum of its inputs. In a learning process, weights of the non-linear function usually are being adjusted. A complex task may be learned by determining a set of weights for the artificial neurons such that the output signal of the artificial neural network is close to a desired output signal, which is performed when the artificial neural network is trained. Multiple methods for training an artificial neural network are known in the art. In supervised learning, a function is learned that maps an input to an output based on exemplary input-output pairs. An artificial neural network that has been submitted to a learning scheme is often called a“trained” artificial neural network.
[0023] The term “vehicle”, as used in this application, shall particularly be understood to encompass passenger cars, trucks, semi-trailer tractors and buses.
[0024] The phrase “being configured to”, as used in this application, shall in particular be understood as being specifically programmed, laid out, furnished or arranged.
[0025] It is an insight of the present invention that, by using the proposed at least one artificial neural network efficient methods can be learned, which can take into account information about the motion of the vehicle in order to perform a de- noising of interior vehicle sensor measurement data or can extract desired information from the noisy sensor measurement data directly. Instead of defining a mechanical model, for which the physical parameter are a priori unknown, and/or instead of defining pre-processing techniques and corresponding best parameters by hand, the at least one artificial neural network can learn the physical model and/or pre-processing techniques necessary to efficiently remove a portion of a sensor signal that is generated by the sensor due to being moved by vehicular movements and/or can directly extract desired information, such as a vehicle passenger-related physical quantity, for a system it is implemented in.
[0026] Since the above-mentioned physical models and parameters are different for each specific type of vehicle, with the proposed method an approximation of the true physical relation of the motions can be learned from sensor data and data of at least one motion sensor device of each vehicle, and by adapting the pre- processing techniques necessary to successfully extract the desired information from the sensor measurement.
[0027] Preferably, the motion sensor device may include at least one motion sensor, which may be formed by an accelerometer sensor or a gyroscope sensor. It is known in the art to employ gyroscope sensors (gyrometers) in passenger cars for measuring a car directional change. In combination with a measurement traveled distance, a determination of a car position is facilitated even in regions without GPS signal by extrapolation.
[0028] In a preferred embodiment of the method, the step of carrying out a combined deep learning scheme further comprises a preceding step of carrying out a first deep learning scheme with a first artificial neural network, wherein the first deep learning scheme comprises a plurality of exemplary pairs of raw or processed data of the at least one sensor and at least one specific vehicle passenger-related physical quantity, and further comprises a preceding step of carrying out a second deep learning scheme with a second artificial neural network, wherein the second deep learning scheme comprises a plurality of exemplary pairs of raw or processed data sensed by the at least one motion sensor device and at least one specific motion of the vehicle body. Further, the step of providing the raw or processed data generated in the detection scenario comprises providing an output of the first artificial neural network in response to the generated raw or processed data of the at least one sensor in the detection scenario within the vehicle interior as one input to the artificial neural network trained by the combined deep learning scheme, and providing an output of the second artificial neural network in response to the generated raw or processed data of the at least one motion sensor device in the detection scenario within the vehicle interior as another input to the artificial neural network trained by the combined deep learning scheme.
[0029] It is noted herewith that the terms“first”,“second”, etc. are used in this application for distinction purposes only, and are not meant to indicate or anticipate a sequence or a priority in any way.
[0030] In this way, instead of defining the physical model and the parameters by hand, the artificial neural networks can learn transformations of the sensor data and data of the at least one motion sensor device that are necessary to output sensor data with a removed motion-induced signal portion or, instead, can extract desired information from the noisy sensor measurement data directly, for instance vital signs, properties of the vital sign(s) like the number of breathing motions present in the vehicle interior, frequencies of the breathing motion, or an individual mass of the passengers.
[0031] Preferably, the exemplary pairs of the first deep learning scheme are known a priori, and the exemplary pairs of the second deep learning scheme are known a priori.
[0032] Preferably, the step of generating raw or processed data of the at least one sensor and raw or processed data of the at least one motion sensor device in a detection scenario is a step of generating processed data of at least one out of the at least one sensor and the at least one motion sensor device and includes applying a fast Fourier transform on raw data of at least one out of the at least one sensor and the at least one motion sensor device. In this way, the deep learning can be extended to then include the frequency domain, which can enable an easier and faster learning process.
[0033] Preferably, the step of deriving an output representing one or more vehicle passenger-related physical quantity or quantities comprises using the derived output of the artificial neural network trained with the combined deep learning scheme for directly deriving the output representing one or more vehicle passenger-related physical quantity or quantities, and/or for determining and applying parameters for de-noising the signal of the at least one sensor by removing a portion of the sensor signal that is generated by the at least one sensor due to being moved by vehicular movements. Depending on the application and a necessity for determining de-noised data of the at least one sensor and/or the at least one motion sensor device, a flexible provision of an output can be enabled that represents one or more vehicle passenger-related physical quantity or quantities, based on the carried out combined deep learning scheme.
[0034] In another aspect of the invention, a sensor device for operation in an interior of a vehicle is provided that comprises at least one sensor and an evaluation and control unit. [0035] The at least one sensor is sensitive to a relative motion to parts of and objects within the interior of the vehicle, wherein the at least one sensor is configured for detecting a vehicle passenger-related physical quantity.
[0036] The evaluation and control unit comprises at least one artificial neural network and is at least configured for
evaluating signals received by the at least one sensor in a detection scenario,
receiving motion sensor data from at least one motion sensor device, wherein the motion sensor data contain information regarding vibration or motion of a body of the vehicle in the detection scenario,
providing the evaluated signal information and the information regarding vibration or motion of a body of the vehicle as input data to the at least one artificial neural network, and
operating the at least one artificial neural network that has been trained with a combined deep learning scheme for processing the provided input data to derive an output representing one or more vehicle passenger-related physical quantity or quantities, based on the combined deep learning scheme.
[0037] The benefits described in context with the disclosed method of operating a sensor device in an interior of a vehicle apply to the sensor device for operation in an interior of a vehicle for detecting a vehicle passenger-related physical quantity to the full extent.
[0038] The combined deep learning scheme may comprise a plurality of exemplary pairs of raw or processed data of the at least one sensor and raw or processed data of the at least one motion sensor device on the one side and at least one specific vehicle passenger-related physical quantity on the other side, wherein the exemplary pairs of raw or processed data and the at least one specific vehicle passenger-related physical quantity are known a priori.
[0039] Preferably, at least one artificial neural network is formed by a deep neural network (DNN), for instance a recurrent neural network (RNN). A DNN is an artificial neural network with multiple hidden layers of artificial neurons between the input and the output side. DNNs are known to be able to model complex non- linear relationships. An RNN is a DNN whose connections between nodes form a directed graph along a sequence. RNNs can show dynamic temporal behavior for a time sequence. Both types of artificial neural network are beneficially employable in the proposed sensor device for operation in an interior of a vehicle.
[0040] In preferred embodiments of the sensor device, the sensor device is formed as a radar sensor system and the at least one sensor is formed by a radar sensor. The radar sensor includes a radar transmitting unit having at least one radar transmitting antenna and being configured for transmitting radar signals towards at least a portion of the vehicle interior. The radar sensor further comprises a radar receiving unit having at least one radar receiving antenna and being configured for receiving radar signals that have been transmitted by the radar transmitter unit and have been reflected by parts of and objects within the interior of the vehicle. The evaluation and control unit is configured for
evaluating information from radar signals received by the at least one radar sensor in the detection scenario, and for
providing the evaluated radar signal information and the information regarding vibration or motion of a body of the vehicle as input data to the at least one artificial neural network.
[0041] In this way, a contactless way of reliably detecting a vehicle passenger- related physical quantity can beneficially be enabled.
[0042] By way of example, the evaluated radar signal information may be formed by Doppler radar signal information, which is well known in the art. However, it is also contemplated within the scope of the invention that the evaluated radar signal information comprises angular information; i.e. azimuthal and/or elevational quantities.
[0043] In preferred embodiments, the sensor device further includes at least one motion sensor device that is configured for providing the motion sensor data. By that, a short signal path can be established for providing the motion sensor data without data sharing, and a fast signal processing can be accomplished.
[0044] In preferred embodiments of the sensor device, the at least one motion sensor device is an integral part of the sensor device. In this way, a sensor device, in particular a radar sensor system, with a compact design can be provided. Moreover, a systematic error for the motion sensor data due to a spatial separation of the motion sensor device and the sensor device can be avoided.
[0045] Preferably, the sensor device and the at least one motion sensor device forming an integral part of the sensor device are rigidly attached to the vehicle body. In this way, any motion of the vehicle body can properly be detected.
[0046] Preferably, the at least one motion sensor device comprises at least one accelerometer sensor that is designed as a micro-electromechanical system (MEMS). In this way, data of the motion sensor device can readily be provided in an economic and part-saving manner.
[0047] In yet another aspect of the invention, the use of the disclosed sensor device, including at least one radar sensor as the at least one sensor, in an automotive vehicle interior sensing system for detection of vital sign characteristics is proposed. The benefits described in context with the disclosed method of operating a sensor device in an interior of a vehicle apply to the use of the disclosed sensor device for operation in an interior of a vehicle for vital sign detection to the full extent.
[0048] These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter.
[0049] It shall be pointed out that the features and measures detailed individually in the preceding description can be combined with one another in any technically meaningful manner and show further embodiments of the invention. The description characterizes and specifies the invention in particular in connection with the figures.
Brief Description of the Drawings
[0050] Further details and advantages of the present invention will be apparent from the following detailed description of not limiting embodiments with reference to the attached drawing, wherein:
Fig. 1 schematically shows a conventional radar sensor system for detecting passenger vital signs in an interior of a vehicle in a side view, installed in the vehicle,
Fig. 2 is a mechanical equivalent of the configuration pursuant to Fig. 1 , Fig. 3 schematically shows a possible embodiment of a sensor device in accordance with the invention, formed as a radar sensor system and being installed in a vehicle, in a side view, and
Fig. 4 schematically illustrates an evaluation and control unit of the radar sensor system pursuant to Fig. 3, and further illustrates steps of a method of operating the radar sensor system in accordance with the invention for detecting vehicle passenger-related physical quantities.
Description of Preferred Embodiments
[0051] Fig. 3 schematically shows, in a side view, a possible embodiment of a sensor device in accordance with the invention. The sensor device, which is formed as a radar sensor system 10, is installed in an interior 18 of a vehicle 16, which is identically designed to the vehicle 16 pursuant to Fig. 1. The sensor device is configured for operation in the interior 18 of the vehicle 16, and for use in an automotive vehicle interior sensing system for vital sign detection.
[0052] The person 32 in the interior 18 of the vehicle 16 is the driver, who is located at and is occupying a seat 20 of the vehicle 16, namely the driver’s seat, thus forming a seat/person system 36. The vehicle 16 is shown to be driving on a roadway 42 having a vertical road profile 44, which is shown in Fig. 3 in an exaggerated manner for clarity purposes. The roughness of the vertical road profile 44 causes vertical motion 38 of vehicle wheels 22. The vertical motion 38 of the wheels 22 is transferred to a vehicle body 24 via a vehicle suspension system (not shown), which is identical to the suspension system 26 shown in Fig. 1 , generating forced vibrations of the seat 20 and the person 32 occupying the seat 20. Additional forced vibrations of the seat/person system 36 are induced by mechanical vibrations 40 of a running engine of the vehicle 16.
[0053] The radar sensor system 10 includes a radar sensor 12 that is arranged in front of the person 32 at an inside of a roof 28 of the vehicle 16, and that is configured for detecting vehicle passenger-related physical quantities that are given by a breathing motion 34, characterized by an amplitude and a breathing frequency. The radar sensor 12 includes a radar transmitting unit and a radar transceiver antenna that is directed backwards towards the vehicle interior 18 and is configured for transmitting radar signals towards the vehicle interior 18. The radar sensor 12 is sensitive to a relative motion between the radar transceiver antenna and the person's chest, and is further configured for receiving radar signals that have been transmitted by the radar transmitting unit and have in particular been reflected by the person's chest. The radar sensor 12 is also sensitive to a relative motion of the radar transceiver antenna to parts within the interior 18 of the vehicle 16.
[0054] The radar sensor system 10 moreover comprises an evaluation and control unit 46, shown in detail in Fig. 4, which is configured for evaluating Doppler information from the radar signals received by the radar receiving unit in a detection scenario. To this end, the evaluation and control unit 46 comprises a processor unit and a digital data memory unit (not shown) to which the processor unit has data access. The evaluation and control unit 46 further includes a plurality of three artificial neural networks, which are formed as deep neural networks 48, 50, 52.
[0055] As the vital sign breathing motion 34 of the person's chest is superimposed by the forced vibrations of the seat/person system 36 (Fig. 3), the evaluated Doppler information is also a superposition of Doppler information generated by the person’s breathing motion 34 and Doppler information generated by the forced vibrations of the seat/person system 36 that are mainly induced by the vertical road profile 44 and the vehicle engine vibrations 40. Other exterior sources that may as well induce forced vibrations of the seat/person system 36 are, for instance, strong winds or heavy oncoming traffic passing in close distance.
[0056] Furthermore, the radar sensor system 10 includes a motion sensor device, which is formed by an accelerometer device 14 that is arranged at the inside of the vehicle roof 28 (Fig. 1 ). The accelerometer device 14 comprises a three-axis accelerometer sensor that is designed as an on-chip micro-electromechanical system (MEMS). A digital data link (wireless or by wire connection), indicated in Fig. 3 by a dashed line, is provided between the accelerometer device 14 and the evaluation and control unit 46. The accelerometer device 14 is configured to provide digital accelerometer data to the evaluation and control unit 46 via the digital data link. The accelerometer data contain information regarding vibration or motion of the vehicle body 24 in the detection scenario. The evaluation and control unit 46 is configured to receive digital accelerometer data from the accelerometer device 14 via the digital data link.
[0057] In the embodiment of the radar sensor system 10 illustrated in Fig. 3, the accelerometer device 14 is arranged near a top of the inside of the vehicle roof 28, spaced from the balance of the radar sensor system 10. In other embodiments of the radar sensor system, the accelerometer device 14 may be an integral part of the radar sensor system, and, in this way, may be arranged close to the radar transceiver antenna.
[0058] In the following, an embodiment of a method of operation the radar sensor system 10 in the interior 18 of the vehicle 16 will be described with reference to Fig. 4, which schematically illustrates the evaluation and control unit 46 of the radar sensor system 10 pursuant to Fig. 3, and further illustrates steps of the method of operating the radar sensor system 10 in accordance with the invention for detecting vehicle passenger-related physical quantities. In preparation of operating the radar sensor system 10, it shall be understood that all involved units and devices are in an operational state and configured as illustrated in Fig. 3.
[0059] In order to be able to carry out the method automatically and in a controlled way, the evaluation and control unit 46 is equipped with a software module. The method steps to be conducted are converted into a program code of the software module. The program code is implemented in the digital data memory unit of the evaluation and control unit 46 and is executable by the processor unit of the evaluation and control unit 46.
[0060] In a first step 70 of the method, processed data sensed by the radar sensor 12 is provided to a first deep neural network (DNN) 48 of the plurality of three DNNs 48, 50, 52. To this end, a data connection is provided within the evaluation and control unit 46 from an output port of the processor unit to an input side 54 of the first DNN 48. In this specific embodiment, the processed data sensed by the radar sensor 12 are given by the evaluated Doppler information from the radar signals received by the radar receiving unit in a detection scenario.
[0061] In another step 72 of the method, which may be executed before, simultaneously to or after the first step 70, processed data sensed by the accelerometer device 14 is provided to a second DNN 50 of the plurality of three DNNs 48, 50, 52. To this end, a data connection is provided between an output port of the accelerometer device 14 to an input side 56 of the second DNN 50.
[0062] Next, a combined supervised learning scheme is carried out in another step 74 of the method. This step comprises a preceding step 76 of carrying out a first supervised learning scheme with the first DNN 48 and another preceding step 78 of carrying out a second supervised learning scheme with the second DNN 50.
[0063] The first learning scheme comprises a plurality of exemplary pairs of processed data sensed by the radar sensor 12 and specific vehicle passenger- related physical quantities, which are given by the breathing amplitude and the breathing frequency. The exemplary pairs of the first learning scheme are known a priori.
[0064] The second learning scheme comprises a plurality of exemplary pairs of processed data sensed by the accelerometer device 14 and a specific motion of the vehicle body 24. The exemplary pairs of the second learning scheme are known a priori.
[0065] Both the first DNN 48 and the second DNN 50 use the data obtained from the respective exemplary pairs in order to learn a function that will be used to map a future input to an output. The output 60 of the first DNN 48 will contain main features in the processed data sensed by the radar sensor 12 occurring in the event of a certain combination of the breathing amplitude and the breathing frequency of the vehicle passenger 32. The output 62 of the second DNN 50 will contain main features in the processed data sensed by the accelerometer device 14 occurring in the event of a certain motion of the vehicle body 24.
[0066] In a next step 80 for carrying out the combined supervised learning scheme with the third DNN 52, the output 60 of the first DNN 48 is provided to an input side 58 of the third DNN 52 of the plurality of three DNNs 48, 50, 52 as one input. The output 62 of the second DNN 50 is provided to the input side 58 of the third DNN 52 as another input. In this specific embodiment, the step 80 of providing the output 60 of the first DNN 48 and providing the output 62 of the second DNN 50 to the input side 58 of the third DNN 52 is carried out by a step 82 of combining the output 60 of the first DNN 48 and the output 62 of the second DNN 50.
[0067] The combined supervised learning scheme comprises the plurality of exemplary pairs of processed data sensed by the radar sensor 12 and processed data sensed by the accelerometer device 14 on the one side and the specific vehicle passenger-related physical quantities, which are given by the breathing amplitude and the breathing frequency, on the other side. The exemplary pairs of processed data and the specific vehicle passenger-related physical quantities are known a priori.
[0068] Once the step 74 of carrying out the combined supervised learning scheme has been completed, the radar sensor system 10 is ready for operation in an actual detection scenario.
[0069] The data flow scheme shown in Fig. 4 is the same for carrying out the supervised learning and for processing data in the actual detection scenario.
[0070] In the actual detection scenario, processed data of the radar sensor 12 and processed data of the accelerometer device 14 are generated in another step 84 of the method. In next steps 86, 88 of the measurement, the processed data of the radar sensor 12 and the processed data of the accelerometer device 14 are provided as input to the third DNN 52 that is trained by the combined supervised learning scheme.
[0071] By operating the third DNN 52 in the next step 90 for processing the provided input data, an output is derived directly by the third DNN 52, based on the carried out combined supervised learning scheme, that represents the breathing amplitude, the breathing frequency or the number of passengers 32 present in the interior 18 of the vehicle 16. In other embodiments, the directly derived output may represent a vital sign motion or other vital sign characteristics.
[0072] In this specific embodiment, the output 64 is formatted as an output vector 66 and can be expressed as (number n, breathing amplitude 1 , breathing frequency 1 , breathing amplitude 2, breathing frequency 2, ..., breathing amplitude n, breathing frequency n), wherein placeholders for more than one vehicle passenger 32 may be filled up with zeros if not detected. In this specific embodiment, the number n is equal to or less than five. [0073] In other embodiments, the output 64 of the third DNN 52 may contain a plurality of parameters for de-noising the processed radar signal sensed by the radar sensor 12 for removing a portion of the processed radar signal that has been generated by the radar sensor 12 due to being moved by vehicular movements.
[0074] While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments.
[0075] Other variations to be disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or“an” does not exclude a plurality, which is meant to express a quantity of at least two. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting scope.
List of Reference Symbols
10 radar sensor system 42 roadway
12 radar sensor 44 vertical road profile
14 accelerometer device 46 evaluation and control unit
16 vehicle 48 deep neural network
18 vehicle interior 50 deep neural network
20 seat 52 deep neural network
22 wheel 54 input side 1st DNN
24 vehicle body 56 input side 2nd DNN
26 suspension system 58 input side 3rd DNN
28 vehicle roof 60 output 1st DNN
32 person 62 output 2nd DNN
34 breathing motion 64 output 3rd DNN
36 seat/person system 66 output vector
38 vertical motion of wheel RS radar-based sensor device
40 vehicle engine vibrations ACC accelerometer device
Method steps:
70 provide radar sensor data to 1st DNN
72 provide accelerometer device data to 2nd DNN
74 carry out combined supervised learning scheme
76 carry out 1st supervised learning scheme
78 carry out 2nd supervised learning scheme
80 provide outputs of 1 st and 2nd DNN to input of 3rd DNN
82 combine outputs of 1st and 2nd DNN
84 generate processed data of radar sensor and processed data of
accelerometer device
86 provide radar sensor data to 1st DNN
88 provide accelerometer device data to 2nd DNN
90 derive output of 3rd DNN directly by operating 3rd DNN

Claims

Claims
1. A method of operating a sensor device (10) in an interior (18) of a vehicle (16), the sensor device (10) having at least one sensor (12) that is sensitive to a relative motion to parts of and objects within the interior (18) of the vehicle (16), wherein the at least one sensor (12) is configured for detecting at least one vehicle passenger-related physical quantity, the method comprising at least steps of
- providing (70, 72) raw or processed data sensed by the at least one sensor (12) and raw or processed data sensed by at least one motion sensor device (14) that is configured to provide information regarding vibration or motion of a body (24) of the vehicle (16) as input data to at least one artificial neural network (48, 50, 52),
- carrying out (74) a combined deep learning scheme with the at least one artificial neural network (52), wherein the combined deep learning scheme comprises a plurality of exemplary pairs of raw or processed data sensed by the at least one sensor (12) and raw or processed data sensed by the at least one motion sensor device (14) on the one side and at least one specific vehicle passenger-related physical quantity on the other side, wherein the exemplary pairs of raw or processed data and the at least one specific vehicle passenger-related physical quantity are known a priori,
- generating (84) raw or processed data of the at least one sensor (12) and raw or processed data of the at least one motion sensor device (14) in a detection scenario within the vehicle interior (18),
- providing (86, 88) the raw or processed data generated in the detection scenario as an input to at least one artificial neural network (52) trained by the combined deep learning scheme, and
- by operating at least the artificial neural network (52) trained with the combined deep learning scheme for processing the provided input data, derive (90) an output representing one or more vehicle passenger-related physical quantity or quantities, based on the carried out combined deep learning scheme.
2. The method as claimed in claim 1 , wherein
the step (74) of carrying out a combined deep learning scheme further comprises a preceding step (76) of carrying out a first deep learning scheme with a first artificial neural network (48), wherein the first deep learning scheme comprises a plurality of exemplary pairs of raw or processed data sensed by the at least one sensor (12) and at least one specific vehicle passenger- related physical quantity, and further comprises a preceding step (78) of carrying out a second deep learning scheme with a second artificial neural network (50), wherein the second deep learning scheme comprises a plurality of exemplary pairs of raw or processed data sensed by the at least one motion sensor device (14) and at least one specific motion of the vehicle body (24), and wherein
the step (86, 88) of providing the raw or processed data generated in the detection scenario comprises providing an output of the first artificial neural network (48) in response to the generated raw or processed data of the at least one sensor (12) in the detection scenario within the vehicle interior (18) as one input to the artificial neural network (52) trained by the combined deep learning scheme, and providing an output of the second artificial neural network (50) in response to the generated raw or processed data of the at least one motion sensor device (14) in the detection scenario within the vehicle interior (18) as another input to the artificial neural network (52) trained by the combined deep learning scheme.
3. The method as claimed in claim 1 or 2, wherein the step (84) of generating raw or processed data of the at least one sensor (12) and raw or processed data of the at least one motion sensor device (14) in a detection scenario is a step (84) of generating processed data of at least one out of the at least one sensor (12) and the at least one motion sensor device (14) and includes applying a fast Fourier transform on raw data of at least one out of the at least one sensor (12) and the at least one motion sensor device (14).
4. The method as claimed in any one of the preceding claims, wherein the step (90) of deriving an output representing one or more vehicle passenger- related physical quantity or quantities comprises using the derived output of the artificial neural network (52) trained with the combined deep learning scheme for directly deriving the output representing one or more vehicle passenger-related physical quantity or quantities, and/or for determining and applying parameters for de-noising the signal of the at least one sensor (12) by removing a portion of the sensor signal that is generated by the at least one sensor (12) due to being moved by vehicular movements.
5. A sensor device (10) for operation in an interior (18) of a vehicle (16), comprising
- at least one sensor (12) that is sensitive to a relative motion to parts of and objects within the interior (18) of the vehicle (16), wherein the at least one sensor (12) is configured for detecting a vehicle passenger-related physical quantity, and
- an evaluation and control unit (46) that comprises at least one artificial neural network (48, 50, 52) and is at least configured for
• evaluating signals received by the at least one sensor (12) in a detection scenario,
• receiving motion sensor data from at least one motion sensor device (14), wherein the motion sensor data contain information regarding vibration or motion of a body (24) of the vehicle (16) in the detection scenario,
• providing the evaluated signal information and the information regarding vibration or motion of a body (24) of the vehicle (16) as input data to the at least one artificial neural network (48, 50, 52), and
• operating the at least one artificial neural network (52) that has been trained with a combined deep learning scheme for processing the provided input data to derive an output representing one or more vehicle passenger-related physical quantity or quantities, based on the combined deep learning scheme.
6. The sensor device as claimed in claim 5, wherein at least one artificial neural network (48, 50, 52) is formed by a deep neural network (48, 50, 52).
7. The sensor device (10) as claimed in claim 5 or 6, wherein the sensor device (10) is formed as a radar sensor system (10) and the at least one sensor (12) is formed by a radar sensor (12), the radar sensor (12) including - a radar transmitting unit having at least one radar transmitting antenna and being configured for transmitting radar signals towards at least a portion of the vehicle interior (18),
- a radar receiving unit having at least one radar receiving antenna and being configured for receiving radar signals that have been transmitted by the radar transmitter unit and have been reflected by parts of and objects within the interior (18) of the vehicle (16),
and wherein the evaluation and control unit (46) is configured for
- evaluating information from radar signals received by the at least one radar sensor (12) in the detection scenario, and for
- providing the evaluated radar signal information and the information regarding vibration or motion of a body (24) of the vehicle (16) as input data to the at least one artificial neural network (48, 50, 52).
8. The sensor device (10) as claimed in any one of claims 5 to 7, further including at least one motion sensor device (14) that is configured for providing the motion sensor data.
9. The sensor device (10) as claimed in any one of claims 5 to 8, wherein the at least one motion sensor device (14) comprises at least one accelerometer sensor that is designed as a micro-electromechanical system.
10. Use of the sensor device (10) as claimed in any one of claims 5 to 9, comprising at least one radar sensor (12) as the at least one sensor (12), in an automotive vehicle interior sensing system for detection of vital sign characteristics.
PCT/EP2019/073993 2018-09-10 2019-09-09 Removing noise caused by vehicular movement from sensor signals using deep neural networks WO2020053148A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
LU100925A LU100925B1 (en) 2018-09-10 2018-09-10 Removing noise caused by vehicular movement from sensor signals using Deep Neural Networks
LULU100925 2018-09-10

Publications (1)

Publication Number Publication Date
WO2020053148A1 true WO2020053148A1 (en) 2020-03-19

Family

ID=63713985

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/073993 WO2020053148A1 (en) 2018-09-10 2019-09-09 Removing noise caused by vehicular movement from sensor signals using deep neural networks

Country Status (2)

Country Link
LU (1) LU100925B1 (en)
WO (1) WO2020053148A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111965636A (en) * 2020-07-20 2020-11-20 重庆大学 Night target detection method based on millimeter wave radar and vision fusion
US20210387584A1 (en) * 2020-06-15 2021-12-16 Lytx, Inc. Sensor fusion for collision detection
WO2022240306A1 (en) * 2021-05-11 2022-11-17 Harman Becker Automotive Systems Gmbh Fusing contextual sensors for improving biosignal extraction

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112957013B (en) * 2021-02-05 2022-11-11 江西国科美信医疗科技有限公司 Dynamic vital sign signal acquisition system, monitoring device and equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013037399A1 (en) 2011-09-12 2013-03-21 Ficomirrors, S.A. System and method for detecting a vital-related signal pattern
WO2016038148A1 (en) 2014-09-10 2016-03-17 Iee International Electronics & Engineering S.A. Radar sensing of vehicle occupancy
CN105534517A (en) * 2016-02-29 2016-05-04 浙江铭众科技有限公司 Method for removing vehicle motion noise in three-lead electrocardiosignal
CN105769173A (en) 2016-02-29 2016-07-20 浙江铭众科技有限公司 Electrocardiogram monitoring system with electrocardiosignal denoising function
CN105796091A (en) * 2016-02-29 2016-07-27 浙江铭众科技有限公司 Intelligent terminal for removing electrocardiosignal vehicle motion noise
US20160354027A1 (en) 2014-02-20 2016-12-08 Faurecia Automotive Seating, Llc. Vehicle seat with integrated sensors

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013037399A1 (en) 2011-09-12 2013-03-21 Ficomirrors, S.A. System and method for detecting a vital-related signal pattern
US20160354027A1 (en) 2014-02-20 2016-12-08 Faurecia Automotive Seating, Llc. Vehicle seat with integrated sensors
WO2016038148A1 (en) 2014-09-10 2016-03-17 Iee International Electronics & Engineering S.A. Radar sensing of vehicle occupancy
US20170282828A1 (en) 2014-09-10 2017-10-05 Iee International Electronics & Engineering S.A. Radar sensing of vehicle occupancy
CN105534517A (en) * 2016-02-29 2016-05-04 浙江铭众科技有限公司 Method for removing vehicle motion noise in three-lead electrocardiosignal
CN105769173A (en) 2016-02-29 2016-07-20 浙江铭众科技有限公司 Electrocardiogram monitoring system with electrocardiosignal denoising function
CN105796091A (en) * 2016-02-29 2016-07-27 浙江铭众科技有限公司 Intelligent terminal for removing electrocardiosignal vehicle motion noise

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YIN WENFENG ET AL: "Self-adjustable domain adaptation in personalized ECG monitoring integrated with IR-UWB radar", BIOMEDICAL SIGNAL PROCESSING AND CONTROL, ELSEVIER, AMSTERDAM, NL, vol. 47, 23 August 2018 (2018-08-23), pages 75 - 87, XP085502714, ISSN: 1746-8094, DOI: 10.1016/J.BSPC.2018.08.002 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210387584A1 (en) * 2020-06-15 2021-12-16 Lytx, Inc. Sensor fusion for collision detection
US11769332B2 (en) * 2020-06-15 2023-09-26 Lytx, Inc. Sensor fusion for collision detection
CN111965636A (en) * 2020-07-20 2020-11-20 重庆大学 Night target detection method based on millimeter wave radar and vision fusion
WO2022240306A1 (en) * 2021-05-11 2022-11-17 Harman Becker Automotive Systems Gmbh Fusing contextual sensors for improving biosignal extraction

Also Published As

Publication number Publication date
LU100925B1 (en) 2020-03-10

Similar Documents

Publication Publication Date Title
LU100925B1 (en) Removing noise caused by vehicular movement from sensor signals using Deep Neural Networks
CN112272779B (en) Method for robust vehicle occupancy detection through vital sign monitoring
EP3652026B1 (en) System and method for radar-based determination of a number of passengers inside a vehicle passenger compartment
US9539969B2 (en) System and method for minimizing occupant injury during vehicle crash events
CN113556975A (en) System, apparatus and method for detecting object in vehicle and obtaining object information
JP4980988B2 (en) Driver state estimation device
CN115023744B (en) Method for operating a radar sensor system for vital sign detection, which eliminates signals excited by interfering movements
SE523753C2 (en) Method for developing a system for identifying the presence and position of an object in a vehicle
Kamann et al. Object tracking based on an extended Kalman filter in high dynamic driving situations
Saeed et al. A novel extension for e-Safety initiative based on developed fusion of biometric traits
CN116680552A (en) Passenger injury prediction method and device and vehicle
US20220087540A1 (en) Systems and methods for remote vital sign monitoring in a vehicle
Kamann et al. Test methodology for automotive surround sensors in dynamic driving situations
Supriya et al. Reliable automotive crash detection using multi sensor decision fusion
LU100364B1 (en) Radar-Based Passenger Classification and Monitoring
US6266593B1 (en) Automotive occupancy sensor gray zone neural net conditioning process for airbag deployment systems
CN105882654A (en) Drive assist apparatus
WO2018224612A1 (en) Radar-based passenger classification and monitoring
JP2017016310A (en) Device, system, program, and method for determining stop mode of mobile body
Tamizharasan et al. Artificial intelligence-based vehicle in-cabin occupant detection and classification system
Hannan et al. Decision fusion of a multi-sensing embedded system for occupant safety measures
US20220009439A1 (en) Enhanced occupant collision safety system
WO2022180656A1 (en) Occupant sensing device and occupant sensing method
JP2023101293A (en) Biological information detection device
Russ et al. A Watchful Eye on the Interior

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19769734

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19769734

Country of ref document: EP

Kind code of ref document: A1