CN117633709A - Fusion method based on in-vehicle living body carry-over detection - Google Patents

Fusion method based on in-vehicle living body carry-over detection Download PDF

Info

Publication number
CN117633709A
CN117633709A CN202311680672.7A CN202311680672A CN117633709A CN 117633709 A CN117633709 A CN 117633709A CN 202311680672 A CN202311680672 A CN 202311680672A CN 117633709 A CN117633709 A CN 117633709A
Authority
CN
China
Prior art keywords
probability
vehicle
ann
predicted
refers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311680672.7A
Other languages
Chinese (zh)
Inventor
古瑞琴
郭凯
李威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhengzhou Weisen Electronics Technology Co ltd
Original Assignee
Zhengzhou Weisen Electronics Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhengzhou Weisen Electronics Technology Co ltd filed Critical Zhengzhou Weisen Electronics Technology Co ltd
Priority to CN202311680672.7A priority Critical patent/CN117633709A/en
Publication of CN117633709A publication Critical patent/CN117633709A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0492Sensor dual technology, i.e. two or more technologies collaborate to extract unsafe condition, e.g. video tracking and RFID tracking

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Gerontology & Geriatric Medicine (AREA)
  • Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Multimedia (AREA)
  • Computational Linguistics (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Emergency Management (AREA)
  • Molecular Biology (AREA)

Abstract

The invention discloses a fusion method based on in-vehicle living body carry-over detection, which is used for realizing the fusion of decision-making layers of two models through a fusion calculation method; therefore, double fusion is realized at the characteristic level and the decision level, the missing report and false report situation when a single sensor is singly predicted is reduced to the greatest extent, and the accuracy and the safety of whether life carry-over judgment exists in the vehicle are improved.

Description

Fusion method based on in-vehicle living body carry-over detection
Technical Field
The invention relates to a detection analysis method of an in-vehicle detection device, in particular to a fusion method based on in-vehicle living body carry-over detection.
Background
In the existing scheme of detecting the living body in the vehicle, a single sensor is used for detecting whether the living body exists in the vehicle, but certain defects exist in most cases, such as a long time is needed for detecting the living body in the vehicle due to the content of single carbon dioxide so as to observe the change of the carbon dioxide in the vehicle; a single pressure sensor detects that a weight on a seat is influenced by inanimate objects such as articles; a single infrared sensor may be blind at high temperatures.
Disclosure of Invention
The invention aims to provide a fusion method based on in-vehicle living environment carry-over detection, which can solve the problem that the joint detection of each sensor is uncoordinated in the prior art.
To achieve the object of the invention, it comprises: locate detecting system in car, detecting system includes C0 2 The sensor, the ambient temperature sensor and the biological temperature sensor are characterized in that the specific implementation modes are as follows:
1. waiting for the vehicle to stop the driver from leaving the vehicle, and starting the detection system after locking the vehicle;
2. detecting CO of a system 2 Sensor real-time monitoring in-car CO 2 Concentration of the content and obtaining CO according to the following formula 2 The rate of change of concentration is taken as a first characteristic value: v=d/t, V is CO 2 The change rate of concentration, t is the calculation period, d is CO in t time 2 The amount of change of the sensor;
3. obtaining the current temperature T in the vehicle through an ambient temperature sensor 0 As a second characteristic value, the current target detection temperature T is obtained by a biological temperature sensor s According to the following formula: p=k (T 0 4 -T s 4 ),
T 0 T is the current temperature in the vehicle s For the current target detection temperature, K is a Undern formula constant, P is a measured value of a biological temperature sensor when an inanimate object in the vehicle is a third characteristic value;
4. taking the current value detected by the biological temperature sensor every calculation period after locking as a fourth characteristic value;
5. combining the four characteristic values of the incoming sensor, and labeling each sample data: the inanimate object mark in the vehicle is a, the animate object mark in the vehicle is b, and the animate object mark is used as an original Data set Data to realize the fusion of the sensor characteristic layers, and the Data is divided into a training set Train and a verification set Test;
6. the probability of each category a and b in the verification set Test to occupy the Test is counted and calculated through the following formula:
p (A) is the probability of an inanimate object in the vehicle, P (B) is the probability of an animate object in the vehicle, y (a) represents the number of samples of an event a in the verification set, y (B) represents the number of samples of an event B in the verification set, and y (Test) represents the number of samples of the verification set Test;
7. and (3) establishing a K-neighbor model (KNN) by utilizing a training set Train, wherein the K-neighbor model is established as follows:
(1) The distances between the current samples in the Test set and the samples in the training sets Train are calculated and statistically recorded as a set W. The calculation of the distance L adopts Euclidean distance (Euclidean distance) of the calculated distance in a high-dimensional space;
(2) The method comprises the steps of ascending sequence sorting is conducted on the distance L in the W, the first K points with the smallest distance L are selected, the occurrence frequencies of the inanimate object in the vehicle and the living object in the vehicle in the first K points are counted, and the category with the highest occurrence frequency in the K points is returned to serve as a prediction result;
(3) The steps (1) and (2) are repeated until each sample in the verification set Test is predicted one by one.
(4) Two results predicted by the K-nearest neighbor model are defined as m and n. m: in the representative verification set Test, a sample is predicted to be an in-vehicle inanimate body by a K-nearest neighbor model; n: in the representative verification set Test, the sample is predicted to be a living body in the vehicle;
calculating probability P knn (m|a),P knn (m|a) means that the samples truly marked as a in the validation set Test are predicted by K-nearest neighborsProbability of m:
y (a) refers to the number of samples of the a event in the verification set Test, and y (m) refers to the number of m events in the verification set Test predicted by K-neighbors;
calculating probability P knn (n|a),P knn (n|a) refers to the probability that a sample truly marked a in the validation set Test is predicted by K-neighbors as n:
y (a) refers to the number of samples of the a event in the verification set Test, and y (n) refers to the number of the a event in the verification set Test predicted by K-nearest neighbor as n;
calculating probability P knn (m|b),P knn (m|b) refers to the probability that a sample truly labeled b in the validation set Test is predicted by K-neighbors as m:
y (b) refers to the number of samples of b events in the verification set Test, and y (m) refers to the number of b events in the verification set Test predicted by K-neighbors as m;
calculating probability P knn (n|b),P knn (n|b) refers to the probability that a sample truly labeled b in the validation set Test is predicted by K-neighbors as n:
y (b) refers to the number of samples of b events in the verification set Test, and y (n) refers to the number of b events in the verification set Test predicted by K-neighbors as n;
recording the 4 probability results;
8. and establishing an artificial neural network model (ANN) by utilizing a training set Train, wherein the output layer comprises two nerve units which respectively correspond to inanimate objects in the vehicle, two prediction categories of the living objects exist in the vehicle, and two results predicted by the ANN model are defined as p and q. And p: in the representative verification set Test, predicting a sample as an in-vehicle inanimate condition by an ANN model; q: in the representative verification set Test, the sample is predicted to be in-vehicle life condition;
calculating probability P ann (p|a),P ann (p|a) refers to the probability that a sample truly labeled a in the validation set Test is predicted by ANN to be p:
y (a) refers to the number of samples of the a event in the verification set Test, and y (p) refers to the number of a events in the verification set Test predicted by ANN as p;
calculating probability P ann (q|a),P ann (q|a) refers to the probability that a sample truly labeled a in the validation set Test is predicted by ANN to be q:
y (a) refers to the number of samples of the a event in the verification set Test, and y (q) refers to the number of a events in the verification set Test predicted by ANN as q;
calculating probability P ann (p|b),P ann (p|b) refers to the probability that a sample truly labeled b in the validation set Test is predicted by ANN to be p:
y (b) refers to the number of samples of b events in the verification set Test, and y (p) refers to the number of b events in the verification set Test predicted by ANN as p;
calculating probability P ann (q|b),P ann (q|b) refers to the probability that a sample truly labeled b in the validation set Test is predicted by ANN to be q:
y (b) refers to the number of samples of b events in the verification set Test, and y (q) refers to the number of b events in the verification set Test predicted by ANN as q;
recording the 4 probability results;
9. because the results generated by the KNN model and the ANN model are mutually independent, the model joint report probability is as follows:
P((m,p)|a)=p knn (m|a)*p ann (p|a)
P((m,p)|b)=p knn (m|b)*p ann (p|b)
p ((m, P) |a) represents: when the true sign is a, the KNN predictor is m, the probability that the ANN predictor is P, P ((m, P) |b) represents: when the true flag is b, the KNN predictor is m, the probability that the ANN predictor is P, P ((n, q) |a) represents: when the true flag is a, the KNN predictor is n, the probability of the ANN predictor being q, P ((n, q) |b) represents: when the true mark is b, the KNN predicted result is n, and the ANN predicted result is the probability of q;
10. substituting the fusion probability obtained in the step 9 into a formula as follows:
p (a| (m, P)) represents the probability that the real situation in the vehicle is a, namely the probability of an inanimate body, when the KNN prediction result is m and the ANN prediction result is P; p (b| (m, P)) represents the probability that the real situation in the vehicle is b, namely the probability of having a living body, when the KNN prediction result is m and the ANN prediction result is P; 11. when the obtained probability of the in-vehicle inanimate body is larger than the probability value of the in-vehicle living body, the prediction result is the in-vehicle inanimate body, and when the obtained probability of the in-vehicle inanimate body is smaller than or equal to the probability value of the in-vehicle living body, the prediction result is the in-vehicle experience living body. That is, in step 10, when the KNN prediction result is m and the ANN prediction result is P, if P (a| (m, P)) > P (b| (m, P)), the final prediction result is a (inanimate object in the vehicle), whereas the final prediction result is b (animate object in the vehicle).
The invention realizes the fusion of decision planes for the two models by a fusion calculation method; therefore, double fusion is realized at the characteristic level and the decision level, the missing report and false report situation when a single sensor is singly predicted is reduced to the greatest extent, and the accuracy and the safety of whether life carry-over judgment exists in the vehicle are improved.
Drawings
Fig. 1 is a front view of an embodiment of the present invention.
Detailed Description
As shown in fig. 1, embodiment 1 of the present invention includes:
1. waiting for the vehicle to stop the driver from leaving the vehicle, and starting the detection system after locking the vehicle;
2. detecting CO of a system 2 Sensor real-time monitoring in-car CO 2 Concentration of the content and obtaining CO according to the following formula 2 The rate of change of concentration is taken as a first characteristic value: v=d/t, V is CO 2 The change rate of concentration, t is the calculation period, d is CO in t time 2 The amount of change of the sensor;
3. obtaining the current temperature T in the vehicle through an ambient temperature sensor 0 As a second characteristic value, the current target detection temperature T is obtained by a biological temperature sensor s According to the following formula: p=k (T 0 4 -T s 4 ),
T 0 T is the current temperature in the vehicle s For the current target detection temperature, K is a Undern formula constant, P is a measured value of a biological temperature sensor when an inanimate object in the vehicle is a third characteristic value;
4. taking the current value detected by the biological temperature sensor every calculation period after locking as a fourth characteristic value;
5. combining the four characteristic values of the incoming sensor, and labeling each sample data: the inanimate object mark in the vehicle is a, the animate object mark in the vehicle is b, and the animate object mark is used as an original Data set Data to realize fusion of sensor characteristic layers, and the Data is divided into a training set Train and a verification set Test, wherein the training set Train accounts for 75% of the Data, and the verification set Test accounts for 25%.
6. The probability of each category a and b in the verification set Test to occupy the Test is counted and calculated through the following formula:
p (A) is the probability of an inanimate object in the vehicle, P (B) is the probability of an animate object in the vehicle, y (a) represents the number of samples of an event a in the verification set, y (B) represents the number of samples of an event B in the verification set, and y (Test) represents the number of samples of the verification set Test;
7. and (3) establishing a K-neighbor model (KNN) by utilizing a training set Train, wherein the K-neighbor model is established as follows:
(1) The distances between the current samples in the Test set and the samples in the training sets Train are calculated and statistically recorded as a set W. The calculation of the distance L adopts Euclidean distance (Euclidean distance) of the calculated distance in a high-dimensional space;
(2) The method comprises the steps of ascending sequence sorting is conducted on the distance L in the W, the first K points with the smallest distance L are selected, the occurrence frequencies of the inanimate object in the vehicle and the living object in the vehicle in the first K points are counted, and the category with the highest occurrence frequency in the K points is returned to serve as a prediction result;
(3) The steps (1) and (2) are repeated until each sample in the verification set Test is predicted one by one.
(4) Two results predicted by the K-nearest neighbor model are defined as m and n. m: in the representative verification set Test, a sample is predicted to be an in-vehicle inanimate body by a K-nearest neighbor model; n: in the representative verification set Test, the sample is predicted to be a living body in the vehicle;
calculating probability P knn (m|a),P knn (m|a) means true in the verification set TestProbability that a sample with real label a is predicted by K-nearest neighbor as m:
y (a) refers to the number of samples of the a event in the verification set Test, and y (m) refers to the number of m events in the verification set Test predicted by K-neighbors;
calculating probability P knn (n|a),P knn (n|a) refers to the probability that a sample truly marked a in the validation set Test is predicted by K-neighbors as n:
y (a) refers to the number of samples of the a event in the verification set Test, and y (n) refers to the number of the a event in the verification set Test predicted by K-nearest neighbor as n;
calculating probability P knn (m|b),P knn (m|b) refers to the probability that a sample truly labeled b in the validation set Test is predicted by K-neighbors as m:
y (b) refers to the number of samples of b events in the verification set Test, and y (m) refers to the number of b events in the verification set Test predicted by K-neighbors as m;
calculating probability P knn (n|b),P knn (n|b) refers to the probability that a sample truly labeled b in the validation set Test is predicted by K-neighbors as n:
y (b) refers to the number of samples of b events in the verification set Test, and y (n) refers to the number of b events in the verification set Test predicted by K-neighbors as n;
recording the 4 probability results;
8. and establishing an artificial neural network model (ANN) by utilizing a training set Train, wherein the output layer comprises two nerve units which respectively correspond to inanimate objects in the vehicle, two prediction categories of the living objects exist in the vehicle, and two results predicted by the ANN model are defined as p and q. And p: in the representative verification set Test, predicting a sample as an in-vehicle inanimate condition by an ANN model; q: in the representative verification set Test, the sample is predicted to be in-vehicle life condition;
calculating probability P ann (p|a),P ann (p|a) refers to the probability that a sample truly labeled a in the validation set Test is predicted by ANN to be p:
y (a) refers to the number of samples of the a event in the verification set Test, and y (p) refers to the number of a events in the verification set Test predicted by ANN as p;
calculating probability P ann (q|a),P ann (q|a) refers to the probability that a sample truly labeled a in the validation set Test is predicted by ANN to be q:
y (a) refers to the number of samples of the a event in the verification set Test, and y (q) refers to the number of a events in the verification set Test predicted by ANN as q;
calculating probability P ann (p|b),P ann (p|b) refers to the probability that a sample truly labeled b in the validation set Test is predicted by ANN to be p:
y (b) refers to the number of samples of b events in the verification set Test, and y (p) refers to the number of b events in the verification set Test predicted by ANN as p;
calculating probability P ann (q|b),P ann (q|b) refers to the probability that a sample truly labeled b in the validation set Test is predicted by ANN to be q:
y (b) refers to the number of samples of b events in the verification set Test, and y (q) refers to the number of b events in the verification set Test predicted by ANN as q;
recording the 4 probability results;
9. because the results generated by the KNN model and the ANN model are mutually independent, the model joint report probability is as follows:
P((m,p)|a)=p knn (m|a)*p ann (p|a)
P((m,p)|b)=p knn (m|b)*p ann (p|b)
P((n,q)|a)=p knn (n|a)*p ann (q|a)
P((n,q)|b)=p knn (n|b)*p ann (q|b)
p ((m, P) |a) represents: when the true sign is a, the KNN predictor is m, the probability that the ANN predictor is P, P ((m, P) |b) represents: when the true flag is b, the KNN predictor is m, the probability that the ANN predictor is P, P ((n, q) |a) represents: when the true flag is a, the KNN predictor is n, the probability of the ANN predictor being q, P ((n, q) |b) represents: when the true mark is b, the KNN predicted result is n, and the ANN predicted result is the probability of q;
10. substituting the fusion probability obtained in the step 9 into a formula as follows:
p (a| (m, P)) represents the probability that the real situation in the vehicle is a, namely the probability of an inanimate body, when the KNN prediction result is m and the ANN prediction result is P; p (b| (m, P)) represents the probability that the real situation in the vehicle is b, namely the probability of having a living body, when the KNN prediction result is m and the ANN prediction result is P;
11. when the obtained probability of the in-vehicle inanimate body is larger than the probability value of the in-vehicle living body, the prediction result is the in-vehicle inanimate body, and when the obtained probability of the in-vehicle inanimate body is smaller than or equal to the probability value of the in-vehicle living body, the prediction result is the in-vehicle experience living body. That is, in step 10, when the KNN prediction result is m and the ANN prediction result is P, if P (a| (m, P)) > P (b| (m, P)), the final prediction result is a (inanimate object in the vehicle), whereas the final prediction result is b (animate object in the vehicle).

Claims (1)

1. A fusion method based on in-vehicle living body carry-over detection, comprising: locate detecting system in car, detecting system includes C0 2 The sensor, the ambient temperature sensor and the biological temperature sensor are characterized in that the specific implementation modes are as follows:
1. waiting for the vehicle to stop the driver from leaving the vehicle, and starting the detection system after locking the vehicle;
2. detecting CO of a system 2 Sensor real-time monitoring in-car CO 2 Concentration of the content and obtaining CO according to the following formula 2 The rate of change of concentration is taken as a first characteristic value: v=d/t, V is CO 2 The change rate of concentration, t is the calculation period, d is CO in t time 2 The amount of change of the sensor;
3. obtaining the current temperature T in the vehicle through an ambient temperature sensor 0 As a second characteristic value, the current target detection temperature T is obtained by a biological temperature sensor s According to the following formula: p=k (T 0 4 -T s 4 ),
T 0 T is the current temperature in the vehicle s For the current target detection temperature, K is a Undern formula constant, P is a measured value of a biological temperature sensor when an inanimate object in the vehicle is a third characteristic value;
4. taking the current value detected by the biological temperature sensor every calculation period after locking as a fourth characteristic value;
5. combining the four characteristic values of the incoming sensor, and labeling each sample data: the inanimate object mark in the vehicle is a, the animate object mark in the vehicle is b, and the animate object mark is used as an original Data set Data to realize the fusion of the sensor characteristic layers, and the Data is divided into a training set Train and a verification set Test;
6. the probability of each category a and b in the verification set Test to occupy the Test is counted and calculated through the following formula:
p (A) is the probability of an inanimate object in the vehicle, P (B) is the probability of an animate object in the vehicle, y (a) represents the number of samples of an event a in the verification set, y (B) represents the number of samples of an event B in the verification set, and y (Test) represents the number of samples of the verification set Test;
7. and (3) establishing a K-neighbor model (KNN) by utilizing a training set Train, wherein the K-neighbor model is established as follows:
(1) The distances between the current samples in the Test set and the samples in the training sets Train are calculated and statistically recorded as a set W. The calculation of the distance L adopts Euclidean distance (Euclidean distance) of the calculated distance in a high-dimensional space;
(2) The method comprises the steps of ascending sequence sorting is conducted on the distance L in the W, the first K points with the smallest distance L are selected, the occurrence frequencies of the inanimate object in the vehicle and the living object in the vehicle in the first K points are counted, and the category with the highest occurrence frequency in the K points is returned to serve as a prediction result;
(3) Repeating the steps (1) and (2) until each sample in the verification set Test is predicted one by one;
(4) Two results predicted by the K-nearest neighbor model are defined as m and n. m: in the representative verification set Test, a sample is predicted to be an in-vehicle inanimate body by a K-nearest neighbor model; n: in the representative verification set Test, the sample is predicted to be a living body in the vehicle;
calculating probability P knn (m|a),P knn (m|a) refers to the probability that a sample truly marked a in the validation set Test is predicted by K-neighbors as m:
y (a) refers to the number of samples of the a event in the verification set Test, and y (m) refers to the number of m events in the verification set Test predicted by K-neighbors;
calculating probability P knn (n|a),P knn (n|a) refers to the probability that a sample truly marked a in the validation set Test is predicted by K-neighbors as n:
y (a) refers to the number of samples of the a event in the verification set Test, and y (n) refers to the number of the a event in the verification set Test predicted by K-nearest neighbor as n;
calculating probability P knn (m|b),P knn (m|b) refers to the probability that a sample truly labeled b in the validation set Test is predicted by K-neighbors as m:
y (b) refers to the number of samples of b events in the verification set Test, and y (m) refers to the number of b events in the verification set Test predicted by K-neighbors as m;
calculating probability P knn (n|b),P knn (n|b) refers to the probability that a sample truly labeled b in the validation set Test is predicted by K-neighbors as n:
y (b) refers to the number of samples of b events in the verification set Test, and y (n) refers to the number of b events in the verification set Test predicted by K-neighbors as n;
recording the 4 probability results;
8. and establishing an artificial neural network model (ANN) by utilizing a training set Train, wherein the output layer comprises two nerve units which respectively correspond to inanimate objects in the vehicle, two prediction categories of the living objects exist in the vehicle, and two results predicted by the ANN model are defined as p and q. And p: in the representative verification set Test, predicting a sample as an in-vehicle inanimate condition by an ANN model; q: in the representative verification set Test, the sample is predicted to be in-vehicle life condition;
calculating probability P ann (p|a),P ann (p|a) refers to the probability that a sample truly labeled a in the validation set Test is predicted by ANN to be p:
y (a) refers to the number of samples of the a event in the verification set Test, and y (p) refers to the number of a events in the verification set Test predicted by ANN as p;
calculating probability P ann (q|a),P ann (q|a) refers to the probability that a sample truly labeled a in the validation set Test is predicted by ANN to be q:
y (a) refers to the number of samples of the a event in the verification set Test, and y (q) refers to the number of a events in the verification set Test predicted by ANN as q;
calculating probability P ann (p|b),P ann (p|b) refers to the probability that a sample truly labeled b in the validation set Test is predicted by ANN to be p:
y (b) refers to the number of samples of b events in the verification set Test, and y (p) refers to the number of b events in the verification set Test predicted by ANN as p;
calculating probability P ann (q|b),P ann (q|b) refers to the probability that a sample truly labeled b in the validation set Test is predicted by ANN to be q:
y (b) refers to the number of samples of b events in the verification set Test, and y (q) refers to the number of b events in the verification set Test predicted by ANN as q;
recording the 4 probability results;
9. because the results generated by the KNN model and the ANN model are mutually independent, the model joint report probability is as follows:
P((m,p)|a)=p knn (m|a)*p ann (p|a)
P((m,p)|b)=p knn (m|b)*p ann (p|b)
p ((m, P) |a) represents: when the true sign is a, the KNN predictor is m, the probability that the ANN predictor is P, P ((m, P) |b) represents: when the true flag is b, the KNN predictor is m, the probability that the ANN predictor is P, P ((n, q) |a) represents: when the true flag is a, the KNN predictor is n, the probability of the ANN predictor being q, P ((n, q) |b) represents: when the true mark is b, the KNN predicted result is n, and the ANN predicted result is the probability of q;
10. substituting the fusion probability obtained in the step 9 into a formula as follows:
p (a| (m, P)) represents the probability that the real situation in the vehicle is a, namely the probability of an inanimate body, when the KNN prediction result is m and the ANN prediction result is P; p (b| (m, P)) represents the probability that the real situation in the vehicle is b, namely the probability of having a living body, when the KNN prediction result is m and the ANN prediction result is P;
11. when the obtained probability of the in-vehicle inanimate body is larger than the probability value of the in-vehicle living body, the prediction result is the in-vehicle inanimate body, and when the obtained probability of the in-vehicle inanimate body is smaller than or equal to the probability value of the in-vehicle living body, the prediction result is the in-vehicle experience living body. That is, in step 10, when the KNN prediction result is m and the ANN prediction result is P, if P (a| (m, P)) > P (b| (m, P)), the final prediction result is a (inanimate object in the vehicle), whereas the final prediction result is b (animate object in the vehicle).
CN202311680672.7A 2023-12-08 2023-12-08 Fusion method based on in-vehicle living body carry-over detection Pending CN117633709A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311680672.7A CN117633709A (en) 2023-12-08 2023-12-08 Fusion method based on in-vehicle living body carry-over detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311680672.7A CN117633709A (en) 2023-12-08 2023-12-08 Fusion method based on in-vehicle living body carry-over detection

Publications (1)

Publication Number Publication Date
CN117633709A true CN117633709A (en) 2024-03-01

Family

ID=90030370

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311680672.7A Pending CN117633709A (en) 2023-12-08 2023-12-08 Fusion method based on in-vehicle living body carry-over detection

Country Status (1)

Country Link
CN (1) CN117633709A (en)

Similar Documents

Publication Publication Date Title
Chen et al. D 3: Abnormal driving behaviors detection and identification using smartphone sensors
CN103065121B (en) The engine driver's method for monitoring state analyzed based on video human face and device
CN108769104B (en) Road condition analysis and early warning method based on vehicle-mounted diagnosis system data
CN108197731B (en) Motor vehicle exhaust remote measurement and vehicle inspection result consistency method based on co-training
CN111024898B (en) Vehicle exhaust concentration standard exceeding judging method based on Catboost model
CN108710637B (en) Real-time detection method for abnormal taxi track based on space-time relationship
US11807253B2 (en) Method and system for detecting driving anomalies
CN110562261B (en) Method for detecting risk level of driver based on Markov model
El Masri et al. Toward self-policing: Detecting drunk driving behaviors through sampling CAN bus data
CN113139594A (en) Airborne image unmanned aerial vehicle target self-adaptive detection method
Harkous et al. A two-stage machine learning method for highly-accurate drunk driving detection
CN117109582A (en) Atmospheric pollution source positioning system and method combining sensing network and machine learning
CN116911610A (en) Method and system for monitoring, evaluating and early warning of driving safety risk of transport vehicle
CN117591986A (en) Real-time automobile data processing method based on artificial intelligence
CN117633709A (en) Fusion method based on in-vehicle living body carry-over detection
CN114238502B (en) Defect automobile information analysis platform based on block chain technology
CN116311739A (en) Multi-sensor fire detection method based on long-short-term memory network and environment information fusion
CN112580741B (en) Gas type identification method and system based on multi-sensor fast learning
Altunkaya et al. Design and implementation of a novel algorithm to smart tachograph for detection and recognition of driving behaviour
CN107085074B (en) A method of classification monitoring motor-vehicle tail-gas
CN111816404A (en) Demagnetization method and system
CN115147964B (en) Multi-element information security inspection method, system, computer equipment and readable medium
CN117635010B (en) Vehicle accident identification method and device, electronic equipment and storage medium
Sen et al. Adversarial Attack Detection for Deep Learning Driving Maneuver Classifiers in Connected Autonomous Vehicles
CN117455268B (en) Bus one-key emergency escape data analysis system and method based on Internet of things

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination