CN113591693A - Truck fatigue driving detection method based on image recognition and ADAS device - Google Patents

Truck fatigue driving detection method based on image recognition and ADAS device Download PDF

Info

Publication number
CN113591693A
CN113591693A CN202110864129.7A CN202110864129A CN113591693A CN 113591693 A CN113591693 A CN 113591693A CN 202110864129 A CN202110864129 A CN 202110864129A CN 113591693 A CN113591693 A CN 113591693A
Authority
CN
China
Prior art keywords
fatigue
vehicle
driver
period
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110864129.7A
Other languages
Chinese (zh)
Other versions
CN113591693B (en
Inventor
周炜
董轩
贾红
曹琛
刘应吉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Research Institute of Highway Ministry of Transport
Original Assignee
Research Institute of Highway Ministry of Transport
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Research Institute of Highway Ministry of Transport filed Critical Research Institute of Highway Ministry of Transport
Publication of CN113591693A publication Critical patent/CN113591693A/en
Application granted granted Critical
Publication of CN113591693B publication Critical patent/CN113591693B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Abstract

The invention discloses a truck fatigue driving detection method based on image recognition and an ADAS device. The method comprises the following steps: s1, determining a driver state based on an image in a cab acquired by a cab monitoring camera, and determining a detection interval duration T1 according to the driver state; step S2, for each detection interval duration T1, executing the step of determining a driver fatigue value FA, wherein the step of determining the driver fatigue value FA comprises the following steps: s21, in a preset detection period T2, the vehicle-mounted forward camera acquires a forward image at a set frequency f; s22, determining the transverse displacement Di of the vehicle in each delta t based on the forward image; s23, determining a fatigue value FA of the driver based on the transverse displacement Di of the vehicle; and S24, determining the fatigue degree grade according to the fatigue value FA of the driver.

Description

Truck fatigue driving detection method based on image recognition and ADAS device
Technical Field
The invention relates to the technical field of vehicles, in particular to a truck fatigue driving detection method based on image recognition and an ADAS device.
Background
Fatigue driving is a common phenomenon in freight transportation and is liable to cause extremely serious traffic accidents. At present, fatigue driving monitoring is performed by recognizing the driver's physiological characteristics and driving behavior, for example, by blink recognition, mouth openness recognition, and the like. On the one hand, the method may require the driver to participate and cooperate, on the other hand, due to the difference of different individual drivers, misjudgment is easy to occur, and the driver is easy to control, so that the monitoring result is inaccurate.
For this reason, various technical solutions have been proposed to improve the detection of fatigue driving. Chinese patent document CN111703428A discloses a fatigue driving detection method based on vehicle-road coordination. In the method, a plurality of road side units 1 are arranged at intervals along the road side, an on-board unit 2 is installed on a vehicle, and the identity and the position of the vehicle are identified through communication between the road side units 1 and the on-board unit 2. The spacing distance between two adjacent side units 1 is designed based on the fact that no hole is left in the whole process of a signal capture range. The road side unit 1 is provided with a high-priority camera and can accurately capture the relative transverse displacement between a running vehicle and the center line of a lane at a fixed frequency of 5 Hz. In the method, a fatigue index is determined based on a standard deviation of a lateral displacement of the vehicle relative to a lane centerline.
On the one hand, in the prior art, fatigue driving can be detected only by arranging a large number of dense road side units 1, so that the cost is high, the construction period is long, and the large coverage rate is difficult to realize. On the other hand, it is difficult to obtain the lateral deviation of the vehicle with respect to the center line of the lane by using the high-definition camera mounted on the roadside unit. Particularly, it is difficult to realize a truck having a high height because of a vehicle body shield. In addition, it is difficult for the camera mounted on the roadside unit to simultaneously achieve a long detection distance and a large angle of view, and thus it is difficult to detect the lateral offset position of the vehicle at different road sections in a wide range (for example, on a 500m road) by the camera of the roadside unit.
At present, for road governance, monitoring of fatigue driving of trucks is more common and needs to be managed and controlled. Therefore, a technical solution is desired to enable monitoring of the fatigue driving of a truck in a manner as accurate as possible with a small construction period and with as large monitoring range as possible.
Disclosure of Invention
The invention provides a truck fatigue driving detection method based on image recognition, which comprises the following steps:
s1, determining a driver state based on an image in a cab acquired by a cab monitoring camera, and determining a detection interval duration T1 according to the driver state;
step S2, for each detection interval duration T1, executing the step of determining a driver fatigue value FA, wherein the step of determining the driver fatigue value FA comprises the following steps:
s21, in a preset detection period T2, the vehicle-mounted forward camera acquires a forward image at a set frequency f;
s22, determining the transverse displacement Di of the vehicle in each delta t based on the forward image; and
s23, determining a fatigue value FA of the driver based on the transverse displacement Di of the vehicle;
and S24, determining the fatigue degree grade according to the fatigue value FA of the driver.
Preferably, the driver fatigue value FA is calculated by the following equation:
Figure BDA0003186945870000021
wherein the content of the first and second substances,
n is the number of times of detecting the lateral displacement of the vehicle within the preset detection period T2, where N is M-1, and M is the number of frames of the forward image acquired within the preset detection period T2;
di is the lateral displacement of the vehicle detected at the ith time in the preset detection period T2 and corresponds to the moment T of the forward image of the (i + 1) th frameiTime t of forward image relative to ith framei-1The vehicle lateral displacement of (a);
di is signed quantity, is shifted to the left, Di >0, is shifted to the right, Di < 0; or left biased, Di >0, right biased, Di < 0;
the | Di | is the absolute value of Di;
k is a weight coefficient, and is more than or equal to 0.5 and less than or equal to 2;
Figure BDA0003186945870000022
preferably, the truck fatigue driving detection method based on image recognition comprises a detection start step S0, wherein in the detection start step S0, the vehicle speed is monitored, the continuous driving time length TL is detected, and in the case of a start condition that the continuous driving time length TL is greater than or equal to the set time length T0, the steps S1 and/or S2 are executed,
if 60min < Tv=0Then TL is 0;
if 15min < Tv=0Less than or equal to 60min, then TL ═ TL +. DELTA.T-Tv=0
If 5min < Tv=0Less than or equal to 15min, then TL + DeltaT-0.5v=0
If T isv=0If the time is less than or equal to 5min, then TL is equal to TL plus delta T;
wherein the content of the first and second substances,
Δ T is the length of time that has elapsed,
Tv=0the duration of time that the continuous vehicle speed is zero in Δ T.
Preferably, if the cab monitor camera monitors that the driver is changed and the previous driving time of the driver is 1 hour ago, the continuous driving time period TL is calculated again from zero.
Preferably, the driver' S state determined in step S1 includes a normal state and a fatigue state, and the detection interval duration T1 determined for the normal state is T1Often times(ii) a Detection interval duration T1-T1 determined for fatigue stateFatigue
Wherein, 1.5T 1Fatigue≤T1Often times≤3*T1Fatigue
The level of fatigue degree determined in step S24 includes light fatigue, moderate fatigue and heavy fatigue, and if the fatigue degree determined in step S24 is moderate fatigue or heavy fatigue, the detection interval duration T1 is set to T1Fatigue
Preferably, the time of driver replacement is recorded, as well as the duration of continuous driving at each moment, and the driver fatigue value FA.
Preferably, the truck fatigue driving detecting method based on image recognition further includes a fatigue degree checking step S3, wherein in the fatigue degree checking step S3, the swing cycle of the left and right swings of the vehicle is detected, if the detected swing cycle is less than the set swing cycle, the fatigue degree is determined to be severe fatigue, and a stop is immediately needed,
wherein, the swing period of the vehicle left and right swing is detected by a method comprising the following steps:
s31, constructing a swinging event sequence, setting the width as an event when the absolute value of the sum of the transverse displacements Di of the vehicle for a plurality of times is larger than or equal to the set width, and setting the number of the events in the event sequence as R;
step S32, determining a plurality of alternative periods T, wherein the alternative periods T < (T2/(R) Delta T));
s33, for each alternative period T, dividing the swing time sequence into a plurality of segments;
s34, overlapping the plurality of divided fragments in the corresponding alternative periods T;
s35, acquiring each separation in the corresponding alternative period TEvents observed at scattered time points Si(T);
Step S36, event S obtained according to step S35i(T) obtaining the distribution p of the event sequence which falls on the position i in a superposed manner after being segmented by the alternative period Ti(T);
Step S37, according to pi(T), acquiring the entropy H (T) of the corresponding alternative period T;
step S38, according to the entropy h (T), obtaining the relative entropy of the candidate period T: KL*(pi(T)||qi(T));
And step S39, selecting the alternative period T corresponding to the relative entropy with the minimum value as the swing period of the left-right swing of the vehicle.
Preferably, the relative calculation of the swing period for detecting the left and right swings of the vehicle during running is carried out at the cloud; when the calculation capacity of the vehicle-mounted calculation unit is limited, the vehicle-mounted forward camera acquires forward images at a set frequency f and uploads the forward images to the cloud, and the fatigue value FA of the driver is calculated at the cloud.
Preferably, the forward-facing camera comprises two cameras of the same specification which are at the same height position and are spaced apart in the transverse direction, and the vehicle transverse displacement Di is determined in the following way:
step S221, determining the previous image frame PA acquired by the first cameraiWith the current image frame PAi+1A common fixed reference feature of (a);
step S222, respectively obtaining the previous image frames PA based on the two camerasiAnd PBiForming a depth map, on the basis of which the previous relative lateral position Dp of the fixed reference object feature and the vehicle is determined;
step S223, current image frame PA respectively obtained based on two camerasi+1And PBi+1Forming a depth map, and determining the current relative transverse position Dc of the fixed reference object feature and the vehicle on the basis of the depth map;
and S224, calculating the vehicle lateral displacement Di detected at the current time as Dc-Dp.
Preferably, if the signal that the turn lamp is turned on is detected in the step of determining the driver fatigue value FA, the calculation of the driver fatigue value FA is terminated, and the calculation of the driver fatigue value FA is performed again after the completion of the steering.
The invention also provides an ADAS device which comprises a cab monitoring camera, a forward camera and a processing unit, wherein the processing unit executes the truck fatigue driving detection method based on image recognition.
According to the truck fatigue driving detection method based on image recognition, the lateral displacement of the vehicle does not need to be detected by a fine road side unit, and the truck fatigue driving can be monitored, particularly quantitatively monitored, in a mode as accurate as possible in a smaller construction period and a monitoring range as large as possible.
Drawings
Fig. 1 is a schematic diagram of a system for implementing a fatigue driving detection method based on vehicle-road coordination in the prior art.
Fig. 2 is a schematic block diagram of an ADAS device according to an embodiment of the invention.
Detailed Description
In the drawings, the same or similar reference numerals are used to denote the same or similar elements or elements having the same or similar functions. Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
Fig. 2 is a schematic block diagram of an ADAS device according to an embodiment of the invention. An ADAS device according to an embodiment of the present invention includes a cab surveillance camera, a forward-facing camera, and a processing unit. The ADAS device may further include a wired communication unit, a wireless communication unit, etc. as needed, so as to communicate with the onboard controller, the cloud server, etc.
The ADAS device according to embodiments of the invention may be of any suitable shape and size. In an alternative embodiment, the ADAS device is integrated with the tachograph.
An ADAS device according to an embodiment of the invention may include a power supply or power supply device; a storage unit may also be included to store the necessary information.
The processing unit executes the truck fatigue driving detection method based on image recognition according to the embodiment of the invention. According to the truck fatigue driving detection method based on image recognition, a fine road side unit is not needed to be arranged to detect the transverse displacement of the truck, and the truck fatigue driving can be monitored, especially quantitatively monitored, in a mode as accurate as possible in a smaller construction period and a monitoring range as large as possible.
The truck fatigue driving detection method based on image recognition according to an embodiment of the present invention includes the following steps S1 and S2.
S1, determining the state of a driver based on an image in the cab acquired by a cab monitoring camera, and determining the detection interval duration T1 according to the state of the driver.
The cab monitor camera is an on-board camera, and is typically mounted in the cab of the truck, for example, above and to the left of the driver's cab.
The driver' S state determined in step S1 includes, for example, a normal state and a fatigue state (may also be referred to as a suspected fatigue state). In a normal state, a longer detection interval duration may be employed; in a fatigue state, a shorter detection interval duration may be employed.
For example, in the case where a normal state such as a blinking state, a mouth opening/closing state, or the like of the driver is detected (no eye closing, yawning, or a small number of times, a low frequency), it is confirmed that the driver state is a normal state, and a long detection interval period, for example, 2 minutes, 3 minutes, or 5 minutes, is set. When abnormality such as blinking of the driver or opening and closing of the mouth is detected (or when there are eye closure and yawning actions, or the frequency is high, the frequency is high), it is determined that the driver's state is a fatigue state (suspected fatigue state), and a short detection interval duration, for example, 1 minute or 30 seconds, is set.
The method of determining the driver's state based on the images in the cab may employ any suitable method known in the art. For example, a neural network convolution method is adopted to identify the relative actions of blinking and mouth opening of the driver.
The cab monitoring camera can be an infrared camera or is provided with an infrared camera unit, and can acquire images under dark illumination.
Step S2. for each detection interval duration T1, the step of determining the driver fatigue value FA is performed. That is, the calculation of the driver fatigue value FA is not always performed in the detection interval period T1.
Specifically, the step of determining the driver fatigue value FA comprises steps S21-S24.
And S21, in a preset detection period T2, acquiring a forward image by the vehicle-mounted forward camera at a set frequency f. Generally, the preset detection period T2 is smaller than the detection interval duration T1. For example, the preset detection period T2 is substantially one third, one fifth, or less of the detection interval duration T1.
It is to be noted that, at each detection interval duration T1, only one preset detection period T2 may be provided, or two or more preset detection periods T2 may be provided. In the case where two or more preset detection periods T2 are provided, the calculated driver fatigue value FA may be averaged in an absolute average or a weighted average.
The set frequency f corresponds to the frame rate (unit: FPS, frame/second) of the on-vehicle forward camera. Generally, the set frequency f is equal to or less than the frame rate of the in-vehicle forward camera. For example, in the case where the frame rate of the in-vehicle forward camera is 20FPS, the set frequency f may be 20, 10, 5, 2, 1, or the like. And under the condition that the set frequency f is less than the frame rate of the vehicle-mounted forward camera, only selecting partial image frames of the vehicle-mounted forward camera. In the embodiments of the present invention, f is 10 as an example. However, it should be noted that the present invention is not limited thereto.
When the set frequency f is 10, the time interval Δ t between two adjacent frames of pictures is 0.1 second.
Assuming that the preset detection period T2 is 1 minute, 601 frames of images will be acquired within the preset detection period T2. The first frame image corresponds to time 0, the second frame image corresponds to time 0+ Δ T, the ith frame image corresponds to time 0+ Δ T (i-1), and the last frame image corresponds to time 0+ T2.
And S22, determining the lateral displacement Di of the vehicle in each delta t on the basis of the forward images. The vehicle lateral displacement Di is a vehicle lateral displacement (lateral position deviation) of the vehicle position corresponding to the i +1 th frame image with respect to the vehicle position corresponding to the i-th frame image. The difference in position between the two image frames is determined by any suitable method known in the art. For example, a trained neural network convolution model is used to determine the vehicle lateral displacement of the vehicle position corresponding to the two image frames.
And S23, determining a driver fatigue value FA based on the vehicle transverse displacement Di. Any suitable method may be employed to determine the driver fatigue value FA based on the vehicle lateral displacement Di. In principle, the greater the vehicle lateral displacement Di, the greater the driver fatigue value FA.
For example,
Figure BDA0003186945870000071
or
Figure BDA0003186945870000072
Wherein the content of the first and second substances,
n is the number of times that the lateral displacement of the vehicle is detected within the preset detection period T2 (e.g., N-600), N-M-1, and M is the number of frames of the forward image acquired within the preset detection period T2 (e.g., M-601);
di is the lateral displacement of the vehicle detected at the ith time in the preset detection period T2 and corresponds to the moment T of the forward image of the (i + 1) th frameiTime t of forward image relative to ith framei-1The vehicle lateral displacement of (a);
di is signed quantity, is shifted to the left, Di >0, is shifted to the right, Di < 0; or left biased, Di >0, right biased, Di < 0;
the | Di | is the absolute value of Di;
Figure BDA0003186945870000073
and S24, determining the fatigue degree grade according to the fatigue value FA of the driver.
In order to determine the fatigue level, a fatigue level rating table is established in advance, as shown in table 1. Wherein F1, F2 are empirical values determined based on a number of experimental statistics.
TABLE 1 driver fatigue level table
Serial number Fatigue value FA of driver Grade of degree of fatigue
1 FA<F1 Mild fatigue
2 F1≤FA≤F2 Moderate fatigue
3 F2<FA Severe fatigue
After determining the level of fatigue based on the measured driver fatigue value FA, a corresponding process may be performed. For example, a reminder or alarm process is performed. In an alternative embodiment of the invention, no alarm or reminder is given in case of mild fatigue. In the case of moderate fatigue, an intermittent audible and visual alarm is provided, for example, to alert the driver once every set period of time (e.g., 3 minutes), to advise the driver to rest or to change drivers. And under the condition of severe fatigue, continuous audible and visual alarm is carried out. Reminding the driver that he should stop at the side for a rest or replace the driver.
In one embodiment of the invention, the driver fatigue value FA is calculated as follows:
Figure BDA0003186945870000081
wherein the content of the first and second substances,
n is the number of times of detecting the lateral displacement of the vehicle within the preset detection period T2, where N is M-1, and M is the number of frames of the forward image acquired within the preset detection period T2;
di is the lateral displacement of the vehicle detected at the ith time in the preset detection period T2 and corresponds to the moment T of the forward image of the (i + 1) th frameiTime t of forward image relative to ith framei-1The vehicle lateral displacement of (a);
di is signed quantity, is shifted to the left, Di >0, is shifted to the right, Di < 0; or left biased, Di >0, right biased, Di < 0;
the | Di | is the absolute value of Di;
k is a weight coefficient, and is more than or equal to 0.5 and less than or equal to 2; for example, for an empty or light truck, K is taken to be less than 1, e.g., K is taken to be 0.75. For a full truck, K is taken to be greater than 1, e.g., K is 1.25. Default to K ═ 1.
Figure BDA0003186945870000082
At the very beginning of driving, the truck driver is not normally in a tired state. A detection initiation step may be provided for this purpose. The subsequent calculation of the driver fatigue value FA is performed only if the starting condition is satisfied.
Specifically, in one embodiment of the present invention, the truck fatigue driving detection method based on image recognition includes a detection starting step S0. In the detection start step S0, the vehicle speed is monitored and the continuous driving period TL is detected, and the step S1 and/or the step S2 are/is executed under the start condition that the continuous driving period TL is equal to or more than the set period T0.
The vehicle speed signal comes from, for example, a vehicle-mounted wheel speed sensor, or from a vehicle-mounted GPS positioning device, or the like. The continuous driving time period refers to the time during which the vehicle is continuously driven in the present invention. It should be noted that the short stop of the vehicle due to the red light and other factors does not affect the calculation and statistics of the continuous driving time.
The continuous driving time period TL can be calculated specifically in the following manner.
If 60min < Tv=0Then TL is 0;
if 15min < Tv=0Less than or equal to 60min, then TL ═ TL +. DELTA.T-Tv=0
If 5min < Tv=0Less than or equal to 15min, then TL + DeltaT-0.5v=0
If T isv=0If the time is less than or equal to 5min, then TL is equal to TL plus delta T;
wherein the content of the first and second substances,
Δ T is the length of time that has elapsed,
Tv=0the duration of time that the continuous vehicle speed is zero in Δ T.
That is, if the continuous parking period is longer than 60 minutes, the continuous driving period TL is recalculated from 0. If the continuous parking period is less than or equal to 5 minutes, the continuous driving period TL is calculated as usual, or the calculation of the continuous driving period TL is not affected. If the continuous parking period is in the range of 5 minutes to 15 minutes, the continuous driving period TL is halved. If the continuous parking period is in the range of 15 minutes to 60 minutes, the continuous driving period TL is not counted.
The continuous driving period TL should also be recalculated from 0 if the driver is replaced. In one embodiment of the invention, if the cab monitor camera monitors the replacement driver and the previous driving time of the driver is 1 hour ago, the continuous driving time period TL is calculated again from zero.
In one embodiment of the present invention, the driver' S state determined in step S1 includes a normal state and a fatigue state for a positive conditionDetection interval duration T1 ═ T1 for constant state determinationOften times(ii) a Detection interval duration T1-T1 determined for fatigue stateFatigueWherein, 1.5T 1Fatigue≤T1Often times≤3*T1Fatigue
The level of fatigue degree determined in step S24 includes light fatigue, moderate fatigue and heavy fatigue, and if the fatigue degree determined in step S24 is moderate fatigue or heavy fatigue, the detection interval duration T1 is set to T1Fatigue
Preferably, the time of driver replacement is recorded, as well as the duration of continuous driving at each moment, and the driver fatigue value FA.
Preferably, the truck fatigue driving detection method based on image recognition further comprises a fatigue degree checking step S3, wherein in the fatigue degree checking step S3, the swing cycle of the left and right swings of the vehicle is detected, if the detected swing cycle is less than the set swing cycle, the fatigue degree is determined to be heavy fatigue, the driver needs to be reminded to stop the vehicle for a rest immediately,
wherein, the swing period of the vehicle left and right swing is detected by a method comprising the following steps:
and S31, constructing a swinging event sequence, specifically, setting the absolute value of the sum of the transverse displacements Di of the vehicle for a plurality of times as a set width, and setting the number of the events in the event sequence as R. The set width is, for example, 1/10 to 1/4 equal to the vehicle width. Or the set width is a fixed value, for example, 0.5 m. And taking the moment of the last displacement of the plurality of continuous vehicle transverse displacements Di as the occurrence moment of the corresponding event.
Step S32. determining a plurality of candidate periods T < (T2/(R Δ T)). It is necessary to select one from the plurality of candidate periods T as the detected wobbling period.
Essentially, each alternative period T may be represented by a natural number or a duration. In the manner of natural number representation, the corresponding duration is a natural number Δ t. In the manner of time duration representation, the corresponding natural number is time duration/. DELTA.t. Accordingly, the set wobble period may be expressed by a natural number or may be expressed by a time length.
And S33, for each alternative period T, dividing the swinging time sequence into a plurality of segments.
And S34, overlapping the plurality of divided segments in the corresponding alternative periods T. That is, each segment is superimposed on one of the alternative periods T.
S35, obtaining an event S observed at each discrete time point in the corresponding alternative period Ti(T)。
Specifically, the event S observed at each discrete time point within a time period of length T is represented by the following equationi(T):
Si(T)={tmod(t,T)=i∧I(t)=1},t=0,1,...,n-1;i=0,1,...,T-1
Wherein the content of the first and second substances,
Si(T) represents an event occurring at the ith time over a time period of length T;
mod () represents a remainder function;
i (t) ═ 1 indicates the presence of an event at time t;
t is a candidate period to be detected;
t represents the sequence time, which takes values from 0 to n-1, n being the length of the event sequence;
i represents the remainder of the division of the sequence time T by T, which takes values from 0 to T-1.
Step S36, event S obtained according to step S35i(T) obtaining the distribution p of the event sequence which falls on the position i in a superposed manner after being segmented by the alternative period Ti(T)。
Specifically, the distribution p of the event sequence superimposed over a time length T after being divided by the candidate period T is calculated by the following formulai(T):
Figure BDA0003186945870000101
Wherein the content of the first and second substances,
pi(T) represents the probability of an observation falling at the i position in the candidate period T;
Sj(T) represents an observed event at the jth time instant over a time period of length T.
Step S37, according to pi(T), acquiring the entropy H (T) of the corresponding alternative period T. The entropy of the alternative period T can be calculated as follows,
Figure BDA0003186945870000111
step S38, according to the entropy h (T), obtaining the relative entropy of the candidate period T: KL*(pi(T)||qi(T)). The relative entropy of the alternative period T can be calculated as follows,
KL*(pi(T)||qi(T))=logT-H(T)
wherein q isi(T) represents the probability of uniform distribution at the ith position in T, qi(T)=1/T。
And step S39, selecting the alternative period T corresponding to the relative entropy with the minimum value as the swing period of the left-right swing of the vehicle.
And considering that the on-board calculation capacity of part of the vehicles is low, part of calculation is transferred to be carried out in motion. For example, correlation calculation for detecting the swing cycle of the vehicle in the right and left swing is performed in the cloud. In addition, when the calculation capacity of the vehicle-mounted computing unit is limited, the vehicle-mounted forward camera can acquire forward images at a set frequency f and upload the forward images to the cloud, and the fatigue value FA of the driver is calculated at the cloud.
In one embodiment of the invention, the forward facing camera includes two cameras of the same size positioned at the same height and spaced apart laterally.
Specifically, the forward-facing camera includes: the device comprises a left eyepiece lens unit, a right eyepiece lens unit and a video processing chip. The resolution of the image sensors of the left eye lens unit and the right eye lens unit are the same, using the same lens, aperture and target surface parameters. The video processing chip receives the two paths of videos from the left eye lens unit and the right eye lens unit and processes the two paths of videos to generate a depth map. The video processing chip compares the current frame picture of the left eye lens unit or the right eye lens unit with the previous frame picture to determine the position offset of the current position relative to the previous position.
For example, the left eyepiece lens unit and the right eyepiece lens unit are located on the same horizontal plane, and the distance between the left eyepiece lens unit and the right eyepiece lens unit is greater than or equal to 10 cm. In an alternative embodiment, the distance between the left and right eye lens units is equal to 12 cm. This is favorable to better form binocular stereo vision, utilizes binocular stereo vision technique to calculate the parallax error. Binocular stereoscopic vision technology, namely a vision system simulating human eyes, and calculates the parallax of a target in an image by acquiring target images shot by two cameras at different positions). Thereby obtaining a depth map. A depth map is an image reflecting the distance of objects in a scene, and each pixel value in the depth map represents the distance between a point in the scene and the camera. Therefore, the distance between the left eye and the right eye is set, and binocular stereoscopic vision can be well formed under the compact size of the whole camera. And then determining the lateral displacement of the vehicle through the relative position change of the fixed reference object.
Specifically, the vehicle lateral displacement Di is determined in the following manner:
step S221, determining the previous image frame PA acquired by the first cameraiWith the current image frame PAi+1A common fixed reference feature of (a);
step S222, respectively obtaining the previous image frames PA based on the two camerasiAnd PBiForming a depth map, on the basis of which the previous relative lateral position Dp of the fixed reference object feature and the vehicle is determined;
step S223, current image frame PA respectively obtained based on two camerasi+1And PBi+1Forming a depth map, and determining the current relative transverse position Dc of the fixed reference object feature and the vehicle on the basis of the depth map;
and S224, calculating the vehicle lateral displacement Di detected at the current time as Dc-Dp.
Preferably, if the signal that the turn lamp is turned on is detected in the step of determining the driver fatigue value FA, the calculation of the driver fatigue value FA is terminated, and the calculation of the driver fatigue value FA is performed again after the completion of the steering.
Embodiments of the present invention also provide an ADAS device, which includes a cab surveillance camera, a forward-facing camera, and a processing unit, where the processing unit executes the method for detecting fatigue driving of a truck based on image recognition as described above.
Finally, it should be pointed out that: the above examples are only for illustrating the technical solutions of the present invention, and are not limited thereto. Those of ordinary skill in the art will understand that: modifications can be made to the technical solutions described in the foregoing embodiments, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A truck fatigue driving detection method based on image recognition is characterized by comprising the following steps:
s1, determining a driver state based on an image in a cab acquired by a cab monitoring camera, and determining a detection interval duration T1 according to the driver state;
step S2, for each detection interval duration T1, executing the step of determining a driver fatigue value FA, wherein the step of determining the driver fatigue value FA comprises the following steps:
s21, in a preset detection period T2, the vehicle-mounted forward camera acquires a forward image at a set frequency f;
s22, determining the transverse displacement Di of the vehicle in each delta t based on the forward image; and
s23, determining a fatigue value FA of the driver based on the transverse displacement Di of the vehicle;
and S24, determining the fatigue degree grade according to the fatigue value FA of the driver.
2. A truck fatigue driving detection method based on image recognition as claimed in claim 1, wherein the driver fatigue value FA is calculated by the following formula:
Figure FDA0003186945860000011
wherein the content of the first and second substances,
n is the number of times of detecting the lateral displacement of the vehicle within the preset detection period T2, where N is M-1, and M is the number of frames of the forward image acquired within the preset detection period T2;
di is the lateral displacement of the vehicle detected at the ith time in the preset detection period T2 and corresponds to the moment T of the forward image of the (i + 1) th frameiTime t of forward image relative to ith framei-1The vehicle lateral displacement of (a);
di is signed quantity, is shifted to the left, Di >0, is shifted to the right, Di < 0; or left biased, Di >0, right biased, Di < 0;
the | Di | is the absolute value of Di;
k is a weight coefficient, and is more than or equal to 0.5 and less than or equal to 2;
Figure FDA0003186945860000012
3. a truck fatigue driving detecting method based on image recognition as claimed in claim 1, comprising a detecting start step S0, wherein in the detecting start step S0, the vehicle speed is monitored and the continuous driving time period TL is detected, and in the case of a start condition that the continuous driving time period TL is equal to or greater than the set time period T0, the steps S1 and/or S2 are/is executed,
if 60min < Tv=0Then TL is 0;
if 15min < Tv=0Less than or equal to 60min, then TL ═ TL +. DELTA.T-Tv=0
If 5min < Tv=0Less than or equal to 15min, then TL + DeltaT-0.5v=0
If T isv=0If the time is less than or equal to 5min, then TL is equal to TL plus delta T;
wherein the content of the first and second substances,
Δ T is the length of time that has elapsed,
Tv=0the duration of time that the continuous vehicle speed is zero in Δ T.
4. A truck fatigue driving detection method based on image recognition as claimed in claim 2, wherein if the cab monitor camera monitors the replacement driver and the previous driving time of the driver is 1 hour before, the continuous driving time TL is calculated again from zero.
5. Truck fatigue driving detection method based on image recognition according to any one of claims 1 to 4,
the driver' S state determined in step S1 includes a normal state for which the detection interval duration T1 is T1 and a fatigue stateOften times(ii) a Detection interval duration T1-T1 determined for fatigue stateFatigue
Wherein, 1.5T 1Fatigue≤T1Often times≤3*T1Fatigue
The level of fatigue degree determined in step S24 includes light fatigue, moderate fatigue and heavy fatigue, and if the fatigue degree determined in step S24 is moderate fatigue or heavy fatigue, the detection interval duration T1 is set to T1Fatigue
6. A truck fatigue driving detection method based on image recognition as claimed in claim 1, characterized in that the time of replacing the driver is recorded, and the continuous driving time at each moment and the driver fatigue value FA are recorded.
7. A truck fatigue driving detecting method based on image recognition as claimed in claim 1, further comprising a fatigue degree checking step S3, wherein in the fatigue degree checking step S3, the swing cycle of the vehicle swinging left and right during driving is detected, if the detected swing cycle is less than the set swing cycle, the fatigue degree is confirmed as heavy fatigue, the driver is reminded to stop for a break immediately,
wherein, the swing period of the vehicle left and right swing is detected by a method comprising the following steps:
s31, constructing a swinging event sequence, setting the width as an event when the absolute value of the sum of the transverse displacements Di of the vehicle for a plurality of times is larger than or equal to the set width, and setting the number of the events in the event sequence as R;
step S32, determining a plurality of alternative periods T, wherein each alternative period T < (T2/(R) Delta T));
s33, for each alternative period T, dividing the swing time sequence into a plurality of segments;
s34, overlapping the plurality of divided fragments in the corresponding alternative periods T;
s35, obtaining an event S observed at each discrete time point in the corresponding alternative period Ti(T);
Step S36, event S obtained according to step S35i(T) obtaining the distribution p of the event sequence which falls on the position i in a superposed manner after being segmented by the alternative period Ti(T);
Step S37, according to pi(T), acquiring the entropy H (T) of the corresponding alternative period T;
step S38, according to the entropy h (T), obtaining the relative entropy of the candidate period T: KL*(pi(T)||qi(T));
And step S39, selecting the alternative period T corresponding to the relative entropy with the minimum value as the swing period of the left-right swing of the vehicle.
8. The truck fatigue driving detection method based on image recognition as claimed in claim 7, wherein the correlation calculation for detecting the swing period of the left and right swings of the vehicle during driving is performed at a cloud end; when the calculation capacity of the vehicle-mounted calculation unit is limited, the vehicle-mounted forward camera acquires forward images at a set frequency f and uploads the forward images to the cloud, and the fatigue value FA of the driver is calculated at the cloud.
9. A method as claimed in any one of claims 1 to 8, wherein the forward camera comprises two cameras of the same height and spaced apart laterally, and the vehicle lateral displacement Di is determined by:
step S221, determining the previous image frame PA acquired by the first cameraiWith the current image frame PAi+1A common fixed reference feature of (a);
step S222, respectively obtaining the previous image frames PA based on the two camerasiAnd PBiForming a depth map, on the basis of which the previous relative lateral position Dp of the fixed reference object feature and the vehicle is determined;
step S223, current image frame PA respectively obtained based on two camerasi+1And PBi+1Forming a depth map, and determining the current relative transverse position Dc of the fixed reference object feature and the vehicle on the basis of the depth map;
and S224, calculating the vehicle lateral displacement Di detected at the current time as Dc-Dp.
10. An ADAS device, characterized in that it comprises a cab surveillance camera, a forward-facing camera and a processing unit, which executes the method for detecting fatigue driving of a truck based on image recognition according to any of claims 1-9.
CN202110864129.7A 2021-07-19 2021-07-29 Truck fatigue driving detection method and ADAS device based on image recognition Active CN113591693B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110814012 2021-07-19
CN2021108140128 2021-07-19

Publications (2)

Publication Number Publication Date
CN113591693A true CN113591693A (en) 2021-11-02
CN113591693B CN113591693B (en) 2023-10-27

Family

ID=78251920

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110864129.7A Active CN113591693B (en) 2021-07-19 2021-07-29 Truck fatigue driving detection method and ADAS device based on image recognition

Country Status (1)

Country Link
CN (1) CN113591693B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108528448A (en) * 2017-03-02 2018-09-14 比亚迪股份有限公司 Vehicle travels autocontrol method and device
CN110063735A (en) * 2019-04-16 2019-07-30 中国第一汽车股份有限公司 Fatigue monitoring method based on driving behavior
CN111703428A (en) * 2020-07-22 2020-09-25 交通运输部公路科学研究所 Fatigue driving monitoring method based on vehicle-road cooperation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108528448A (en) * 2017-03-02 2018-09-14 比亚迪股份有限公司 Vehicle travels autocontrol method and device
CN110063735A (en) * 2019-04-16 2019-07-30 中国第一汽车股份有限公司 Fatigue monitoring method based on driving behavior
CN111703428A (en) * 2020-07-22 2020-09-25 交通运输部公路科学研究所 Fatigue driving monitoring method based on vehicle-road cooperation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
万蔚等: "基于驾驶行为的疲劳驾驶判别算法研究", 《道路交通与安全》 *

Also Published As

Publication number Publication date
CN113591693B (en) 2023-10-27

Similar Documents

Publication Publication Date Title
EP3316231B1 (en) Alert generation correlating between head mounted imaging data and external device
US7423540B2 (en) Method of detecting vehicle-operator state
JP5171629B2 (en) Driving information providing device
JP5974915B2 (en) Arousal level detection device and arousal level detection method
US9390568B2 (en) Driver identification based on driving maneuver signature
US10336257B2 (en) Rear vision system for a vehicle and method of using the same
JP5506745B2 (en) Image acquisition unit, method and associated control unit {IMAGEACQUISITIONUNIT, ACQUISITIONMETHODANDASSOCIATEDCONTROLLUNT}
US10755111B2 (en) Identifying suspicious entities using autonomous vehicles
CN105474265B (en) Object apparatus for predicting and object estimating method
CN107449440A (en) The display methods and display device for prompt message of driving a vehicle
CN112180605B (en) Auxiliary driving system based on augmented reality
DE102005001456A1 (en) Determination device for a collision probability
CN104012081A (en) Object detection device
JP2000161915A (en) On-vehicle single-camera stereoscopic vision system
DE102018216009A1 (en) APPARATUS FOR DETERMINING DISCONTINUED DRIVING, METHOD FOR DETERMINING DISCONTINUED DRIVING AND PROGRAM
CN104471626A (en) Lane departure determination apparatus, lane departure warning apparatus, and vehicle control system using same
US20190147270A1 (en) Information processing apparatus, driver monitoring system, information processing method and computer-readable storage medium
DE102018127600A1 (en) Information processing apparatus, driver monitoring system, information processing method and information processing program
CN111260915B (en) Early warning reminding method for pedestrian stay in expressway traffic abnormal area
JP6481969B2 (en) Driver state estimation device
KR101986734B1 (en) Driver assistance apparatus in vehicle and method for guidance a safety driving thereof
JP2014071627A (en) Driving condition display system, driving condition display program, and driving condition display method
CN113591693B (en) Truck fatigue driving detection method and ADAS device based on image recognition
DE102019109491A1 (en) DATA PROCESSING DEVICE, MONITORING SYSTEM, WECKSYSTEM, DATA PROCESSING METHOD AND DATA PROCESSING PROGRAM
CN117197779A (en) Track traffic foreign matter detection method, device and system based on binocular vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant