CN113544471A - DMS device performance evaluation system, method, storage medium and electronic device - Google Patents
DMS device performance evaluation system, method, storage medium and electronic device Download PDFInfo
- Publication number
- CN113544471A CN113544471A CN202080019788.4A CN202080019788A CN113544471A CN 113544471 A CN113544471 A CN 113544471A CN 202080019788 A CN202080019788 A CN 202080019788A CN 113544471 A CN113544471 A CN 113544471A
- Authority
- CN
- China
- Prior art keywords
- driver
- data
- dms device
- real
- dms
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000011156 evaluation Methods 0.000 title claims abstract description 85
- 238000000034 method Methods 0.000 title claims abstract description 26
- 238000003860 storage Methods 0.000 title claims abstract description 21
- 238000012545 processing Methods 0.000 claims abstract description 57
- 238000012360 testing method Methods 0.000 claims abstract description 6
- 238000006073 displacement reaction Methods 0.000 claims description 34
- 230000006399 behavior Effects 0.000 claims description 32
- 238000005286 illumination Methods 0.000 claims description 28
- 230000008569 process Effects 0.000 claims description 12
- 210000004556 brain Anatomy 0.000 claims description 11
- 238000009434 installation Methods 0.000 claims description 10
- 230000008859 change Effects 0.000 claims description 8
- 230000000007 visual effect Effects 0.000 claims description 8
- 238000013500 data storage Methods 0.000 claims description 6
- 239000013589 supplement Substances 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 5
- 238000003825 pressing Methods 0.000 claims description 5
- 230000001502 supplementing effect Effects 0.000 claims description 3
- 238000004519 manufacturing process Methods 0.000 abstract description 8
- 238000001514 detection method Methods 0.000 abstract description 7
- 210000003128 head Anatomy 0.000 description 108
- 230000036544 posture Effects 0.000 description 78
- 238000012544 monitoring process Methods 0.000 description 12
- 230000005540 biological transmission Effects 0.000 description 7
- 239000000047 product Substances 0.000 description 6
- 210000001747 pupil Anatomy 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000004148 unit process Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000035622 drinking Effects 0.000 description 3
- 230000000391 smoking effect Effects 0.000 description 3
- 241001282135 Poromitra oscitans Species 0.000 description 2
- 206010048232 Yawning Diseases 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 2
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 230000036760 body temperature Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 210000004087 cornea Anatomy 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000003414 extremity Anatomy 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 229910052760 oxygen Inorganic materials 0.000 description 2
- 239000001301 oxygen Substances 0.000 description 2
- 238000001028 reflection method Methods 0.000 description 2
- 230000036387 respiratory rate Effects 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 210000003141 lower extremity Anatomy 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 230000035479 physiological effects, processes and functions Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 210000001364 upper extremity Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D18/00—Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The application discloses a DMS device performance evaluation system, a method, a storage medium and an electronic device, which are used for solving the DMS device performance evaluation problem, and the system comprises: the data acquisition unit to be tested is used for acquiring the driver state data output by the DMS equipment; the system comprises an original data acquisition unit, a data acquisition unit and a data acquisition unit, wherein the original data acquisition unit is used for acquiring original driver data acquired by a sensor; the original data processing unit is used for calculating the real state data of the driver according to the original driver data; and the comparison unit is used for comparing the driver state data with the real driver state data and evaluating the performance of the DMS equipment according to a comparison result. The application provides a testing and evaluating scheme for DMS equipment before mass production, and solves the problem that the false alarm rate and the detection rate of a product can only be fed back depending on the feeling of a user after mass production of the product at present.
Description
The present disclosure relates to the field of device performance evaluation technologies, and in particular, to a DMS device performance evaluation system, a DMS device performance evaluation method, a storage medium, and an electronic device.
This section is intended to provide a background or context to the embodiments of the application that are recited in the claims. The description herein is not admitted to be prior art by inclusion in this section.
The DMS (Driver Monitor System) has a function of recognizing the Driver's identity, monitoring the fatigue level, the head posture, the behavior, the sight line information, and the visual field range of the Driver, and giving an alarm when the Driver is in a dangerous situation such as fatigue and distraction. The DMS can effectively standardize the driving behavior of the driver and greatly reduce the probability of traffic accidents.
There are numerous DMS systems in the market, front-end or rear-end, which are based on process principles and claimed to have similar functional characteristics, but the software and hardware performance is different, and it is difficult to quantitatively compare the performance laterally. The market also has no solution for performing performance check on the DMS, and the false alarm rate and the detection rate of the equipment can be fed back only by the feeling of a client after the product quantity.
Disclosure of Invention
In view of the foregoing analysis, the present application aims to provide a DMS device performance evaluation system, method, storage medium, and electronic device, which perform testing and evaluation on the performance of a DMS device before mass production of the DMS device, so that a vehicle factory can select the DMS device according to the evaluation result.
The purpose of the application is mainly realized by the following technical scheme:
according to an aspect of the present application, there is provided a DMS device performance evaluation system including:
the data acquisition unit to be tested is used for acquiring the driver state data output by the DMS equipment;
the system comprises an original data acquisition unit, a data acquisition unit and a data acquisition unit, wherein the original data acquisition unit is used for acquiring original driver data acquired by a sensor;
the original data processing unit is used for calculating the real state data of the driver according to the original driver data;
and the comparison unit is used for comparing the driver state data with the real driver state data and evaluating the performance of the DMS equipment according to a comparison result.
According to another aspect of the present application, there is provided a DMS device performance evaluation method, applied to the device performance evaluation system as described above, including the steps of:
acquiring driver state data output by DMS equipment;
acquiring original driver data acquired by a sensor;
calculating to obtain real state data of the driver according to the original driver data;
and evaluating the performance of the DMS equipment according to the comparison result of the driver state data and the driver real state data.
According to yet another aspect of the present application, there is provided a storage medium characterized in that the storage medium stores a computer program for executing the DMS device performance evaluation method described above.
According to still another aspect of the present application, there is provided an electronic apparatus, comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is used for reading the executable instructions from the memory and executing the instructions to realize the DMS device performance evaluation method.
According to the method, the original driver data directly measured by the sensor are collected, the real state data of the driver are calculated on the basis, the state of the driver can be accurately reflected, and the accuracy of the output result of the DMS device is verified as a true value; the application provides a testing and evaluating scheme of a DMS device before mass production, and solves the problem that the false alarm rate and the detection rate of a product can only be fed back depending on the feeling of a client after mass production of the product at present.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the invention. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present application is further described in detail by the accompanying drawings and examples.
The above and other objects, features and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the present application are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
FIG. 1 is a schematic diagram of a DMS device performance evaluation system provided in one embodiment of the present application;
FIG. 2 is a schematic diagram of a DMS device performance evaluation provided in one embodiment of the present application;
FIG. 3 is a data flow diagram of DMS device performance evaluation in online mode as provided in an embodiment of the present application;
FIG. 4 is a data flow diagram of DMS device performance evaluation in an offline mode as provided in an embodiment of the present application;
FIG. 5 is a schematic view of a DMS device performance evaluation system provided in an embodiment of the present application;
FIG. 6 is a flow chart of a DMS device performance evaluation method provided in an embodiment of the present application.
The principles and spirit of the present application will be described with reference to a number of exemplary embodiments. It should be understood that these embodiments are given solely for the purpose of enabling those skilled in the art to better understand and to practice the present application, and are not intended to limit the scope of the present application in any way. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
As will be appreciated by one skilled in the art, embodiments of the present application may be embodied as a system, apparatus, device, method, or computer program product. Accordingly, the present disclosure may be embodied in the form of: entirely hardware, entirely software (including firmware, resident software, micro-code, etc.), or a combination of hardware and software.
In this document, any number of elements in the drawings is by way of example and not by way of limitation, and any nomenclature is used solely for differentiation and not by way of limitation.
The operating principle of the DMS device commonly available in the market at present is to acquire an image of a driver during driving by using a camera, and then input the image of the driver into a trained model (e.g., a neural network model, a machine learning model, etc.) to obtain driver state data including driver fatigue, visual information, head posture, etc.; analysis of driver status data may allow for detection and warning of tired driving, unsafe driving (e.g., smoking, making a call, drinking, distracting, etc.), and dangerous driving (e.g., lane departure, passing by car, etc.).
Most of these DMS devices output driver status data based on computer vision recognition technology. However, the DMS device on the market may output the driver status data with a less accurate result due to the installation location of the camera, the image quality, the selected model, and the like.
However, there is no solution for checking the performance of the DMS device in the current market, and the method can only feedback the false alarm rate and the detection rate of the DMS device by obtaining the user's experience after mass production of the product, but this method for evaluating the performance of the DMS device according to the user's experience has subjectivity and lacks a quantization standard.
To solve this problem, an embodiment of the present application proposes a DMS device performance evaluation system, as shown in fig. 1, where a DMS device performance evaluation system 10 is connected to a DMS device 20 to be evaluated.
The DMS device 20 includes a companion camera and processor. The matched camera is used for collecting images of a driver; the processor is used for processing the driver images collected by the matched camera by computer vision and the like to generate at least one driver state data including sight line information, head posture, fatigue and the like of the driver. In addition, the processor can also output the driver state data in the forms of characters, images, sound and the like, judge whether an alarm is needed or not according to the driver state data, and generate alarm signals in the forms of characters, images, sound and the like when the alarm is needed.
The sight line information output by the DMS device 20 may include at least one of a sight line direction of the driver, a gaze point of the driver, and the like; the head pose output by the DMS device 20 may include at least one of a head displacement, a head pose angle, and the like of the driver.
The DMS device performance evaluation system 10 is configured to evaluate at least one of a gaze tracking performance, a head pose tracking performance, and a fatigue monitoring performance of the DMS device 20. Wherein, the sight tracking performance is used to reflect whether the sight information such as the sight direction, the fixation point, etc. output by the DMS device 20 is accurate; the head pose tracking performance is used for reflecting whether the head poses such as head displacement, head pose angle and the like output by the DMS device 20 are accurate; the fatigue monitoring performance is used to reflect whether the driver fatigue level output from the DMS device 20 is accurate.
As shown in fig. 2, the DMS device performance evaluation system 10 may specifically include: the device comprises a to-be-detected data acquisition unit 101, an original data acquisition unit 102, an original data processing unit 103 and a comparison unit 104.
And the data to be detected acquisition unit 101 is used for acquiring the driver state data output by the DMS device.
Optionally, the data acquisition unit 101 to be tested may include an SDK (Software Development Kit) capable of establishing a data transmission channel with the DMS device 20 to be evaluated, and is configured to receive at least one type of driver state data including gaze information, head posture, fatigue degree, and the like output by the DMS device 20.
The raw data acquiring unit 102 is configured to acquire raw driver data acquired by a sensor. It should be noted that the raw data acquiring unit 102 may be connected to various sensors for collecting raw driver data in a direct or indirect manner. For example, the raw data acquisition unit 102 may be directly connected to various sensors for acquiring raw driver data such as driver images, physiological characteristic data of the driver, head posture data of the driver, and the like; for various sensors for collecting original driver data such as driving behavior data and driving scene data, the original data acquisition unit 102 may be connected to the sensors in an indirect manner, for example, the original data acquisition unit 102 may indirectly receive data from such sensors by establishing a connection relationship with an automobile driving simulator, an automobile data collector, and the like, which collect data collected by such sensors. The various types of sensors that the raw data acquisition unit 102 is connected to in a "direct manner or an indirect manner" are simply referred to as "connecting" the various types of sensors below.
Optionally, the raw data acquiring unit 102 may include an SDK capable of establishing a data transmission channel with various sensors (including but not limited to image sensors, electrocardiograph sensors, electroencephalograph sensors, displacement attitude sensors, automobile driving simulators, automobile data collectors, and the like) for collecting raw driver data, and is configured to receive the raw driver data collected by these sensors.
Optionally, in some embodiments, the DMS device performance evaluation system includes various sensors (including but not limited to an image sensor, an ecg sensor, an eeg sensor, a displacement posture sensor, a car driving simulator, a car data collector, and the like) for collecting raw driver data.
Optionally, in some embodiments, various types of sensors for collecting raw driver data are not included in the DMS device performance evaluation system.
Optionally, in some embodiments, some sensors (e.g., an image sensor, an electrocardiograph sensor, an electroencephalograph sensor, a displacement attitude sensor, etc.) for acquiring raw driver data are included in the DMS device performance evaluation system, and other sensors (e.g., a car driving simulator, a car data collector, etc.) are not included in the DMS device performance evaluation system.
The following describes various sensors connected to the raw data acquisition unit 102:
the image sensor can be used for shooting an image of the driver comprising at least one part of the head, the upper limbs, the trunk, the lower limbs and the like of the driver, and can also be used for determining data such as head displacement, head attitude angle and the like of the driver based on the collected image. In some embodiments, the image sensor may be a camera module including a camera and an infrared light source, and the camera is used to capture an image of the driver while the infrared light generated by the infrared light source illuminates an eye region of the user, wherein a corneal region may generate a light spot due to specular reflection of the infrared light, and a pupil and the light spot are included in an image corresponding to the eye region, so that the driver's sight line information may be determined by determining the positions of the pupil and the light spot based on a pupil-cornea reflection method. In some embodiments, the image sensor may be a multi-view camera capable of capturing an image of the driver with depth information and calculating head pose data such as head displacement, head pose angle, etc. of the driver.
The brain wave sensor can be used for collecting at least one of brain waves, head displacement and head posture angles of a driver. In some embodiments, the electroencephalograph sensor may be an electroencephalograph, and the electroencephalograph sensor is worn on the head of the driver in a contact manner to collect the brain waves of the driver. For an electroencephalograph with an embedded IMU (Inertial Measurement Unit), the embedded IMU can be used to acquire the head displacement and the head posture angle of the driver.
The electrocardio sensor can be used for collecting the electrocardiogram of the driver. In some embodiments, the electrocardiograph sensor may be an electrocardiograph (also called an electrocardiograph), and generally, the electrocardiograph, the heart rate, the blood oxygen saturation, the respiratory rate, the body temperature, the skin resistance, and the like of the driver are acquired by attaching electrodes to the chest or the limbs of the driver in a contact manner.
And the displacement attitude sensor can be used for acquiring at least one of head displacement and head attitude angle of the driver. In some embodiments, the displacement attitude sensor may be an IMU that can be worn to the driver's head (or in contact with the driver's head), acquiring the driver's head displacement, head attitude angle, etc. through contact with the driver's head.
The automobile driving simulator can provide a virtual automobile driving environment, and when a driver drives in the virtual automobile driving environment, at least one of driving behavior data and driving scene data can be collected. In some embodiments, the automobile driving simulator can simulate a real driving environment to generate virtual driving data, the virtual driving data is displayed in front of a driver on a plurality of screens surrounding the driver through a video interface to enable the driver to be immersed in the virtual driving data, and the automobile driving simulator can provide manipulating devices such as a steering wheel, an acceleration (accelerator) pedal, a brake (brake) pedal and the like for the driver to complete corresponding operations such as steering wheel rotating, acceleration, deceleration and the like according to the driving environment seen on the screens. A position sensor can be installed on a steering wheel of the automobile driving simulator to measure steering wheel angle information; displacement sensors can be arranged on an accelerator pedal, a brake pedal, a hand brake handle and the like of the automobile driving simulator to measure information such as the opening degree of the accelerator pedal, the opening degree of the brake pedal, the position of the hand brake and the like; the data acquisition card of the automobile driving simulator can record the type and the time stamp of at least one emergency event, such as traffic lights, pedestrians or other vehicles entering a road where the vehicle runs, violation of vehicle merging, front collision, lane departure, solid line pressing, passing by and the like, in the virtual driving data.
The vehicle data collector may be an ECU (Electronic Control Unit) configured in the real vehicle, and collects at least one of driving behavior data and driving scene data while the driver drives the real vehicle. In some embodiments, the vehicle data collector may be connected to a position sensor mounted on a steering wheel to obtain steering wheel angle information collected by the vehicle data collector, and connected to a displacement sensor mounted on an accelerator (accelerator) pedal, a brake (brake) pedal, a hand brake handle, and the like to obtain information such as accelerator pedal opening, brake pedal opening, and hand brake position collected by the vehicle data collector. In some embodiments, the car data collector may also be connected with an OBD (On Board Diagnostics) to obtain steering wheel angle information, accelerator pedal opening, brake pedal opening, hand brake handle displacement, clutch pedal opening, and gear information collected by the car data collector. In some embodiments, the automobile data collector may collect the driving environment image through a built-in camera, and recognize the collected driving environment image through a built-in vision recognition processor, so as to obtain the type and the timestamp of at least one emergency event, such as a traffic light, a pedestrian or other vehicles intruding into a road on which the vehicle runs, a vehicle violation merging, a front collision, a lane departure, a solid line pressing, a vehicle passing close, and the like.
A raw data processing unit 103 for calculating driver real state data from the raw driver data.
Optionally, the raw data processing unit 103 may include at least one of: the device comprises a sight line processing unit, a posture processing unit and a fatigue processing unit.
And the sight line processing unit is used for processing the driver image to obtain the real sight line information of the driver. The real sight line information includes at least one of a real sight line direction (real sight line angle) of the driver, coordinates of a real gaze point, and the like. For example, the driver image is an image captured by a camera equipped with an infrared light source, the eye region of the driver in the image includes pupils and light spots (the cornea region generates light spots due to reflection of infrared light), the sight line processing unit can determine the sight line information of the driver by determining the positions of the pupils and the light spots, however, since the sight line information is obtained based on the image captured by the camera, that is, the sight line information in a space coordinate system (i.e., a camera coordinate system) adopted by the camera, the sight line information needs to be converted into a world coordinate system to obtain the real sight line information of the driver, for this reason, the camera can be calibrated in advance under the condition that the real sight line information of the driver is known, so as to obtain calibration data including at least one of a pitch angle, a yaw angle, a lateral displacement, a longitudinal displacement, a vertical displacement and the like of the camera coordinate system relative to the world coordinate system, and then based on the calibration data obtained in advance, converting the sight line information of the driver obtained according to the positions of the pupils and the faculas from a space coordinate system adopted by the camera into a world coordinate system to obtain the real sight line information of the driver.
And the posture processing unit is used for processing the head posture data to obtain the real head posture of the driver. For example, the head posture data such as the head displacement and the head posture angle output by the displacement posture sensor or the electroencephalogram sensor is data based on a spatial coordinate system adopted by the IMU built therein, and the posture processing unit may convert the data into a spatial coordinate system used by the DMS device 20 as the real head posture of the driver, so as to compare the real head posture with the driver head posture output by the DMS device; alternatively, the posture processing unit may convert the data into the world coordinate system to obtain the real head posture of the driver, so as to compare the real head posture with the head posture of the driver output by the DMS device which is also converted into the world coordinate system. Optionally, the pose processing unit may also calculate the true head pose of the driver using the image of the driver with the depth information, in which case a depth camera (e.g., a TOF camera, a multi-view camera, etc.) is required to acquire the image of the driver with the depth information.
And the fatigue processing unit is used for processing the physiological characteristic data, the driving behavior data and the driving scene data to obtain the real fatigue of the driver. For example, the fatigue processing unit may input at least one of the image of the driver, the physiological characteristic data of the driver, the driving behavior data, and the driving scene data into a model trained based on physiological principles to obtain the true fatigue of the driver, wherein the model trained based on physiological principles is model-trained by using some physiological characteristic data in physiology with a clear association relationship with the fatigue, for example, theta waves and alpha waves in brain waves have a clear association relationship with the fatigue, and the model trained by establishing the association relationship can very accurately estimate the fatigue of the human according to the input data.
The comparison unit 104 is configured to compare the driver state data output by the DMS device 20 with the driver real state data, and evaluate the performance of the DMS device 20 according to the comparison result.
Optionally, the alignment unit 104 may include at least one of the following: the device comprises a sight line comparison unit, a head posture comparison unit and a fatigue degree comparison unit.
And a sight line comparison unit for calculating a deviation between the real sight line information of the driver and the sight line information output by the DMS device 20, wherein a smaller deviation indicates a better sight line tracking performance of the DMS device 20. It should be noted that, in order to make the data comparable, the line-of-sight comparing unit needs to calculate the deviation between the actual line-of-sight information and the line-of-sight information output by the DMS device 20 in the same coordinate system. Alternatively, the same coordinate system may be a world coordinate system, a coordinate system employed by DMS device 20, or another type of coordinate system.
A head pose comparison unit for calculating a deviation between the real head pose of the driver and the head pose outputted from the DMS device 20, wherein a smaller deviation indicates a better tracking performance of the driver's head pose of the DMS device 20. It should be noted that, in order to make the data comparable, the gaze alignment unit and the head pose alignment unit need to calculate a deviation between the true head pose and the head pose output by the DMS device 20 in the same coordinate system. Alternatively, the same coordinate system may be a world coordinate system, a coordinate system employed by DMS device 20, or another type of coordinate system.
And a fatigue degree comparison unit for calculating a deviation between the real fatigue degree of the driver and the fatigue degree output by the DMS device 20, wherein a smaller deviation indicates a better fatigue monitoring performance of the DMS device 20.
In order to facilitate the intuitive viewing of the evaluation result, the comparing unit 24 further includes at least one of the following display units: the device comprises a sight line display unit, a head posture display unit and a fatigue degree display unit.
A sight line display unit for displaying a sight line focus corresponding to the real sight line information of the driver, a sight line focus corresponding to the sight line information output by the DMS device 20, and a distance between the two focuses;
optionally, the gaze display unit may further display the distance between two gaze focuses within a set time period in a graph manner, for example: in the coordinate system where the abscissa axis displays time and the ordinate axis displays the focal position, the positions and distances of two kinds of gaze focuses at different times are recorded, so that the user can visually see the accuracy of the DMS device 20 in gaze tracking.
A head posture display unit for displaying a real head posture of the driver, a head posture output from the DMS device 20, and a deviation of the two;
optionally, the head pose display unit may further display two head poses within a set time period in a graph manner, for example: in the coordinate system where the abscissa axis displays time and the ordinate axis displays attitude data, the real head attitude of the driver obtained at different times and the head attitude output by the DMS device 20 are recorded, so that the user can visually recognize the accuracy of the DMS device 20 in detecting the head attitude.
A fatigue degree display unit for displaying the true fatigue degree of the driver, the fatigue degree of the DMS device 20 output, and a deviation of both.
Optionally, the fatigue degree display unit may further display two fatigue degrees within a set time period in a graph manner, for example: in the coordinate system with the abscissa axis displaying time and the ordinate axis displaying fatigue, the real fatigue of the driver and the fatigue output by the DMS device 20 obtained at different times are recorded, so that the user can visually see the accuracy of the DMS device 20 in detecting fatigue.
In an implementable manner, the comparison unit 24 further includes an evaluation unit for evaluating the sight-line tracking performance of the DMS device 20 based on the actual sight-line information of the driver and the deviation of the sight-line information output by the DMS device 20, the head-posture tracking performance of the DMS device 20 based on the actual head posture of the driver and the deviation of the head posture output by the DMS device 20, and the fatigue monitoring performance of the DMS device 20 based on the actual fatigue of the driver and the deviation of the fatigue output by the DMS device 20. The evaluation unit may also evaluate the performance of the DMS device 20 based on a combination of the line-of-sight tracking performance, the head posture tracking performance, the fatigue monitoring performance, and the like of the DMS device 20. That is, the evaluation unit may be configured to evaluate at least one of:
(1) the quality of the sight tracking performance;
(2) the quality of the head posture tracking performance;
(3) the quality of the fatigue monitoring performance; and the number of the first and second groups,
(4) the comprehensive quality of sight tracking performance, head posture tracking performance and fatigue monitoring performance.
Alternatively, the various performance goodness conditions described above may be quantified using different levels.
In order to examine the generalization ability of the DMS device 20 to the change of the lighting conditions in consideration of the above-mentioned situation, in addition to the above-mentioned various components, the DMS device performance evaluation system may further include a fill-in light for generating a light beam and making the light beam irradiate the driver from at least one of the front, side and back of the driver, so as to make the driver under different lighting conditions.
In the light beam control of the fill-in light, at least one of manual or automatic methods can be adopted:
during manual control, the light supplement lamp is manually controlled to change the illumination conditions (such as illumination intensity, illumination direction and the like) of the light supplement lamp, so that the phenomenon that a driver is unfavorable for visual identification of the DMS (digital display system) occurs, such as a yin and yang face and backlight, and meanwhile, the original data acquisition unit 102 is manually controlled to respectively acquire original driver data under different illumination conditions of the driver and process the original driver data to obtain real state data of the driver, and thus the comparison unit 104 can evaluate the accuracy of the output result of the DMS device 20 under different illumination conditions and investigate the generalization ability of the DMS to the illumination condition change.
When the automatic control is performed, the DMS performance evaluation system further comprises:
the acquisition control unit is used for controlling the light supplement lamp to change the illumination condition and controlling the original data acquisition unit 102 to respectively acquire the original driver data acquired by the sensor when the driver is in different illumination conditions; then the process of the first step is carried out,
the original data processing unit 103 is further configured to calculate actual state data of the driver under different illumination conditions according to the original driver data respectively acquired by the acquisition control unit;
the data acquisition unit 101 to be measured is further configured to acquire driver state data output by the DMS device 20 when the driver is in different lighting conditions;
the comparing unit 104 is further configured to compare the driver real state data and the driver state data of the driver under different illumination conditions, evaluate the accuracy of the output result of the DMS device 20 under different illumination conditions according to the comparison result, and examine the generalization ability of the DMS device 20 to the change of the illumination conditions.
To facilitate evaluation of the performance of the DMS device 20, the DMS performance evaluation system provided herein may operate in an online mode of operation and an offline mode of operation, the two modes of operation differing in the manner in which the raw driver data is obtained.
In one embodiment, the raw data acquiring unit 102 may include an online acquiring unit for acquiring raw driver data acquired by the sensor in real time in an online operating mode.
In this embodiment, the online acquisition unit acquires various kinds of original driver data online, the original data processing unit 103 calculates actual driver state data according to the original driver data, the comparison unit 104 compares the actual driver state data with the driver state data output online by the DMS device 20, and evaluates the performance of the DMS device 20 according to the comparison result.
Optionally, the online acquisition unit is connected to at least one of an image sensor, an electrocardiograph sensor, an electroencephalogram sensor and a displacement attitude sensor. The image sensor, the electrocardiograph sensor, the electroencephalograph sensor, the displacement attitude sensor and other types of sensors can refer to the foregoing description, and are not described herein again.
Optionally, the online acquisition unit may further be connected to at least one of an automobile driving simulator and an automobile data collector. The automobile driving simulator and the automobile data collector can refer to the above descriptions, and are not described herein again.
The online acquisition unit can basically meet the requirements for subsequently calculating the real sight line information, the real fatigue degree and the real head posture of the driver by acquiring data such as the driver image, the brain wave, the electrocardio wave, the head displacement, the head posture angle and the like acquired by the image sensor, the electroencephalogram sensor, the displacement posture sensor and the like, but in order to improve the reliability of an evaluation result, the online acquisition unit can also be connected with an automobile driving simulator and an automobile data acquisition device, so that more accurate real sight line information, real fatigue degree and real head posture of the driver can be acquired by combining driving behavior data and driving scene data acquired by the automobile driving simulator and the automobile data acquisition device, and the performance of the DMS device 20 can be more accurately evaluated.
It should be noted that the types of sensors connected to the online acquisition unit can be matched in any combination according to the requirements of testing and evaluation. For the purposes of the present application, any solution that utilizes other sensors or devices to collect raw driver data is within the scope of the present application.
Optionally, the DMS device performance evaluation system 10 may further include an original data storage unit, configured to store various types of original driver data acquired by the online acquisition unit in a database, so as to implement an offline evaluation function of the DMS device performance evaluation system. Further, the raw data storage unit may further store various real driver status data processed by the raw data processing unit 103 in the online mode into a database, so as to facilitate the offline evaluation of the DMS device 20 by the DMS device performance evaluation system.
Fig. 3 shows the data stream transmission in the DMS device performance evaluation system in the online mode. As shown in fig. 3:
the data acquisition unit to be tested establishes a data transmission channel with the DMS device to be evaluated through the SDK, and receives at least one driver state data including sight line information, head posture and fatigue degree output by the DMS device;
the on-line acquisition unit respectively acquires original driver data such as driver images, head attitude data, physiological characteristic data, driving behavior data, driving scene data and the like acquired by an image sensor, an electroencephalogram sensor, an electrocardio sensor, a displacement attitude sensor and an automobile driving simulator (or an automobile data acquisition device), and outputs the driver data to the original data processing unit.
Meanwhile, the original data storage unit stores the original driver data such as the driver image, the head posture data, the physiological characteristic data, the driving behavior data and the driving scene data acquired by the online acquisition unit into a database.
The original data processing unit comprises a sight line processing unit, an attitude processing unit and a fatigue processing unit; the sight line processing unit processes the image data of the driver to obtain the real sight line information of the driver; the posture processing unit processes the head posture data to obtain the real head posture of the driver; the fatigue processing unit processes at least one of the physiological characteristic data, the driving behavior data and the driving scene data to obtain the real fatigue.
The original data storage unit is also used for storing the data such as the real sight line information, the real head posture, the real fatigue degree and the like which are obtained by the original data processing unit in a database.
The comparison unit comprises a sight line comparison unit, a head posture comparison unit and a fatigue degree comparison unit; the sight comparison unit receives the real sight information output by the sight processing unit and the sight information output by the DMS device to be tested, compares the real sight information and the sight information to obtain the deviation of the sight information, and is used for evaluating the sight tracking performance of the DMS device; the posture comparison unit is used for receiving the real head posture output by the posture processing unit and the head posture output by the DMS device, comparing to obtain the head posture deviation of the driver and evaluating the tracking performance of the head posture of the driver of the DMS device; and the fatigue degree comparison unit is used for receiving the real fatigue degree output by the fatigue processing unit and the fatigue degree output by the DMS device, comparing to obtain the fatigue degree deviation of the driver, and evaluating the driver fatigue monitoring performance of the DMS device.
In one embodiment, the raw data acquisition unit 102 may include an offline acquisition unit for reading raw driver data from a database in an offline mode.
In this embodiment, the offline acquisition unit acquires the original driver data from the database, the original data processing unit 103 calculates the real driver state data according to the original driver data, the comparison unit 104 compares the real driver state data with the driver state data output online by the DMS device 20, and evaluates the performance of the DMS device 20 according to the comparison result.
Optionally, the offline acquisition unit includes at least one of an image reading unit, a head posture data reading unit, a physiological characteristic data reading unit, a behavior data reading unit, and a scene data reading unit. An image reading unit for reading the driver image from the database; a head posture data reading unit for reading the head posture data of the driver such as the head displacement and the head posture angle of the driver from the database; the physiological characteristic data reading unit is used for reading the physiological characteristic data of the driver, such as electrocardiogram, brain wave and the like of the driver from the database; the behavior data reading unit is used for reading driving behavior data such as steering wheel angles, accelerator pedal opening degrees, brake pedal opening degrees and the like from a database; and the scene data reading unit is used for reading the driving scene data of the emergency events such as traffic lights, pedestrians entering the road where the vehicle runs, the vehicle violating regulations and pressing a solid line and the like and the time stamps from the database.
It should be noted that the various kinds of raw driver data read from the database by the image reading unit, the head posture data reading unit, the physiological characteristic data reading unit, the behavior data reading unit and the scene data reading unit are recorded in the driving process of the same driver in the same time period.
Optionally, the offline acquisition unit may also directly read, from the database, various actual driver status data calculated from various original driver data recorded during the driving process of the same driver in the same time period, and in this case, the offline acquisition unit may directly output, to the comparison unit 104, the various actual driver status data read from the database for comparison, so as to omit the processing work of the original data processing unit, and improve the evaluation efficiency.
In order to ensure the accuracy of the evaluation result, the driver status data outputted by the DMS device 20 monitoring the driving process of the same driver in the same time period needs to be acquired, and in order to realize this acquisition in the offline mode, in this embodiment, the data acquisition unit 101 to be measured may include an input unit and an output unit. An input unit for reading the driver image from the database and sending it to the DMS device 20 for processing; and an output unit for acquiring the driver state data output after the driver image transmitted from the input unit is processed by the DMS device 20.
Generally, a camera matched with a DMS device is mounted on an a-pillar or a steering wheel shaft, and this mounting manner has a certain constraint on an angle of view of the camera, and also has a certain constraint on a field of view of a driver image processed by the DMS device. In view of this, the DMS device 20 should perform its recognition well to ensure the accuracy of the evaluation result. For this purpose, optionally, in the offline mode, the driver image sent by the input unit to the DMS device 20 may be captured as follows: the installation position and/or the shooting angle of the camera associated with the DMS device 20 to be evaluated are determined, and determined as the target installation position and/or the target shooting angle, and the sensor that collects the image of the driver is installed at the target installation position, and/or shooting is performed at the target shooting angle (and then stored in the database for reading by the input unit). This ensures that the field of view of the driver image transmitted from the input unit to the DMS device 20 is as consistent as possible with the field of view of the driver image captured by the camera associated with the DMS device 20, so that the DMS device 20 can exert its recognition capability well.
Optionally, the original driver data stored in the database may be from various original driver data acquired by the online acquisition unit (which may be stored in the database by the original data storage unit), or may be from various original driver data acquired by a sensor device other than the DMS device performance evaluation system.
Fig. 4 shows the data stream transmission in the offline mode. As shown in fig. 4:
in the off-line acquisition unit, an image reading unit, a head posture data reading unit, a physiological characteristic data reading unit, a behavior data reading unit and a scene data reading unit respectively read driver image data, head posture data, physiological characteristic data, driving behavior data and driving scene data which are stored in advance from a database.
An input unit and an output unit in the data acquisition unit to be tested respectively establish a data transmission channel with the DMS device to be tested through the SDK, the input unit is used for reading the driver image from the database and transmitting the driver image to the DMS device, and the output unit is used for receiving at least one driver state data such as driver sight line information, head posture, fatigue degree and the like output by the DMS device.
In the original data processing unit, the sight line processing unit obtains real sight line information of a driver according to the image data of the driver; the posture processing unit processes the head posture data to obtain the real head posture of the driver; the fatigue processing unit processes at least one of the physiological characteristic data, the driving behavior data and the driving scene data to obtain the real fatigue.
In the comparison unit, the sight line comparison unit receives the real sight line information output by the sight line processing unit and the sight line information output by the DMS device to be detected, compares the real sight line information and the sight line information to obtain the deviation of the sight line information, and is used for evaluating the sight line tracking performance of the DMS device; the posture comparison unit is used for receiving the real head posture output by the posture processing unit and the head posture output by the DMS device, comparing the real head posture output by the posture processing unit and the head posture output by the DMS device to obtain the head posture deviation of the driver, and evaluating the tracking performance of the head posture of the driver of the DMS device; and the fatigue degree comparison unit is used for receiving the real fatigue degree output by the fatigue processing unit and the fatigue degree output by the DMS device, comparing to obtain the fatigue degree deviation of the driver, and evaluating the driver fatigue monitoring performance of the DMS device.
According to the DMS device performance evaluation system, the original driver data are directly measured by the sensor, the real state data of the driver are obtained through calculation based on the measured original driver data, the real state data of the driver obtained through calculation by the DMS device performance evaluation system can accurately reflect the state of the driver based on the reliability of the measurement result of the sensor, and the true value can be used as a true value to verify the accuracy of the output result of the DMS device and judge the performance of the DMS device.
In a specific implementation, the result output by the DMS device performance evaluation system provided by the present application can be used as a true value to adjust the DMS device, such as adjusting the installation angle, the shooting angle, and the shooting quality of a camera associated with the DMS device, and adjusting a model selected by the DMS device, so that the output result of the DMS device approaches the true value as much as possible, thereby achieving the purpose of optimizing the DMS device.
As shown in fig. 5, the present application provides a specific example of a DMS device performance evaluation system. In this example, the DMS device performance evaluation system includes: the device comprises a driving simulator, a light supplementing light source, a sensor combination and an industrial personal computer.
The DMS device to be evaluated comprises a camera and a processor, wherein the camera acquires an image of a driver, and the processor analyzes and processes the image of the driver to obtain and output sight line information, fatigue, head posture and other data of the driver.
The driving simulator comprises a server, a surrounding type display, a driving cabin seat, a data acquisition card and the like. Wherein the wrap around display comprises the three-sided display of FIG. 5; the cockpit comprises a cockpit seat, a steering wheel, a clutch, a brake pedal, an accelerator pedal, a transmission, a turn light switch, an emergency light switch, a horn switch, an ignition switch, a main electric switch, a windshield wiper switch, a high beam switch, a low beam switch, a high beam and low beam alternative switch, a safety belt and the like; the server generates virtual driving data by simulating a real driving environment and outputs the virtual driving data to the surrounding display through the video interface, so that a driver can be immersive fused into the virtual driving environment; the data acquisition card can acquire various driving behavior data (including but not limited to accelerator pedal opening, brake pedal opening, hand brake position and the like) generated by a driver through controlling a cockpit, and various driving scene data (including but not limited to type and timestamp of at least one emergency event of encountering traffic lights, pedestrians or other vehicles intruding into a road on which the vehicle runs, violation of vehicle merging, front collision, lane departure, pressing to a solid line, passing by and close to the vehicle) in the virtual driving data.
The light filling light source is including surrounding three group's light filling lamps that the cockpit seat set up, and wherein set up the light filling lamp that sets up in the cockpit seat dead ahead and be used for producing the light beam and shine from driver's front, set up two sets of light filling lamps in cockpit seat side and be used for producing the light beam respectively and shine from driver's both sides, set up the light filling lamp that sets up in the cockpit seat dead astern and be used for producing the light beam and shine from driver's the back. And the opening, closing, illumination intensity and the like of each group of light supplement lamps are controlled by the industrial personal computer.
The sensor combination comprises three cameras, an electrocardiograph and an electroencephalograph. The three cameras are respectively arranged near the three-side display and used for shooting images of a driver from different angles, and each camera is matched with an infrared light source and used for generating infrared light to irradiate the driver, so that the driver images shot by the cameras can be used for calculating sight line information of the driver based on a pupil cornea reflection method; the electrocardiograph comprises a plurality of electrodes, and the electrodes are attached to the chest, four limbs and other parts of a driver to acquire physiological characteristic data such as heart rate, blood oxygen saturation, respiratory rate, body temperature, skin resistance and the like; the electroencephalograph comprises an electrode cap, wherein an IMU is arranged in the electrode cap, and when the electrode cap is worn on the head of a driver, data such as brain waves, head displacement, head posture angles and the like can be collected.
The industrial personal computer is connected with the surrounding type display, the driving simulator, the light supplementing lamp, the sensor combination and the DMS device to be evaluated. The industrial personal computer is used for acquiring driver state data such as driver sight line information, head postures and fatigue degrees output by the DMS device, calculating original driver data such as driver images, electrocardiograms, brain waves, head displacement and head posture angles acquired by the combination of the sensors to obtain real driver state data such as real driver sight line information, real head postures and real fatigue degrees, comparing the driver sight line information, the head postures and the fatigue degrees output by the DMS device with the calculated real driver sight line information, the calculated real head postures and the calculated real fatigue degrees respectively to evaluate sight line tracking performance, head posture tracking performance, fatigue monitoring performance and comprehensive performance of the DMS device, and finally sending comparison results and evaluation results to the surrounding type display for displaying.
The industrial personal computer can control the on and off of each group of light supplement lamps, change the illumination intensity and the like, so that the driver is in different illumination conditions (for example, the phenomenon that the visual identification of the DMS device is not facilitated due to the appearance of yin and yang face, backlight and the like of the driver), the industrial personal computer processes various kinds of original driver data respectively collected by the camera, the electrocardiograph, the electroencephalograph and the automobile driving simulator under different illumination conditions of the driver to obtain real state data of the driver, and collects the driver state data output by the DMS device to be evaluated when the driver is in different lighting conditions, by comparing the real driver state data of the driver under different illumination conditions with the driver state data output by the DMS device, the accuracy of the DMS device output result under different illumination conditions can be evaluated, and the generalization capability of the DMS device to illumination condition changes can be inspected.
In this embodiment, the DMS device to be evaluated may further obtain and output driver behavior information such as smoking, calling, drinking, unbelted, yawning, and eye closing by recognizing the driver image.
The industrial personal computer can also process various kinds of original driver data collected by a camera, an electrocardiograph, an electroencephalograph, an automobile driving simulator and the like to obtain and output accurate driver behavior information such as smoking, calling, drinking, unbuckled belts, yawning, eye closing and the like, and compares the accurate driver behavior information with the driver behavior information output by the DMS device to evaluate the performance of the DMS device in the aspect of driver behavior detection.
Another embodiment of the present application provides a DMS device performance evaluation method, which performs evaluation by using the above DMS device performance evaluation system, as shown in fig. 7, and includes the following steps:
step S1, acquiring driver state data output by the DMS device;
step S2, acquiring original driver data acquired by a sensor;
step S3, calculating to obtain the real state data of the driver according to the original driver data;
and step S4, evaluating the performance of the DMS device according to the comparison result of the driver state data and the driver real state data.
Optionally, the DMS device performance evaluation method may adopt an online working mode and an offline working mode;
in the online working mode, the driver state data in the step S1 and the original driver data in the step S2 are acquired by online collection.
In the off-line operation mode, the driver state data in step S1 and the raw driver data in step S2 are obtained from off-line data collected in advance and stored in a database.
In yet another embodiment of the present application, a computer-readable storage medium is provided; the storage medium stores a computer program for executing the DMS device performance evaluation method described above.
In yet another embodiment of the present application, an electronic device is provided; the electronic device includes:
a processor;
a memory for storing the processor-executable instructions;
the processor is used for reading the executable instructions from the memory and executing the instructions to realize the DMS device performance evaluation method.
In summary, the DMS device performance evaluation scheme provided by the embodiments of the present disclosure may be used for testing and evaluating a DMS device before mass production, and solves the problem that the false alarm rate and the detection rate of a product can only be fed back by the experience of a client after mass production of the product; the method comprises the steps that original driver data directly measured by a sensor are collected, real state data of a driver are calculated on the basis, the state of the driver can be accurately reflected, and a quantitative standard for judging the performance of DMS equipment is provided; the performance of the DMS equipment is evaluated by setting different illumination conditions, so that the generalization capability of the DMS equipment on the change of the illumination conditions can be inspected; according to the performance evaluation result of the DMS equipment, configuration schemes such as the installation position of a camera of the DMS equipment can be optimized, and the performance of the DMS equipment is improved.
In the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts in the embodiments are referred to each other. For the method embodiment, since it basically corresponds to the system embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
It should be noted that while the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
While the spirit and principles of the application have been described with reference to several particular embodiments, it is to be understood that the application is not limited to the disclosed embodiments, nor is the division of aspects, which is for convenience only as the features in such aspects may not be combined to benefit from the description. The application is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
The above-mentioned embodiments are further described in detail for the purpose of illustrating the invention, and it should be understood that the above-mentioned embodiments are only illustrative of the present invention and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements, etc. made within the spirit and principle of the present invention should be included in the scope of the present invention.
Those of skill in the art will further appreciate that the various illustrative logical blocks, units, and steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate the interchangeability of hardware and software, various illustrative components, elements, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design requirements of the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The various illustrative logical blocks, or units, or devices described in this application may be implemented or operated by a general purpose processor, a digital signal processor, an Application Specific Integrated Circuit (ASIC), a field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other similar configuration.
The steps of a method or algorithm described in the embodiments herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. For example, a storage medium may be coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC, which may be located in a user terminal. In the alternative, the processor and the storage medium may reside in different components in a user terminal.
In one or more exemplary designs, the functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination of the three. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media that facilitate transfer of a computer program from one place to another. Storage media may be any available media that can be accessed by a general purpose or special purpose computer. For example, such computer-readable media can include, but is not limited to, RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store program code in the form of instructions or data structures and which can be read by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor. Additionally, any connection is properly termed a computer-readable medium, and, thus, is included if the software is transmitted from a website, server, or other remote source via a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wirelessly, e.g., infrared, radio, and microwave. Such discs (disk) and disks (disc) include compact disks, laser disks, optical disks, DVDs, floppy disks and blu-ray disks where disks usually reproduce data magnetically, while disks usually reproduce data optically with lasers. Combinations of the above may also be included in the computer-readable medium.
Claims (20)
- A DMS device performance evaluation system, comprising:the data acquisition unit to be tested is used for acquiring the driver state data output by the DMS equipment;the system comprises an original data acquisition unit, a data acquisition unit and a data acquisition unit, wherein the original data acquisition unit is used for acquiring original driver data acquired by a sensor;the original data processing unit is used for calculating the real state data of the driver according to the original driver data;and the comparison unit is used for comparing the driver state data with the real driver state data and evaluating the performance of the DMS equipment according to a comparison result.
- The DMS device performance evaluation system of claim 1, further comprising:the light supplementing lamp is used for generating light beams and enabling the light beams to irradiate the driver from at least one of the front face, the side face and the back face of the driver so that the driver can be in different illumination conditions.
- The DMS device performance evaluation system of claim 2, further comprising:the acquisition control unit is used for controlling the light supplement lamp to change the illumination condition and controlling the original data acquisition unit to respectively acquire the original driver data acquired by the sensor when the driver is in different illumination conditions; then the process of the first step is carried out,the original data processing unit is also used for calculating to obtain the real state data of the driver under different illumination conditions according to the original driver data respectively acquired by the acquisition control unit;the data acquisition unit to be tested is also used for acquiring driver state data output by the DMS device when the driver is in different illumination conditions;and the comparison unit is also used for comparing the real state data of the driver and the state data of the driver under different illumination conditions, and evaluating the performance of the DMS device under different illumination conditions according to the comparison result.
- The DMS device performance evaluation system of claim 1,the driver state data comprises at least one of driver fatigue, sight line information and head posture;the original driver data comprises at least one of a driver image, driver head posture data, driver physiological characteristic data, driving behavior data and driving scene data;the driver real state data includes at least one of a real fatigue degree, a real sight line information, and a real head posture of the driver.
- The DMS device performance evaluation system of claim 4,the driver head pose data comprises at least one of a head displacement and a head pose angle of the driver;the driver physiological characteristic data includes at least one of an electrocardiogram and brain waves of the driver;the driving behavior data includes at least one of a steering wheel angle, an accelerator pedal opening, and a brake pedal opening;the driving scene data comprises the type and the time stamp of an emergency in the driving process, and the emergency comprises at least one of traffic lights, pedestrians breaking into the road on which the vehicle runs, and vehicle violation pressing to a solid line.
- The DMS device performance evaluation system according to claim 5, wherein the raw data acquisition unit includes at least one of an online acquisition unit and an offline acquisition unit;the online acquisition unit is used for acquiring original driver data acquired by the sensor in real time;and the off-line acquisition unit is used for reading the original driver data collected by the sensor and stored in the database.
- The DMS device performance evaluation system of claim 6, further including at least one of the following sensors:an image sensor for acquiring at least one of a driver image, a head displacement and a head pose angle;the electrocardiogram sensor is used for acquiring the electrocardiogram of the driver;the brain wave sensor is used for acquiring at least one of brain waves, head displacement and head posture angles of a driver;and the displacement attitude sensor is used for acquiring at least one of head displacement and head attitude angle of the driver.
- The DMS device performance evaluation system of claim 7, further including at least one of:the automobile driving simulator is used for providing a virtual automobile driving environment and acquiring at least one of driving behavior data and driving scene data when a driver drives in the virtual automobile driving environment;and the automobile data acquisition unit is used for acquiring at least one of driving behavior data and driving scene data of a driver in the driving process of driving the automobile.
- The DMS device performance evaluation system of claim 6, further comprising:and the original data storage unit is used for storing the original driver data acquired by the online acquisition unit into a database.
- The DMS device performance evaluation system according to claim 6, wherein the offline acquisition unit includes at least one of:an image reading unit for reading the driver image from the database;a head posture data reading unit for reading the head posture data of the driver from the database;the physiological characteristic data reading unit is used for reading the physiological characteristic data of the driver from the database;a behavior data reading unit for reading driving behavior data from a database;and the scene data reading unit is used for reading the driving scene data from the database.
- The DMS device performance evaluation system of claim 10, wherein the data acquisition unit under test includes:the input unit is used for reading the driver image from the database and sending the driver image to the DMS device for processing;and the output unit is used for acquiring the driver state data which is output after the driver image sent by the input unit is processed by the DMS device.
- The DMS device performance evaluation system according to claim 11, wherein the driver's image transmitted from the input unit to the DMS device is captured in at least one of the following ways:determining the installation position of a camera matched with DMS equipment, determining the installation position as a target installation position, and installing a sensor for acquiring an image of a driver at the target installation position and shooting;and determining the shooting visual angle of a camera matched with the DMS equipment, determining the shooting visual angle as a target shooting visual angle, and shooting by a sensor for collecting the driver image according to the target shooting visual angle.
- The DMS device performance evaluation system of claim 4, wherein the raw data processing unit includes at least one of:the sight line processing unit is used for calculating real sight line information of the driver according to the driver image;an attitude processing unit for calculating a true head attitude of the driver from at least one of the driver image and the driver head attitude data;and the fatigue processing unit is used for calculating the real fatigue degree of the driver according to at least one of the driver image, the physiological characteristic data of the driver, the driving behavior data and the driving scene data.
- The DMS device performance evaluation system of claim 13,the sight line processing unit is used for calculating sight line information of the driver according to the image of the driver and converting the sight line information into a world coordinate system to obtain real sight line information of the driver;the attitude processing unit is used for converting the head attitude data of the driver into a world coordinate system to obtain the real head attitude of the driver;and the fatigue processing unit is used for inputting at least one of the driver image, the driver physiological characteristic data, the driving behavior data and the driving scene data into a model trained on the basis of a physiological principle to obtain the real fatigue of the driver.
- The DMS device performance evaluation system of claim 2, wherein the alignment unit includes at least one of:the sight line comparison unit is used for calculating the deviation between the real sight line information of the driver and the sight line information output by the DMS device;the head posture comparison unit is used for calculating the deviation between the real head posture of the driver and the head posture output by the DMS device;and the fatigue degree comparison unit is used for calculating the deviation between the real fatigue degree of the driver and the fatigue degree output by the DMS device.
- The DMS device performance evaluation system of claim 15, wherein the alignment unit further includes at least one of:the sight line display unit is used for displaying a sight line focus corresponding to the real sight line information of the driver, a sight line focus corresponding to the sight line information output by the DMS device and the distance between the two focuses;a head posture display unit for displaying the real head posture of the driver, the head posture output by the DMS device and the deviation of the two;and the fatigue display unit is used for displaying the real fatigue of the driver, the fatigue output by the DMS device and the deviation of the real fatigue and the fatigue.
- The DMS device performance evaluation system of claim 15, wherein the alignment unit further includes:and the evaluation unit is used for determining the performance quality of the DMS device according to at least one of the deviation of the real sight line information of the driver and the sight line information output by the DMS device, the deviation of the real head posture of the driver and the head posture output by the DMS device, and the deviation of the real fatigue of the driver and the fatigue output by the DMS device.
- A DMS device performance evaluation method applied to the device performance evaluation system according to any one of claims 1 to 17, comprising the steps of:acquiring driver state data output by DMS equipment;acquiring original driver data acquired by a sensor;calculating to obtain real state data of the driver according to the original driver data;and evaluating the performance of the DMS equipment according to the comparison result of the driver state data and the driver real state data.
- A storage medium characterized in that it stores a computer program for executing the DMS device performance evaluation method of claim 18.
- An electronic device, characterized in that the electronic device comprises:a processor;a memory for storing the processor-executable instructions;the processor is configured to read the executable instructions from the memory and execute the instructions to implement the DMS device performance evaluation method of claim 18.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/070519 WO2021138775A1 (en) | 2020-01-06 | 2020-01-06 | Dms device performance evaluation system and method, and storage medium and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113544471A true CN113544471A (en) | 2021-10-22 |
Family
ID=76787713
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202080019788.4A Pending CN113544471A (en) | 2020-01-06 | 2020-01-06 | DMS device performance evaluation system, method, storage medium and electronic device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113544471A (en) |
WO (1) | WO2021138775A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114040195A (en) * | 2021-11-28 | 2022-02-11 | 苏州市德智电子有限公司 | Intelligent automobile cab monitoring and testing system |
CN114200849A (en) * | 2021-12-06 | 2022-03-18 | 苏州挚途科技有限公司 | Virtual simulation test system and method for automatic driving |
CN114580082B (en) * | 2022-03-11 | 2024-04-12 | 东风汽车股份有限公司 | Electric installation checking method for comfort level of driving seat of light commercial vehicle |
CN116570835B (en) * | 2023-07-12 | 2023-10-10 | 杭州般意科技有限公司 | Method for determining intervention stimulation mode based on scene and user state |
CN118470694A (en) * | 2024-07-10 | 2024-08-09 | 广东工业大学 | Non-contact fatigue detection method and system based on multiple sensors |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN202383802U (en) * | 2011-12-31 | 2012-08-15 | 长安大学 | Acousto-optic warning system for truck fatigue driving |
CN102778670A (en) * | 2011-05-10 | 2012-11-14 | 通用汽车环球科技运作有限责任公司 | Novel sensor alignment process and tools for active safety vehicle applications |
CN103818256A (en) * | 2012-11-16 | 2014-05-28 | 西安众智惠泽光电科技有限公司 | Automobile fatigue-driving real-time alert system |
CN104925001A (en) * | 2014-03-18 | 2015-09-23 | 沃尔沃汽车公司 | Vehicle sensor diagnosis system and method and a vehicle comprising such a system |
CN105987717A (en) * | 2015-03-18 | 2016-10-05 | 福特全球技术公司 | Driver visual sensor performance test system |
US20160300242A1 (en) * | 2015-04-10 | 2016-10-13 | Uber Technologies, Inc. | Driver verification system for transport services |
CN208498370U (en) * | 2017-12-03 | 2019-02-15 | 南京理工大学 | Fatigue driving based on steering wheel detects prior-warning device |
CN109515318A (en) * | 2018-12-07 | 2019-03-26 | 安徽江淮汽车集团股份有限公司 | A kind of test method and system monitored for assessing vehicle blind spot |
CN109910899A (en) * | 2019-04-01 | 2019-06-21 | 广东科学技术职业学院 | A kind of safe and intelligent drive manner and system |
-
2020
- 2020-01-06 WO PCT/CN2020/070519 patent/WO2021138775A1/en active Application Filing
- 2020-01-06 CN CN202080019788.4A patent/CN113544471A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102778670A (en) * | 2011-05-10 | 2012-11-14 | 通用汽车环球科技运作有限责任公司 | Novel sensor alignment process and tools for active safety vehicle applications |
CN202383802U (en) * | 2011-12-31 | 2012-08-15 | 长安大学 | Acousto-optic warning system for truck fatigue driving |
CN103818256A (en) * | 2012-11-16 | 2014-05-28 | 西安众智惠泽光电科技有限公司 | Automobile fatigue-driving real-time alert system |
CN104925001A (en) * | 2014-03-18 | 2015-09-23 | 沃尔沃汽车公司 | Vehicle sensor diagnosis system and method and a vehicle comprising such a system |
CN105987717A (en) * | 2015-03-18 | 2016-10-05 | 福特全球技术公司 | Driver visual sensor performance test system |
US20160300242A1 (en) * | 2015-04-10 | 2016-10-13 | Uber Technologies, Inc. | Driver verification system for transport services |
CN208498370U (en) * | 2017-12-03 | 2019-02-15 | 南京理工大学 | Fatigue driving based on steering wheel detects prior-warning device |
CN109515318A (en) * | 2018-12-07 | 2019-03-26 | 安徽江淮汽车集团股份有限公司 | A kind of test method and system monitored for assessing vehicle blind spot |
CN109910899A (en) * | 2019-04-01 | 2019-06-21 | 广东科学技术职业学院 | A kind of safe and intelligent drive manner and system |
Also Published As
Publication number | Publication date |
---|---|
WO2021138775A1 (en) | 2021-07-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113544471A (en) | DMS device performance evaluation system, method, storage medium and electronic device | |
Braunagel et al. | Ready for take-over? A new driver assistance system for an automated classification of driver take-over readiness | |
CN107832748B (en) | Shared automobile driver replacing system and method | |
CN101242523B (en) | Field watch apparatus | |
JP3787493B2 (en) | Vehicle driver training system, vehicle driver evaluation method, and apparatus | |
CN112289003B (en) | Method for monitoring end-of-driving behavior of fatigue driving and active safety driving monitoring system | |
US20150238087A1 (en) | Biological information measurement device and input device utilizing same | |
Doshi et al. | A comparative exploration of eye gaze and head motion cues for lane change intent prediction | |
CN110826369A (en) | Driver attention detection method and system during driving | |
CN102047304A (en) | Driver awareness degree judgment device, method, and program | |
JP6839090B2 (en) | Devices and methods for predicting the arousal level of a motor vehicle driver | |
US11715333B2 (en) | Human monitoring system incorporating calibration methodology | |
JP2019195377A (en) | Data processing device, monitoring system, awakening system, data processing method, and data processing program | |
CN110472556A (en) | Driver attention's state analysis system and analysis method based on monocular vision | |
CN202568277U (en) | Fatigue detection device | |
CN117227740B (en) | Multi-mode sensing system and method for intelligent driving vehicle | |
CN110171357A (en) | Vehicle and its control method | |
CN112215093A (en) | Method and device for evaluating vehicle driving ability level | |
CN103854539A (en) | Electronic road test monitoring system based on video analysis | |
Guasconi et al. | A low-cost implementation of an eye tracking system for driver's gaze analysis | |
CN115743137A (en) | Driving situation understanding method based on man-machine enhanced perception | |
US20230219584A1 (en) | System for testing a driver assistance system of a vehicle | |
CN105987717A (en) | Driver visual sensor performance test system | |
CN117132967A (en) | Fatigue driving detection method and system and machine-readable storage medium | |
CN112319486A (en) | Driving detection method based on driving data acquisition and related device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |