CN116594858A - Intelligent cabin man-machine interaction evaluation method and system - Google Patents

Intelligent cabin man-machine interaction evaluation method and system Download PDF

Info

Publication number
CN116594858A
CN116594858A CN202211726814.4A CN202211726814A CN116594858A CN 116594858 A CN116594858 A CN 116594858A CN 202211726814 A CN202211726814 A CN 202211726814A CN 116594858 A CN116594858 A CN 116594858A
Authority
CN
China
Prior art keywords
interaction
data
scores
cabin
score
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211726814.4A
Other languages
Chinese (zh)
Other versions
CN116594858B (en
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kingfar International Inc
Original Assignee
Kingfar International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kingfar International Inc filed Critical Kingfar International Inc
Priority to CN202211726814.4A priority Critical patent/CN116594858B/en
Publication of CN116594858A publication Critical patent/CN116594858A/en
Application granted granted Critical
Publication of CN116594858B publication Critical patent/CN116594858B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3604Software analysis for verifying properties of programs
    • G06F11/3612Software analysis for verifying properties of programs by runtime analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application provides an intelligent cabin man-machine interaction evaluation method and system, wherein virtual driving scenes are provided through XR head-mounted equipment, multi-mode data are collected through a virtual simulation driver to realize interaction modes such as eye control and brain control, so that operators can interact, a real driving scene is simulated, and test data close to the real scene are obtained under the condition of ensuring safety. The multi-modal data of human factors is introduced, based on vital sign data and operator behavior data generated in the interaction process, multi-dimensional scores comprising cognitive load scores, comfort scores and behavior high-efficiency scores in the interaction process are calculated and formed, an omnibearing evaluation scheme aiming at intelligent cockpit design is constructed, full-automatic evaluation is carried out, evaluation efficiency is improved, influence of individual subjective factors is reduced, and multi-modal data quantitative evaluation is carried out, so that reliability is higher.

Description

Intelligent cabin man-machine interaction evaluation method and system
Technical Field
The application relates to the technical field of intelligent driving, in particular to an intelligent cabin man-machine interaction evaluation method and system.
Background
With the continuous development of new intelligent driving technology, the man-machine interaction function of the intelligent driving cabin is also more and more abundant, so as to realize the monitoring of the passenger state and the safe driving assistance in the driving process. Meanwhile, innovative application of technologies such as AI, big data and 5G brings more and more diversification of multi-mode forms of man-machine interaction of the intelligent cabin, and more fine interaction scenes. For example, smart car DMS systems (driver monitoring systems). The DMS system can realize real-time monitoring of driver fatigue, distraction, other dangerous behaviors (such as calling, eating, etc.), and the like, and can make timely early warning for abnormal states.
On the basis, the human-computer interaction evaluation requirement of the intelligent cockpit design is also higher and higher. The design evaluation of the intelligent cockpit traditionally depends on expert experience or engineer/designer experience, subjective evaluation is mainly performed, meanwhile, the test environment is complex, on one hand, safety cannot be guaranteed, on the other hand, objective and systematic evaluation technology is lacked, perfect and systematic automatic evaluation schemes are lacked, efficiency is low, and the safety of later products and the experience of operators are also affected.
Disclosure of Invention
In view of this, the embodiment of the application provides a man-machine interaction evaluation method and system for an intelligent cockpit, so as to eliminate or improve one or more defects existing in the prior art, and solve the problem that an automatic evaluation scheme capable of simulating a real scene cannot be provided for the intelligent cockpit in the prior art.
The technical scheme of the application is as follows:
in one aspect, the application provides an intelligent cabin human-computer interaction evaluation method, the method is executed on a cabin human-computer interaction design evaluation recommendation subsystem, the cabin human-computer interaction design evaluation recommendation subsystem is connected with a simulation driver module, the simulation driver module is connected with an XR cabin interaction module, the simulation driver module provides a virtual driving scene for operators to interact by XR head equipment, and the method comprises the following steps:
acquiring interaction data, wherein the interaction data are acquired by an operator in a cockpit in an interaction process by an interaction data acquisition component of the simulated driver module; the XR cabin interaction module makes an interaction decision based on a preset interaction control scheme according to the interaction data and feeds back the interaction decision to the simulation driver module so as to realize continuous interaction;
calculating grade scores of human-computer interaction behaviors in a plurality of score dimensions based on the multi-modal human-computer interaction data acquired by the interaction data acquisition component, and calculating a weighted average value to obtain strategy scores of the interaction control scheme; the scoring dimension includes a cognitive load score, a comfort score, and a behavioral efficiency score; wherein the multi-modal human factor comprises: electroencephalogram data, eye movement data, physiological data, near infrared data, and operator behavioral data.
In some embodiments, the electroencephalogram data includes electroencephalogram EEG signal time domain, frequency domain, and non-linear indicators; the eye movement data comprise pupil diameters, fixation point coordinates and corresponding fixation time lengths; the physiological data includes skin temperature data, skin electrical data, and respiratory data; the near infrared data comprises blood oxygen data, total hemoglobin concentration values and heart rate variability data; the respiratory data includes respiratory rate, tidal volume, and vital capacity; the operator behavior data includes an execution completion rate of driving behavior and a stimulus response time period.
In some embodiments, the cognitive load score is evaluated using a first type of interaction data, the first type of interaction data including the electroencephalogram EEG signal time domain, frequency domain, and non-linear indicators, the pupil diameter, the blood oxygenation data, and the total hemoglobin concentration value;
the comfort score is obtained by evaluating second-class interaction data, wherein the second-class interaction data comprises heart rate variability data, skin temperature data and skin electricity data;
the behavior high-efficiency score is obtained by evaluating third-class interaction data, wherein the third-class interaction data comprises the execution completion rate and the stimulus response duration.
In some embodiments, the method comprises:
mapping the first type of interaction data to the cognitive load scores by using a pre-trained first neural network, wherein the cognitive load scores comprise a first set number of grade scores;
mapping the second type of interaction data to the comfort scores using a pre-trained second neural network, the comfort scores comprising a second set number of rank scores;
mapping the third type of interaction data to the behavioral efficiency scores using a pre-trained third neural network, the behavioral efficiency scores comprising a third set number of rank scores.
In some embodiments, the first neural network, the second neural network, and the third neural network are decision trees or BP neural networks.
In some embodiments, the first neural network, the second neural network, and the third neural network each comprise one convolutional neural network.
In some embodiments, before calculating the weighted average to obtain the policy score of the interaction control scheme, the method further includes:
and normalizing the grade scores of the score dimensions.
In some embodiments, the method further comprises: and calculating strategy scores corresponding to the interaction control schemes, and selecting the interaction control scheme with the highest strategy score as the optimal interaction control scheme.
On the other hand, the application also provides an intelligent cabin man-machine interaction evaluation system, which comprises:
the simulation driver module comprises a simulation driver cabin and an interaction data acquisition component, wherein the driver cabin is used for providing a hardware environment for simulation driving, and the interaction data acquisition component comprises XR head-mounted equipment for brain control and eye control interaction, a skin temperature skin electric sensor and a near infrared sensor which are arranged on a steering wheel, a respiration sensor arranged on a safety belt and an interaction behavior sensor arranged in the driver cabin; the XR headset equipment acquires electroencephalogram data and eye movement data, the skin temperature and skin electricity sensors acquire skin temperature data and skin electricity data, the near infrared sensors acquire near infrared data, the respiration sensors acquire respiration data, and the interaction behavior sensors are used for acquiring operator behavior data;
the XR cabin interaction module is used for controlling the XR head-mounted equipment to present a virtual driving scene, making an interaction decision on the electroencephalogram data, the eye movement data, the skin temperature data, the near infrared data, the breathing data and the operator behavior data based on a preset interaction control scheme, and feeding back the interaction decision to the simulation value module for execution;
and the cabin man-machine interaction design evaluation recommendation subsystem is used for executing the intelligent cabin man-machine interaction evaluation method and obtaining the strategy score of the interaction control scheme.
In some embodiments, the system further connects to a cloud server in a wired or wireless form and uploads the policy scores for the interaction control scheme.
The application has the advantages that:
according to the intelligent cabin man-machine interaction evaluation method and system, virtual driving scenes are provided through the XR head-mounted equipment, multi-mode data are collected through the simulation driver, and interaction modes such as eye control and brain control are achieved, so that operators can interact, a real driving scene is simulated, and test data close to the real scene are obtained under the condition that safety is guaranteed. The multi-modal data of human factors are introduced, based on vital sign data and operator behavior data generated in the interaction process, multidimensional scores comprising cognitive load scores, comfort scores and behavior high-efficiency scores in the interaction process are calculated and formed, an omnibearing evaluation scheme aiming at intelligent cockpit design is constructed, full-automatic evaluation is carried out, the evaluation efficiency is improved, the influence of human subjective factors is eliminated, and the reliability is higher.
Additional advantages, objects, and features of the application will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
It will be appreciated by those skilled in the art that the objects and advantages that can be achieved with the present application are not limited to the above-described specific ones, and that the above and other objects that can be achieved with the present application will be more clearly understood from the following detailed description.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate and together with the description serve to explain the application. In the drawings:
fig. 1 is a schematic structural diagram of a man-machine interaction evaluation system for an intelligent cabin according to an embodiment of the application.
Fig. 2 is a schematic diagram of a calculation structure of a cabin man-machine interaction design score in an intelligent cabin man-machine interaction evaluation method according to an embodiment of the application.
Fig. 3 is a schematic diagram of evaluation logic of an interaction control scheme by the intelligent cabin man-machine interaction evaluation method according to an embodiment of the application.
Detailed Description
The present application will be described in further detail with reference to the following embodiments and the accompanying drawings, in order to make the objects, technical solutions and advantages of the present application more apparent. The exemplary embodiments of the present application and the descriptions thereof are used herein to explain the present application, but are not intended to limit the application.
It should be noted here that, in order to avoid obscuring the present application due to unnecessary details, only structures and/or processing steps closely related to the solution according to the present application are shown in the drawings, while other details not greatly related to the present application are omitted.
It should be emphasized that the term "comprises/comprising" when used herein is taken to specify the presence of stated features, elements, steps or components, but does not preclude the presence or addition of one or more other features, elements, steps or components.
The intelligent cockpit can be defined as an intelligent service system, can actively observe and understand the demands of operators, and can meet the demands of the operators; from end consumer demand and application scenario, the passenger not only need not worry driving and trip, can also obtain comfortable experience in intelligent cabin. The automobile based on the intelligent cockpit takes the tattletale as a traveling tool, and aims to realize intelligent interaction of the cockpit with people, vehicles and roads. The intelligent cockpit can provide a complete interactive control scheme, can generate and execute interactive decisions based on physiological characteristic data and operator behavior data generated by operators in the driving process, comprises driving assistance related to emergency braking, steering and the like, alarming related to fatigue supervision and dangerous driving behaviors of the drivers and intelligent regulation and control related to the driving environment in the vehicle. For example, the intelligent automobile DMS system can realize detection of driving fatigue, distraction and other dangerous behaviors of a driver, and regulate the temperature of an air conditioner in the automobile based on the change of the body temperature and the heart rate of the driver. However, the interactive control scheme provided by the intelligent cockpit needs to be accurately evaluated whether it is suitable for the scene requirement or whether it can achieve the expected effect.
On the one hand, the application provides an intelligent cabin man-machine interaction evaluation method, which is executed on a cabin man-machine interaction design evaluation recommendation subsystem, wherein the cabin man-machine interaction design evaluation recommendation subsystem is connected with a simulation driver module, the simulation driver module is connected with an XR cabin interaction module, and the simulation driver module provides a virtual driving scene for operators to interact by XR head-mounted equipment, and the method comprises the following steps of S101-S102:
step S101: acquiring interaction data, wherein the interaction data are acquired by an interaction data acquisition component of the simulated driver module in the interaction process of operators in a cockpit; and the XR cabin interaction module makes an interaction decision based on a preset interaction control scheme according to the interaction data and feeds back the interaction decision to the simulation driver module so as to realize continuous interaction.
Step S102: calculating grade scores of the human-computer interaction behaviors in a plurality of score dimensions based on the multi-mode human-computer interaction data acquired by the interaction data acquisition component, and calculating a weighted average value to obtain strategy scores of an interaction control scheme; the scoring dimension includes a cognitive load score, a comfort score, and a behavioral efficiency score; wherein the multi-modal human factor comprises: electroencephalogram data, eye movement data, physiological data, near infrared data, and operator behavioral data.
In this embodiment, the simulated driver module includes a cockpit and an interaction data acquisition component, where the cockpit is configured to provide a driving environment and required hardware devices, such as: seats, steering wheels, air conditioning systems, audio-visual systems, central control equipment, other control keys and the like. The interaction data acquisition assembly comprises XR head-mounted equipment for brain control and eye control interaction, a skin temperature skin sensor and a near infrared sensor which are arranged on a steering wheel, a respiration sensor arranged on a safety belt and an interaction behavior sensor arranged in the cockpit; the skin temperature and skin electricity data are collected by the skin temperature and skin electricity sensor, the near infrared data are collected by the near infrared sensor, the respiration data are collected by the respiration sensor, and the interaction behavior sensor is used for collecting the behavior data of operators.
The XR cabin interaction module reacts to physiological characteristic data collected by the interaction data assembly and the behaviors of operators according to a preset interaction control scheme to form an interaction strategy, the interaction strategy is provided for the operators, and the strategy is continuously adjusted in the interaction process to form interaction continuity.
In step S102, the cabin man-machine interaction design evaluation recommendation subsystem acquires physiological characteristic data and operator behavior data acquired by the interaction data acquisition component, and evaluates the interaction control scheme in a plurality of scoring dimensions, including aspects of cognitive load, comfort, behavior efficiency and the like of an operator. The evaluation of the cognitive load can reflect whether certain intelligent interaction schemes are easily accepted by clients, whether understanding barriers exist, whether learning cost is required to be paid or not, and the like. The comfort evaluation can reflect the use feeling of operators, such as whether the provided temperature regulation and in-vehicle light regulation strategies are proper or not. The evaluation of the behavior efficiency is mainly to evaluate the reaction time and accuracy required by the operator to execute corresponding actions on the basis of providing driving assistance, such as the accuracy and reaction time of the driving operation executed by the operator at each position on the basis of providing voice navigation prompt. The application calculates the weighted average of the scores of the evaluation dimensions to obtain the strategy score of the interaction control strategy.
It should be noted that, the virtual driving scene is provided by the XR headset for the interaction of operators, and the simulation of the real road condition can be realized in the virtual scene by combining the hardware equipment of the cockpit. The road test is not needed, the safety of the detection process is ensured, and particularly, the detection of the intelligent auxiliary driving related function can ensure zero accident.
In some embodiments, the electroencephalogram data includes electroencephalogram EEG signal time domain, frequency domain, and non-linear indicators; the eye movement data comprise pupil diameters, fixation point coordinates and corresponding fixation time lengths; the physiological data includes skin temperature data, skin electrical data, and respiratory data; the near infrared data includes blood oxygen data, total hemoglobin concentration values, and heart rate variability data; respiratory data includes respiratory rate, tidal volume, and vital capacity; the operator behavior data includes the execution completion rate of the driving behavior and the stimulus response time period. It will be appreciated by those skilled in the art that the parameters to which the present application can be directed include not only those described above, but also any other physiological characteristic data and operator interaction data as parameters for evaluating cognitive load, comfort and behavioral efficiency.
In some embodiments, the cognitive load score is assessed using a first type of interaction data, including electroencephalogram EEG signal time domain, frequency domain and non-linear indicators, pupil diameter, blood oxygen data and total hemoglobin concentration values.
The comfort score is evaluated using a second type of interaction data, including heart rate variability data, skin temperature data, and skin electrical data.
The behavioral high-efficiency score is obtained by evaluating third-class interaction data, wherein the third-class interaction data comprises execution completion rate and stimulus response duration.
In some embodiments, the method comprises:
the first type of interaction data is mapped to cognitive load scores using a pre-trained first neural network, the cognitive load scores comprising a first set number of rank scores.
The second class of interaction data is mapped to comfort scores using a pre-trained second neural network, the comfort scores comprising a second set number of level scores.
And mapping the third type of interaction data to a behavior high-efficiency score by using a pre-trained third neural network, wherein the behavior high-efficiency score comprises a third set number of grade scores.
Specifically, in this embodiment, the first neural network, the second neural network and the third neural network need to be pre-trained based on the existing data, and the data in the existing database is adopted to add the score on the cognitive load, the comfort and the behavior efficiency of each group of data in the form of manually adding the label. The score may be divided into a plurality of grades, and illustratively, 5 grades may be scored from high to low, with 5 grades being superior, 4 grades being good, 3 grades being inferior, 2 grades being inferior, and 1 grade being poor.
In some embodiments, the first neural network, the second neural network, and the third neural network are decision trees or BP neural networks. In other embodiments, the first neural network, the second neural network, and the third neural network each comprise a convolutional neural network.
In some embodiments, before calculating the weighted average to obtain the policy score of the interaction control scheme, the method further includes: and normalizing the grade scores of the score dimensions.
In some embodiments, the method further comprises: and calculating strategy scores corresponding to the interaction control schemes, and selecting the interaction control scheme with the highest strategy score as the optimal interaction control scheme.
On the other hand, the application also provides an intelligent cabin man-machine interaction evaluation system, referring to fig. 1, comprising:
the simulation driver module comprises a simulation driver cabin and an interaction data acquisition component, wherein the driver cabin is used for providing a hardware environment for simulation driving, and the interaction data acquisition component comprises XR head-mounted equipment for brain control and eye control interaction, a skin temperature skin sensor and a near infrared sensor which are arranged on a steering wheel, a respiration sensor arranged on a safety belt and an interaction behavior sensor arranged in the driver cabin; the skin temperature and skin electricity data are collected by the skin temperature and skin electricity sensor, the near infrared data are collected by the near infrared sensor, the respiration data are collected by the respiration sensor, and the interaction behavior sensor is used for collecting the behavior data of operators.
And the XR cabin interaction module is used for controlling the XR head-mounted equipment to present a virtual driving scene, making an interaction decision on electroencephalogram data, eye movement data, skin temperature data, skin electricity data, near infrared data, breathing data and operator behavior data based on a preset interaction control scheme, and feeding back the interaction decision to the simulation value module for execution.
And the cabin man-machine interaction design evaluation recommendation subsystem is used for executing the intelligent cabin man-machine interaction evaluation method in the steps S101-S102 to obtain the strategy score of the interaction control scheme.
In some embodiments, the system also connects to a cloud server in a wired or wireless form and uploads the policy scores for the interactive control scheme.
The following description is made in connection with specific embodiments:
referring to fig. 1, the present embodiment provides an XR cockpit interaction control system, which includes an XR cockpit interaction module and an analog driver module. The system is used for the design of a man-machine interaction interface of a driver cabin, the realization of a man-machine interaction control mode design, the presentation of driving scenes, and the collection and recording of various driving behavior data and physiological data of a driver.
Wherein, XR cabin interaction module: the system is used for presenting a cockpit interaction interface design scheme, a man-machine interaction control mode design and a driving scene, and is provided with various intelligent and natural man-machine interaction control modes, including but not limited to: brain control interactions, eye control interactions, skin electrical interactions, skin temperature interactions, respiratory interactions, myoelectrical interactions, gesture interactions, and the like. The electroencephalogram data comprises EEG signal time domain, frequency domain and nonlinear indexes; the eye movement data comprise pupil diameters, fixation point coordinates and corresponding fixation time lengths; the near infrared data includes blood oxygen data, total hemoglobin concentration values, and heart rate variability data; respiratory data includes respiratory rate, tidal volume, and vital capacity; the operator behavior data includes the execution completion rate of the driving behavior and the stimulus response time period. It will be appreciated by those skilled in the art that the parameters to which the present application can be directed include not only those described above, but also any other physiological characteristic data and operator interaction data as parameters for evaluating cognitive load, comfort and behavioral efficiency. The interactive instruction is fed back to the simulator to drive the operation of the simulated driver.
The man-machine interaction control mode is designed and selected by an operator according to the design scheme in a self-defining way, and the implementation modes of all control instructions comprise but are not limited to the following two modes:
first, a custom threshold: the operator can set the threshold value of the control command as required. For example: the eye control interaction can realize the output of instructions by setting the gazing time length threshold to be 2 s.
Second, the algorithm adaptively adjusts the threshold: the algorithm is used for adaptively adjusting the threshold according to the habit of an operator, the driving scene and the recognition of the vehicle state. Such as: the eye-control interaction, when the vehicle is identified to be in a high-speed driving state, the threshold value of the eye-movement fixation duration can be reduced in a self-adaptive mode. When the driving scene is identified to be complex, more attention and cognitive resource consumption of the driver are required, the threshold value of the eye movement fixation time length for realizing interface control can be reduced in a self-adaptive mode.
A simulated driver module: the method is used for executing the interaction strategy and collecting and recording various man-machine interaction data and physiological data of the driver. Acquisition means include, but are not limited to: the brain control and eye control interaction data are acquired by XR head-mounted equipment, the skin temperature and skin electricity data are acquired by a steering wheel of a driving simulator, the breathing data are acquired by a safety belt of the driving simulator, and the gesture interaction data are acquired by a sensor built in the driving simulator.
Cabin man-machine interaction design evaluation recommendation subsystem: the system inputs data such as physiological data and man-machine interaction of a multi-channel driver, carries out grade evaluation of cabin man-machine interaction design according to grading dimension, and the grades are divided into 5 grades (excellent 5, excellent 4, medium 3, poor 2 and poor 1).
The evaluation dimension includes: under specific driving scenes, interaction interfaces and interaction modes, the states or performances of the driver when the driving task is completed are scored at 5 levels. Including but not limited to cognitive load, comfort, and efficiency.
Referring to fig. 2, the cabin man-machine interaction design level is calculated in the following manner: a weighted average of the individual rating dimension scores.
S is the score of the man-machine interaction design scheme of the cabin; w is the weight of each dimension; x is a rating for each dimension; n is the number of evaluation dimensions. The S value is rounded off and is determined as the class of the cabin recipe.
The correspondence scheme of the evaluation dimension and each secondary index includes, but is not limited to: and (3) constructing a model through a machine learning or deep learning algorithm (such as a decision tree, a neural network and the like), and establishing a mapping relation of specific operator physiological characteristic indexes, operator behavior data and cognitive load scores, comfort scores and behavior high-efficiency scores. The subjective scores (1-5) of the operators on each dimension are corresponding to the objective physiological indexes.
After normalizing each secondary index score (range is 0-1), taking a weighted average value (range is 0-1), dividing the value of 0-1 according to 5 quantiles, and mapping the value to the grade score (1-5). The value of the weight is defined by an expert.
Taking the three evaluation dimensions as examples, features selected from the rating 1 of cognitive load (L) include, but are not limited to: electroencephalogram EEG signal time domain, frequency domain and nonlinear index (theta/beta), near infrared (oxygenation, deoxidation and total hemoglobin concentration value) and eye movement index (pupil diameter) in the task process; characteristics selected for the rating of comfort (H) include, but are not limited to: HRV frequency domain index (LF/HF), skin electrical index (SC); characteristics of choice for the rating of efficiency (Q) include, but are not limited to: driving behavior index (accuracy of task and reaction time). Referring to fig. 3, three elements of a cabin interaction interface, a cabin interaction mode and a driving scene form a complete interaction control scheme, and evaluation is performed based on an electroencephalogram index and an eye movement index in terms of cognitive load; in terms of comfort, evaluating based on physiological indicators; the efficiency aspect is evaluated based on the driving behavior index.
The embodiment provides the cockpit man-machine interaction design automatic evaluation method based on the combination of the cockpit interaction interface and the cockpit interaction mode in the specific driving scene, which is beneficial to adapting to the optimal man-machine interaction scheme under different driving conditions.
In summary, according to the man-machine interaction evaluation method and system for the intelligent cabin, virtual driving scenes are provided through the XR head-mounted equipment, and interaction modes such as eye control and brain control are realized through collecting multi-mode data through the simulation driver, so that operators can interact, a real driving scene is simulated, and test data close to the real scene is obtained under the condition of ensuring safety. The multi-modal data of human factors are introduced, based on vital sign data and operator behavior data generated in the interaction process, multidimensional scores comprising cognitive load scores, comfort scores and behavior high-efficiency scores in the interaction process are calculated and formed, an omnibearing evaluation scheme aiming at intelligent cockpit design is constructed, full-automatic evaluation is carried out, the evaluation efficiency is improved, the influence of human subjective factors is eliminated, and the reliability is higher.
Those of ordinary skill in the art will appreciate that the various illustrative components, systems, and methods described in connection with the embodiments disclosed herein can be implemented as hardware, software, or a combination of both. The particular implementation is hardware or software dependent on the specific application of the solution and the design constraints. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application. When implemented in hardware, it may be, for example, an electronic circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, a plug-in, a function card, or the like. When implemented in software, the elements of the application are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine readable medium or transmitted over transmission media or communication links by a data signal carried in a carrier wave. A "machine-readable medium" may include any medium that can store or transfer information. Examples of machine-readable media include electronic circuitry, semiconductor memory devices, ROM, flash memory, erasable ROM (EROM), floppy disks, CD-ROMs, optical disks, hard disks, fiber optic media, radio Frequency (RF) links, and the like. The code segments may be downloaded via computer networks such as the internet, intranets, etc.
It should also be noted that the exemplary embodiments mentioned in this disclosure describe some methods or systems based on a series of steps or devices. However, the present application is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, or may be performed in a different order from the order in the embodiments, or several steps may be performed simultaneously.
In this disclosure, features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments and/or in combination with or instead of the features of the other embodiments.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, and various modifications and variations can be made to the embodiments of the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. The method is executed on a cabin man-machine interaction design evaluation recommendation subsystem, the cabin man-machine interaction design evaluation recommendation subsystem is connected with a simulation driver module, the simulation driver module is connected with an XR cabin interaction module, and the simulation driver module provides a virtual driving scene for operators to interact by XR head-mounted equipment, and the method comprises the following steps:
acquiring interaction data, wherein the interaction data are acquired by an operator in a cockpit in an interaction process by an interaction data acquisition component of the simulated driver module; the XR cabin interaction module makes an interaction decision based on a preset interaction control scheme according to the interaction data and feeds back the interaction decision to the simulation driver module so as to realize continuous interaction;
calculating grade scores of human-computer interaction behaviors in a plurality of score dimensions based on the multi-modal human-computer interaction data acquired by the interaction data acquisition component, and calculating a weighted average value to obtain strategy scores of the interaction control scheme; the scoring dimension includes a cognitive load score, a comfort score, and a behavioral efficiency score; wherein the multi-modal human factor comprises: electroencephalogram data, eye movement data, physiological data, near infrared data, and operator behavioral data.
2. The intelligent cockpit human-computer interaction assessment method according to claim 1, wherein the electroencephalogram data comprises electroencephalogram EEG signal time domain, frequency domain and nonlinear indexes; the eye movement data comprise pupil diameters, fixation point coordinates and corresponding fixation time lengths; the physiological data includes skin temperature data, skin electrical data, and respiratory data; the near infrared data comprises blood oxygen data, total hemoglobin concentration values and heart rate variability data; the respiratory data includes respiratory rate, tidal volume, and vital capacity; the operator behavior data includes an execution completion rate of driving behavior and a stimulus response time period.
3. The intelligent cabin human-computer interaction evaluation method according to claim 2, wherein the cognitive load score is obtained by evaluating first-class interaction data, wherein the first-class interaction data comprises the electroencephalogram EEG signal time domain, the electroencephalogram EEG signal frequency domain, nonlinear indexes, the pupil diameter, the blood oxygen data and the total hemoglobin concentration value;
the comfort score is obtained by evaluating second-class interaction data, wherein the second-class interaction data comprises heart rate variability data, skin temperature data and skin electricity data;
the behavior high-efficiency score is obtained by evaluating third-class interaction data, wherein the third-class interaction data comprises the execution completion rate and the stimulus response duration.
4. A method of intelligent cockpit human-computer interaction assessment according to claim 3, wherein the method comprises:
mapping the first type of interaction data to the cognitive load scores by using a pre-trained first neural network, wherein the cognitive load scores comprise a first set number of grade scores;
mapping the second type of interaction data to the comfort scores using a pre-trained second neural network, the comfort scores comprising a second set number of rank scores;
mapping the third type of interaction data to the behavioral efficiency scores using a pre-trained third neural network, the behavioral efficiency scores comprising a third set number of rank scores.
5. The intelligent cabin human-computer interaction assessment method according to claim 4, wherein the first neural network, the second neural network and the third neural network are decision trees or BP neural networks.
6. The intelligent cockpit human-computer interaction assessment method of claim 4 wherein the first neural network, the second neural network and the third neural network each comprise a convolutional neural network.
7. The method for evaluating human-computer interaction of intelligent cabin according to claim 1, wherein before calculating a weighted average to obtain a policy score of the interaction control scheme, further comprises:
and normalizing the grade scores of the score dimensions.
8. The intelligent cockpit human-computer interaction assessment method of claim 1, further comprising:
and calculating strategy scores corresponding to the interaction control schemes, and selecting the interaction control scheme with the highest strategy score as the optimal interaction control scheme.
9. An intelligent cabin man-machine interaction evaluation system, which is characterized by comprising:
the simulation driver module comprises a simulation driver cabin and an interaction data acquisition component, wherein the driver cabin is used for providing a hardware environment for simulation driving, and the interaction data acquisition component comprises XR head-mounted equipment for brain control and eye control interaction, a skin temperature skin electric sensor and a near infrared sensor which are arranged on a steering wheel, a respiration sensor arranged on a safety belt and an interaction behavior sensor arranged in the driver cabin; the XR headset equipment acquires electroencephalogram data and eye movement data, the skin temperature and skin electricity sensors acquire skin temperature data and skin electricity data, the near infrared sensors acquire near infrared data, the respiration sensors acquire respiration data, and the interaction behavior sensors are used for acquiring operator behavior data;
the XR cabin interaction module is used for controlling the XR head-mounted equipment to present a virtual driving scene, making an interaction decision on the electroencephalogram data, the eye movement data, the skin temperature data, the near infrared data, the breathing data and the operator behavior data based on a preset interaction control scheme, and feeding back the interaction decision to the simulation value module for execution;
the cabin man-machine interaction design evaluation recommendation subsystem is used for executing the intelligent cabin man-machine interaction evaluation method according to any one of claims 1 to 8 to obtain the strategy score of the interaction control scheme.
10. The intelligent cockpit human-computer interaction assessment system of claim 9, wherein the system is further connected to a cloud server in a wired or wireless manner and uploads the policy scores of the interaction control scheme.
CN202211726814.4A 2022-12-30 2022-12-30 Intelligent cabin man-machine interaction evaluation method and system Active CN116594858B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211726814.4A CN116594858B (en) 2022-12-30 2022-12-30 Intelligent cabin man-machine interaction evaluation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211726814.4A CN116594858B (en) 2022-12-30 2022-12-30 Intelligent cabin man-machine interaction evaluation method and system

Publications (2)

Publication Number Publication Date
CN116594858A true CN116594858A (en) 2023-08-15
CN116594858B CN116594858B (en) 2024-08-27

Family

ID=87603212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211726814.4A Active CN116594858B (en) 2022-12-30 2022-12-30 Intelligent cabin man-machine interaction evaluation method and system

Country Status (1)

Country Link
CN (1) CN116594858B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117992333A (en) * 2023-12-22 2024-05-07 北京津发科技股份有限公司 Evaluation method and device for man-machine interaction system of vehicle and evaluation system

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855410A (en) * 2012-09-20 2013-01-02 上海品铭机械工程有限公司 Method and system for evaluation of man-machine work efficiency of cabin simulation test bed
JP2014035649A (en) * 2012-08-08 2014-02-24 Toyota Motor Corp Vehicle driving evaluation apparatus and vehicle driving evaluation method
CN103761581A (en) * 2013-12-31 2014-04-30 西北工业大学 Method for civil aircraft flight deck human-computer interface comprehensive evaluation
US20180286269A1 (en) * 2017-03-29 2018-10-04 The Boeing Company Systems and methods for an immersive simulator
CN109740936A (en) * 2019-01-03 2019-05-10 中国商用飞机有限责任公司 A kind of system for civil aircraft cockpit availability assessment
US20200241525A1 (en) * 2019-01-27 2020-07-30 Human Autonomous Solutions LLC Computer-based apparatus system for assessing, predicting, correcting, recovering, and reducing risk arising from an operator?s deficient situation awareness
CN111767611A (en) * 2020-06-30 2020-10-13 南京航空航天大学 Load balance-based airplane cockpit man-machine function distribution method
CN111783355A (en) * 2020-06-17 2020-10-16 南京航空航天大学 Man-machine interaction risk assessment method under multi-agent architecture
CN113420952A (en) * 2021-05-17 2021-09-21 同济大学 Automobile human-computer interaction testing and evaluating system based on simulated driving
CN113962022A (en) * 2021-09-30 2022-01-21 西南交通大学 Passenger experience-based intelligent cabin comfort evaluation method
CN114298469A (en) * 2021-11-24 2022-04-08 重庆大学 User experience test evaluation method for intelligent cabin of automobile
WO2022095985A1 (en) * 2020-11-09 2022-05-12 清华大学 Method and system for evaluating comfort of passenger of intelligent driving vehicle
CN217008449U (en) * 2022-02-21 2022-07-19 苏州壹心汽车科技有限公司 New generation automobile driving simulator with man-machine interaction intelligent cabin
CN115489402A (en) * 2022-09-27 2022-12-20 上汽通用五菱汽车股份有限公司 Vehicle cabin adjusting method and device, electronic equipment and readable storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014035649A (en) * 2012-08-08 2014-02-24 Toyota Motor Corp Vehicle driving evaluation apparatus and vehicle driving evaluation method
CN102855410A (en) * 2012-09-20 2013-01-02 上海品铭机械工程有限公司 Method and system for evaluation of man-machine work efficiency of cabin simulation test bed
CN103761581A (en) * 2013-12-31 2014-04-30 西北工业大学 Method for civil aircraft flight deck human-computer interface comprehensive evaluation
US20180286269A1 (en) * 2017-03-29 2018-10-04 The Boeing Company Systems and methods for an immersive simulator
CN109740936A (en) * 2019-01-03 2019-05-10 中国商用飞机有限责任公司 A kind of system for civil aircraft cockpit availability assessment
US20200241525A1 (en) * 2019-01-27 2020-07-30 Human Autonomous Solutions LLC Computer-based apparatus system for assessing, predicting, correcting, recovering, and reducing risk arising from an operator?s deficient situation awareness
CN111783355A (en) * 2020-06-17 2020-10-16 南京航空航天大学 Man-machine interaction risk assessment method under multi-agent architecture
CN111767611A (en) * 2020-06-30 2020-10-13 南京航空航天大学 Load balance-based airplane cockpit man-machine function distribution method
WO2022095985A1 (en) * 2020-11-09 2022-05-12 清华大学 Method and system for evaluating comfort of passenger of intelligent driving vehicle
CN113420952A (en) * 2021-05-17 2021-09-21 同济大学 Automobile human-computer interaction testing and evaluating system based on simulated driving
CN113962022A (en) * 2021-09-30 2022-01-21 西南交通大学 Passenger experience-based intelligent cabin comfort evaluation method
CN114298469A (en) * 2021-11-24 2022-04-08 重庆大学 User experience test evaluation method for intelligent cabin of automobile
CN217008449U (en) * 2022-02-21 2022-07-19 苏州壹心汽车科技有限公司 New generation automobile driving simulator with man-machine interaction intelligent cabin
CN115489402A (en) * 2022-09-27 2022-12-20 上汽通用五菱汽车股份有限公司 Vehicle cabin adjusting method and device, electronic equipment and readable storage medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
张杰;方钰;逄嘉振;季宝宁;: "基于飞行操作流程的虚拟现实驾驶舱操纵设备布置评估", 计算机集成制造系统, no. 03, 15 March 2020 (2020-03-15) *
邱国华: "汽车智能交互内外饰设计", 31 December 2021, 北京:机械工业出版社, pages: 195 - 196 *
郝云飞;杨继国;曹永刚;邴洋海;: "生理监测设备在座舱人机工效测量中的应用", 飞机设计, no. 04, 15 August 2017 (2017-08-15) *
黄迪青;徐霖;陈诚;: "基于神经网络的汽车座椅舒适性研究", 汽车零部件, no. 07, 28 July 2020 (2020-07-28) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117992333A (en) * 2023-12-22 2024-05-07 北京津发科技股份有限公司 Evaluation method and device for man-machine interaction system of vehicle and evaluation system

Also Published As

Publication number Publication date
CN116594858B (en) 2024-08-27

Similar Documents

Publication Publication Date Title
Sikander et al. Driver fatigue detection systems: A review
CN110456635B (en) Control method of electric automobile power system based on digital twin technology
Xing et al. Toward human-vehicle collaboration: Review and perspectives on human-centered collaborative automated driving
CN112041910B (en) Information processing apparatus, mobile device, method, and program
Fan et al. EEG-based affect and workload recognition in a virtual driving environment for ASD intervention
US20190092337A1 (en) System for Monitoring an Operator
CN116594858B (en) Intelligent cabin man-machine interaction evaluation method and system
Yang et al. Real-time driver cognitive workload recognition: Attention-enabled learning with multimodal information fusion
CN107531236A (en) Wagon control based on occupant
CN110123266B (en) Maneuvering decision modeling method based on multi-modal physiological information
CN112068315A (en) Adjusting method and system of AR helmet
CN115407872B (en) Evaluation method, device and storage medium for intelligent man-machine cooperative system
CN111361567B (en) Method and equipment for emergency treatment in vehicle driving
CN109858178A (en) A kind of commercial vehicle drivers giving fatigue pre-warning method based on Intelligent bracelet
CN112455461B (en) Human-vehicle interaction method for automatically driving vehicle and automatically driving system
CN117876807A (en) Personnel state adjustment training method and device based on simulation situation multi-mode data
DE102021122037A1 (en) PREDICTING CHASSIS INPUT INTENT VIA BRAIN-MACHINE INTERFACE AND DRIVER MONITORING SENSOR FUSION
Zhenhai et al. The Driver’s Steering Feel Assessment Using EEG and EMG signals G
CN115116297B (en) Method suitable for taking over training of human-machine co-driving vehicle driver
CN110723145A (en) Vehicle control method and device, computer-readable storage medium and wearable device
CN116168371A (en) 5G remote driving-oriented safety work load estimation system and estimation method
CN109934171A (en) Driver's passiveness driving condition online awareness method based on layered network model
CN115946702A (en) Vehicle-mounted driver health detection method and system
Bao et al. Driving risk and intervention: Subjective risk lane change dataset
CN118474957B (en) Control method and device for atmosphere lamp in vehicle and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant