CN111309628A - Vehicle human-computer interface benchmarking evaluation method - Google Patents

Vehicle human-computer interface benchmarking evaluation method Download PDF

Info

Publication number
CN111309628A
CN111309628A CN202010163702.7A CN202010163702A CN111309628A CN 111309628 A CN111309628 A CN 111309628A CN 202010163702 A CN202010163702 A CN 202010163702A CN 111309628 A CN111309628 A CN 111309628A
Authority
CN
China
Prior art keywords
human
parameter
vehicle
parameters
hardware
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010163702.7A
Other languages
Chinese (zh)
Other versions
CN111309628B (en
Inventor
王伟力
刘瑞祥
陈思承
夏雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecarx Hubei Tech Co Ltd
Original Assignee
Hubei Ecarx Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Ecarx Technology Co Ltd filed Critical Hubei Ecarx Technology Co Ltd
Priority to CN202010163702.7A priority Critical patent/CN111309628B/en
Publication of CN111309628A publication Critical patent/CN111309628A/en
Application granted granted Critical
Publication of CN111309628B publication Critical patent/CN111309628B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3604Software analysis for verifying properties of programs
    • G06F11/3608Software analysis for verifying properties of programs using formal methods, e.g. model checking, abstract interpretation

Abstract

The invention provides a vehicle human-computer interface benchmarking evaluation method, which is characterized in that relevant data used by a human-computer interface, such as hardware basic parameters of hardware equipment in a vehicle of the hardware equipment, human-computer engineering relation parameters of relative relation parameters of the hardware equipment and a driver of the vehicle under a set driving environment, human-computer interaction parameters and the like, are recorded and read, after the data are collected, quantization processing is respectively carried out to obtain a hardware basic parameter quantization value, a human-computer engineering relation parameter quantization value and a human-computer interaction parameter quantization value, and after an evaluation model is input in a unified mode for analysis, a human-computer interface evaluation value of the vehicle is output. The method provided by the invention combines two methods of data reading and analysis and human-computer interaction task evaluation of the vehicle and the user, can greatly improve the accuracy, objectivity and authenticity of the evaluation of the human-computer interface of the vehicle, and provides accurate and efficient data and guidance opinions for the analysis and evaluation of the human-computer interface of the vehicle.

Description

Vehicle human-computer interface benchmarking evaluation method
Technical Field
The invention relates to the technical field of automatic driving, in particular to a vehicle man-machine interface landmark evaluation method.
Background
In the process of designing and researching an intelligent vehicle-mounted system and a human-computer interface of an automobile, bidding and evaluation of the human-computer interface of the automobile are important bases of designing and researching the intelligent vehicle-mounted system and the human-computer interface, for example, in the process of researching and researching the human-computer interface, bidding analysis of various competitive products or benchmarking products is required, the human-computer interfaces of the competitive products and the benchmarking products are evaluated, and meanwhile, bidding analysis and evaluation results are output to guide research and development work of the products.
The existing benchmarking evaluation method only records a human-computer interface, and then an evaluator subjectively evaluates the recorded data, so that the personal experience and the environment of the evaluator possibly cause subjectivity of an evaluation result and lack of accuracy.
Disclosure of Invention
The invention provides a vehicle human-computer interface landmark evaluation method and system to overcome the problems or at least partially solve the problems.
According to one aspect of the invention, a vehicle human-computer interface benchmarking method is provided, which comprises the following steps:
reading hardware basic parameters of hardware equipment in a vehicle and carrying out quantization processing to obtain quantized values of the hardware basic parameters; wherein the hardware device comprises a central control system and an instrument panel system of the vehicle;
recording an ergonomic relation parameter of a relative relation parameter between the hardware equipment and a driver of the vehicle under a set driving environment, and carrying out quantization processing on the ergonomic relation parameter to obtain an ergonomic relation parameter quantized value;
recording human-computer interaction parameters of the hardware equipment and a voice system in the vehicle for completing a preset human-computer interaction task, and carrying out quantization processing on the human-computer interaction parameters to obtain human-computer interaction parameter quantization values;
and inputting the hardware basic parameter quantitative value, the human-machine engineering relation parameter quantitative value and the human-machine interaction parameter quantitative value into a pre-established evaluation model, and outputting a human-machine interface evaluation value of the vehicle through the evaluation model.
Optionally, the recording an ergonomic relationship parameter of a relative relationship parameter between the hardware device and a driver of the vehicle in a set driving environment to obtain an ergonomic relationship parameter quantized value includes:
recording a first group of human-machine engineering relation parameters of the hardware equipment in a first set driving environment and carrying out quantization processing to obtain a first group of human-machine engineering relation parameter quantized values;
recording a second group of human-machine engineering relation parameters of the hardware equipment in a second set driving environment and carrying out quantization processing to obtain a second group of human-machine engineering relation parameter quantized values;
taking the average value of the first group of quantized values of the ergonomic relationship parameters and the second group of quantized values of the ergonomic relationship parameters as a final quantized value of the ergonomic relationship parameters;
wherein the first set driving environment and the second set driving environment are different in user population.
Optionally, the ergonomic relation parameter includes at least one of an inclination angle relation between the central control screen and the instrument panel, a finger operation position, a position height relation, a sight line blocking, and an up-down inclination angle.
Optionally, the human-computer interaction parameters include a scenarized task evaluation parameter and a voice interaction parameter;
the recording of the human-computer interaction parameters of the hardware equipment and the voice system in the vehicle for completing the preset human-computer interaction task, and the quantization of the human-computer interaction parameters to obtain the quantized values of the human-computer interaction parameters comprise:
recording a scene task evaluation parameter of the hardware equipment for executing at least one scene interaction task under the set driving environment, and carrying out quantization processing on the scene task evaluation parameter to obtain a scene task evaluation parameter quantization value;
recording voice interaction parameters of at least one or more voice conversation tasks executed by the central control system under the set driving environment, and carrying out quantization processing on the voice interaction parameters to obtain a voice interaction parameter quantization value.
Optionally, the scenized task evaluation parameters include execution steps and time when the hardware device executes at least one scenized task, and task completion time of the scenized task when the vehicle is in a static driving state and a dynamic driving state, respectively;
the voice interaction parameter comprises at least one of a recognition success rate, a support capability parameter and a voice state feedback parameter.
Optionally, outputting, by the evaluation model, a human-machine interface evaluation value of the vehicle, including:
respectively determining the respective weights of the hardware basic parameter quantized value, the human-machine engineering relation parameter quantized value and the human-machine interaction parameter quantized value by utilizing the evaluation model;
and carrying out weighted summation on the hardware basic parameter quantized value, the human-machine engineering relation parameter quantized value and the human-machine interaction parameter quantized value and respective weights to obtain and output a human-machine interface evaluation value of the vehicle.
Optionally, the determining, by using the evaluation model, respective weights of the hardware basic parameter quantized value, the human-machine engineering relationship parameter quantized value, and the human-machine interaction parameter quantized value includes:
determining a first weight of the quantized value of the hardware basic parameter based on the functional coverage and degree of the central control system and the instrument panel system and the number of the intelligent hardware devices through the evaluation model;
determining a second weight of the ergonomic relationship parameter quantitative value based on the recorded relationship parameters of the central control system and the instrument panel system relative to the driver of the vehicle;
calculating a third weight of the human-computer interaction parameter quantized value based on the hardware basis parameter quantized value and the human-computer interaction parameter quantized value;
wherein a sum of the first weight, the second weight, and the third weight is 1.
Optionally, before reading the hardware basic parameters of the hardware device in the vehicle and performing quantization processing, the method further includes:
constructing a test environment, constructing a preset number of cameras with preset visual angles in a set vehicle to establish a simulated driving environment, and recording a central control system of the simulated driving environment and a human-computer interface of an instrument panel system through the cameras;
the simulated driving environment is obtained by adjusting the positions of a seat, a steering wheel and an inside and outside rearview mirror of the vehicle to the corresponding positions in the driving environment according to body parameters corresponding to different user groups.
According to another aspect of the present invention, there is also provided a computer-readable storage medium, wherein at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the storage medium, and the at least one instruction, at least one program, a set of codes, or a set of instructions is loaded by a processor and executes the vehicle human machine interface benchmarking method according to any one of the above items.
According to another aspect of the present invention, there is also provided an electronic device, which is characterized by comprising a processor and a memory, wherein the memory stores at least one instruction, at least one program, a code set, or a set of instructions, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the vehicle human-machine interface benchmarking method according to any one of the above items.
The invention provides a vehicle human-computer interface benchmarking evaluation method, which comprises the steps of recording and reading relevant data used by a human-computer interface, such as hardware basic parameters of hardware equipment in a vehicle of the hardware equipment, human-computer engineering relation parameters of the hardware equipment in a set driving environment, human-computer interaction parameters and the like, respectively carrying out quantization processing after collecting the data to obtain quantized hardware basic parameter quantization values, human-computer engineering relation parameter quantization values and human-computer interaction parameter quantization values, inputting the quantization values into a pre-established evaluation model, and analyzing the data by the evaluation model to output a human-computer interface evaluation value of the vehicle. The method provided by the invention combines two methods of data reading and analysis and human-computer interaction task evaluation of the vehicle and the user, can greatly improve the accuracy, objectivity and authenticity of the evaluation of the human-computer interface of the vehicle, and provides accurate and efficient data and guidance opinions for the analysis and evaluation of the human-computer interface of the vehicle.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
The above and other objects, advantages and features of the present invention will become more apparent to those skilled in the art from the following detailed description of specific embodiments thereof, taken in conjunction with the accompanying drawings.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a schematic flow chart of a vehicle human-machine interface benchmarking method according to an embodiment of the present invention;
FIG. 2 is a graphical illustration of a first person viewing a central control system human machine interface in accordance with an embodiment of the invention;
FIG. 3 is a diagram illustrating a quantization standard of the high-low position relationship of the center control screen according to an embodiment of the present invention;
fig. 4 is a schematic diagram of image quantization of the human-machine interface of the central control system shown in fig. 2 according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
Fig. 1 is a schematic flow chart of a vehicle human-machine interface benchmarking method according to an embodiment of the present invention, and as can be seen from fig. 1, the vehicle human-machine interface benchmarking method according to the embodiment of the present invention may include:
step S102, reading hardware basic parameters of hardware equipment in a vehicle and carrying out quantization processing to obtain quantized values of the hardware basic parameters; the hardware equipment comprises a central control system and an instrument panel system of the vehicle;
step S104, recording the human-machine engineering relation parameters of the relative relation parameters of the hardware equipment and the driver of the vehicle under the set driving environment, and carrying out quantization processing on the human-machine engineering relation parameters to obtain quantized values of the human-machine engineering relation parameters;
step S106, recording human-computer interaction parameters of a hardware device and a voice system in a vehicle for completing a preset human-computer interaction task, and obtaining a human-computer interaction parameter quantization value after the human-computer interaction parameters are subjected to quantization processing;
and S108, inputting the hardware basic parameter quantitative value, the human-machine engineering relation parameter quantitative value and the human-machine interaction parameter quantitative value into a pre-established evaluation model, and outputting a human-machine interface evaluation value of the vehicle through the evaluation model.
The embodiment of the invention provides a vehicle human-computer interface benchmarking evaluation method, which comprises the steps of recording/reading relevant data used by a human-computer interface, such as hardware basic parameters of hardware equipment in a vehicle, human-computer engineering relation parameters of the hardware equipment in a set driving environment, human-computer interaction parameters and the like, respectively carrying out quantization processing after collecting the parameter data to obtain quantized hardware basic parameter quantization values, human-computer engineering relation parameter quantization values and human-computer interaction parameter quantization values, inputting the quantization values into a pre-established evaluation model, and analyzing the data by the evaluation model to output a human-computer interface evaluation value of the vehicle. The method provided by the embodiment of the invention combines two methods of data reading and analysis and human-computer interaction task evaluation of the vehicle and the user, can greatly improve the accuracy, objectivity and authenticity of the evaluation of the human-computer interface of the vehicle, and provides accurate and efficient data and guidance opinions for the analysis and evaluation of the benchmarks of the human-computer interface of the vehicle.
In an optional embodiment of the invention, before the human-computer interface benchmarks of the vehicle are analyzed, a test environment can be constructed according to the characteristics and requirements of the evaluated vehicle, specifically, a preset number of cameras with preset visual angles can be built in the set vehicle to establish a simulated driving environment, and the human-computer interfaces of a central control system and an instrument panel system simulating the driving environment are recorded through the cameras. The simulated driving environment is obtained by adjusting the positions of a seat, a steering wheel and an inside and outside rearview mirror of a vehicle to the corresponding positions in the driving environment according to body parameters corresponding to different user groups. Wherein the physical parameters include age, sex, height, weight, etc. The test environment provided by the embodiment of the invention needs to provide variable illumination conditions, simulate different light environments to simulate the use conditions of users, and provide a suitable field for simulating the real driving and use scenes of the users.
In the step S102, the hardware basic parameters of the hardware device in the vehicle are read, and optionally, the hardware device in this embodiment may include a central control system (IHU/CSD) and a dashboard system (DIM/Cluster) of the vehicle, and corresponding controllers respectively. The central control system of the vehicle is a place for controlling comfortable entertainment devices such as an automobile air conditioner and a sound system. The central control system comprises a central control door lock system, and a driver can control the opening and closing of the whole vehicle door and the glass lifting system through the central control system. The center control system may further include a center console and various vehicle controllers such as an audio control panel. The instrument panel system of the vehicle mainly comprises an automobile instrument which mainly comprises a speedometer, a tachometer, an engine oil pressure gauge, a water temperature gauge, a fuel gauge, a charging gauge, an alarm device, a support LCD (liquid Crystal display) screen, a TFT (thin film transistor) screen and the like. In practical applications, the hardware devices of the vehicle may include, in addition to the central control system and the instrument panel system described above, intelligent hardware devices such as a secondary instrument console control panel, a head-up display system, a secondary driving entertainment screen, a rear row entertainment screen, and the like, which may be added or deleted specifically according to the specificity of the vehicle, and the present invention is not limited thereto.
The hardware device in the embodiment of the present invention takes the central control system and the dashboard system as an example, wherein the hardware basic parameters may include various functional parameters of the central control system and the dashboard system, such as display size, display scale, resolution of the central control screen and the dashboard, control mode of the central control system and the dashboard system (for example, by touch, voice control, or integrated controller), noise abnormal sound, and stability (system software stability, for example, crash, etc.). In practical applications, because the central control system and the dashboard system have many functions, the read hardware basic parameters will increase accordingly. Therefore, in an optional embodiment of the present invention, after reading the hardware basic parameters, data processing and screening may be performed to obtain the relevant data of the needed human-machine interface evaluation, and the type and the amount of the data may be specifically adjusted according to different actual requirements, which is not limited by the present invention.
Continuing with step S102, after the hardware basic parameters are read, quantization processing is performed to obtain quantized values of the hardware basic parameters. Before the data is quantized, a hardware basic parameter quantization table may be established, and scores corresponding to different states of each hardware device may be stored in the quantization table, so as to quantize the read hardware basic parameters. Such as the degree of influence of state parameters of various types of data of hardware equipment in the vehicle on a human body and the like.
Taking the abnormal noise parameter of the center control system mentioned in the above embodiment as an example, a sound (noise) detection device placed in a vehicle in advance may be used to detect that the noise of the center control system is 10db in a state where the vehicle is powered on and stationary, and the noise value less than or equal to 15db is classified as an excellent classification according to the influence degree of the noise on the human body according to the quantization table, and is converted into a quantization value corresponding to 5 points, and the higher the noise value is, the lower the score is, and in this way, the hardware basic parameter is quantized to obtain the quantized value of the hardware basic parameter. In practical applications, other quantization manners may also be included, and are not described herein again.
In this embodiment, since the hardware device includes a central control system and a dashboard system, at this time, assuming that the hardware basic parameter quantized value is a, a may include a central control system hardware parameter quantized value a1 and a dashboard hardware parameter quantized value a 2.
In practical application, the hardware basic parameters may include multiple types of data, and specifically, when quantization processing is performed, quantization processing may be performed on all types of hardware data included in the central control system and the instrument panel system, respectively, and after respective quantized values are obtained, the quantized values are accumulated to finally obtain a quantized value a1 of the hardware parameters of the central control system and a quantized value a2 of the hardware parameters of the instrument panel. Of course, the quantized value of the hardware basic parameter may also be calculated in other manners, which is not described herein. After the quantized values a of the hardware basic parameters, the quantized values a1 of the hardware parameters of the center control system and the quantized values a2 of the hardware parameters of the instrument panel are obtained, they can be stored for subsequent use.
Referring to the step S104, the ergonomic relation parameter of the hardware device in the set driving environment and the relative relation parameter of the driver of the vehicle is recorded, and the ergonomic relation parameter is quantized to obtain a quantized value of the ergonomic relation parameter. The ergonomic relation parameter of the embodiment is relation data between hardware devices of the vehicle relative to a driver, for example, at least one of an inclination angle relation between a center control screen and an instrument panel, a finger operation position, a position height relation, a sight line obstruction, and an up-down inclination angle. Of course, the ergonomic parameters may also include other data than those listed above, and the present invention is not limited thereto.
In practical applications, the recorded ergonomic parameters may be different due to differences in individual characteristics of different types of users, for example, the recorded data may be different due to differences in height and weight of male and female user groups. In the embodiment of the invention, various driving environments can be set to respectively record the ergonomic relation parameters in different set driving environments, so that the ergonomic relation parameters in different set driving environments are subjected to overall analysis and quantification, and a quantitative value of the ergonomic relation parameter with higher accuracy is obtained, so that the objective evaluation purpose with higher applicability is achieved for applicable users of different groups.
Optionally, the step S104 may further include: firstly, recording a first group of human-machine engineering relation parameters of hardware equipment in a first set driving environment and carrying out quantization processing to obtain a first group of human-machine engineering relation parameter quantized values; secondly, recording a second group of human-machine engineering relation parameters of the hardware equipment in a second set driving environment and carrying out quantization processing to obtain a second group of human-machine engineering relation parameter quantized values; and finally, taking the average value of the first group of quantized values of the ergonomic relation parameters and the second group of quantized values of the ergonomic relation parameters as a final quantized value of the ergonomic relation parameters. The first set driving environment and the second set driving environment are different in user group. For example, the first set driving environment is a simulated driving scene in which a male user is the driver, and the second set driving environment is a simulated driving scene in which a female user is the driver. In addition, the age groups of the users can be divided, and the driving environments are set for the users in different age groups, so that the accuracy of the quantitative values of the human-machine engineering relation parameters is higher, and the objectivity is higher.
In this embodiment, assuming that the finally obtained quantized value of the ergonomic parameters is B, B may include quantized values of central control system ergonomic parameters B1 and dashboard ergonomic parameters B2.
For example, when the quantized value B1 of the central control system human-machine parameter among the quantized values of the human-machine engineering relationship parameters is obtained, for example, the relationship parameter of the high and low positions of the central control screen in B1 is quantized, the shooting device obtains an image of the relationship of the high and low positions of the central control screen and transmits the image to the background computing device, and the background computing device reads the shot image (as shown in fig. 2) of the human-machine interface of the central control system observed from the perspective of the first person in the simulated real driving environment and compares the image with the quantized standard (as shown in fig. 3) of the relationship of the high and low positions of the central control screen in the computer to obtain the quantized value of the relationship of the high. The specific quantification rule may be that, if the central control system interface is located in a range of 50-60% of the longitudinal y axis in the acquired shooting picture, the time for a simulated driver to observe the central control system interface to leave the road surface is the minimum when the vehicle is in driving, the vehicle is the optimal position, and the quantification is correspondingly quantified to 5 points (as shown in fig. 4). For other ergonomic data, the same quantization processing mode can be adopted, or a quantization table of corresponding scores of different data state data is established for quantization, for example, the degree of sight line shielding can be quantized into different scores, and the like.
For the quantized values B1 of the man-machine parameters of the central control system and the quantized values B2 of the man-machine parameters of the instrument panel, the quantized values B1 of the man-machine parameters of the central control system and the quantized values B2 of the man-machine parameters of the instrument panel may be accumulated after quantizing the specific data included in the central control system and the instrument panel, respectively, or other ways are adopted, which is not limited in the. After the quantized values of the ergonomic relationship parameters B, the quantized values of the central system ergonomic parameters B1 and the dashboard ergonomic parameters B2 are obtained, they may be stored for later use.
In addition to the hardware device data and the human-computer relationship data described above, referring to step S106, it is also necessary to record human-computer interaction parameters of a preset human-computer interaction task completed by the hardware device and a speech system in the vehicle, and obtain quantized values of the human-computer interaction parameters after quantizing the human-computer interaction parameters. In an optional embodiment of the present invention, the human-computer interaction parameter may include a scenarization task evaluation parameter generated by the hardware device in the process of executing the scenarization interaction task, and a voice interaction parameter for the voice system to complete the voice conversation task. That is, the step S106 may further include:
and S1, recording the scene task evaluation parameters of the hardware equipment for executing at least one scene interaction task under the set driving environment, and carrying out quantitative processing on the scene task evaluation parameters to obtain scene task evaluation parameter quantitative values. Optionally, the scenized task evaluation parameters may include an execution step and time when the hardware device executes at least one scenized task, and a task completion time of the scenized task when the vehicle is in a static driving state and a dynamic driving state, respectively.
Taking a scenized task evaluation of the connected Bluetooth device as an example, aiming at any set driving environment, the execution steps and time of the connected Bluetooth device in static and dynamic driving states are recorded by a camera, data such as the execution steps and the corresponding time are transmitted to a background computing device for data processing, and according to a quantization table corresponding to the scene of the connected Bluetooth device (the quantization table can be a preset score quantization table after the Bluetooth connection time is evaluated), quantization processing is carried out to obtain a quantization value corresponding to the scenized task evaluation of the connected Bluetooth device.
And S2, recording voice interaction parameters of the central control system executing at least one or more voice conversation tasks under the set driving environment, and quantizing the voice interaction parameters to obtain quantized values of the voice interaction parameters. The voice interaction parameter may include at least one of a recognition success rate, a support capability parameter, and a voice state feedback parameter.
Aiming at any set driving environment, the background computing equipment selects one or more voice conversation mode parameters from a conversation semantic library, inputs the parameters into the central control system, records and reads data such as recognition success rate, support capability, voice state feedback and the like returned by a voice system in the central control system through a computer, performs data processing to obtain voice interaction parameters, and performs quantization processing according to a quantization table (such as quantization scores corresponding to accuracy, recognizable semantic types and the like) corresponding to the voice interaction parameters to obtain the quantized values of the voice interaction parameters.
For example, for a use scene navigated to a given destination in a voice system, 5 different dialogue semantic parameters are input into the voice system in the central control system by the background computing device, the data of the support capability returned by the voice system is recorded and read by the background computing device, and is subjected to data processing to obtain voice interaction parameters, and quantization is performed according to a quantization table corresponding to the voice interaction parameters to obtain a voice interaction parameter quantization value, for example, if 3 dialogue semantics are supported, the quantization value is 3 scores. And can be quantized into different scores according to the accuracy, speed and the like of the speech recognition.
In the present embodiment, assuming that the human-computer interaction parameter quantized value is C, C may include a scenarization task evaluation parameter quantized value C1 and a voice interaction parameter quantized value C2. For the quantitative values C1 of the scenized task evaluation parameters and the quantitative values C2 of the voice interaction parameters, the specific data included in the scenized task evaluation parameters and the specific data included in the voice interaction parameters may be respectively quantized and then accumulated, or other manners may be adopted, which is not limited in the present invention. After the human-computer interaction parameter quantized value C and the scene task evaluation parameter quantized value C1 and the voice interaction parameter quantized value C2 included in the human-computer interaction parameter quantized value C are obtained, the human-computer interaction parameter quantized value C and the scene task evaluation parameter quantized value C can be stored for later use.
Referring to step S108, after the hardware basic parameter quantized value, the human-machine engineering relationship parameter quantized value, and the human-machine interaction parameter quantized value are obtained, the hardware basic parameter quantized value, the human-machine engineering relationship parameter quantized value, and the human-machine interaction parameter quantized value are input into a pre-established evaluation model, and a human-machine interface evaluation value of the vehicle is output through the evaluation model.
In an optional embodiment of the invention, when the human-computer interface evaluation value of the vehicle is output through the evaluation model, the respective weights of the hardware basic parameter quantitative value, the human-computer engineering relation parameter quantitative value and the human-computer interaction parameter quantitative value can be respectively determined by using the evaluation model; and carrying out weighted summation on the quantized values of the hardware basic parameters, the quantized values of the human-computer engineering relation parameters and the quantized values of the human-computer interaction parameters and respective weights to obtain and output a human-computer interface evaluation value of the vehicle.
When determining respective weights of a hardware basic parameter quantized value, a human-machine engineering relation parameter quantized value and a human-machine interaction parameter quantized value, determining a first weight of the hardware basic parameter quantized value based on the functional coverage range and degree of the central control system and the instrument panel system and the number of the intelligent hardware devices through the evaluation model; determining a second weight of the ergonomic relationship parameter quantitative value based on the recorded relationship parameters of the central control system and the instrument panel system relative to the driver of the vehicle; calculating a third weight of the human-computer interaction parameter quantized value based on the hardware basis parameter quantized value and the human-computer interaction parameter quantized value.
That is, the evaluation model mainly assigns weights to the hardware basic parameter quantization value, the human-machine engineering relationship parameter quantization value and the human-machine interaction parameter quantization value. The weight a of the quantized value A of the basic parameter of the hardware equipment is preferably 0.1-0.3, the quantized value A of the basic parameter of the hardware equipment comprises two parts, namely a quantized value A1 of the hardware parameter of a center control system and a quantized value A2 of the hardware parameter of an instrument panel, and the weights a1 and a2 of the quantized value A of the basic parameter of the hardware equipment are respectively 0.5-0.9 and 0.1-0.5.
The specific weight values of the central control system hardware parameter quantized value A1 and the instrument panel hardware parameter quantized value A2 can be selected and set with different weights according to different functional coverage ranges and degrees of a central control system and an instrument panel system of a vehicle and the difference of the number of intelligent hardware devices.
For the hardware device basic data, in a general state, if the vehicle hardware is read to only include a central control system and an instrument panel system, the weight a of the hardware basic parameter quantized value a is set to 0.2, if the number of the intelligent hardware devices of the test vehicle is read to be large, for example, the hardware device basic data includes, but not limited to, intelligent hardware devices such as a secondary instrument console control panel, a head-up display system, a secondary driving entertainment screen and a rear row entertainment screen in addition to the central control system and the instrument panel system, and the weight a of the hardware basic parameter quantized value a can be set to 0.3.
For the setting of the weight a1 of the quantized value a1 of the center system hardware parameter and the weight a2 of the quantized value a2 of the instrument panel hardware parameter, when the vehicle reads that the instrument panel system has more functions including, but not limited to, cruise control, adaptive cruise control, Advanced Driving Assistance (ADAS), and automatic driving, the weight a1 of the quantized value a1 of the center system hardware parameter and the weight a2 of the quantized value a2 of the instrument panel hardware parameter can be respectively set to 0.5 and 0.5. If the instrument panel has less functions and only comprises basic vehicle information display, the weight a1 of the quantized value A1 of the hardware parameter of the center control system and the weight a2 of the quantized value A2 of the hardware parameter of the instrument panel can be respectively set to 0.9 and 0.1. The sum of the weight a1 of the quantized center system hardware parameter value A1 and the weight a2 of the quantized instrument panel parameter value A2 is equal to 1.
Finally, the basic evaluation value X of the hardware device can be calculated and obtained according to the quantized central system hardware parameter a1, the quantized instrument panel hardware parameter a2, and the weights a1 and a2 thereof (a1 a1+ a2 a 2).
For the quantized value B of the human-machine relationship parameter, the weight B is preferably 0.1-0.3, the quantized value B of the human-machine relationship parameter comprises a quantized value B1 of the human-machine parameter of the center control system and a quantized value B2 of the instrument panel, the weights B1 and B2 are respectively in the interval of 0.5-0.9 and 0.1-0.5, and the sum of the weight B1 of the quantized value B1 of the human-machine relationship parameter of the center control system and the weight B2 of the quantized value B2 of the instrument panel is equal to 1.
The quantized value B of the ergonomic parameters is set to 0.2 in a normal state, such as only a central control system and a steering wheel controller, and can be set to 0.3 if hardware including but not limited to a system integrated controller, an air conditioner key control panel, a multimedia knob, a driving function control panel and the like can be acquired besides the central control system and the steering wheel controller.
For the quantized value B1 of the center system human machine parameter, when the center system area acquired to the test vehicle includes other components such as a control panel, a key, and a controller in addition to the touch screen, the weight B1 of the quantized value B1 of the center system human machine parameter may be set to 0.9, and the weight B2 of the quantized value B2 of the dashboard human machine parameter may be set to 0.1. On the contrary, if the center control system has only a touch screen and the dashboard system has other devices besides the steering wheel controller, such as a joystick, a key panel, etc., the weights B1 and B2 can be set to 0.5 and 0.5, respectively, and the sum of the weight B1 of the quantized center control system human machine parameter value B1 and the weight B2 of the quantized dashboard human machine parameter value B2 is equal to 1.
Finally, according to the quantized values of the man-machine parameters of the central control system B1 and the instrument panel B2 and the weights B1 and B2 thereof, an ergonomic relation evaluation value Y can be calculated and obtained, wherein Y is B (B1B 1+ B2B 2)
And for the human-computer interaction parameter quantized value C, the weight C is preferably in the range of 0.4-0.8, the human-computer interaction parameter quantized value C comprises a scene task evaluation parameter quantized value C1 and a voice interaction parameter quantized value C2, and the weights C1 and C2 of the two parts are in the ranges of 0.5-0.9 and 0.1-0.5 respectively.
Finally, according to the scene task evaluation parameter quantitative value C1, the voice interaction parameter quantitative value C2 and the respective weights C1 and C2, a human-computer interaction evaluation value Z can be obtained through calculation, wherein Z is C (C1C 1+ C2C 2)
In practical applications, the sum of the weight a of the hardware basic parameter quantized value a, the weight B of the ergonomic relation parameter quantized value B, and the weight C of the human-machine interaction parameter quantized value C is 1, and therefore, after a and B are determined, the value of C can be calculated by the formula a + B + C being 1. If a is 0.3 and b is 0.3, c is a weight of 0.4.
For the weight C1 of the scene task evaluation parameter quantized value C1 and the weight C2 of the voice interaction parameter quantized value C2, the corresponding task evaluation can be added according to the number of intelligent devices or the number of auxiliary driving functions of the vehicle, such as but not limited to a head-up display system and an advanced driving assistance function (ADAS) in addition to a central control system and an instrument panel system, and the weight C1 of the scene task evaluation parameter quantized value C1 can be increased to 0.9. Similarly, if the obtained speech system function is complete, the weight C2 of the quantized value C2 of the speech interaction parameter can be increased.
The sum of the evaluation values (X, Y, Z) of the three parts is obtained by the evaluation model to be the evaluation value O of the final human-computer interface, namely, O is X + Y + Z. Therefore, the human-computer interface evaluation value of one vehicle can be obtained, and in practical application, the human-computer interface evaluation values of a plurality of vehicles can be calculated and obtained in the manner mentioned in the above embodiment, so that an evaluation vehicle data analysis library including the human-computer interface evaluation value of each vehicle is established, and transverse contrastive analysis is performed on the plurality of vehicles. Optionally, the evaluation vehicle data analysis library in this embodiment may store not only the final human-machine interface evaluation value, but also a quantized value of each set of data and a corresponding weight thereof, so as to more conveniently perform lateral comparison between multiple vehicles.
In the embodiment of the invention, in a pre-constructed automobile human-computer interface standard evaluation test environment, when a real driving environment is simulated, a camera records human-computer interfaces such as a central control system, an instrument panel system and the like in an automobile to obtain data of the human-computer interfaces, in the process of simulation operation, the camera records the data of the human-computer interfaces, a computer reads hardware parameters of the central control system and the instrument panel system and processes the data of the human-computer interfaces to obtain parameters of the human-computer interface system, shooting recording equipment (such as a camera, the camera and the like) or the computer quantizes the parameters of the human-computer interface system and inputs the quantized values into an evaluation model, and the evaluation model inputs the quantized values into the evaluation model according to quantized hardware basic parameter quantized values, quantized values of human-computer engineering relation parameters and quantized values of human-computer interaction parameters and then according to the functional coverage range and degree of the central control system and the instrument panel, and the quantity difference of the intelligent hardware equipment, determining the weight of the quantized parameters, calculating and outputting a final evaluation value according to the weight, wherein the evaluation value is used for evaluating the operability and usability of the human-computer interface of the evaluation vehicle, and summarizing the final evaluation value to an evaluation vehicle type database for transverse comparison and analysis.
The method provided by the embodiment of the invention has the following beneficial effects:
1. the embodiment of the invention combines two methods of data reading analysis and user scene task evaluation, can greatly improve the accuracy, objectivity and authenticity of the evaluation of the vehicle human-computer interface, and provides accurate and efficient data and guidance opinions for the analysis and evaluation of the vehicle human-computer interface targets;
2. the embodiment of the invention highly simulates the real use condition and environment of the user and obtains the evaluation data and result which are highly consistent with the real use environment.
3. The embodiment of the invention can set different simulated real driving environments and evaluate the simulated real driving environments so as to achieve the objective evaluation purpose with higher applicability aiming at applicable users of different groups.
4. After the vehicle is evaluated, the evaluation vehicle data analysis database is established, transverse comparison analysis can be performed, an evaluation system is enriched, and comprehensive design analysis guidance is provided.
In an optional embodiment of the present invention, there is further provided a computer-readable storage medium, wherein at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the storage medium, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded by a processor and executes the vehicle human-machine interface benchmarking method according to any one of the above embodiments.
In an optional embodiment of the present invention, there is further provided an electronic device, characterized by comprising a processor and a memory, wherein the memory stores at least one instruction, at least one program, a code set, or a set of instructions, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the vehicle human-machine interface benchmarking method according to any one of the above embodiments.
It can be clearly understood by those skilled in the art that the specific working process of the system described above may refer to the corresponding process in the foregoing method embodiments, and for the sake of brevity, no further description is provided herein.
Those of ordinary skill in the art will understand that: the above-described method, if implemented in software and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computing device (e.g., a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention when the instructions are executed. And the aforementioned storage medium includes: u disk, removable hard disk, Read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disk, and other various media capable of storing program code.
Alternatively, all or part of the steps of implementing the foregoing method embodiments may be implemented by hardware (such as a computing device, e.g., a personal computer, a server, or a network device) associated with program instructions, which may be stored in a computer-readable storage medium, and when the program instructions are executed by a processor of the computing device, the computing device executes all or part of the steps of the method according to the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments can be modified or some or all of the technical features can be equivalently replaced within the spirit and principle of the present invention; such modifications or substitutions do not depart from the scope of the present invention.

Claims (10)

1. A vehicle human-computer interface benchmarking method comprises the following steps:
reading hardware basic parameters of hardware equipment in a vehicle and carrying out quantization processing to obtain quantized values of the hardware basic parameters; wherein the hardware device comprises a central control system and an instrument panel system of the vehicle;
recording an ergonomic relation parameter of a relative relation parameter between the hardware equipment and a driver of the vehicle under a set driving environment, and carrying out quantization processing on the ergonomic relation parameter to obtain an ergonomic relation parameter quantized value;
recording human-computer interaction parameters of the hardware equipment and a voice system in the vehicle for completing a preset human-computer interaction task, and carrying out quantization processing on the human-computer interaction parameters to obtain human-computer interaction parameter quantization values;
and inputting the hardware basic parameter quantitative value, the human-machine engineering relation parameter quantitative value and the human-machine interaction parameter quantitative value into a pre-established evaluation model, and outputting a human-machine interface evaluation value of the vehicle through the evaluation model.
2. The method of claim 1, wherein the recording of the ergonomic relationship parameter of the hardware device relative to the driver of the vehicle in the set driving environment to obtain the quantified value of the ergonomic relationship parameter comprises:
recording a first group of human-machine engineering relation parameters of the hardware equipment in a first set driving environment and carrying out quantization processing to obtain a first group of human-machine engineering relation parameter quantized values;
recording a second group of human-machine engineering relation parameters of the hardware equipment in a second set driving environment and carrying out quantization processing to obtain a second group of human-machine engineering relation parameter quantized values;
taking the average value of the first group of quantized values of the ergonomic relationship parameters and the second group of quantized values of the ergonomic relationship parameters as a final quantized value of the ergonomic relationship parameters;
wherein the first set driving environment and the second set driving environment are different in user population.
3. The method according to claim 1 or 2,
the ergonomic relation parameters comprise at least one of the inclination angle relation between the central control screen and the instrument panel, the finger operation position, the position height relation, the sight shielding and the upper and lower inclination angles.
4. The method of claim 1, wherein the human-computer interaction parameters comprise a scenarized task assessment parameter and a voice interaction parameter;
the recording of the human-computer interaction parameters of the hardware equipment and the voice system in the vehicle for completing the preset human-computer interaction task, and the quantization of the human-computer interaction parameters to obtain the quantized values of the human-computer interaction parameters comprise:
recording a scene task evaluation parameter of the hardware equipment for executing at least one scene interaction task under the set driving environment, and carrying out quantization processing on the scene task evaluation parameter to obtain a scene task evaluation parameter quantization value;
recording voice interaction parameters of at least one or more voice conversation tasks executed by the central control system under the set driving environment, and carrying out quantization processing on the voice interaction parameters to obtain a voice interaction parameter quantization value.
5. The method of claim 4,
the scene task evaluation parameters comprise execution steps and time when the hardware equipment executes at least one scene task, and task completion time of the scene task when the vehicle is in a static driving state and a dynamic driving state respectively;
the voice interaction parameter comprises at least one of a recognition success rate, a support capability parameter and a voice state feedback parameter.
6. The method of claim 1, wherein outputting, by the assessment model, a human-machine interface assessment value for the vehicle comprises:
respectively determining the respective weights of the hardware basic parameter quantized value, the human-machine engineering relation parameter quantized value and the human-machine interaction parameter quantized value by utilizing the evaluation model;
and carrying out weighted summation on the hardware basic parameter quantized value, the human-machine engineering relation parameter quantized value and the human-machine interaction parameter quantized value and respective weights to obtain and output a human-machine interface evaluation value of the vehicle.
7. The method of claim 6, wherein the determining, by the evaluation model, respective weights of the hardware basis parameter quantitative value, the human-machine engineering relation parameter quantitative value, and the human-machine interaction parameter quantitative value comprises:
determining a first weight of the quantized value of the hardware basic parameter based on the functional coverage and degree of the central control system and the instrument panel system and the number of the intelligent hardware devices through the evaluation model;
determining a second weight of the ergonomic relationship parameter quantitative value based on the recorded relationship parameters of the central control system and the instrument panel system relative to the driver of the vehicle;
calculating a third weight of the human-computer interaction parameter quantized value based on the hardware basis parameter quantized value and the human-computer interaction parameter quantized value;
wherein a sum of the first weight, the second weight, and the third weight is 1.
8. The method of claim 1, wherein before reading hardware basic parameters of hardware devices in the vehicle and performing quantization processing, the method further comprises:
constructing a test environment, constructing a preset number of cameras with preset visual angles in a set vehicle to establish a simulated driving environment, and recording a central control system of the simulated driving environment and a human-computer interface of an instrument panel system through the cameras;
the simulated driving environment is obtained by adjusting the positions of a seat, a steering wheel and an inside and outside rearview mirror of the vehicle to the corresponding positions in the driving environment according to body parameters corresponding to different user groups.
9. A computer-readable storage medium, wherein at least one instruction, at least one program, a set of codes, or a set of instructions is stored, which is loaded by a processor and executes a vehicle human machine interface landmark evaluation method according to any one of claims 1-8.
10. An electronic device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the vehicle human machine interface benchmarking method of any one of claims 1-8.
CN202010163702.7A 2020-03-10 2020-03-10 Vehicle human-computer interface benchmarking evaluation method Active CN111309628B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010163702.7A CN111309628B (en) 2020-03-10 2020-03-10 Vehicle human-computer interface benchmarking evaluation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010163702.7A CN111309628B (en) 2020-03-10 2020-03-10 Vehicle human-computer interface benchmarking evaluation method

Publications (2)

Publication Number Publication Date
CN111309628A true CN111309628A (en) 2020-06-19
CN111309628B CN111309628B (en) 2021-05-28

Family

ID=71158948

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010163702.7A Active CN111309628B (en) 2020-03-10 2020-03-10 Vehicle human-computer interface benchmarking evaluation method

Country Status (1)

Country Link
CN (1) CN111309628B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113836790A (en) * 2021-08-25 2021-12-24 成都鲁易科技有限公司 Method and device for evaluating intelligent level of electric bicycle and computer equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009013291A1 (en) * 2009-03-14 2010-09-16 Audi Ag Method for preparing control process for active vehicle component influencing driving dynamics of vehicle, involves simulating defined vehicle maneuver by vehicle-modeling system for vehicle components
CN104200267A (en) * 2014-09-23 2014-12-10 清华大学 Vehicle driving economy evaluation system and vehicle driving economy evaluation method
CN109887373A (en) * 2019-01-30 2019-06-14 北京津发科技股份有限公司 Driving behavior collecting method, assessment method and device based on vehicle drive

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009013291A1 (en) * 2009-03-14 2010-09-16 Audi Ag Method for preparing control process for active vehicle component influencing driving dynamics of vehicle, involves simulating defined vehicle maneuver by vehicle-modeling system for vehicle components
CN104200267A (en) * 2014-09-23 2014-12-10 清华大学 Vehicle driving economy evaluation system and vehicle driving economy evaluation method
CN109887373A (en) * 2019-01-30 2019-06-14 北京津发科技股份有限公司 Driving behavior collecting method, assessment method and device based on vehicle drive

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王康: "轨道交通车辆系统安全评价云模型", 《中国安全科学学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113836790A (en) * 2021-08-25 2021-12-24 成都鲁易科技有限公司 Method and device for evaluating intelligent level of electric bicycle and computer equipment
CN113836790B (en) * 2021-08-25 2024-02-02 成都鲁易科技有限公司 Method and device for evaluating intelligent grade of electric bicycle and computer equipment

Also Published As

Publication number Publication date
CN111309628B (en) 2021-05-28

Similar Documents

Publication Publication Date Title
Roberts et al. How persuasive is a good fit? A comment on theory testing.
Zhang et al. Understanding multitasking through parallelized strategy exploration and individualized cognitive modeling
Ma et al. Assessing the driving distraction effect of vehicle HMI displays using data mining techniques
CN107766577B (en) Public opinion monitoring method, device, equipment and storage medium
de Jong et al. A common dynamic prior for time in duration discrimination
CN111309628B (en) Vehicle human-computer interface benchmarking evaluation method
CN116165988A (en) Production quality control method and system for automobile center console
Liu et al. Strategy and implementing techniques for the sound quality target of car interior noise during acceleration
CN106663260B (en) Information presentation device, method, and program
Jeong et al. Modeling of stimulus-response secondary tasks with different modalities while driving in a computational cognitive architecture
Islam et al. Measuring user responses to driving simulators: A galvanic skin response based study
CN111026267A (en) VR electroencephalogram idea control interface system
Ma et al. Study on the evaluation method of in-vehicle gesture control
CN117195399A (en) Intelligent cabin user experience and interaction design method, device, equipment and medium
CN116578921A (en) Method and device for evaluating eye movement interaction usability of central control interface, vehicle and storage medium
Zhang et al. Mid-air gestures for in-vehicle media player: elicitation, segmentation, recognition, and eye-tracking testing
Ma et al. Impact of in-vehicle touchscreen size on visual demand and usability
Ma et al. The Impact of In-Vehicle Voice Interaction System on Driving Safety
CN113971897A (en) Driving simulation system, and method and device for calibrating degree of truth of driving simulation system
Pan et al. Vehicle interior sound quality evaluation index selection scheme based on grey relational analysis
JP6860602B2 (en) General-purpose artificial intelligence device and general-purpose artificial intelligence program
Torkkola et al. Sensor selection for maneuver classification
Tideman et al. Design and evaluation of a virtual gearshift application
Black et al. Creation of a driver preference objective metric to evaluate ground vehicle steering systems
Nopiah et al. Optimisation of acoustical comfort in vehicle cabin using goal programming

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220411

Address after: 430051 No. b1336, chuanggu startup area, taizihu cultural Digital Creative Industry Park, No. 18, Shenlong Avenue, Wuhan Economic and Technological Development Zone, Wuhan, Hubei Province

Patentee after: Yikatong (Hubei) Technology Co.,Ltd.

Address before: No.c101, chuanggu start up area, taizihu cultural Digital Industrial Park, No.18 Shenlong Avenue, Wuhan Economic Development Zone, Hubei Province

Patentee before: HUBEI ECARX TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right