CN115409535A - Complex product perceptual interaction performance evaluation method fusing multi-source heterogeneous data - Google Patents

Complex product perceptual interaction performance evaluation method fusing multi-source heterogeneous data Download PDF

Info

Publication number
CN115409535A
CN115409535A CN202210852388.2A CN202210852388A CN115409535A CN 115409535 A CN115409535 A CN 115409535A CN 202210852388 A CN202210852388 A CN 202210852388A CN 115409535 A CN115409535 A CN 115409535A
Authority
CN
China
Prior art keywords
evaluation
value
matrix
eye movement
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210852388.2A
Other languages
Chinese (zh)
Inventor
陆蔚华
姜冠岳
刘雨婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN202210852388.2A priority Critical patent/CN115409535A/en
Publication of CN115409535A publication Critical patent/CN115409535A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0282Rating or review of business operators or products
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • General Physics & Mathematics (AREA)
  • Finance (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Multimedia (AREA)
  • Game Theory and Decision Science (AREA)
  • Human Computer Interaction (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a multi-source heterogeneous data fused complex product perceptual interaction performance evaluation method, which comprises the steps of firstly obtaining subjective evaluation data, eye movement experiment data and facial expression experiment data of a testee in k evaluation dimensions, then establishing an evaluation matrix according to the mean value of each datum, calculating the weight of the matrix, finally establishing a comprehensive evaluation matrix, and judging whether the user evaluation satisfaction of each sample in each evaluation dimension is a forward value or not by comparing with a threshold value; the method disclosed by the invention integrates subjective and objective dimensions, integrates three indexes of subjective evaluation, eye movement data and facial expression data, accurately and comprehensively obtains the perceptual interaction evaluation of a testee on a complex product under the condition of not influencing the normal operation of the testee, provides a new experimental paradigm and a data analysis system for the complex product design, maps out the implicit requirements of users, overcomes the uncertainty and the fuzziness of the traditional method, and provides effective reference for the product design optimization.

Description

Complex product perceptual interaction performance evaluation method fusing multi-source heterogeneous data
Technical Field
The invention relates to a method for evaluating perceptual interaction performance of a complex product.
Background
The complex product has the characteristics of space tightness, dense contacts, complex and integrated functions and the like, for example, a cockpit of an airplane, a high-speed rail, a submarine, a tank and the like faces to a complex dynamic environment, different working scenes and different operation stages, the perceptual interaction between a user and the complex product is a dynamic evolution process along with a time sequence, is comprehensively influenced by a plurality of factors, and the interaction mechanism is complex, so that the perceptual interaction process between the user and the complex product is particularly complex, and the implicit requirements of the user cannot be comprehensively and reasonably explained only by means of single-dimensional data.
The patent "a method for quantitatively evaluating food consumer acceptance by applying facial expression emotion recognition and electroencephalogram analysis" (CN 113570211A) adopts a method of facial expression recognition and electroencephalogram analysis to evaluate food acceptance, which is an evaluation means commonly used for simple products at present, but in the environment of complex products, an electroencephalogram analysis instrument can influence the normal driving operation of a testee, so that the accuracy of evaluation results is greatly reduced, and therefore, the problem of which appropriate combination of quantification tools is adopted to represent perceptual quality to map design requirements in the dynamic environment of complex products is the key for optimizing the appearance design of complex products.
Disclosure of Invention
The purpose of the invention is as follows: the invention aims to provide a method for comprehensively and accurately evaluating the perceptual interaction between a user and a complex product by integrating subjective and objective dimensions.
The technical scheme is as follows: the invention discloses a method for evaluating perceptual interaction performance of a complex product fusing multi-source heterogeneous data, which comprises the following steps:
(1) Establishing a complex product form sample library, and acquiring subjective evaluation data, eye movement experiment data and facial expression experiment data of a tested person in k evaluation dimensions;
(2) Calculating the mean value of subjective evaluation data, the mean value of eye movement experiment data and the mean value of facial expression experiment data of a tested person in each evaluation dimension, and establishing a subjective evaluation matrix, an eye movement evaluation matrix and a facial expression evaluation matrix;
(3) Calculating the weights of the subjective evaluation matrix, the eye movement evaluation matrix and the facial expression evaluation matrix;
(4) And establishing a comprehensive evaluation matrix according to the subjective evaluation matrix, the eye movement evaluation matrix, the facial expression evaluation matrix and the weight thereof, and judging whether the user evaluation satisfaction of the product in the evaluation dimension is a forward value according to whether the numerical value of the comprehensive evaluation matrix is greater than a threshold value.
Further, the subjective evaluation data in the step (1) is a Likter scale score; the eye movement data is the watching duration and the staying duration; the facial expression data is emotional valence value.
Further, the eye movement evaluation matrix in the step (2) is an average value of reciprocal of the mean value of eye movement fixation degrees of the testee in each evaluation dimension, and the eye movement fixation degree is a ratio of fixation time length to stay time length.
Further, before calculating the weight in step (3), normalizing the subjective evaluation matrix, the eye movement evaluation matrix and the facial expression evaluation matrix.
Further, the method for calculating the weight in step (3) is to calculate the reciprocal of the complex correlation coefficient of the subjective evaluation matrix, the eye movement evaluation matrix and the facial expression evaluation matrix respectively, and then perform normalization processing respectively.
Further, in the step (4), if the value of the comprehensive evaluation matrix is greater than the threshold value, the user evaluation satisfaction of the product in the evaluation dimension is a forward value; if the value of the comprehensive evaluation matrix is equal to the threshold value, the user evaluation satisfaction of the product in the evaluation dimension is a neutral value; and if the value of the comprehensive evaluation matrix is smaller than the threshold value, the user evaluation satisfaction of the product in the evaluation dimension is a negative value.
Further, the complex product form sample library is a complex product interior form picture.
Further, the threshold in step (4) is selected from the normalized results of the scores of the Liktter scale.
Has the advantages that: compared with the prior art, the invention has the advantages that: (1) Aiming at the characteristics of a complex product, the subjective evaluation method and the physiological measurement instrument adopted by the invention have no interference or intrusiveness, so that unnecessary interference on the operation of a user is avoided, and the real feedback of the inductive interaction between the user and the product is not influenced; (2) The multi-source heterogeneous data indexes selected by the method can comprehensively reflect the comprehensive performance of the sensory interaction experience of the user and the complex product, give full play to the characteristics of the user and supplement each other, have comprehensiveness and accuracy, provide a new experimental paradigm and a data analysis system for the design of the complex product, map out the implicit requirements of the user, overcome the uncertainty and the ambiguity of the traditional method and provide effective reference for the optimization of the product design; (3) The perceptual interaction comprehensive performance evaluation method based on fusion of subjective evaluation, eye movement and facial expression data can calculate to obtain a specific numerical value, the interval is between 0 and 1, the middle value in the normalized result of the Lekter scale is selected as the threshold value, the perceptual interaction performance satisfaction of the evaluation dimension can be rapidly and visually judged, for complex products with complicated interactive operation, a user needs to execute a strict operation rule and also needs to carry out scientific and reasonable emergency treatment on emergent conditions, so that the user of the complex products needs to keep reasonable thinking as much as possible, meanwhile, fatigue and misoperation are avoided, and the neutral threshold value is more suitable for being used as the evaluation threshold value of perceptual interaction of the complex products.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
The technical scheme of the invention is further explained by combining the attached drawings.
As shown in fig. 1, the method for evaluating the perceptual interaction performance of the complex product includes the following steps:
(1) And inputting subjective evaluation, eye movement and experimental data of facial expressions, and establishing a data sample library.
The number of testees is represented by n, the number of product samples is represented by m, and the number of evaluation dimensions is represented by k. Wherein n, m and k are positive integers. The subjective evaluation data is the score of a Likter scale, the eye movement data is the fixation time length and the stay time length, and the facial expression data is the emotional effect value.
(2) Constructing evaluation matrix of data of each mode
(2-1) establishing a subjective evaluation matrix
According to the measured value of the subjective evaluation data of n testees in the k evaluation dimensionEstablishing a subjective evaluation data element set matrix Q k
Figure BDA0003755147050000031
Wherein,
Figure BDA0003755147050000032
represents the List of Liktt scores for the mth product sample in the kth evaluation dimension for the nth subject. In this embodiment, a level 5 List table is selected, and the score is set to (-2, -1,0,1,2), which can further represent the negative value, the neutral value and the positive value of the user evaluation.
Calculating the mean value of the Like's Table scores of the n testees for each product sample in the kth evaluation dimension
Figure BDA0003755147050000033
And as subjective evaluation scores, obtaining the evaluation overall trend of the single product sample in the k-th evaluation dimension, which is explained by the subjective evaluation data:
Figure BDA0003755147050000034
establishing a matrix by using the subjective evaluation scores of the m product samples in the kth evaluation dimension to obtain a subjective evaluation matrix S k
Figure BDA0003755147050000035
(2-2) establishing an eye movement evaluation matrix
Establishing an eye movement data element set matrix T according to the eye movement data measured value of n testees in the k-th evaluation dimension k
Figure BDA0003755147050000036
Wherein,
Figure BDA0003755147050000041
respectively representing the fixation time and the stay time of the nth subject on the mth product sample in the kth evaluation dimension. Calculating the ratio to obtain the eye movement fixation degree
Figure BDA0003755147050000042
Figure BDA0003755147050000043
For the negative proportion relation between the eye movement score and the emotion value, the reciprocal of the eye movement fixation degree is selected for calculation, and the eye movement fixation degree mean value of each product sample in the k evaluation dimension of n testees is obtained
Figure BDA0003755147050000044
And (3) obtaining the evaluation overall trend of the single product sample explained by the eye movement data in the k evaluation dimension as the eye movement score:
Figure BDA0003755147050000045
establishing a matrix by using the eye movement score mean value of the m product samples in the k evaluation dimension to obtain an eye movement evaluation matrix E k
Figure BDA0003755147050000046
(2-3) establishing facial expression evaluation matrix
Establishing a facial expression evaluation data element set matrix C according to the facial expression evaluation data measured values of n testees in the k-th evaluation dimension k
Figure BDA0003755147050000047
Wherein,
Figure BDA0003755147050000048
the facial expression valence value of the nth subject in the kth evaluation dimension for the mth product sample is shown.
Calculating the mean value of the facial expression effect values of the n testees in the k evaluation dimension of each product sample
Figure BDA0003755147050000049
As the facial expression score, the overall evaluation trend of a single product sample explained by facial expression data in the k-th evaluation dimension is obtained:
Figure BDA00037551470500000410
establishing a matrix by using the facial expression score mean value of m product samples in the kth evaluation dimension to obtain a facial expression evaluation matrix F k
Figure BDA0003755147050000051
(3) Matrix data normalization processing
The data in each evaluation matrix is linearly transformed so that the evaluation result value is mapped between [0,1 ].
Figure BDA0003755147050000052
Will S k 、E k 、F k Substituting the formula to obtain S k * 、E k * 、F k *
(4) Calculating the weight of each evaluation matrix
And calculating the complex correlation coefficient of each evaluation matrix, wherein the formula is as follows:
Figure BDA0003755147050000053
wherein
Figure BDA0003755147050000054
Is the average value of the parameters, and the average value of the parameters,
Figure BDA0003755147050000055
is the estimated value of the parameter to be estimated. To R ik Taking reciprocal and carrying out normalization processing to obtain each index weight w of single mode ik
Figure BDA0003755147050000056
Will matrix S k * 、E k * 、F k * Substituting the data in (1) into the above formula to obtain w sk 、w ek 、w fk
(5) Constructing a comprehensive evaluation matrix and outputting an evaluation value
Subjective evaluation matrix S after comprehensive normalization processing k * Eye movement evaluation matrix E k * Facial expression evaluation matrix F k * And the weight w corresponding to the matrix in turn sk 、w ek 、w fk Constructing a single evaluation dimension product comprehensive evaluation matrix Z k
Figure BDA0003755147050000057
The comprehensive evaluation matrix Z of the product is as follows:
Z=[Z 1 ,Z 2 ,…,Z k ]
z is an m × k matrix representing the evaluation value of each product sample in k evaluation dimensions during the product evaluation process.
(6) Analysis of evaluation results
After the score of the 5-level lie table is normalized, the middle value of 0.5 represents that the evaluation result is a neutral value, the value greater than 0.5 represents that the evaluation result is a positive value, and the value less than 0.5 represents that the evaluation result is a negative value; since the normalization processing is performed on the subjective evaluation, the eye movement evaluation and the facial expression evaluation matrix, the satisfaction of the subjective evaluation, the eye movement evaluation and the facial expression evaluation can be represented by using the normalized result of the Likter scale as a threshold value. In actual use, other threshold values may be selected according to the grade of the litterb scale, the kind of product, or the like.
Different products have different expectations for perceptual interaction by users, and threshold choices for determining perceptual interaction performance are also different. For example, the product homogeneity phenomenon of consumer products, such as household goods, small household appliances, 3C products, etc., is common, so that the user's perceptual interaction with the product is expected to be a very typical positive emotion, and the product is expected to be pleasant, so that the selection of the threshold value is more biased to the positive emotion according to the specific product. The complex product needs strict operation rules due to complex interactive operation, and also needs scientific and reasonable emergency treatment for the emergent situations. Users of complex products need to keep reasonable thinking as much as possible while avoiding fatigue and mishandling, and the threshold value of the perceptual interaction is preferably chosen to be neutral.
When the evaluation value is greater than 0.5, the user evaluation satisfaction of the product in the dimension is represented as a positive value, and the closer the numerical value is to 1, the higher the satisfaction is;
when the evaluation value is equal to 0.5, the user evaluation satisfaction degree of the product in the evaluation dimension is represented as a neutral value;
when the evaluation value is less than 0.5, the user evaluation satisfaction degree of the product in the evaluation dimension is a negative value, and the closer the value is to 0, the lower the satisfaction degree is.
The method of the present invention is verified by specific experimental data below.
Taking the interior decoration form of the cockpit of the business machine as an example, the perceptual interaction performance evaluation of fusing multi-source heterogeneous data is carried out.
(1) After screening and picture processing, determining 10 pictures of the interior decoration form of the cockpit of the official business machine as a complex product sample library, namely m =10; inviting 29 male pilots as the trial, i.e., n =29; in this embodiment, 3 perceptual interaction subjective evaluation dimensions are defined, which are respectively the cockpit interior trim appearance line-type feature, the functional effect corresponding to the cockpit shape, and the emotional experience in the morphological perceptual interaction between the pilot and the cockpit interior trim, and correspond to the cases of k =1, k =2, and k = 3.
(2) And recording subjective evaluation, eye movement and experimental data of facial expressions of the pilot in each perceptual interaction dimension, and establishing a data sample library. The subjective evaluation data is a 5-grade Lekter scale score, the eye movement data is the fixation time length and the stay time length, and the facial expression data is the emotional effect value.
(3) Constructing evaluation matrix of data of each mode
(3-1) establishing a subjective evaluation matrix
Establishing a subjective evaluation data element set matrix Q according to the subjective evaluation data measured value of 29 testees in the k-th evaluation dimension k From Q k Calculating the mean value of the Lekter scale scores of 29 testees for each product sample in the k evaluation dimension
Figure BDA0003755147050000071
As a subjective evaluation score, from
Figure BDA0003755147050000072
Establishing a subjective evaluation matrix S k
Figure BDA0003755147050000073
(3-2) establishing an eye movement evaluation matrix
Establishing an eye movement data element set matrix T according to the eye movement data measured value of 29 testees in the k-th evaluation dimension k From T k Obtaining the mean value of eye movement fixation degree of 29 testees in the k evaluation dimension of each product sample
Figure BDA0003755147050000074
Then by
Figure BDA0003755147050000075
Establishing an eye movement evaluation matrix E k
Figure BDA0003755147050000076
(3-3) establishing facial expression evaluation matrix
Establishing a facial expression evaluation data element set matrix C according to the facial expression evaluation data measured value of 29 testees in the k-th evaluation dimension k From C to C k Calculating the mean value of the facial expression effect values of 29 testees on each product sample in the k evaluation dimension
Figure BDA0003755147050000077
As a facial expression score, is composed of
Figure BDA0003755147050000078
Establishing facial expression evaluation matrix F k
Figure BDA0003755147050000081
(1) Normalizing the data in each evaluation matrix to map the evaluation result value to [0,1]]In the meantime. To obtain S k * 、E k * 、F k *
Figure BDA0003755147050000082
(2) Determining the weights w of the matrices sk 、w ek 、w fk
w s1 =36.04% w e1 =36.63 w f1 =27.33
w s2 =41.11% w e2 =28.32 w f2 =30.57
w s3 =27.25% w e3 =30.92 w f3 =41.83
(3) Comprehensive evaluation matrix Z for constructing business machine sample with single evaluation dimension k
Figure BDA0003755147050000091
Figure BDA0003755147050000092
Figure BDA0003755147050000093
The perceptual interaction comprehensive performance evaluation matrix Z of the sample library is:
Figure BDA0003755147050000101
when the evaluation value is larger than 0.5, the user evaluation satisfaction of the business machine in the evaluation dimension is represented as a positive value, and the closer the numerical value is to 1, the higher the satisfaction is.
When the evaluation value is equal to 0.5, the user evaluation satisfaction degree of the business machine in the evaluation dimension is represented as a neutral value;
and when the evaluation value is less than 0.5, the user evaluation satisfaction degree of the business machine in the evaluation dimension is a negative value. And the closer the value is to 0, the lower the satisfaction.
For example, the user evaluation satisfaction of sample 1 in two evaluation dimensions of k =2 and k =3 is greater than 0.5, while the satisfaction of k =1 is lower, so that the subsequent optimization design can be expanded for the evaluation dimension of k = 1; the user satisfaction degrees of the sample 5 in the three evaluation dimensions are all less than 0.5, and the design of the three evaluation dimensions needs to be further optimized; the user satisfaction of the sample 8 in k =2 evaluation dimension is greater than 0.5, while the satisfaction of k =1 and k =3 is lower, wherein the satisfaction of k =3 is much less than 0.5, so that the subsequent optimization design can be expanded for k =1 and k =3 evaluation dimensions, and the latter is optimized in focus.

Claims (10)

1. A complex product perceptual interaction performance evaluation method fusing multi-source heterogeneous data is characterized by comprising the following steps:
(1) Establishing a complex product form sample library, and acquiring subjective evaluation data, eye movement experiment data and facial expression experiment data of a tested person in k evaluation dimensions;
(2) Calculating the mean value of subjective evaluation data, the mean value of eye movement experiment data and the mean value of facial expression experiment data of a tested person in each evaluation dimension, and establishing a subjective evaluation matrix, an eye movement evaluation matrix and a facial expression evaluation matrix;
(3) Calculating the weights of the subjective evaluation matrix, the eye movement evaluation matrix and the facial expression evaluation matrix;
(4) And establishing a comprehensive evaluation matrix according to the subjective evaluation matrix, the eye movement evaluation matrix, the facial expression evaluation matrix and the weight thereof, and judging whether the user evaluation satisfaction degree of the product in the evaluation dimension is a forward value according to whether the numerical value of the comprehensive evaluation matrix is greater than a threshold value.
2. The method for evaluating the perceptual interaction performance of the complex product fusing the multi-source heterogeneous data according to claim 1, wherein the subjective evaluation data in the step (1) is a Lekter scale score.
3. The method for evaluating the perceptual interaction performance of the complex product fusing the multi-source heterogeneous data according to claim 1, wherein the eye movement data in the step (1) are a fixation time length and a dwell time length.
4. The method for evaluating the perceptual interaction performance of the complex product fusing the multi-source heterogeneous data according to claim 1, wherein the facial expression data in the step (1) is an emotional valence value.
5. The method for evaluating the perceptual interaction performance of the complex product fusing the multi-source heterogeneous data according to claim 1, wherein the eye movement evaluation matrix in the step (2) is an average of inverse average values of eye movement fixation degrees of the testee in each evaluation dimension, and the eye movement fixation degree is a ratio of a fixation time length to a retention time length.
6. The method for evaluating the perceptual interaction performance of the complex product fusing the multi-source heterogeneous data according to claim 1, wherein before the weight is calculated in the step (3), the subjective evaluation matrix, the eye movement evaluation matrix and the facial expression evaluation matrix are normalized.
7. The method for evaluating the perceptual interaction performance of the complex product fusing the multi-source heterogeneous data according to claim 1, wherein the method for calculating the weight in the step (3) is to calculate the reciprocal of a complex correlation coefficient of the subjective evaluation matrix, the eye movement evaluation matrix and the facial expression evaluation matrix respectively, and then perform normalization processing respectively.
8. The method for evaluating the perceptual interaction performance of the complex product fusing the multi-source heterogeneous data according to claim 1, wherein in the step (4), if the value of the comprehensive evaluation matrix is greater than a threshold value, the user evaluation satisfaction of the product in the evaluation dimension is a forward value; if the value of the comprehensive evaluation matrix is equal to the threshold value, the user evaluation satisfaction degree of the product in the evaluation dimension is a neutral value; and if the value of the comprehensive evaluation matrix is smaller than the threshold value, the user evaluation satisfaction of the product in the evaluation dimension is a negative value.
9. The method for evaluating the perceptual interaction performance of the complex product fusing the multi-source heterogeneous data according to claim 1, wherein the complex product form sample library is a complex product interior form picture.
10. The method for evaluating the perceptual interaction performance of the complex product fusing the multi-source heterogeneous data according to claim 2, wherein the threshold value in the step (4) is selected from the normalized results of the scores of the Liktter scale.
CN202210852388.2A 2022-07-20 2022-07-20 Complex product perceptual interaction performance evaluation method fusing multi-source heterogeneous data Pending CN115409535A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210852388.2A CN115409535A (en) 2022-07-20 2022-07-20 Complex product perceptual interaction performance evaluation method fusing multi-source heterogeneous data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210852388.2A CN115409535A (en) 2022-07-20 2022-07-20 Complex product perceptual interaction performance evaluation method fusing multi-source heterogeneous data

Publications (1)

Publication Number Publication Date
CN115409535A true CN115409535A (en) 2022-11-29

Family

ID=84158413

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210852388.2A Pending CN115409535A (en) 2022-07-20 2022-07-20 Complex product perceptual interaction performance evaluation method fusing multi-source heterogeneous data

Country Status (1)

Country Link
CN (1) CN115409535A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115904911A (en) * 2022-12-24 2023-04-04 北京津发科技股份有限公司 Web human factor intelligent online evaluation method, system and device based on cloud server

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112288287A (en) * 2020-10-30 2021-01-29 清华大学 Evaluation method and equipment of vehicle-mounted information system
CN113052475A (en) * 2021-04-02 2021-06-29 江苏徐工工程机械研究院有限公司 Engineering machinery icon visual performance testing method and device and storage medium
US20210248656A1 (en) * 2019-10-30 2021-08-12 Lululemon Athletica Canada Inc. Method and system for an interface for personalization or recommendation of products
CN113569901A (en) * 2021-06-07 2021-10-29 飞友科技有限公司 Method and system for evaluating and analyzing satisfaction quality of aviation enterprise

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210248656A1 (en) * 2019-10-30 2021-08-12 Lululemon Athletica Canada Inc. Method and system for an interface for personalization or recommendation of products
CN115004308A (en) * 2019-10-30 2022-09-02 加拿大露露乐檬运动用品有限公司 Method and system for providing an interface for activity recommendations
CN112288287A (en) * 2020-10-30 2021-01-29 清华大学 Evaluation method and equipment of vehicle-mounted information system
CN113052475A (en) * 2021-04-02 2021-06-29 江苏徐工工程机械研究院有限公司 Engineering machinery icon visual performance testing method and device and storage medium
CN113569901A (en) * 2021-06-07 2021-10-29 飞友科技有限公司 Method and system for evaluating and analyzing satisfaction quality of aviation enterprise

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115904911A (en) * 2022-12-24 2023-04-04 北京津发科技股份有限公司 Web human factor intelligent online evaluation method, system and device based on cloud server

Similar Documents

Publication Publication Date Title
Gomez et al. A diffusion model account of masked versus unmasked priming: Are they qualitatively different?
Shapiro Rational political man: A synthesis of economic and social-psychological perspectives
CN111199205B (en) Vehicle-mounted voice interaction experience assessment method, device, equipment and storage medium
CN108959895A (en) A kind of EEG signals EEG personal identification method based on convolutional neural networks
CN115409535A (en) Complex product perceptual interaction performance evaluation method fusing multi-source heterogeneous data
CN111401105B (en) Video expression recognition method, device and equipment
CN111026267A (en) VR electroencephalogram idea control interface system
CN113243924A (en) Identity recognition method based on electroencephalogram signal channel attention convolution neural network
CN110210380A (en) The analysis method of personality is generated based on Expression Recognition and psychology test
Ball A comparison of single-step and multiple-step transition analyses of multiattribute decision strategies
CN115713634A (en) Color collocation evaluation method combining similarity measurement and visual perception
CN116763324A (en) Single-channel electroencephalogram signal sleep stage method based on multiple scales and multiple attentions
Li et al. A new product development study using intelligent data analysis algorithm based on KE theory
CN113625877A (en) Force and touch subjective perception result quantitative description method based on perception space analysis
CN112685562B (en) XGboost model-based multidimensional index integration technical evaluation method
Hunt Assimilation or marginality? Some school integration effects reconsidered
CN111950044A (en) Evaluation method and system for online clothing creative design
CN110555823B (en) Image fusion quality evaluation method based on TVL structure texture decomposition
CN111401764B (en) Comprehensive evaluation method for satisfaction degree of library users based on CPN network model
Zhang et al. A QoE Physiological Measure of VR With Vibrotactile Feedback Based on Frontal Lobe Power Asymmetry
CN118093344B (en) Large-model-oriented multidimensional question-answer pair generation task evaluation method and system
CN112419112B (en) Method and device for generating academic growth curve, electronic equipment and storage medium
Gray et al. Predicting choices in asymptotic decisions: A comparison of two models
CN113050849B (en) Method for testing implicit attitude of user on automobile interactive interface
CN118331860A (en) Method and system for constructing evaluation model of human-computer interaction system of automobile

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20221129