CN112346945B - Man-machine interaction data analysis method and device - Google Patents

Man-machine interaction data analysis method and device Download PDF

Info

Publication number
CN112346945B
CN112346945B CN202011143828.4A CN202011143828A CN112346945B CN 112346945 B CN112346945 B CN 112346945B CN 202011143828 A CN202011143828 A CN 202011143828A CN 112346945 B CN112346945 B CN 112346945B
Authority
CN
China
Prior art keywords
behavior
data
human
duration
array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011143828.4A
Other languages
Chinese (zh)
Other versions
CN112346945A (en
Inventor
赵起超
杨苒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kingfar International Inc
Original Assignee
Kingfar International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kingfar International Inc filed Critical Kingfar International Inc
Priority to CN202011143828.4A priority Critical patent/CN112346945B/en
Publication of CN112346945A publication Critical patent/CN112346945A/en
Application granted granted Critical
Publication of CN112346945B publication Critical patent/CN112346945B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring
    • G06F11/3476Data logging

Abstract

The invention provides a human-computer interaction data analysis method and a human-computer interaction data analysis device, wherein the method comprises the following steps: acquiring human-computer interaction data; acquiring one or more data value range interval combinations of equipment data, wherein each data value range interval combination corresponds to one behavior or behavior characteristic, and setting corresponding behavior codes for each behavior or behavior characteristic respectively; converting the man-machine interaction data into a primary behavior array containing a timestamp and corresponding behavior codes according to the combination of each data value domain interval; combining arrays with continuous time stamps and same behavior codes in the first-level behavior array to obtain a second-level behavior array; and counting one or more of the frequency, the minimum duration, the maximum duration, the average duration and the total duration of each behavior code appearing in the secondary behavior array so as to analyze and evaluate the characteristics of the human-computer interaction behavior. The method avoids the influence of human subjective factors and forms an accurate analysis result.

Description

Man-machine interaction data analysis method and device
Technical Field
The invention relates to the technical field of data behavior analysis, in particular to a human-computer interaction data analysis method and device.
Background
With the development of science and technology, people have become mainstream to work and live by using electronic equipment, and a large amount of human-computer interaction equipment is applied to production and life. In order to perfect the human-computer interaction flow and optimize the functions of the electronic equipment, the human behavior in the human-computer interaction process needs to be analyzed. In the prior art, the influence of subjective factors is large through a manual evaluation mode, a large batch of tested samples are difficult to process, the accuracy is poor, and the efficiency is low. The trend of technical development is to analyze the interaction behavior of a subject through data generated or collected by electronic equipment, but the data cannot directly express the behavior characteristics, so that a data analysis method is urgently needed to accurately and efficiently analyze the human-computer interaction behavior.
Disclosure of Invention
The embodiment of the invention provides a human-computer interaction data analysis method and device, which are used for analyzing interaction behaviors based on data generated by human-computer interaction.
The technical scheme of the invention is as follows:
in one aspect, the present invention provides a human-computer interaction data analysis method, including:
acquiring human-computer interaction data, wherein the human-computer interaction data comprises: acquiring equipment data generated by one or more kinds of equipment and corresponding timestamps according to specified interval duration;
acquiring one or more data value range interval combinations of equipment data, wherein each data value range interval combination corresponds to one behavior or behavior characteristic, and setting corresponding behavior codes for each behavior or behavior characteristic respectively;
converting the human-computer interaction data into a primary behavior array according to the combination of the data value domain intervals, wherein the primary behavior array comprises: a timestamp and a corresponding behavior code;
combining the arrays with continuous time stamps and same behavior codes in the primary behavior array to obtain a secondary behavior array;
and counting one or more of the frequency, the minimum duration, the maximum duration, the average duration and the total duration of each behavior code appearing in the secondary behavior array so as to analyze and evaluate the characteristics of the human-computer interaction behavior.
In some embodiments, converting the human-computer interaction data into a primary behavior array according to a combination of the data value range intervals, includes:
adding sequence numbers to the human-computer interaction data according to the sequence of the timestamps;
traversing the man-machine interaction data added with the serial numbers, if equipment data in the man-machine interaction data belong to one of the data value domain interval combinations, marking the corresponding behavior code of the data value domain interval combination to which the equipment data mark belongs, or marking the equipment data as empty;
and recording each serial number and the corresponding behavior code to form the primary behavior array.
In some embodiments, after converting the human-computer interaction data into a primary behavior array according to a combination of the data value range intervals, the method further includes:
and respectively marking the various behavior codes with different colors for visual presentation.
In some embodiments, merging the arrays with consecutive timestamps and identical behavior codes in the primary behavior array to obtain the secondary behavior array includes:
and keeping the earliest or latest timestamp in a plurality of arrays with the same timestamp continuous and behavior coding.
In some embodiments, counting the frequency with which each behavior code appears in the secondary behavior array comprises:
respectively creating corresponding temporary variables for various behavior codes, wherein the temporary variables are 0 by default;
traversing the secondary behavior array, and accumulating one temporary variable corresponding to a behavior code when each behavior code appears to obtain the occurrence times corresponding to various behavior codes;
and dividing the occurrence times corresponding to the various behavior codes by the total duration of the man-machine interaction data to obtain the frequency corresponding to the various behavior codes.
In some embodiments, merging the arrays with consecutive timestamps and identical behavior codes in the primary behavior array, and retaining the latest timestamp in the arrays with consecutive timestamps and identical behavior codes to obtain the secondary behavior array, further includes:
traversing the secondary behavior array, and subtracting the numerical value of the sequence number corresponding to the current behavior code and the previous behavior code to obtain a sequence number difference value;
multiplying the sequence number difference value by the specified interval duration to obtain the duration corresponding to the current behavior code;
and counting the minimum duration, the maximum duration, the average duration and the total duration of each behavior code in the secondary behavior array according to the duration corresponding to each behavior code in the secondary behavior array.
In some embodiments, the behavior code is one or more of a number, a letter, a symbol, and a pattern.
In some embodiments, after counting one or more of a frequency, a minimum duration, a maximum duration, an average duration, and a total duration corresponding to each behavior code in the secondary behavior array to analyze and evaluate characteristics of the human-computer interaction behavior, the method further includes:
acquiring early warning event information, and if the characteristics of the human-computer interaction behavior exist in the early warning event recorded in the early warning event information, generating warning information; and/or
And generating a visual histogram according to one or more of the corresponding frequency, the minimum duration, the maximum duration, the average duration and the total duration of the various behavior codes.
In another aspect, the present invention provides an electronic device, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the method when executing the program.
In another aspect, the present invention also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the above method.
The invention has the beneficial effects that:
the human-computer interaction data analysis method and the human-computer interaction data analysis device match human-computer interaction data acquired by equipment with combinations of preset data value range intervals, each combination of the data value range intervals corresponds to a behavior code of a behavior or behavior characteristic, so that the equipment data are accurately mapped to the behavior code of the corresponding behavior or behavior characteristic, and a primary behavior array capable of presenting the behavior or behavior characteristic is formed. And combining the continuous arrays with the same behavior codes in the primary behavior array to form a secondary behavior array, so that the same behavior is recorded uniquely and unrepeatedly. One or more of the frequency, the minimum duration, the maximum duration, the average duration and the total duration of the behavior codes in each secondary behavior array are counted to analyze the behavior characteristics of the testee, so that the influence of artificial subjective factors can be avoided, and an accurate analysis result is formed.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
It will be appreciated by those skilled in the art that the objects and advantages that can be achieved with the present invention are not limited to the specific details set forth above, and that these and other objects that can be achieved with the present invention will be more clearly understood from the detailed description that follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the principles of the invention. In the drawings:
FIG. 1 is a flowchart illustrating a human-computer interaction data analysis method according to an embodiment of the invention;
FIG. 2 is a logic diagram of the human-computer interaction data analysis method according to an embodiment of the present invention;
FIG. 3 is a flow chart illustrating a process of generating a primary behavior array in the human-computer interaction data analysis method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating a statistical frequency process in the human-computer interaction data analysis method according to an embodiment of the present invention;
fig. 5 is a schematic flow chart illustrating a process of generating a secondary behavior array in the human-computer interaction data analysis method according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the following embodiments and accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but not to limit the present invention.
It should be noted that, in order to avoid obscuring the present invention with unnecessary details, only the structures and/or processing steps closely related to the scheme according to the present invention are shown in the drawings, and other details not so relevant to the present invention are omitted.
It should be emphasized that the term "comprises/comprising" when used herein, is taken to specify the presence of stated features, elements, steps or components, but does not preclude the presence or addition of one or more other features, elements, steps or components.
With the popularization of electronic devices, most of the production and life needs to be completed by the electronic devices. On one hand, the characteristics and the quality of the interaction behavior in the human-computer interaction process can reflect the usability of the electronic equipment, and the improvement of the electronic equipment can be guided. On the other hand, the interactive behavior can reflect the behavior habit and other characteristics of the equipment user.
During the human-computer interaction process, the device can actively or passively collect and generate human-computer interaction data, and the data can be one-dimensional data generated on the basis of a certain object or index or can be in the form of multi-dimensional data generated on the basis of a plurality of objects or indexes. These data themselves cannot be directly represented as interactive behaviors for analysis, and therefore further transformation is required to relate human-computer interaction data to the behavior or behavior characteristics that need to be analyzed. In the prior art, a method for converting and analyzing human-computer interaction data is unavailable.
The application provides a human-computer interaction data analysis method, referring to fig. 1 and 2, comprising steps S101 to S105:
it should be noted that the description of steps S101 to 105 in the present application is not limited to the order of the steps, and it should be understood that the steps may be performed in parallel or in the order of the steps may be changed in a specific usage scenario.
Step S101: acquiring human-computer interaction data, wherein the human-computer interaction data comprises the following steps: and acquiring equipment data generated by one or more kinds of equipment according to the specified interval duration and corresponding time stamps.
Step S102: acquiring one or more data value range interval combinations of the equipment data, wherein each data value range interval combination corresponds to one behavior or behavior characteristic, and setting corresponding behavior codes for the behaviors or behavior characteristics.
Step S103: converting the human-computer interaction data into a first-level behavior array according to the combination of each data value domain interval, wherein the first-level behavior array comprises: time stamps and corresponding behavior encodings.
Step S104: and combining the arrays with continuous time stamps and same behavior codes in the first-level behavior array to obtain a second-level behavior array.
Step S105: and counting one or more of the frequency, the minimum duration, the maximum duration, the average duration and the total duration of each behavior code appearing in the secondary behavior array so as to analyze and evaluate the characteristics of the human-computer interaction behavior.
In step S101, the human-computer interaction data is data generated by a human during using an electronic device, and may be in various types and forms, such as physiological data generated by a vital sign monitoring device, eye movement data collected by an eye tracker, mouse data recorded by a PC device, vehicle driving data recorded by a vehicle computer, and positioning data recorded by a positioning module. Based on the difference in data form, the acquired data may be one-dimensional data including only one object or index, or may be multi-dimensional data including a plurality of objects or indices. Illustratively, heart rate variability data for analyzing emotions during human-computer interaction contains only one parameter as one-dimensional data; the eye movement data needs to be marked with position information which comprises a horizontal coordinate and a vertical coordinate and is two-dimensional data; in other embodiments, multiple objects or index parameters may be collected simultaneously to form multidimensional data according to actual needs. During data acquisition, sampling time of various devices is synchronous, and time stamps are kept consistent.
In step S102, the behavior or behavior characteristics may be defined in various scenarios, for example: in analyzing the interaction process of the automobile assistant driving software and the driver, behaviors can be defined as emotional behaviors, watching behaviors, driving operation behaviors and the like. The behavior characteristics are further divided into a certain class of behaviors, for example, emotional behaviors can be divided into fear, calmness, excitement and the like.
Under the condition of a specified behavior or behavior characteristic, different value intervals exist in the device data generated by each device. Therefore, by setting the data value range section of each piece of equipment data, it is possible to determine which behavior or behavior characteristic each piece of equipment data corresponds to at the present time. Illustratively, in the process of analyzing interaction between automobile auxiliary driving software and a driver, a heart rate data interval is 100-140 times/min, when the stepping pressure of a brake pedal is greater than 20N, the reverse acceleration of the vehicle is greater than 9.8m/s, and the fact that the driver implements emergency braking can be judged. And setting corresponding data value range interval combinations for each behavior or behavior characteristic, and mapping the equipment data to the corresponding behaviors and behavior characteristics.
Further, setting behavior codes and behavior characteristics for all behaviors and behavior characteristicsThe code may be marked with one or more of a number, letter, symbol or pattern. Illustratively, A may be taken to represent a mouse operation, A1Indicating a left click, A2Indicating double click of left key, A3Indicating a right click, A4Indicating the roller is sliding upwards, A5Indicating that the roller is slipping down.
In step S103, the collected device data is converted into a primary behavior array recorded by a timestamp and a behavior code according to the data value domain interval combination corresponding to each behavior or behavior feature acquired in step S102. Specifically, according to the data value range combination set in step S102, when each piece of device data corresponding to a certain time belongs to the range defined by a certain data value range combination, it is determined that a corresponding behavior or behavior feature occurs at the current time, and a corresponding behavior code is marked. The data form in the first-level behavior array is time and behavior Encoding]Time represents a timestamp and behavior Encoding represents a behavioral Encoding. Illustratively, the primary behavior array M may be { [ t ]1,B2],[t2,B3],[t3,B3],[t4,B3],[t5,B1]Where t is1~t5Is a time stamp, B1~B3And coding behaviors corresponding to the three behaviors.
Since a subject's behavior may continue to occur, the data collected continuously by the device is converted and recorded as the same behavior code. Merging should be performed for the same behavior that occurs continuously and is recorded in the primary behavior array. Specifically, in step S104, arrays having consecutive time stamps and the same behavior code are merged and recorded as one. Only the earliest timestamp may be recorded, or only the latest timestamp may be recorded. And merging the data generated by the same behavior in the primary behavior array to form a secondary behavior array. Illustratively, the primary behavior array M, as previously described, with a timestamp t1、t2、t3The corresponding behaviors are coded identically and represent the same behavior which occurs continuously, the three arrays are merged, and the earliest or latest timestamp is reservedThe time stamp can be embodied as a duration. When the array with the earliest timestamp is reserved, the primary behavior array M is converted into the secondary behavior array N1{[t1,B2],[t2,B3],[t5,B1]Is then t2And t5The time difference of (A) can be embodied as B3The duration of (c). When the array with the latest timestamp is reserved, the first-level behavior array M is converted into a second-level behavior array N2{[t1,B2],[t4,B3],[t5,B1]Is then t1And t4The time difference of (A) can be embodied as B3The duration of (c).
In step S105, the frequency, minimum duration, maximum duration, average duration and total duration of each behavior code in the secondary behavior array are counted, which are actually the minimum duration, maximum duration, average duration and total duration corresponding to the corresponding behavior or behavior feature. According to the occurrence of different behaviors, the behavior characteristics of the subject in the process of interacting with the electronic equipment can be analyzed. For example, in the driving behavior analysis process, the higher the frequency of the emergency braking behavior and the acceleration behavior of the subject is, the more the subject is reflected in the preference for the violent driving.
In some embodiments, in step S103, the man-machine interaction data is converted into a primary behavior array according to the combination of the data value range intervals, as shown in fig. 3, including steps S1031 to S1033:
step S1031: and adding sequence numbers to the human-computer interaction data according to the sequence of the time stamps.
Step S1032: traversing the man-machine interaction data added with the serial numbers, if equipment data in the man-machine interaction data belong to one of the data value domain interval combinations, marking the behavior code corresponding to the data value domain interval combination to which the equipment data mark belongs to, and otherwise, marking the equipment data as empty.
Step S1033: and recording each serial number and the corresponding behavior code to form a primary behavior array.
The time stamp is used for recording accurate sampling time points, the data format of the time stamp is complex, the time stamp data is directly adopted to calculate the duration, the calculation process is complex, the processing is not facilitated, in order to process the time information more efficiently, in step S1031 of the embodiment, the time stamp is converted into sequence numbers which are sequentially arranged according to the time sequence, and the duration corresponding to the corresponding behavior code can be obtained by calculating the product of the sequence number difference and the specified interval duration. Further, in step S1032, it is determined whether the device data corresponding to each timestamp or sequence number belongs to a certain set data value range combination, so as to determine whether a corresponding behavior or behavior feature occurs, and mark a corresponding behavior code. Wherein if the device data corresponding to the timestamp or sequence number does not correspond to any behavior or behavior characteristic, the device data is marked as NULL (NULL). In step S1033, only the serial number and the corresponding behavior code are recorded, so that the behavior or behavior characteristics occurring during the interaction process can be recorded more clearly and definitely.
In some embodiments, after step S103, after converting the human-computer interaction data into a primary behavior array according to the combination of the data value range intervals, the method further includes:
and respectively marking the various behavior codes with different colors for visual presentation.
In the embodiment, the interactive behaviors in the human-computer interaction time period are visually presented, and various behavior codes are marked into different colors, so that the behavior codes are easier to evaluate and analyze.
In some embodiments, in step S105, the frequency of occurrence of each behavior code in the secondary behavior array is counted, as shown in fig. 4, including steps S201 to S203:
step S201: and respectively creating corresponding temporary variables for various behavior codes, wherein the temporary variables are 0 by default.
Step S202: and traversing the secondary behavior array, and accumulating one temporary variable corresponding to the behavior code when each behavior code appears to obtain the occurrence times corresponding to various behavior codes.
Step S203: and dividing the occurrence times corresponding to the various behavior codes by the total duration of the man-machine interaction data to obtain the frequency corresponding to the various behavior codes.
In the embodiment, temporary variables are respectively established for various behavior codes, so that various behavior codes in the secondary behavior array are quickly and accurately counted. In other embodiments, the count may be performed in other manners. Furthermore, the frequency corresponding to each behavior code is the number of times of occurrence of the behavior code in unit time, and can be recorded as times/min or times/h.
In some embodiments, after step S104, that is, after the arrays with consecutive timestamps and the same behavior codes in the primary behavior array are merged, and the timestamps in the multiple arrays with consecutive timestamps and the same behavior codes are retained at the latest to obtain the secondary behavior array, as shown in fig. 5, the method further includes steps S301 to S303:
step S301: and traversing the secondary behavior array, and subtracting the serial number value corresponding to the current behavior code and the previous behavior code to obtain a serial number difference value.
Step S302: and multiplying the sequence number difference by the specified interval time length to obtain the duration corresponding to the current behavior code.
Step S303: and counting the minimum duration, the maximum duration, the average duration and the total duration of each behavior code in the secondary behavior array according to the duration corresponding to each behavior code in the secondary behavior array.
In this embodiment, the duration corresponding to each behavior code, that is, the duration of the corresponding behavior or behavior feature, is quickly and easily calculated by combining the sequence number difference and the specified interval duration.
In some embodiments, after step S105, that is, after counting one or more of the frequency, the minimum duration, the maximum duration, the average duration, and the total duration corresponding to each behavior code in the secondary behavior array, to analyze and evaluate the characteristics of the human-computer interaction behavior, the method further includes:
acquiring early warning event information, and if the characteristics of the human-computer interaction behavior exist in the early warning event recorded in the early warning event information, generating warning information; and/or
And generating a visual histogram according to one or more of the corresponding frequency, the minimum duration, the maximum duration, the average duration and the total duration of the various behavior codes.
In this embodiment, the early warning event information may be a direct definition of one or more of various behavior encoding frequencies, minimum duration, maximum duration, average duration, and total duration. For example, in the driving behavior analysis, when the average duration corresponding to the eye-closing behavior is higher than 0.5 second, a tiredness state early warning event can be prompted.
Further, for various behavior-encoded frequencies, minimum durations, maximum durations, average durations, and total durations, a visual histogram may be generated for direct use in visual contrast analysis.
In some embodiments, the flow of the human-computer interaction data analysis method includes:
1) and (5) creating a project, and determining an analysis object and a device data type.
2) The method comprises the steps of collecting equipment data, wherein the equipment data can be physiological data, eye movement data, mouse data, driving data, indoor positioning data and the like, and storing the collected equipment data.
3) Importing the equipment data into a visual analysis module, adding a behavior coding analysis task, selecting an equipment data segment to be analyzed, selecting an equipment data type to be analyzed, and selecting a preset behavior segment (data value range interval combination) or newly adding a new behavior segment.
Newly-built single dimension action segmentation: defining the integral name of the behavior group segment, defining the behavior name of each segment in the behavior group segment and setting a behavior code, wherein the minimum value and the maximum value of each segment cannot be crossed. (for example, the segment 1 interval is [0,1 ], and the segment 2 interval is [1, + ∞.).
Newly building two-dimensional segmentation behaviors: defining the integral name of the behavior group segment, defining the name of each segment behavior in the behavior group segment, setting behavior codes, defining the minimum value and the maximum value of the first dimension numerical value segment, defining the minimum value and the maximum value of the second dimension numerical value segment, and enabling the segments in each dimension not to be crossed. (for example, the interval of segment 1 in dimension 1 is: [0,1 ], and the interval of segment 2 in dimension 1 is [1, + ∞).
4) And converting the equipment data into data consisting of behavior codes according to the segmented value range.
5) And performing behavior code analysis, and counting the total times, average duration, frequency, minimum duration, maximum duration and total duration of each behavior code in the current data segment. Each statistical result is presented by a numerical table and a visual histogram.
Specifically, the one-dimensional data refers to only one device data per time stamp, such as physiological skin electrical data, and its data structure is [ time stamp, data value ]. The two-dimensional data refers to two pieces of equipment data such as eye movement data in each timestamp, the data structure of the two-dimensional data is [ time, (X-axis value and Y-axis value) ], the X-axis value and the Y-axis value are proportional values relative to the length and the width of a display, the value range of the X-axis value is 0-1, and the value range of the Y-axis value is 0-1. The track position point in the indoor positioning is the X-axis and Y-axis coordinates obtained in the current map coordinate system, and the track in the vehicle data is a longitude and latitude value.
The behavior coding means that the device data value range is segmented according to behavior or behavior characteristics, and a data value range interval combination is set for each device data aiming at a certain behavior or behavior characteristics. For example, a tested subject completes an experimental record, a behavior group-emotional behavior group can be added to the behavior module, and the behavior module can contain various behavior characteristics such as surprise, depression, excitement and the like. After the behavior group is added, the acquired equipment data can be divided, wherein the time 1-time 2 is an exciting behavior, the time 2-time 3 is a surprise behavior, the time 5-time 8 is a depression behavior, and the time 3-time 5 is not designated behavior.
In some embodiments, when comparing the behavior of the user, the smaller values in the data value range are all greater than or equal to the smaller values, and the larger values are all smaller values.
A behavior analysis case in the process of interacting with the simulated driving equipment is given below, and for convenience of explanation, part of the data content of the equipment is simplified.
In the simulated driving process, the heart rate (N/min) of a subject is detected by heart rate detection equipment, the tread pressure (N cattle) of a brake pedal is detected by a brake pedal pressure detection device, and the tread pressure (N cattle) of an accelerator pedal is detected by an accelerator pedal pressure detection device. Continuously collecting the three types of equipment data according to a specified interval duration of 1s to form three-dimensional data (time, m, n, h), wherein time is a time stamp, m is a heart rate, n is a brake pedal pressure, h is an accelerator pedal pressure, and the man-machine interaction data Q { (1,80,0,10), (2,85,0,11), (3,83,0,12), (4,85,0,20), (5,80,0,21), (6,80,0,10), (7,84,0,12), (8,95,30,0), (9,100,34,0), (10,120,30), (11,83,0,23) are collected in a range of 1-10 seconds in a driving process,
(12,81,0,22),(13,83,0,12),(14,84,0,12),(15,96,31,0),(16,101,34,0),(17,120,30),(18,80,0,10),
(19,85,0,11),(20,83,0,12)}。
further, setting a uniform driving behavior K1(green) corresponding data value range interval combinations, wherein the heart rate data is segmented into [60,90 ], the brake pedal pressure data is segmented into [0,2 ] and the accelerator pedal pressure data is segmented into [9, 15).
Setting an accelerated driving behavior K2(yellow) corresponding data value range interval combinations, wherein the heart rate data is segmented into [60,90 ], the brake pedal pressure data is segmented into [0,2 ] and the accelerator pedal pressure data is segmented into [15, 25).
Setting a braking driving behavior K3(red) corresponding data value range interval combinations, where heart rate data is segmented as [95,140 ], brake pedal pressure data is segmented as [30,40), and accelerator pedal pressure data is segmented as [0, 3).
The human-computer interaction data Q can be converted into a primary behavior array P { (1, K) according to the data value range interval combination1),(2,K1),(3,K1),(4,K2),(5,K2),(6,K1),(7,K1),(8,K3),(9,K3),(10,K3),(11,K2),(12,K2),(13,K1),(14,K1),(15,K3),(16,K3),(17,K3),(18,K1),(19,K1),(20,K1)}。
Merging the arrays with continuous time stamps and same behavior codes in the P to obtain a secondary behavior array W { (1, K)1),(4,K2),(6,K1),(8,K3),(11,K2),(13,K1),(15,K3),(18,K1)}. Since the end time stamps are merged, an empty array (18, NULL) may be added at the end of the data to mark the time at which the interaction ended.
Further, statistics of K1The frequency of (2) is 0.2n/s, K2The frequency of (2) is 0.1n/s, K3The frequency of the brake is 0.1n/s, the acceleration frequency and the braking frequency of the testee are higher and level through analysis, and the driving habit of the testee tends to be fiercely driven.
In another aspect, the present invention provides an electronic device, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the method when executing the program.
In another aspect, the present invention also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the above method.
In summary, the human-computer interaction data analysis method and device of the present invention match the human-computer interaction data collected by the device with the preset combination of data value range intervals, where each combination of data value range intervals corresponds to a behavior code of a behavior or behavior feature, so as to accurately map the device data to the behavior code of the corresponding behavior or behavior feature, thereby forming a primary behavior array capable of presenting the behavior or behavior feature. And combining the continuous arrays with the same behavior codes in the primary behavior array to form a secondary behavior array, so that the same behavior is recorded uniquely and unrepeatedly. One or more of the frequency, the minimum duration, the maximum duration, the average duration and the total duration of the behavior codes in each secondary behavior array are counted to analyze the behavior characteristics of the testee, so that the influence of artificial subjective factors can be avoided, and an accurate analysis result is formed.
Furthermore, the timestamp processing is converted into the serial number for processing, so that the duration corresponding to the specific behavior code can be calculated more easily, the calculation power is saved, and the data processing efficiency is improved.
Furthermore, each behavior code is configured with different colors for visual presentation, which is more beneficial to the visual evaluation of analysts.
Those of ordinary skill in the art will appreciate that the various illustrative components, systems, and methods described in connection with the embodiments disclosed herein may be implemented as hardware, software, or combinations of both. Whether this is done in hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention. When implemented in hardware, it may be, for example, an electronic circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, plug-in, function card, or the like. When implemented in software, the elements of the invention are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine-readable medium or transmitted by a data signal carried in a carrier wave over a transmission medium or a communication link. A "machine-readable medium" may include any medium that can store or transfer information. Examples of a machine-readable medium include electronic circuits, semiconductor memory devices, ROM, flash memory, Erasable ROM (EROM), floppy disks, CD-ROMs, optical disks, hard disks, fiber optic media, Radio Frequency (RF) links, and so forth. The code segments may be downloaded via computer networks such as the internet, intranet, etc.
It should also be noted that the exemplary embodiments mentioned in this patent describe some methods or systems based on a series of steps or devices. However, the present invention is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, may be performed in an order different from the order in the embodiments, or may be performed simultaneously. Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments and/or in combination with or instead of the features of the other embodiments in the present invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes may be made to the embodiment of the present invention by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (8)

1. A human-computer interaction data analysis method is characterized by comprising the following steps:
acquiring human-computer interaction data, wherein the human-computer interaction data comprises: acquiring equipment data generated by one or more kinds of equipment and corresponding timestamps according to specified interval duration;
acquiring one or more data value range interval combinations of equipment data, wherein each data value range interval combination corresponds to one behavior or behavior characteristic, and setting corresponding behavior codes for each behavior or behavior characteristic respectively;
converting the human-computer interaction data into a primary behavior array according to the combination of the data value domain intervals, wherein the primary behavior array comprises: a timestamp and a corresponding behavior code;
combining the arrays with continuous timestamps and identical behavior codes in the first-level behavior array, and reserving the earliest or latest timestamp in a plurality of arrays with continuous timestamps and identical behavior codes to obtain a second-level behavior array;
counting one or more of the frequency, the minimum duration, the maximum duration, the average duration and the total duration of each behavior code appearing in the secondary behavior array so as to analyze and evaluate the characteristics of the human-computer interaction behaviors;
counting the frequency of occurrence of each behavior code in the secondary behavior array, wherein the counting comprises the following steps: respectively creating corresponding temporary variables for various behavior codes, wherein the temporary variables are 0 by default; traversing the secondary behavior array, and accumulating one temporary variable corresponding to a behavior code when each behavior code appears to obtain the occurrence times corresponding to various behavior codes; and dividing the occurrence times corresponding to the various behavior codes by the total duration of the man-machine interaction data to obtain the frequency corresponding to the various behavior codes.
2. The human-computer interaction data analysis method according to claim 1, wherein converting the human-computer interaction data into a primary behavior array according to a combination of data value range intervals comprises:
adding sequence numbers to the human-computer interaction data according to the sequence of the timestamps;
traversing the man-machine interaction data added with the serial numbers, if equipment data in the man-machine interaction data belong to one of the data value domain interval combinations, coding behaviors corresponding to the data value domain interval combination to which the equipment data mark belongs, and otherwise, marking the equipment data as null;
and recording each serial number and the corresponding behavior code to form the primary behavior array.
3. The human-computer interaction data analysis method according to claim 1, wherein after converting the human-computer interaction data into a primary behavior array according to the combination of the data value range intervals, the method further comprises:
and respectively marking the various behavior codes with different colors for visual presentation.
4. The human-computer interaction data analysis method according to claim 1, wherein the steps of merging arrays with continuous timestamps and identical behavior codes in the primary behavior array, retaining the latest timestamps in a plurality of arrays with continuous timestamps and identical behavior codes to obtain a secondary behavior array further comprise:
traversing the secondary behavior array, and subtracting the numerical value of the sequence number corresponding to the current behavior code and the previous behavior code to obtain a sequence number difference value;
multiplying the sequence number difference value by the specified interval duration to obtain the duration corresponding to the current behavior code;
and counting the minimum duration, the maximum duration, the average duration and the total duration of each behavior code in the secondary behavior array according to the duration corresponding to each behavior code in the secondary behavior array.
5. The human-computer interaction data analysis method of claim 1, wherein the behavior code is one or more of a number, a letter, a symbol, and a pattern.
6. The method for analyzing human-computer interaction data according to claim 1, wherein after counting one or more of the frequency, the minimum duration, the maximum duration, the average duration and the total duration corresponding to each behavior code in the secondary behavior array to analyze and evaluate the characteristics of the human-computer interaction behavior, the method further comprises:
acquiring early warning event information, and if the characteristics of the human-computer interaction behavior exist in the early warning event recorded in the early warning event information, generating warning information; and/or
And generating a visual histogram according to one or more of the corresponding frequency, the minimum duration, the maximum duration, the average duration and the total duration of the various behavior codes.
7. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method according to any of claims 1 to 6 are implemented when the processor executes the program.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN202011143828.4A 2020-10-23 2020-10-23 Man-machine interaction data analysis method and device Active CN112346945B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011143828.4A CN112346945B (en) 2020-10-23 2020-10-23 Man-machine interaction data analysis method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011143828.4A CN112346945B (en) 2020-10-23 2020-10-23 Man-machine interaction data analysis method and device

Publications (2)

Publication Number Publication Date
CN112346945A CN112346945A (en) 2021-02-09
CN112346945B true CN112346945B (en) 2022-04-12

Family

ID=74359821

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011143828.4A Active CN112346945B (en) 2020-10-23 2020-10-23 Man-machine interaction data analysis method and device

Country Status (1)

Country Link
CN (1) CN112346945B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489327B (en) * 2021-12-30 2024-03-19 北京津发科技股份有限公司 Sequence analysis method and system for reaction behavior based on man-machine interaction
CN115209079B (en) * 2022-02-23 2023-05-02 北京拙河科技有限公司 Method and equipment suitable for long-time data storage of high-speed camera

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105579929B (en) * 2013-10-29 2019-11-05 英特尔公司 Human-computer interaction based on gesture
CN108572720A (en) * 2017-04-10 2018-09-25 佘均连 Man-machine interactive system, control device and man-machine interaction method
CN110069135A (en) * 2019-04-28 2019-07-30 联想(北京)有限公司 The data processing method of human-computer interaction device a kind of and human-computer interaction device
CN111265225B (en) * 2020-01-20 2022-04-12 北京津发科技股份有限公司 Method and device for selecting usability of mobile terminal

Also Published As

Publication number Publication date
CN112346945A (en) 2021-02-09

Similar Documents

Publication Publication Date Title
CN112346945B (en) Man-machine interaction data analysis method and device
CN115908424B (en) Building health detection method, system and medium based on three-dimensional laser scanning
CN104507389A (en) Concentration ratio measurement device and program
CN110717542A (en) Emotion recognition method, device and equipment
CN110580217A (en) software code health degree detection method, processing method and device and electronic equipment
CN113052475B (en) Engineering machinery icon visual performance test method, device and storage medium
CN111902829A (en) Information processing apparatus, information processing method, and program
CN116401311B (en) Three-dimensional visual data management system and method based on GIS
CN114489327B (en) Sequence analysis method and system for reaction behavior based on man-machine interaction
CN113806343B (en) Evaluation method and system for Internet of vehicles data quality
CN113283677B (en) Index data processing method, device, equipment and storage medium
CN109144816A (en) A kind of node health degree detection method and system
CN114037993A (en) Substation pointer instrument reading method and device, storage medium and electronic equipment
CN108903911B (en) Remote acquisition and processing method for traditional Chinese medicine pulse condition information
CN115995282B (en) Expiratory flow data processing system based on knowledge graph
CN116049253B (en) Deep learning-based Internet of vehicles data modeling method and system
CN106696941A (en) Brake pad loss diagnosis method and diagnosis system
CN117318053B (en) Energy demand prediction method and system for energy storage power station
CN116109145B (en) Risk assessment method, risk assessment device, risk assessment terminal and risk assessment storage medium for vehicle driving route
CN116431133A (en) Method and device for digitally measuring user code contribution in low-code platform
CN116269390B (en) Autism evaluation method, device, electronic device, and storage medium
WO2022130891A1 (en) Information processing device, information processing method, and recording medium
CN113963085B (en) State characterization method and device of industrial system and electronic equipment
CN116166994A (en) Statistical index classification method and system based on touch force sense experiment
JPWO2011024325A1 (en) calculator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant