CN111265225A - Method and device for selecting usability of mobile terminal - Google Patents

Method and device for selecting usability of mobile terminal Download PDF

Info

Publication number
CN111265225A
CN111265225A CN202010067582.0A CN202010067582A CN111265225A CN 111265225 A CN111265225 A CN 111265225A CN 202010067582 A CN202010067582 A CN 202010067582A CN 111265225 A CN111265225 A CN 111265225A
Authority
CN
China
Prior art keywords
style
terminal
information content
type terminal
styles
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010067582.0A
Other languages
Chinese (zh)
Other versions
CN111265225B (en
Inventor
赵起超
杨苒
朱小青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kingfar International Inc
Original Assignee
Kingfar International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kingfar International Inc filed Critical Kingfar International Inc
Priority to CN202010067582.0A priority Critical patent/CN111265225B/en
Publication of CN111265225A publication Critical patent/CN111265225A/en
Application granted granted Critical
Publication of CN111265225B publication Critical patent/CN111265225B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02405Determining heart rate variability
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality

Abstract

The invention provides a method and equipment for selecting the availability of a mobile terminal, wherein the method comprises the following steps: fixing the information content style of the interactive area, and adjusting the style without the interactive area to form a first type terminal style set; fixing the style of the non-interactive area, and adjusting the information content style of the interactive area to form a second type terminal style set; acquiring eye movement data and heart rate variability data of a testee within a specified interaction duration when the testee interacts with the styles in the first type terminal style set and the second type terminal style set, and calculating the number and/or occurrence frequency of eye jump points and an emotion change index; selecting a terminal non-interactive area style according to the number and/or the occurrence frequency of each eye jumping point corresponding to the first type terminal style set; and selecting the information content style of the terminal interaction area according to each emotion change index corresponding to the second type of terminal style set. The method avoids the influence of subjective factors in the process of artificial evaluation, and can obtain accurate evaluation data to select the pattern with the highest availability.

Description

Method and device for selecting usability of mobile terminal
Technical Field
The invention belongs to the technical field of mobile terminals, and particularly relates to a method and a device for selecting the usability of a mobile terminal.
Background
International standard ISO9241-11 defines "usability" as "the degree of effectiveness, effectiveness and satisfaction that a particular user uses a product in a particular environment and achieves a particular goal". For a mobile terminal, including a mobile phone, a notebook computer or a vehicle-mounted computer, etc., an interface of the mobile terminal includes a non-interactive area and an interactive area, and different non-interactive area styles and different interactive area information content styles can affect the use state and the interactive degree of a user on the mobile terminal, thereby affecting the usability of the mobile terminal.
In the prior art, the style without the interactive area and the style of the information content in the interactive area of the mobile terminal are generally selected by adopting a manual evaluation mode, the influence of subjective factors is large, and the style with the highest availability cannot be accurately obtained.
Disclosure of Invention
The present invention aims to provide a method and a device for selecting usability of a mobile terminal, which are used for detecting and evaluating the influence of different non-interactive area patterns and different interactive area information content patterns in the mobile terminal on the use state and the interactive degree of a user so as to select the pattern with the strongest usability.
The technical scheme for solving the problems is as follows:
in one aspect, the present invention provides a method for selecting usability of a mobile terminal, including:
keeping the style of the information content in the interactive area of the mobile terminal unchanged, and adjusting the style without the interactive area to form a first type terminal style set; and
keeping the style of the non-interactive area of the mobile terminal unchanged, adjusting the style of the information content of the interactive area to form a second type terminal style set;
acquiring eye movement data of a testee within a specified interaction duration when interacting with the styles in the first type terminal style set and the second type terminal style set, wherein the eye movement data at least comprises: sampling time points, and visual directions, eye movement point coordinates and blinking times of a tested person corresponding to the sampling time points; acquiring pulse time domain signals of the testee recorded by a pulse sensor as heart rate variability data;
calculating the number and/or the occurrence frequency of corresponding eye jumping points according to each eye movement data of the testee;
calculating a corresponding emotion change index according to each heart rate variability data of the testee;
acquiring the number and/or the occurrence frequency of eye jump points of a testee corresponding to each style in the first type terminal style set, and taking a non-interactive area style of which the number of the eye jump points is smaller than a first set threshold value and/or the occurrence frequency is smaller than a set frequency value as a terminal non-interactive area style;
and acquiring emotion change indexes of testees corresponding to the styles in the second type of terminal style set, and taking the interactive area information content style with the emotion change index higher than a second set threshold value as a terminal interactive area information content style.
In some embodiments, calculating the number and/or occurrence frequency of corresponding eye jumping points according to each eye movement data of the human subject includes:
carrying out linear filling on data gaps with the duration less than a first set duration in the eye movement data of the testee corresponding to each style in the first type terminal style set;
taking the second set duration as the window length, sequentially taking each sampling time point as the middle point of the window, respectively obtaining the visual direction included angle between the first sampling time point and the last sampling time point in the window length and the corresponding time difference, and dividing the visual direction included angle and the corresponding time difference into the eye movement angular speed corresponding to each sampling time point;
and marking the sampling time points of which the eye movement angular velocity is higher than a third set threshold as eye jump points, marking the sampling time points of which the eye movement angular velocity is lower than the third set threshold as fixation points, and calculating the number and/or occurrence frequency of the eye jump points.
In some embodiments, calculating a corresponding index of change in emotion from each heart rate variability data of the subject comprises:
marking peak points in the heart rate variability data of the testee corresponding to the styles in the second type terminal style set;
calculating heartbeat interval time corresponding to each peak point according to the time stamp of each peak point;
marking peak points of which the heart beat interval time growth rate exceeds a first set percentage as singular points;
and replacing the heartbeat interval time of the corresponding singular point by the mean value or the median value of the heartbeat interval time corresponding to the peak values of the first set number on the two sides of each singular point, and calculating the total standard deviation of the adjusted heartbeat interval time as the emotion change index of the testee corresponding to each type in the second type terminal type set.
In some embodiments, after calculating the number and/or the occurrence frequency of the corresponding eye jump points according to each eye movement data of the human subject, the method further includes:
dividing each interactive area information content style in the second type terminal style set into a plurality of blocks according to corner styles;
generating a visual hotspot graph and/or a visual track graph corresponding to each interactive area information content style according to the eye movement data of the subject corresponding to each style in the second type terminal style set;
and fixing the corner pattern with the total duration lower than the third set duration as a terminal corner pattern according to the corner pattern marked in the visual hotspot graph and/or the visual track graph.
In some embodiments, after calculating the number and/or the occurrence frequency of the corresponding eye jump points according to each eye movement data of the human subject, the method further includes:
according to eye movement data corresponding to the styles in the second type terminal style set, the first sight time, the fixation frequency proportion and the fixation time proportion corresponding to the information content styles of the interaction areas are counted;
and taking the interactive area information content style of which the first-view time is longer than a fourth set time length, the watching frequency proportion is higher than a second set percentage, the watching time proportion is higher than a third set percentage and the emotion change index is higher than a fourth set threshold as the terminal interactive area information content style.
In some embodiments, after obtaining the eye movement data of the human subject within the specified interaction duration when interacting with the styles in the first terminal style set and the second terminal style set, the method further includes:
acquiring user interaction data of a subject, the user interaction data comprising at least: click times and stagnation times;
and concentrating the second type of terminal styles, wherein the interaction area information content styles with the click times higher than a second set number, the stagnation times higher than a third set number and the emotion change index higher than a fifth set threshold are used as terminal interaction area information content styles.
In another aspect, the present invention further provides a device for selecting the usability of a mobile terminal, including:
the interactive display unit is used as an interactive platform of a testee and is used for displaying a first type terminal style in the first type terminal style set and a second type terminal style in the second type terminal style set; the first type terminal style set comprises a plurality of different first type terminal styles, each first type terminal style comprises an interaction area information content style and a non-interaction area style, the interaction area information content styles of different first type terminal styles are the same, and the non-interaction area styles are different from each other; the second type terminal style set comprises a plurality of different second type terminal styles, each second type terminal style comprises an interactive area information content style and a non-interactive area style, the non-interactive area styles of the different second type terminal styles are the same, and the interactive area information content styles are different from each other;
an eye movement data acquisition unit for acquiring eye movement data of a subject, comprising at least: sampling time points and visual directions, eye movement point coordinates and blinking times corresponding to the sampling time points;
the heart rate variability data acquisition unit is used for recording pulse time domain signals of the testee as heart rate variability data;
the processor unit is used for calculating the number and/or the occurrence frequency of corresponding eye jump points according to the eye movement data of the testee corresponding to the style in the first type terminal style set, and taking the non-interactive area style of which the number and/or the occurrence frequency of the eye jump points are smaller than a first set threshold value and/or a set frequency value as a terminal non-interactive area style; and calculating a corresponding emotion change index according to the heart rate variability data of the testee corresponding to each style in the second type of terminal style set, and taking the interactive area information content style with the emotion change index higher than a second set threshold value as a terminal interactive area information content style.
In some embodiments, the processor unit is further configured to:
dividing each interactive area information content style in the second type terminal style set into a plurality of blocks according to corner styles;
generating a visual hotspot graph and/or a visual track graph corresponding to each interactive area information content style according to the eye movement data of the subject corresponding to each style in the second type terminal style set;
and fixing the corner pattern with the total duration lower than the third set duration as a terminal corner pattern according to the corner pattern marked in the visual hotspot graph and/or the visual track graph.
In some embodiments, the interactive display unit is further configured to: acquiring user interaction data of a subject, the user interaction data comprising at least: click times and stagnation times;
the processor unit is further configured to: and concentrating the second type of terminal styles, wherein the interaction area information content styles with the click times higher than a second set number, the stagnation times higher than a third set number and the emotion change index higher than a fifth set threshold are used as terminal interaction area information content styles.
In another aspect, the present invention also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the above method.
According to the method for selecting the usability of the mobile terminal, the non-interactive area style and the interactive area information content style are adjusted, the eye movement data and the heart rate variability data of the user in different styles are collected, the number of eye jumping points and/or the occurrence frequency are calculated according to the eye movement data, the non-interactive area style is evaluated and selected, the emotion change index is calculated according to the heart rate variability data, the interactive area information content style is evaluated, the influence of subjective factors in the artificial evaluation process is avoided, accurate evaluation data can be obtained, and the style with the highest usability is selected.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts. In the drawings:
fig. 1 is a schematic diagram illustrating a style in the method for selecting the availability of the mobile terminal according to an embodiment of the present invention;
fig. 2 is a heart rate variability data pattern collected by the method for selecting the usability of the mobile terminal according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating a method for selecting the availability of the mobile terminal according to an embodiment of the present invention;
fig. 4 is a schematic flow chart illustrating a process of calculating the number and/or occurrence frequency of corresponding eye jump points according to each eye movement data of a subject according to an embodiment of the present invention;
FIG. 5 is a schematic flow chart illustrating a process of calculating a corresponding index of emotional change according to the heart rate variability data of the subject according to an embodiment of the present invention;
fig. 6 is a schematic flow chart illustrating a process of selecting a block corner style in an interactive area in the method for selecting the availability of a mobile terminal according to an embodiment of the present invention;
fig. 7 is a schematic diagram illustrating a style in the method for selecting the availability of the mobile terminal according to an embodiment of the present invention;
fig. 8 is a schematic diagram of a style in the method for selecting the availability of the mobile terminal according to an embodiment of the present invention.
Detailed Description
The objects and functions of the present invention and methods for accomplishing the same will be apparent by reference to the exemplary embodiments. However, the present invention is not limited to the exemplary embodiments disclosed below; it can be implemented in different forms. The nature of the description is merely to assist those skilled in the relevant art in a comprehensive understanding of the specific details of the invention.
Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. In the drawings, the same reference numerals denote the same or similar parts, or the same or similar steps.
It should be noted that, in order to avoid obscuring the present invention with unnecessary details, only the structures and/or processing steps closely related to the scheme according to the present invention are shown in the drawings, and other details not so relevant to the present invention are omitted.
The interactive interface of the mobile terminal generally comprises a non-interactive area and an interactive area; the non-interactive area is usually a fixed part in an interactive interface of the mobile terminal, and is usually arranged at the top and the bottom of the interactive interface, and in some embodiments, may also be arranged at a side or other position for displaying a title or an operation key, and in a certain interactive interface, the non-interactive area is fixed, and a user cannot move the position of the non-interactive area, and cannot perform a replacement or change operation on the non-interactive area. The interactive area refers to an area where a user can change the content by an operation to read information, such as sliding to scroll the content up and down or page up and down.
Illustratively, the user interface of the mobile terminal as shown in fig. 1 includes a top non-interactive area and a bottom non-interactive area, and a middle portion interactive area. The top non-interactive area can introduce a title or a part of fixed operation keys and the like, the bottom non-interactive area can introduce fixed operation keys which have no content or style change and are only used for operation interaction pointing to a specific result, such as a return, setting and input area and the like. The style difference between the top non-interactive area and the bottom non-interactive area may include a shape difference, a position difference, a color difference, and the like. The interactive area of the middle part is used for displaying text information or video information and the like, the information content style of the interactive area is not only changed in the whole style and the color, but also displayed in different blocks because the content is divided into different blocks, different corner styles can be set in different blocks, and other specific structure styles of each block can be set respectively, for example, a subtitle style of each block.
In the use interaction process of a user, the immersion degree of the interaction region information content style and the interaction region-free style with different styles is different due to structure difference, color difference or position difference, the design of the interaction region information content style and the interaction region-free style is more abrupt, unnecessary attention of the user can be caused, and the immersion degree is poorer in use; the design is reasonable, the attention of the user is focused on specific information, and the immersion degree is good when the device is used.
In the prior art, the evaluation and selection of the interactive area information content style and the interactive area-free style are mainly carried out in a manual selection mode, so that the subjective awareness influence is large, and the evaluation and selection results are inaccurate.
The application of the present invention provides a method for selecting the availability of a mobile terminal, as shown in fig. 3, comprising steps S101 to S107:
step S101: keeping the style of the information content in the interactive area of the mobile terminal unchanged, and adjusting the style without the interactive area to form a first type terminal style set.
Step S102: keeping the style of the non-interactive area of the mobile terminal unchanged, and adjusting the style of the information content of the interactive area to form a second type terminal style set.
Step S103: acquiring eye movement data of a testee within a specified interaction duration when interacting with the styles in the first type terminal style set and the second type terminal style set, wherein the eye movement data at least comprises the following steps: sampling time points, and visual directions, eye movement point coordinates and blinking times of a tested person corresponding to the sampling time points; and acquiring pulse time domain signals of the testee recorded by the pulse sensor as heart rate variability data.
Step S104: and calculating the number and/or the occurrence frequency of the corresponding eye jumping points according to the data of each eye movement of the testee.
Step S105: and calculating a corresponding emotion change index according to each heart rate variability data of the testee.
Step S106: and acquiring the number and/or the occurrence frequency of eye jumping points of a testee corresponding to each style in the first type terminal style set, and taking the non-interactive area style of which the number of the eye jumping points is smaller than a first set threshold value and/or the occurrence frequency is smaller than a set frequency value as the terminal non-interactive area style.
Step S107: and acquiring emotion change indexes of testees corresponding to the styles in the second type of terminal style set, and taking the interactive area information content style with the emotion change index higher than a second set threshold value as the terminal interactive area information content style.
In step S101, only the style of the non-interactive area is adjusted by controlling the variables, and the style of the information content in the interactive area is controlled to be unchanged, so as to form a first-class terminal style set, which can ensure that the parameters obtained in the interactive process accurately reflect the influence of the style of the non-interactive area on the immersion degree of the user in the interactive process. Illustratively, referring to fig. 1, the style of the interactive area information content in the middle of the mobile terminal is kept unchanged, and the top and/or bottom non-interactive area styles are adjusted to form a first type terminal style set.
Similarly, in step S102, only the interactive region information content style is adjusted by controlling the variable, and the non-interactive region style is controlled to be unchanged, so as to form a second type of terminal style set, which can ensure that the parameters obtained in the interactive process accurately reflect the influence of the interactive region information content style on the immersion degree of the user in the interactive process. Illustratively, referring to fig. 1, the style of the top non-interactive area and the bottom non-interactive area of the mobile terminal is kept unchanged, and the information content style of the interactive area is adjusted to form a second type terminal style set.
In step S103, in order to achieve effective evaluation in the interaction process with the mobile terminal, the present invention objectively evaluates the immersion degree in the interaction process of the mobile terminal by collecting both eye movement data and heart rate variability data of the subject, and is used to select a mobile terminal non-interaction region style and an interaction region information content style with better usability.
Specifically, the subject interacts with the patterns in the first terminal pattern set and the second terminal pattern set respectively, and the eye movement data and the heart rate variability data are collected. In order to improve the generalization of data, a plurality of subjects may be arranged to perform interaction and detection respectively, and the mean value, the median value, the maximum value, or the like may be taken for a plurality of sets of data corresponding to each pattern.
During the test, the eye movement data can be collected by the eye tracker, and specifically, the eye movement data may include: sampling time points and visual directions, eye movement point coordinates and blinking times corresponding to the sampling time points. The visual direction can be directly the visual direction of a dominant eye (also called a fixation eye and a dominant eye, from the physiological perspective of people, everyone has the visual direction of the dominant eye, possibly the left eye and possibly the right eye, and things seen by the dominant eye can be preferentially accepted by the brain), and the eye movement point is the intersection point coordinate of the visual direction of the dominant eye and the interactive interface of the mobile terminal.
The heart rate variability data can adopt pulse time domain signals recorded by a pulse sensor, specifically, a wireless photoelectric volume eartip pulse sensor, 2.4G wireless radio frequency technology, Bluetooth communication or zigbee communication technology can be adopted for data transmission. When the sensor irradiates the skin surface by emitting infrared light with a certain wavelength, light beams are transmitted to the photoelectric sensor in a projection or reflection mode, the photoelectric intensity detected by the photoelectric sensor is weakened to a certain extent due to the absorption attenuation action of skin muscle tissues and blood, and specifically, when the heart contracts, peripheral blood vessels expand, the blood volume is maximum, the light absorption is strongest, and therefore the detected light signal intensity is minimum; when the heart is in diastole, peripheral blood vessels contract, the blood volume is minimum, the light absorption is weakest, and therefore the intensity of the detected light signal is maximum, so that the light intensity detected by the photoelectric sensor presents pulsatility and development along with the heart; the light intensity change signal is converted into an electric signal, and the electric signal is amplified to reflect the change of peripheral blood vessel blood flow along with the beating of the heart to form a pulse time domain signal as shown in figure 2.
In step S104, the eye movement points may be classified into two types, and a point where visual processing has occurred is defined as a fixation point, and a point where visual processing has not occurred is defined as a jumping point. And calculating and analyzing according to the eye movement data, dividing the eye movement points corresponding to the sampling time points into fixation points or eye jump points, and further calculating the number and/or the occurrence frequency of the jump points.
In some embodiments, calculating the number and/or occurrence frequency of corresponding eye jumping points according to each eye movement data of the subject, as shown in fig. 4, may include steps S1041 to S1043:
step S1041: the method comprises the steps of obtaining eye movement data of a testee corresponding to each style in a first type terminal style set and a second type terminal style set, and carrying out linear filling on data gaps, smaller than a first set time length, in each eye movement data.
In the process of acquiring eye movement data by using an eye tracker, complete eye movement data may not be acquired due to factors such as blinking (short eye closure) or eye closure (long eye closure) occurring at a sampling time point, which means that an effective visual direction and coordinates of an eye movement point cannot be acquired, and a data gap is formed along a time axis.
And acquiring the duration corresponding to each data gap, specifically, expressing as an interval time superposition value between sampling time points of continuous data blank. When the corresponding duration of the data slot is longer than the first set duration, the corresponding status is closed eye, which may be 75ms, for example, then the data slot is marked as a valid slot. And when the corresponding duration of the data gap is less than the first set duration, and the corresponding state is blinking, performing linear supplementation according to the sampling frequency and the data values of the data before and after the data gap, and ensuring the data smoothness. For example, at the sampling time point t1、t2、t3And t4In, t2And t3When data is missing and the sampling time point interval is 20ms, the corresponding time length of the data gap is 60ms, less than 75ms, t1And t4The corresponding eye movement points are (5,6) and (8,9), respectively, then t is linearly supplemented in order to preserve the tendency of eye movement2And t3The corresponding eye movement point coordinates are (6,7) and (7, 8). As another example, at sampling time t1、t2、t3、t4And t5In, t2、t3And t4The number of consecutive deletions is such that,when the sampling time point interval is 20ms, the data gap is 80ms, and is greater than 75ms, the data gap is a valid data gap, and is not processed and marked as eye closure. Similarly, the visual direction may be filled in the same way, and the visual direction may be expressed in a space vector manner.
Step S1042: and taking the second set duration as the window length, sequentially taking each sampling time point as a middle point of the window, respectively obtaining the visual direction included angle between the first sampling time point and the last sampling time point in the window length and the corresponding time difference, and dividing the visual direction included angle and the corresponding time difference into the eye movement angular speed corresponding to each sampling time point.
Illustratively, at a sampling time point interval of 20ms, a second set duration of the window length may be set to 80ms, which is taken as a midpoint for sampling time point M, the first point within the window length being M-2 and the last point being M + 2. And then, the visual directions corresponding to the sampling time points M-2 and M +2 are differenced to obtain a visual direction included angle, and the visual direction included angle is divided by the window time to obtain the eye movement angular velocity corresponding to the sampling time point M.
Step S1043: and marking the sampling time points of which the eye movement angular velocity is higher than a third set threshold as eye jump points, marking the sampling time points of which the eye movement angular velocity is lower than the third set threshold as fixation points, and calculating the number and/or occurrence frequency of the eye jump points.
When the eye movement angular speed corresponding to one sampling time point is higher than a third set threshold, indicating that no visual processing occurs, defining the eye movement angular speed as an eye jump point; and when the eye movement angular speed corresponding to one sampling time point is smaller than a third set threshold, indicating that the visual processing occurs, and defining the eye movement angular speed as the fixation point. Thereby, the number and/or the frequency of occurrence of eye jump points can be further obtained. In some embodiments, the third threshold may be 30 °/s, and in other embodiments, a specific value may be set for a specific scene.
In step S105, a corresponding emotion change index is calculated from each heart rate variability data, and the emotion change index may be a MEAN of the heart beat interval (MEAN), a total Standard Deviation (SDNN), a MEAN Standard Deviation (SDANN), or a square root of the MEAN square of the difference (r-MSSD).
In some embodiments, calculating a corresponding index of change in emotion from each heart rate variability data of the subject, as shown in fig. 5, may include steps S1051 to S1054:
step S1051: and acquiring heart rate variability data of the testee corresponding to each type in the first type terminal type set and the second type terminal type set, and marking peak points in each group of heart rate variability data.
Referring to fig. 2, the peak point in the heart rate variability data refers to the R peak. Specifically, a maximum Heart Rate value (Max Heart Rate) and a peak Threshold value (R-peak Mark Threshold) are obtained from the Heart Rate variability data.
In the process of obtaining the peak value point, firstly obtaining the length of a retrieval window according to the maximum Heart Rate value, wherein the length of the retrieval window is 60/Max Heart Rate, sliding the retrieval window on a time axis, respectively finding QRS waves in the retrieval window, and marking the QRS waves with the amplitude larger than the peak value threshold value as the peak value point.
Step S1052: and calculating the heartbeat interval time (IBI) corresponding to each peak point according to the timestamp of each peak point.
And according to the marked peak point and the corresponding timestamp, taking the time interval between the designated peak point and the previous peak point as the corresponding heartbeat interval time. It should be noted that if the number of peak points is N, the number of heartbeat interval times is N-1.
Step S1053: the peak points where the beat interval time growth rate exceeds a first set percentage are marked as singularities.
Through a percentage detection method, when the heart-beat interval time growth rate of a certain peak point is higher than a first set percentage, marking the certain peak point as a singular point; illustratively, the first set percentage is set to 20%, when the nth heartbeat interval is 500ms and the N-1 st heartbeat interval is 490ms, the calculated growth rate is |500 |/500 × 100% — 2%, and the peak point is not marked as a singular point. Otherwise, if the heart beat interval time growth rate is more than 20%, marking as a singular point.
In other embodiments, a median filter may also be usedThe ectopic detection is performed as a pulse suppressor with a threshold. Calculating the detected value D (n) corresponding to each peak point,
Figure BDA0002376420130000101
wherein, x (n) is the heartbeat interval time corresponding to the nth peak point, and med (x) is the median of the heartbeat interval times corresponding to the peak points of the specified number before and after the nth peak point. med | x (n) -med (x) | is the median of the absolute values of the differences between the heart beat interval corresponding to each peak point and med (x).
When D (n) is larger than the set parameter, marking the corresponding peak point as a singular point.
Step S1054: and replacing the heartbeat interval time corresponding to the singular points by the mean value or the median value of the heartbeat interval time corresponding to the peak values of the first set number on the two sides of each singular point, and calculating the total standard deviation of the adjusted heartbeat interval time as the emotion change index of the testees corresponding to each type in the first type terminal type set and the second type terminal type set.
Singular points of the detected heart rate variability data are replaced, data changes caused by external interference factors are removed, accuracy of the detected data is guaranteed, and objective evaluation capability of the data is improved. Specifically, two ways of mean replacement and median replacement may be employed. For example, the singular point may be replaced by a mean value of the heartbeat interval time corresponding to 3 peak points on both sides of the singular point and 6 peak points in total.
And for mean value replacement, namely selecting mean values of heartbeat interval time corresponding to a specified number of peak points before and after the singular point to replace the singular point. Similarly, for the median replacement, the median of the heartbeat interval time corresponding to the specified number of peak points before and after the singular point is selected to replace the singular point.
Finally, calculating the total standard deviation of the heartbeat interval time as an emotion change index, wherein the formula is as follows:
Figure BDA0002376420130000111
wherein, SDNN is the total standard deviation, N is the number of heartbeat interval times, RRi is the heartbeat interval time corresponding to the ith peak point, and MeanRR is the average value of the heartbeat interval times.
In step S106, under the condition that the interactive region information content style is not changed, the non-interactive region style is evaluated by the number and/or frequency of the eye jump point. Specifically, the difference in the style set of the first type terminal is represented by the difference in the style of the non-interactive area. During the interaction process, the attention of the subject is influenced by the non-interactive region pattern, so that the fixation behavior stops or shifts to generate eye jump. Therefore, the immersion degree of the non-interactive region pattern is evaluated by counting the number and/or occurrence frequency of the eye jump points, and in other embodiments, the proportion of the eye jump points can be calculated. And when the number and/or the occurrence frequency of the eye jumping points are less than a first set threshold value and/or a set frequency value, taking the corresponding non-interactive area pattern as a terminal non-interactive area pattern. The first set threshold and the set frequency value can be adjusted according to a specific mobile terminal. For example, if the specified interaction duration is 20min and the sampling interval time is 20ms, 60000 sampling time points and corresponding data are generated in one interaction, and it is determined that the number of eye jumps is less than 600 times or the frequency of eye jumps is less than 30 times/minute, and the corresponding non-interactive area pattern is used as the terminal non-interactive area pattern.
In step S107, under the condition that no interactive region content is unchanged, the emotion change index is used to evaluate the immersion degree of the interactive region information content style, and when the emotion change index is higher than a second set threshold, the corresponding interactive region information content style is used as the terminal interactive region information content style. The second set threshold may be adjusted according to a specific mobile terminal.
In some embodiments, when the information content style of the terminal interaction area is selected according to the emotion change index, the evaluation criteria of the emotion change index may be set according to different age groups of the subject, and the evaluation criterion value of the emotion change index is higher when the age group is lower. Illustratively, when SDNN is used as the index of mood change, a second set threshold is set at 180 for the age range of 18-29 years; the second set threshold value of the age range of 30-49 years is set to be 160; the second set threshold for the age range of 50-69 years is set at 140.
In other embodiments, after step S104, that is, after calculating the number and/or the occurrence frequency of the corresponding eye jump points according to each eye movement data of the subject, as shown in fig. 6, the method further includes steps S201 to S203:
step S201: and dividing each interactive area information content style in the second type terminal style set into a plurality of blocks according to the corner style.
The interactive region information content pattern is divided into blocks according to the content, and the design and combination of the corner patterns among the blocks attract the attention of the subject to a certain extent. Referring to fig. 7 to 8, with respect to fig. 7, the corner pattern of the block in the interactive area information content pattern is a rounded corner, and the corner pattern of the block in the interactive area information content pattern in fig. 8 is a right angle. During the interaction, the two forms of rounded corners and right angles of the block will attract the attention of the subject to different degrees.
Step S202: and generating a visual heat point diagram and/or a visual track diagram corresponding to each interactive area information content style according to the eye movement data of the subject corresponding to each style in the second type terminal style set.
Since the emotion change index in step S107 cannot specifically evaluate the style of each block, in order to further evaluate whether the design and combination of the corner styles between blocks have sufficient usability, evaluation should be made for the specific block corner style. Therefore, in the embodiment, the corresponding visual hotspot graph and/or visual track graph is generated according to the eye movement data and the information content style of each interactive area. The visual hotspot graph is obtained by marking the density of fixation points at each position on a pattern of an information content style of the interactive area, the more visual processing is performed, the more intensive the fixation points are, the longer the total fixation duration is, the darker the marked color is; the visual track graph is a track for marking the stay and connection of the fixation points on the pattern of the information content style of the interactive area according to time lines, the more the visual processing is, the more the number of the fixation points is, the more the track is dense, and the longer the corresponding total fixation time is.
Step S203: and fixing the corner pattern with the total duration lower than the third set duration as a terminal corner pattern according to the corner pattern marked in the visual hotspot graph and/or the visual track graph.
And according to the corner style watching total time of each block marked by the visual hotspot graph and/or the visual track graph, taking the corner style with the corner style watching total time lower than a third set time as a terminal corner style. The total watching time of the corner patterns can be calculated one by one, and the watching time of the corner patterns can be superposed and calculated.
In other embodiments, after step S104, that is, after calculating the number and/or the occurrence frequency of the corresponding eye jump points according to each eye movement data of the subject, the method further includes steps S301 to S302:
s301: and according to the eye movement data corresponding to each style in the second type terminal style set, counting the first sight time, the fixation frequency proportion and the fixation time proportion corresponding to each interactive area information content style.
For the evaluation of the mobile terminal using the immersion degree, the evaluation of the usability with only the index of change in emotion as a parameter may cause the evaluation to be inaccurate. In the embodiment, the first-view time, the fixation time ratio and the fixation time ratio are further adopted to reflect the immersion effect of the subject in the interaction process. Specifically, the first-time watching time is the first-time watching duration, the watching frequency proportion is the proportion of the number of the watching points to the total number of samples, and the watching time proportion is the proportion of the watching time to the total testing duration.
S302: and taking the interactive area information content style with the first-view time being higher than a fourth set time length, the watching frequency proportion being higher than a second set percentage, the watching time proportion being higher than a third set percentage and the emotion change index being higher than a fourth set threshold as the terminal interactive area information content style.
In the embodiment, the first-view time, the fixation number ratio, the fixation time ratio and the emotion change index are respectively limited, so that the accuracy of the immersion degree and usability evaluation can be improved. Specific values of the fourth setting duration, the second setting percentage, the third setting percentage, and the fourth setting threshold may be set for a specific mobile terminal according to an actual application scenario. For example, the fourth set time period is 10 seconds, the second set percentage is 90%, the third set percentage is 91%, and the fourth set threshold is 160.
In other embodiments, in step S103, after obtaining the eye movement data of the human subject within the specified interaction duration when interacting with the styles in the first type terminal style set and the second type terminal style set, S401 to S402 may be further included:
s401: acquiring user interaction data of a subject, the user interaction data comprising at least: number of clicks, number of stalls.
The evaluation test is used for a method of immersion degree in the interaction process of the mobile terminal, and can also be used for reflecting the interaction process of a subject and the mobile terminal by contrasting and referring user interaction data such as click times and stagnation times. The number of clicks is the number of times of effective triggering of the content, and the number of times of the stagnation number screen reaching a specified time length without movement can be set to 3 seconds or other time lengths. In other embodiments, the method may further include: time and times of operation behaviors such as downward scrolling, upward scrolling, alternate conversion between upper and lower pages, stagnation, screen clicking, screen double clicking and the like.
S402: and concentrating the second type of terminal styles, wherein the interaction area information content styles with the click times higher than a second set number, the stagnation times higher than a third set number and the emotion change index higher than a fifth set threshold are used as the terminal interaction area information content styles.
In this embodiment, the information content style of the interactive region is determined by combining the user interactive data, so that the evaluation accuracy can be effectively improved. Specifically, the second set number and the third set number may be set according to the actual situation of the mobile terminal.
In some embodiments, step S107 is followed by generating an evaluation report for the first terminal pattern set and the second terminal pattern set.
In still other embodiments, an apparatus for selecting availability of a mobile terminal includes: the device comprises an interactive display unit, an eye movement data acquisition unit, a heart rate variability data acquisition unit and a processor unit.
The interactive display unit is used as an interactive platform of a testee and is used for displaying a first type terminal style in the first type terminal style set and a second type terminal style in the second type terminal style set; the first type terminal style set comprises a plurality of different first type terminal styles, each first type terminal style comprises an interaction area information content style and a non-interaction area style, the interaction area information content styles of different first type terminal styles are the same, and the non-interaction area styles are different from each other; the second type terminal style set comprises a plurality of different second type terminal styles, each second type terminal style comprises an interactive area information content style and a non-interactive area style, the non-interactive area styles of the different second type terminal styles are the same, and the interactive area information content styles are different from each other. The interactive display unit can be a mobile terminal platform such as a mobile phone, a tablet, a pc and the like, and can also be other elements capable of being used for displaying and interacting.
An eye movement data acquisition unit for acquiring eye movement data of a subject, comprising at least: sampling time points and visual directions, eye movement point coordinates and blinking times corresponding to the sampling time points. An eye tracker or similar eye movement data acquisition device may be employed.
And the heart rate variability data acquisition unit is used for recording the pulse time domain signals of the testee as heart rate variability data. A wireless photoelectric plethysmographic ear tip pulse sensor may be employed. The data transmission can be carried out by adopting 2.4G wireless radio frequency technology, Bluetooth communication or zigbee communication technology.
The processor unit is used for calculating the number and/or the occurrence frequency of corresponding eye jumping points according to the eye movement data of the testee corresponding to the style in the first type terminal style set, and taking the non-interactive area style of which the number and/or the occurrence frequency of the eye jumping points are smaller than a first set threshold value and/or a set frequency value as a terminal non-interactive area style; and calculating a corresponding emotion change index according to the heart rate variability data of the testee corresponding to each style in the second type of terminal style set, and taking the interactive area information content style with the emotion change index higher than a second set threshold value as the terminal interactive area information content style. The processor unit may be a single-chip microcomputer, a CPU processor or other computer storage media capable of storing and executing computer programs.
The interactive display unit, the eye movement data acquisition unit and the heart rate variability data acquisition unit are respectively connected with the processor unit.
In other embodiments, the processor unit is further to: dividing each interactive area information content style in the second type terminal style set into a plurality of blocks according to corner styles; generating a visual hotspot graph and/or a visual track graph corresponding to each interactive area information content style according to the eye movement data of the subject corresponding to each style in the second type terminal style set; and fixing the corner pattern with the total duration lower than the third set duration as a terminal corner pattern according to the corner pattern marked in the visual hotspot graph and/or the visual track graph.
In other embodiments, the interactive display unit is further configured to acquire user interaction data of the subject, the user interaction data including at least: click times and stagnation times;
the processor unit is further configured to: and concentrating the second type of terminal styles, wherein the interaction area information content styles with the click times higher than a second set number, the stagnation times higher than a third set number and the emotion change index higher than a fifth set threshold are used as terminal interaction area information content styles.
In another aspect, the present invention also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the above method.
In summary, the method and the device for selecting the usability of the mobile terminal collect eye movement data and heart rate variability data of the user in different styles by adjusting the style of the non-interactive area and the style of the information content of the interactive area, calculate the number of eye jump points and/or the occurrence frequency according to the eye movement data, evaluate the style of the information content of the interactive area by calculating the emotion change index according to the heart rate variability data, avoid the influence of subjective factors in the artificial evaluation process, and obtain accurate evaluation data to select the style with the highest usability.
In the description herein, reference to the description of the terms "one embodiment," "a particular embodiment," "some embodiments," "for example," "an example," "a particular example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. The sequence of steps involved in the various embodiments is provided to schematically illustrate the practice of the invention, and the sequence of steps is not limited and can be suitably adjusted as desired.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. A method for selecting availability of a mobile terminal, comprising:
keeping the style of the information content in the interactive area of the mobile terminal unchanged, and adjusting the style without the interactive area to form a first type terminal style set; and
keeping the style of the non-interactive area of the mobile terminal unchanged, adjusting the style of the information content of the interactive area to form a second type terminal style set;
acquiring eye movement data of a testee within a specified interaction duration when interacting with the styles in the first type terminal style set and the second type terminal style set, wherein the eye movement data at least comprises: sampling time points, and visual directions, eye movement point coordinates and blinking times of a tested person corresponding to the sampling time points; acquiring pulse time domain signals of the testee recorded by a pulse sensor as heart rate variability data;
calculating the number and/or the occurrence frequency of corresponding eye jumping points according to each eye movement data of the testee;
calculating a corresponding emotion change index according to each heart rate variability data of the testee;
acquiring the number and/or the occurrence frequency of eye jump points of a testee corresponding to each style in the first type terminal style set, and taking a non-interactive area style of which the number of the eye jump points is smaller than a first set threshold value and/or the occurrence frequency is smaller than a set frequency value as a terminal non-interactive area style;
and acquiring emotion change indexes of testees corresponding to the styles in the second type of terminal style set, and taking the interactive area information content style with the emotion change index higher than a second set threshold value as a terminal interactive area information content style.
2. The method for selecting the availability of a mobile terminal according to claim 1, wherein calculating the number and/or the frequency of occurrence of corresponding eye-jump points according to each eye movement data of the subject comprises:
acquiring eye movement data of a testee corresponding to each style in the first type terminal style set and the second type terminal style set, and performing linear filling on data gaps smaller than a first set time length in each eye movement data;
taking the second set duration as the window length, sequentially taking each sampling time point as the middle point of the window, respectively obtaining the visual direction included angle between the first sampling time point and the last sampling time point in the window length and the corresponding time difference, and dividing the visual direction included angle and the corresponding time difference into the eye movement angular speed corresponding to each sampling time point;
and marking the sampling time points of which the eye movement angular velocity is higher than a third set threshold as eye jump points, marking the sampling time points of which the eye movement angular velocity is lower than the third set threshold as fixation points, and calculating the number and/or occurrence frequency of the eye jump points.
3. The method for selecting the availability of a mobile terminal according to claim 1, wherein calculating a corresponding index of change in emotion from each heart rate variability data of the subject comprises:
acquiring heart rate variability data of a testee corresponding to each style in the first type terminal style set and the second type terminal style set, and marking peak points in each set of heart rate variability data;
calculating heartbeat interval time corresponding to each peak point according to the time stamp of each peak point;
marking peak points of which the heart beat interval time growth rate exceeds a first set percentage as singular points;
and replacing the heartbeat interval time corresponding to the singular points by the mean value or the median value of the heartbeat interval time corresponding to the peak values of the first set number on the two sides of each singular point, and calculating the total standard deviation of the adjusted heartbeat interval time as the emotion change index of the testees corresponding to each style in the first type terminal style set and the second type terminal style set.
4. The method for selecting the availability of a mobile terminal according to claim 1, wherein after calculating the number and/or the frequency of occurrence of corresponding eye-jump points according to each eye movement data of the subject, further comprising:
dividing each interactive area information content style in the second type terminal style set into a plurality of blocks according to corner styles;
generating a visual hotspot graph and/or a visual track graph corresponding to each interactive area information content style according to the eye movement data of the subject corresponding to each style in the second type terminal style set;
and fixing the corner pattern with the total duration lower than the third set duration as a terminal corner pattern according to the corner pattern marked in the visual hotspot graph and/or the visual track graph.
5. The method for selecting the availability of a mobile terminal according to claim 1, wherein after calculating the number and/or the frequency of occurrence of corresponding eye-jump points according to each eye movement data of the subject, further comprising:
according to eye movement data corresponding to the styles in the second type terminal style set, the first sight time, the fixation frequency proportion and the fixation time proportion corresponding to the information content styles of the interaction areas are counted;
and taking the interactive area information content style of which the first-view time is longer than a fourth set time length, the watching frequency proportion is higher than a second set percentage, the watching time proportion is higher than a third set percentage and the emotion change index is higher than a fourth set threshold as the terminal interactive area information content style.
6. The method for selecting usability of a mobile terminal according to claim 1, wherein after obtaining the eye movement data of the subject within the specified interaction duration when interacting with the styles in the first terminal style set and the second terminal style set, the method further comprises:
acquiring user interaction data of a subject, the user interaction data comprising at least: click times and stagnation times;
and concentrating the second type of terminal styles, wherein the interaction area information content styles with the click times higher than a second set number, the stagnation times higher than a third set number and the emotion change index higher than a fifth set threshold are used as terminal interaction area information content styles.
7. An apparatus for selecting availability of a mobile terminal, comprising:
the interactive display unit is used as an interactive platform of a testee and is used for displaying a first type terminal style in the first type terminal style set and a second type terminal style in the second type terminal style set; the first type terminal style set comprises a plurality of different first type terminal styles, each first type terminal style comprises an interaction area information content style and a non-interaction area style, the interaction area information content styles of different first type terminal styles are the same, and the non-interaction area styles are different from each other; the second type terminal style set comprises a plurality of different second type terminal styles, each second type terminal style comprises an interactive area information content style and a non-interactive area style, the non-interactive area styles of the different second type terminal styles are the same, and the interactive area information content styles are different from each other;
an eye movement data acquisition unit for acquiring eye movement data of a subject, comprising at least: sampling time points and visual directions, eye movement point coordinates and blinking times corresponding to the sampling time points;
the heart rate variability data acquisition unit is used for recording pulse time domain signals of the testee as heart rate variability data;
the processor unit is used for calculating the number and/or the occurrence frequency of corresponding eye jump points according to the eye movement data of the testee corresponding to the style in the first type terminal style set, and taking the non-interactive area style of which the number and/or the occurrence frequency of the eye jump points are smaller than a first set threshold value and/or a set frequency value as a terminal non-interactive area style; and calculating a corresponding emotion change index according to the heart rate variability data of the testee corresponding to each style in the second type of terminal style set, and taking the interactive area information content style with the emotion change index higher than a second set threshold value as a terminal interactive area information content style.
8. The apparatus for selecting the availability of a mobile terminal according to claim 7, wherein the processor unit is further configured to:
dividing each interactive area information content style in the second type terminal style set into a plurality of blocks according to corner styles;
generating a visual hotspot graph and/or a visual track graph corresponding to each interactive area information content style according to the eye movement data of the subject corresponding to each style in the second type terminal style set;
and fixing the corner pattern with the total duration lower than the third set duration as a terminal corner pattern according to the corner pattern marked in the visual hotspot graph and/or the visual track graph.
9. The apparatus for selecting usability of a mobile terminal according to claim 7, wherein the interactive display unit is further configured to: acquiring user interaction data of a subject, the user interaction data comprising at least: click times and stagnation times;
the processor unit is further configured to: and concentrating the second type of terminal styles, wherein the interaction area information content styles with the click times higher than a second set number, the stagnation times higher than a third set number and the emotion change index higher than a fifth set threshold are used as terminal interaction area information content styles.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN202010067582.0A 2020-01-20 2020-01-20 Method and device for selecting usability of mobile terminal Active CN111265225B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010067582.0A CN111265225B (en) 2020-01-20 2020-01-20 Method and device for selecting usability of mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010067582.0A CN111265225B (en) 2020-01-20 2020-01-20 Method and device for selecting usability of mobile terminal

Publications (2)

Publication Number Publication Date
CN111265225A true CN111265225A (en) 2020-06-12
CN111265225B CN111265225B (en) 2022-04-12

Family

ID=70991103

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010067582.0A Active CN111265225B (en) 2020-01-20 2020-01-20 Method and device for selecting usability of mobile terminal

Country Status (1)

Country Link
CN (1) CN111265225B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112346945A (en) * 2020-10-23 2021-02-09 北京津发科技股份有限公司 Man-machine interaction data analysis method and device
WO2023056753A1 (en) * 2021-10-08 2023-04-13 北京津发科技股份有限公司 Time-space movement track acquisition and generation method and apparatus, device, system, and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030006913A1 (en) * 2001-07-03 2003-01-09 Joyce Dennis P. Location-based content delivery
US20060090130A1 (en) * 2004-10-21 2006-04-27 Microsoft Corporation System and method for styling content in a graphical user interface control
US20100039618A1 (en) * 2008-08-15 2010-02-18 Imotions - Emotion Technology A/S System and method for identifying the existence and position of text in visual media content and for determining a subject's interactions with the text
CN108309327A (en) * 2018-01-30 2018-07-24 中国人民解放军海军总医院 A kind of test and appraisal of vision attention distribution capability and training method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030006913A1 (en) * 2001-07-03 2003-01-09 Joyce Dennis P. Location-based content delivery
US20060090130A1 (en) * 2004-10-21 2006-04-27 Microsoft Corporation System and method for styling content in a graphical user interface control
US20100039618A1 (en) * 2008-08-15 2010-02-18 Imotions - Emotion Technology A/S System and method for identifying the existence and position of text in visual media content and for determining a subject's interactions with the text
CN108309327A (en) * 2018-01-30 2018-07-24 中国人民解放军海军总医院 A kind of test and appraisal of vision attention distribution capability and training method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
凡明坤等: "不同仪表显示界面对飞行员工作负荷的影响", 《航空计算技术》 *
吕梦莎等: "手机新闻客户端不同页面设计下用户视觉浏览规律研究", 《人类工效学》 *
常方圆: "基于眼动仪的智能手机APP图形用户界面设计可用性评估", 《包装工程》 *
王佳珂等: "基于生理评测技术的民用无人机控制界面研究", 《设计》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112346945A (en) * 2020-10-23 2021-02-09 北京津发科技股份有限公司 Man-machine interaction data analysis method and device
WO2023056753A1 (en) * 2021-10-08 2023-04-13 北京津发科技股份有限公司 Time-space movement track acquisition and generation method and apparatus, device, system, and storage medium
EP4184445A4 (en) * 2021-10-08 2024-01-03 Kingfar Int Inc Time-space movement track acquisition and generation method and apparatus, device, system, and storage medium

Also Published As

Publication number Publication date
CN111265225B (en) 2022-04-12

Similar Documents

Publication Publication Date Title
US20160302680A1 (en) Biological information processing system, biological information processing device, terminal device, method for generating analysis result information, and biological information processing method
US8515531B2 (en) Method of medical monitoring
CN105190691B (en) Equipment for obtaining the vital sign of object
DE69833656T2 (en) DEVICE FOR DIAGNOSIS OF PULSE WAVES
US11288685B2 (en) Systems and methods for assessing the marketability of a product
CN111265225B (en) Method and device for selecting usability of mobile terminal
CN105852846A (en) Equipment, system and method for testing cardiac motion function
CN108478209A (en) Ecg information dynamic monitor method and dynamic monitor system
CN108882872A (en) Biont information analytical equipment, Biont information analysis system, program and Biont information analysis method
AU2021250923B2 (en) Computer-implemented method of handling electrocardiogram data
JP5760351B2 (en) Sleep evaluation apparatus, sleep evaluation system, and program
Hercegfi Heart rate variability monitoring during human-computer interaction
JP6079824B2 (en) Sleep evaluation apparatus and program
US20140257124A1 (en) Atrial fibrillation analyzer and program
KR101536361B1 (en) Method for analyzing ecg and ecg apparatus therefor
CN106333663A (en) Blood pressure monitoring method and device
EP4282327A1 (en) Deep learning-based heart rate measurement method and wearable device
JP6060563B2 (en) Atrial fibrillation determination device, atrial fibrillation determination method and program
WO2018168235A1 (en) Information display device, biological signal measurement system and computer-readable recording medium
JP5425647B2 (en) ECG analyzer
Mehrgardt et al. Deep learning fused wearable pressure and PPG data for accurate heart rate monitoring
Somia et al. A computer analysis of reflex eyelid motion in normal subjects and in facial neuropathy
JP2020151082A (en) Information processing device, information processing method, program, and biological signal measuring system
EP3265937B1 (en) Systems for displaying medical data
JP7135845B2 (en) Information processing device, information processing method, program, and biological signal measurement system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant