CN115576464A - User evaluation method, device, equipment and storage medium - Google Patents

User evaluation method, device, equipment and storage medium Download PDF

Info

Publication number
CN115576464A
CN115576464A CN202211569149.2A CN202211569149A CN115576464A CN 115576464 A CN115576464 A CN 115576464A CN 202211569149 A CN202211569149 A CN 202211569149A CN 115576464 A CN115576464 A CN 115576464A
Authority
CN
China
Prior art keywords
user
icon
prompt
target information
selection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211569149.2A
Other languages
Chinese (zh)
Inventor
韩璧丞
武晓龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mental Flow Technology Co Ltd
Original Assignee
Shenzhen Mental Flow Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mental Flow Technology Co Ltd filed Critical Shenzhen Mental Flow Technology Co Ltd
Priority to CN202211569149.2A priority Critical patent/CN115576464A/en
Publication of CN115576464A publication Critical patent/CN115576464A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/168Evaluating attention deficit, hyperactivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Developmental Disabilities (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Educational Technology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Dermatology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention belongs to the technical field of user evaluation, and discloses a user evaluation method, a device, equipment and a storage medium. The method comprises the following steps: acquiring target information from a preset information base; searching a corresponding prompt image according to the target information; generating a prompt icon group in the prompt area according to the prompt image, wherein the prompt icon group is used for prompting a user to trigger a selection instruction; generating a selectable icon group in a selection area according to the target information; responding to a user selection instruction, and determining a user selection icon which is an icon in a selectable icon group; and scoring the user according to the user selection icon. The target information of the correct answers is generated into a selectable icon combination prompting icon group, the prompting icons prompt the user of the correct answers to guide the user to select, and the final answers are determined according to the user selection condition, so that the satisfaction degree of the user in the question and answer process is improved, and the participation willingness of the user is improved.

Description

User evaluation method, device, equipment and storage medium
Technical Field
The present invention relates to the field of user evaluation technologies, and in particular, to a user evaluation method, apparatus, device, and storage medium.
Background
The existing user thinking evaluation mode can only be carried out in a questionnaire mode, the effect is poor, and the participation willingness of a user is low.
The above is only for the purpose of assisting understanding of the technical solution of the present invention, and does not represent an admission that the above is the prior art.
Disclosure of Invention
The invention mainly aims to provide a user evaluation method, a user evaluation device, user evaluation equipment and a storage medium, and aims to solve the technical problem that the user experience is poor in the user evaluation mode in the prior art.
In order to achieve the above object, the present invention provides a user evaluation method, comprising the steps of:
acquiring target information from a preset information base;
searching a corresponding prompt image according to the target information;
generating a prompt icon group in the prompt area according to the prompt image, wherein the prompt icon group is used for prompting a user to trigger a selection instruction;
generating a selectable icon group in a selection area according to the target information;
responding to a user selection instruction, and determining a user selection icon which is an icon in the selectable icon group;
and scoring the user according to the user selection icon.
Optionally, the generating a selectable icon group in the selection area according to the target information includes:
determining an answer image and an interference image according to the target information;
determining an answer icon according to the answer image;
generating a plurality of interference icons according to the interference images;
and generating a selectable icon group according to the answer icon and the plurality of interference icons.
Optionally, the intelligent terminal is connected to the wearable control device, and before determining that the user selects the icon in response to the user selection instruction, the method further includes:
detecting a user selection instruction, the user selection instruction being generated according to the wearable control device.
Optionally, the intelligent terminal is connected to a wearable control device, the wearable control device is provided with a brain signal detection device, and before responding to the user selection instruction and determining that the user selects the icon, the method further includes:
detecting a user selection instruction, the user selection instruction being generated according to the wearable control device;
receiving a brain signal detected by the brain signal detecting device;
determining a user concentration from the brain signals;
and when the concentration degree is smaller than a preset concentration degree threshold value, not responding to a user selection instruction.
Optionally, the intelligent terminal is connected with a wearable control device, the wearable control device is provided with a brain signal detection device, and after scoring the user according to the user selection icon, the method further includes:
acquiring brain signals, wherein the brain signals are acquired according to the brain signal detection device;
determining a user concentration degree change curve according to the brain signals;
generating a display image according to the user concentration degree change curve;
and finishing the animation of the change trend of the concentration degree of the user according to the display image.
Optionally, the generating a display image according to the user concentration degree variation curve further includes:
recording the time length of each icon selected by the user;
and generating a display image according to the user concentration degree change curve and the time length of each icon selected by the user.
Optionally, the scoring the user according to the user selection icon includes:
matching the user selection icon with a preset answer image;
determining the user graph selection score according to the matching result;
and scoring the user according to the user map selection score.
In addition, to achieve the above object, the present invention further provides a user evaluation apparatus, including:
the acquisition module 10 is used for acquiring target information from a preset information base;
the processing module 20 is configured to search for a corresponding prompt image according to the target information;
the processing module 20 is further configured to generate a prompt icon group in a prompt area according to the prompt image;
the processing module 20 is further configured to generate a selectable icon group in the selection area according to the target information, where the prompt icon group is used to prompt a user to trigger a selection instruction;
the processing module 20 is further configured to respond to a user selection instruction, and determine a user-selected icon, where the user-selected icon is an icon in the selectable icon group;
the processing module 20 is further configured to score the user according to the user selection icon.
In addition, to achieve the above object, the present invention further provides a user evaluation apparatus, including: a memory, a processor and a user evaluation program stored on the memory and executable on the processor, the user evaluation program being configured to implement the steps of the user evaluation method as described above.
Furthermore, to achieve the above object, the present invention also provides a storage medium, on which a user evaluation program is stored, the user evaluation program, when executed by a processor, implementing the steps of the user evaluation method as described above.
The method comprises the steps of acquiring target information from a preset information base; searching a corresponding prompt image according to the target information; generating a prompt icon group in the prompt area according to the prompt image, wherein the prompt icon group is used for prompting a user to trigger a selection instruction; generating a selectable icon group in a selection area according to the target information; responding to a user selection instruction, and determining a user selection icon which is an icon in a selectable icon group; and scoring the user according to the user selection icon. The target information of the correct answer is generated into a selectable icon combination prompt icon group, the prompt icon prompts the user that the correct answer guides the user to select, and the final answer is determined according to the user selection condition, so that the satisfaction degree of the user in the question and answer process is improved, and the participation desire of the user is improved.
Drawings
FIG. 1 is a schematic diagram of a user evaluation device of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a user evaluation method according to a first embodiment of the present invention;
FIG. 3 is a flowchart illustrating a user evaluation method according to a second embodiment of the present invention;
FIG. 4 is a block diagram of a first embodiment of a user evaluation apparatus according to the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a user evaluation device of a hardware operating environment according to an embodiment of the present invention.
As shown in fig. 1, the user evaluation apparatus may include: a processor 1001, such as a Central Processing Unit (CPU), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a Wireless interface (e.g., a Wireless-Fidelity (Wi-Fi) interface). The Memory 1005 may be a Random Access Memory (RAM) Memory, or may be a Non-Volatile Memory (NVM), such as a disk Memory. The memory 1005 may alternatively be a storage device separate from the processor 1001 described previously.
Those skilled in the art will appreciate that the configuration shown in FIG. 1 does not constitute a limitation of the user evaluation device, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a storage medium, may include therein an operating system, a network communication module, a user interface module, and a user evaluation program.
In the user evaluation apparatus shown in fig. 1, the network interface 1004 is mainly used for data communication with a network server; the user interface 1003 is mainly used for data interaction with a user; the processor 1001 and the memory 1005 of the user evaluation device of the present invention may be provided in the user evaluation device, and the user evaluation device calls the user evaluation program stored in the memory 1005 through the processor 1001 and executes the user evaluation method provided by the embodiment of the present invention.
An embodiment of the present invention provides a user evaluation method, and referring to fig. 2, fig. 2 is a flowchart illustrating a first embodiment of a user evaluation method according to the present invention.
In this embodiment, the user evaluation method includes the following steps:
step S10: and acquiring target information from a preset information base.
It should be noted that the execution main body of this embodiment is an intelligent terminal, and the intelligent terminal may be a mobile phone, a tablet computer, a computer, or even another intelligent terminal having the same or similar function as the mobile phone.
It should be noted that, this embodiment is applied to the user evaluation process, the user is guided to perform corresponding operations by a user evaluation program arranged on an intelligent terminal, and the concentration of the user is evaluated according to the user operation condition, because the current method for user evaluation only uses a simple questionnaire form to perform evaluation manually, the efficiency is low, and the user experience is not excellent, this embodiment provides a method for user evaluation through user operation, and the user can use a wearable control device to perform screen interaction operation in combination as an input mode of user parameters, perform graphical prompt and graphical selection through a graphical interface to allow the user to select answers, obtain the selected content of the user, and further perform scoring on the selection of the user to further perform user thinking evaluation and training. Taking a rayleigh test as an example, which icon should be selected can be inferred by prompting a plurality of icons in an icon group, a user selects different icon response opportunities to give different scores to the user, and finally the thinking of the user can be evaluated according to the scores.
It can be understood that the preset information base is a preset answer base, and the target information, that is, the target information, is extracted from the preset information base as an answer according to a preset rule or randomly acquired target information, so as to arrange the current interface.
Step S20: and searching a corresponding prompt image according to the target information.
It should be noted that, after the target information is determined, a corresponding prompt image needs to be determined according to the target information, where the prompt image is an image for prompting the user to select the target information. For example: assuming that the representation mode of the target information is an icon, the prompt image may be a plurality of graphic elements included in the icon, and assuming that the icon corresponding to the target information includes graphic elements such as wavy lines, crosses, red crosses, circles, and the like, the prompt image may be a set formed by the graphic elements.
Step S30: and generating a prompt icon group in the prompt area according to the prompt image, wherein the prompt icon group is used for prompting a user to trigger a selection instruction.
It is understood that the prompt icon group is generated in the prompt area according to the prompt image, that is, a plurality of icons are generated for prompting according to various information elements in the prompt image, for example: assuming that the icon corresponding to the target information includes graphical elements such as a wavy line, a cross, a red cross and a circle, generating an icon including a wavy line, an icon including a red cross and an icon including 1 graphical element according to the graphical elements such as the wavy line, the cross, the red cross and the circle, or icons including various graphical elements, and then forming a prompt icon group by the icons, so that a user can trigger a selection instruction according to a prompt of the prompt icon group.
Step S40: and generating a selectable icon group in the selection area according to the target information.
It should be noted that, according to the target information, a selectable icon group is generated in a selection area for the user to select, where the selection area refers to an area on the screen for the user to operate, and the selectable icon group is an icon provided for the user to select in the selection area, and includes an answer icon corresponding to the target information.
In the embodiment, an answer image and an interference image are determined according to the target information; determining an answer icon according to the answer image; generating a plurality of interference icons according to the interference images; and generating a selectable icon group according to the answer icon and the plurality of interference icons.
It can be understood that, according to the target information, an answer image and an interference image are determined, where the answer image is an icon pointed by the prompt icon group and is referred to as an answer icon in the present scheme, and the target information may also generate an interference icon, for example: assuming that the icon corresponding to the target information includes graphic elements such as wavy lines, crosses, red crosses, circles and the like, the graphic elements included in the icon corresponding to the target information and a large number of irrelevant elements can be combined to form an interference item, and even the graphic elements not included in the icon corresponding to the target information are found to generate the interference icon directly according to the graphic elements included in the icon corresponding to the target information. So as to improve the difficulty of the selection of the user and improve the effect of thinking training.
Step S50: and responding to a user selection instruction, and determining a user selection icon which is an icon in the selectable icon group.
It can be understood that the user can determine the icon selected by the user directly according to the user selection instruction through the touch screen operation, and the user can only select the icon from the selection area.
In an embodiment, a user selection instruction is detected, the user selection instruction being generated in accordance with the wearable control device.
It can be understood that the intelligent terminal is connected with the wearable control device, the display interface of the intelligent terminal contains the controllable image and the target images, wherein the wearable device can be a head ring device, a hand ring device, a waist ring device, intelligent glasses and the like, and the representation form of the wearable device is not limited in the embodiment. The connection mode of the intelligent terminal and the intelligent terminal can be wired connection or wireless connection.
It should be noted that the wearable control device may be a head ring worn by the user, for example: when the head ring device deflects from the left side of the user head portrait, the head ring can generate an instruction of a left side inclined state, at the moment, one icon can be selected to the left, when the head ring device deflects from the right side of the user head portrait, one icon can be selected to the right, and when the head ring device deflects from the front side of the user head portrait, the current icon can be selected.
In an embodiment, a user selection instruction is detected, the user selection instruction being generated in accordance with the wearable control device; receiving a brain signal detected by the brain signal detecting device; determining a user concentration from the brain signals; and when the concentration degree is smaller than a preset concentration degree threshold value, not responding to a user selection instruction.
It can be understood that the process of determining the concentration of the user according to the brain signals is a mature technical means in the field, and this embodiment will not be described in detail. Through the monitoring of degree of concentration, when the user is concentrated on the degree not enough, prevent the user to operate, can supervise and urge the user to keep concentrating on the degree always, also can let the user when the user is distracted to receive the phone or has other proruption situation, let user's other actions can not influence user's evaluation flow, prevent that the user from producing the maloperation if receiving the phone or when distracting.
Step S60: and scoring the user according to the user selection icon.
It should be noted that, according to the user selecting the icon, it can be determined whether the user selection is correct, and then the user selection is scored, for example: and (5) answering and adding scores, and calculating the final accumulated score.
In one embodiment, matching is performed according to the user selection icon and a preset answer image; determining the user graph selection score according to the matching result; and scoring the user according to the user map selection score.
In specific implementation, matching is performed according to the user selection icon and a preset answer image, that is, according to the image selection condition of the user in the whole process, the number of correct user selections is calculated, and then a score of the user is generated, in a preset time, for example: within 2 minutes, the more correct answers or the higher the correct rate, the higher the score.
The embodiment detects a state instruction generated by the wearable control device; determining the current state information of the target image according to the state instruction; controlling the operable image movement in response to a steerable image movement instruction; when the controllable image moves to a preset position and the current state information of the target image is in an acquirable state, eliminating the target image; and generating operation scores according to the number of the elimination target images. Combine together through screen operation and wearable equipment operation, control two kinds of control parameters in the training process through two kinds of modes, again according to final control effect to being absorbed in the degree and appraising, when absorbing in the degree training to the user, improved the training effect of being absorbed in the degree.
Referring to fig. 3, fig. 3 is a flowchart illustrating a user evaluation method according to a second embodiment of the present invention.
Based on the first embodiment, before the step S10, the user evaluation method in this embodiment further includes:
step S70: and acquiring a brain signal, wherein the brain signal is acquired according to the brain signal detection device.
In addition, since the brain signal can be acquired by the brain signal detection device, the wearable device worn on the head, such as a head ring or smart glasses, is preferably used as the wearable control device in the present embodiment.
Step S80: and determining a user concentration degree change curve according to the brain signals.
It should be noted that from the historical brain signal data collected in a local user assessment method, concentration values at various times can be calculated from the historical brain signal data, for example: the brain waves are detected through the intelligent head ring, the concentration value of the brain is estimated according to a certain algorithm, the concentration value is used as a state evaluation value to intervene in the training process, the technology of calculating the concentration value through the brain waves is a common technical means for a person skilled in the art, and the embodiment is not described herein again. And counting according to the concentration value at each moment to generate a user concentration degree change curve.
Step S90: and generating a display image according to the user concentration degree change curve.
It will be appreciated that the varying image may be plotted according to the user concentration varying curve, for example: after the evaluation is finished, the conditions of each selection stage are replayed, the situation from the generation of the prompting icon group to the completion of the selection position of the selectable icon by the user is considered as 1 answer, a scene that the vehicle runs on the bridge can be generated, the bridge is taken as a time axis of the answer of the user through a top view angle, each question corresponds to one section of the structure of the bridge, for example, if the user answers 20 questions altogether, 1 question corresponds to a bridge with 1 unit length in an image, the average value of the user concentration degree in the answer stage is taken as the width of the bridge, and the change of the user concentration degree is reflected through the change situation of the width of the bridge extending forwards continuously. The concentration degree of the user is described through the vivid image, so that the use enthusiasm of the user is favorably improved, the evaluation effect is improved, and the user can be provided with the image for duplication.
In one embodiment, the time length of each icon selected by the user is recorded; and generating a display image according to the user concentration degree change curve and the time length of each icon selected by the user.
It should be noted that the time length of each icon selected by the user is the time for the user to answer, the longer the time is, the longer the user can prove the thinking of the user, on the basis, the displayed image can be adjusted according to the time for the user to answer in each icon selection stage, for example: assuming that the scene that the vehicle runs on the bridge is still the same, the longer the time length that the user selects each icon in a question is, the slower the vehicle runs, and the specific calculation process may be V = S/T, where V is the vehicle running speed, S is the time length that the user selects each icon in the current question, and S is the bridge length corresponding to 1 question.
Step S100: and finishing the animation of the change trend of the concentration degree of the user according to the display image.
It can be understood that, according to the show image of each question accomplishes user concentration degree variation trend animation, be about to connect the image of each question and generate the animation that the vehicle was gone on the bridge, the form through the animation with the user concentration degree condition with the condition of answering the question relevance, can more clearly audio-visual observation when letting the user reply how oneself the attention changes when the answer, whether attention change has a relation with the answer speed, carry out the secondary excitation to the user, let the user can audio-visual judgement, improved user's use experience.
In the embodiment, the brain signals are acquired according to the brain signal detection device; determining a user concentration degree change curve according to the brain signals; generating a display image according to the user concentration degree change curve; and completing the concentration degree change trend animation of the user according to the display image. By the method, the image selection condition of the user is displayed in an animation mode, and the satisfaction degree of the user is improved.
In addition, an embodiment of the present invention further provides a storage medium, where the storage medium stores a user evaluation program, and the user evaluation program, when executed by a processor, implements the steps of the user evaluation method described above.
Referring to fig. 4, fig. 4 is a block diagram of a first embodiment of the user evaluation apparatus according to the present invention.
As shown in fig. 4, the user evaluation apparatus according to the embodiment of the present invention includes:
the obtaining module 10 is configured to obtain target information from a preset information base.
And the processing module 20 is configured to search for a corresponding prompt image according to the target information.
The processing module 20 is further configured to generate a prompt icon group in a prompt area according to the prompt image.
The processing module 20 is further configured to generate a selectable icon group in the selection area according to the target information, where the prompt icon group is used to prompt a user to trigger a selection instruction.
The processing module 20 is further configured to determine a user-selected icon in response to a user selection instruction, where the user-selected icon is an icon in the selectable icon group.
The processing module 20 is further configured to score the user according to the user selection icon.
It should be understood that the above is only an example, and the technical solution of the present invention is not limited in any way, and in a specific application, a person skilled in the art may set the technical solution as needed, and the present invention is not limited thereto.
The obtaining module 10 of the present embodiment obtains target information from a preset information base; the processing module 20 searches a corresponding prompt image according to the target information; the processing module 20 generates a prompt icon group in the prompt area according to the prompt image, wherein the prompt icon group is used for prompting a user to trigger a selection instruction; the processing module 20 generates a selectable icon group in the selection area according to the target information; the processing module 20 responds to a user selection instruction to determine a user selection icon, wherein the user selection icon is an icon in the selectable icon group; the processing module 20 scores the user according to the user selected icon. The target information of the correct answer is generated into a selectable icon combination prompt icon group, the prompt icon prompts the user that the correct answer guides the user to select, and the final answer is determined according to the user selection condition, so that the satisfaction degree of the user in the question and answer process is improved, and the participation desire of the user is improved.
It should be noted that the above-described work flows are only exemplary, and do not limit the scope of the present invention, and in practical applications, a person skilled in the art may select some or all of them to achieve the purpose of the solution of the embodiment according to actual needs, and the present invention is not limited herein.
In addition, the technical details that are not described in detail in this embodiment may refer to the user evaluation method provided in any embodiment of the present invention, and are not described herein again.
In an embodiment, the processing module 20 is further configured to determine an answer image and an interference image according to the target information;
determining an answer icon according to the answer image;
generating a plurality of interference icons according to the interference images;
and generating a selectable icon group according to the answer icon and the plurality of interference icons.
In an embodiment, the processing module 20 is further configured to detect a user selection instruction, where the user selection instruction is generated according to the wearable control device.
In an embodiment, the processing module 20 is further configured to detect a user selection instruction, where the user selection instruction is generated according to the wearable control device;
receiving a brain signal detected by the brain signal detecting device;
determining a user concentration from the brain signals;
and when the concentration degree is smaller than a preset concentration degree threshold value, not responding to a user selection instruction.
In an embodiment, the processing module 20 is further configured to acquire a brain signal, where the brain signal is acquired according to the brain signal detecting device;
determining a user concentration degree change curve according to the brain signals;
generating a display image according to the user concentration degree change curve;
and finishing the animation of the change trend of the concentration degree of the user according to the display image.
In an embodiment, the processing module 20 is further configured to record a time length for the user to select each icon;
and generating a display image according to the user concentration degree change curve and the time length of each icon selected by the user.
In an embodiment, the processing module 20 is further configured to match the icon selected by the user with a preset answer image;
determining the user graph selection score according to the matching result;
and scoring the user according to the user map selection score.
Furthermore, it should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention or portions thereof that contribute to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (e.g. Read Only Memory (ROM)/RAM, magnetic disk, optical disk), and includes several instructions for enabling a terminal device (e.g. a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A user evaluation method is applied to an intelligent terminal, a display interface of the intelligent terminal comprises a prompt area and a selection area, and the user evaluation method comprises the following steps:
acquiring target information from a preset information base;
searching a corresponding prompt image according to the target information;
generating a prompt icon group in the prompt area according to the prompt image, wherein the prompt icon group is used for prompting a user to trigger a selection instruction;
generating a selectable icon group in a selection area according to the target information;
responding to a user selection instruction, and determining a user selection icon which is an icon in a selectable icon group;
and scoring the user according to the user selection icon.
2. The method of claim 1, wherein said generating a set of selectable icons in a selection area based on said target information comprises:
determining an answer image and an interference image according to the target information;
determining an answer icon according to the answer image;
generating a plurality of interference icons according to the interference images;
and generating a selectable icon group according to the answer icon and the plurality of interference icons.
3. The method of claim 1, wherein the smart terminal is connected to a wearable control device, and wherein determining that the user selected the icon in response to the user selection instruction further comprises:
detecting a user selection instruction, the user selection instruction being generated according to the wearable control device.
4. The method of claim 1, wherein the smart terminal is connected to a wearable control device, the wearable control device is provided with a brain signal detection device, and the method further comprises, before determining that the user selects the icon in response to the user selection instruction:
detecting a user selection instruction, the user selection instruction being generated according to the wearable control device;
receiving a brain signal detected by the brain signal detecting device;
determining a user concentration from the brain signals;
and when the concentration degree is smaller than a preset concentration degree threshold value, not responding to a user selection instruction.
5. The method of claim 1, wherein the smart terminal is connected to a wearable control device, the wearable control device is provided with a brain signal detection device, and after scoring the user according to the user-selected icon, the method further comprises:
acquiring brain signals, wherein the brain signals are acquired according to the brain signal detection device;
determining a user concentration degree change curve according to the brain signals;
generating a display image according to the user concentration degree change curve;
and finishing the animation of the change trend of the concentration degree of the user according to the display image.
6. The method of claim 5, wherein generating a presentation image from the user concentration profile, further comprises:
recording the time length of each icon selected by the user;
and generating a display image according to the user concentration degree change curve and the time length of each icon selected by the user.
7. The method of any of claims 1-6, wherein said scoring a user according to said user selected icon comprises:
matching the user selection icon with a preset answer image;
determining the user graph selection score according to the matching result;
and scoring the user according to the user map selection score.
8. A user assessment apparatus, characterized in that the user assessment apparatus comprises:
the acquisition module 10 is used for acquiring target information from a preset information base;
the processing module 20 is configured to search for a corresponding prompt image according to the target information;
the processing module 20 is further configured to generate a prompt icon group in a prompt area according to the prompt image;
the processing module 20 is further configured to generate a selectable icon group in the selection area according to the target information, where the prompt icon group is used to prompt a user to trigger a selection instruction;
the processing module 20 is further configured to respond to a user selection instruction, and determine a user-selected icon, where the user-selected icon is an icon in the selectable icon group;
the processing module 20 is further configured to score the user according to the user selection icon.
9. A user evaluation device, the device comprising: a memory, a processor and a user evaluation program stored on the memory and executable on the processor, the user evaluation program being configured to implement the steps of the user evaluation method of any one of claims 1 to 7.
10. A storage medium, characterized in that the storage medium has stored thereon a user evaluation program which, when executed by a processor, carries out the steps of the user evaluation method according to any one of claims 1 to 7.
CN202211569149.2A 2022-12-08 2022-12-08 User evaluation method, device, equipment and storage medium Pending CN115576464A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211569149.2A CN115576464A (en) 2022-12-08 2022-12-08 User evaluation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211569149.2A CN115576464A (en) 2022-12-08 2022-12-08 User evaluation method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115576464A true CN115576464A (en) 2023-01-06

Family

ID=84590163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211569149.2A Pending CN115576464A (en) 2022-12-08 2022-12-08 User evaluation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115576464A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115756177A (en) * 2023-01-10 2023-03-07 深圳市心流科技有限公司 User evaluation method, device, equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014001271A2 (en) * 2012-06-25 2014-01-03 Hospices Civils De Lyon Device for assessing the cognitive abilities of a patient
US20170337834A1 (en) * 2016-05-17 2017-11-23 Rajaa Shindi Interactive brain trainer
CN109589122A (en) * 2018-12-18 2019-04-09 中国科学院深圳先进技术研究院 A kind of cognitive ability evaluation system and method
CN111227849A (en) * 2020-02-11 2020-06-05 杭州同绘科技有限公司 Attention assessment system and method based on VR
CN111700611A (en) * 2020-06-16 2020-09-25 中国科学院深圳先进技术研究院 Method for assessing insight capabilities and related device
CN112957049A (en) * 2021-02-10 2021-06-15 首都医科大学宣武医院 Attention state monitoring device and method based on brain-computer interface equipment technology
CN113191438A (en) * 2021-05-08 2021-07-30 啊哎(上海)科技有限公司 Learning style recognition model training and recognition method, device, equipment and medium
CN115329901A (en) * 2022-10-12 2022-11-11 深圳市心流科技有限公司 Cognitive training method, device, equipment and storage terminal

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014001271A2 (en) * 2012-06-25 2014-01-03 Hospices Civils De Lyon Device for assessing the cognitive abilities of a patient
US20170337834A1 (en) * 2016-05-17 2017-11-23 Rajaa Shindi Interactive brain trainer
CN109589122A (en) * 2018-12-18 2019-04-09 中国科学院深圳先进技术研究院 A kind of cognitive ability evaluation system and method
CN111227849A (en) * 2020-02-11 2020-06-05 杭州同绘科技有限公司 Attention assessment system and method based on VR
CN111700611A (en) * 2020-06-16 2020-09-25 中国科学院深圳先进技术研究院 Method for assessing insight capabilities and related device
CN112957049A (en) * 2021-02-10 2021-06-15 首都医科大学宣武医院 Attention state monitoring device and method based on brain-computer interface equipment technology
CN113191438A (en) * 2021-05-08 2021-07-30 啊哎(上海)科技有限公司 Learning style recognition model training and recognition method, device, equipment and medium
CN115329901A (en) * 2022-10-12 2022-11-11 深圳市心流科技有限公司 Cognitive training method, device, equipment and storage terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115756177A (en) * 2023-01-10 2023-03-07 深圳市心流科技有限公司 User evaluation method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN108416235B (en) The anti-peeping method, apparatus of display interface, storage medium and terminal device
CN109242765B (en) Face image processing method and device and storage medium
US20100238034A1 (en) System for rapid detection of drowsiness in a machine operator
CN112632349B (en) Exhibition area indication method and device, electronic equipment and storage medium
CN115576464A (en) User evaluation method, device, equipment and storage medium
JP2000342537A (en) Device and method for warning eye fatigue
JP2015220574A (en) Information processing system, storage medium, and content acquisition method
JP2018136770A (en) Information processing apparatus, method, and information processing system
WO2017184407A1 (en) Augmenting search with three-dimensional representations
CN110547756A (en) Vision test method, device and system
CN115756177B (en) User evaluation method, device, equipment and storage medium
CN109062755A (en) A kind of mobile terminal usage behavior monitoring based reminding method, device, medium and equipment
JP2019051568A (en) Guide apparatus, guide system, guidance method and program
US10747308B2 (en) Line-of-sight operation apparatus, method, and medical device
CN112817447B (en) AR content display method and system
US10282453B2 (en) Contextual and interactive sessions within search
CN112286422B (en) Information display method and device
JP2005122369A (en) Item recommendation degree presenting apparatus for recommender system
CN111872928A (en) Obstacle attribute distinguishing method and system and intelligent robot
CN111401217A (en) Driver attention detection method, device and equipment
JPH11249779A (en) Device and method for supporting visitor guidance
CN115272019A (en) Teaching evaluation method and device based on VR
CN114063845A (en) Display method, display device and electronic equipment
CN107335218A (en) Scene of game moving method and device, storage medium, processor and terminal
JP7087804B2 (en) Communication support device, communication support system and communication method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20230106