KR101447563B1 - Evaluation system of cognitive ability based on physical object and method thereof - Google Patents
Evaluation system of cognitive ability based on physical object and method thereof Download PDFInfo
- Publication number
- KR101447563B1 KR101447563B1 KR1020130037446A KR20130037446A KR101447563B1 KR 101447563 B1 KR101447563 B1 KR 101447563B1 KR 1020130037446 A KR1020130037446 A KR 1020130037446A KR 20130037446 A KR20130037446 A KR 20130037446A KR 101447563 B1 KR101447563 B1 KR 101447563B1
- Authority
- KR
- South Korea
- Prior art keywords
- user
- physical object
- cognitive
- cognitive ability
- data
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/06—Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
The present invention relates to a cognitive capability evaluation system and method, and more particularly, to a cognitive capability evaluation system and method based on physical objects.
The importance of the evaluation of the cognitive function and the cognitive function of the elderly people in the aging society is now emphasized. Conventional cognitive skills assessment techniques are not generalized according to age groups, but they are not developed for elderly people in accordance with their personality and tool manipulation ability. The currently developed cognitive ability measurement and training techniques are performed using documents, computer or observation. For example, Korea Computerized Neuropsyological test (KCNT), Computerized neuropsychological test (CNT), 3D virtual reality capability measurement program, and Internet based cognitive stability index.
However, this paper-based method of measuring cognitive ability is difficult to apply to elderly people who are uncomfortable reading obscenity or illiteracy due to physical defects, and computer-based cognitive abilities measurement methods are not suitable for elderly people who are not familiar with computers And it can be inaccurate to evaluate the capabilities of the actual user by using unfamiliar input tools (mouse or keyboard).
In addition, conventional activity-of-living (ADL) performance measurement methods can affect the outcome of subjective factors because the nurse or the therapist directly controls the degree of performance of the subject on a given task, .
In order to solve the above problems, a system and a method for evaluating cognitive ability from a user's operation while providing an easy-to-operate interface for the user to obtain an objective evaluation result without any objection to environmental variables and measurement tools Is required.
According to another aspect of the present invention, there is provided a physical object based cognitive performance evaluation system comprising: a physical object; an operation data receiving unit operable to receive operation data on an operation of a user operating the physical object from the physical object; An operation element extracting unit for extracting one operation element and a cognitive ability evaluating unit for evaluating a cognitive ability level of the user by applying the extracted at least one operation element to at least one cognitive ability evaluation element.
Also, in the physical object based cognitive performance evaluation system, the physical object includes a sensor unit capable of sensing at least one of acceleration, tilt and pressure, and the sensor unit senses the operation of the user to generate the operation data .
The physical object based cognitive performance evaluation system may further include a camera, wherein the camera captures an image of the user's operation to generate image data, transmits the image data to the operation data receiving unit, Data may be included.
Also, in the physical object based cognitive performance evaluation system, the physical object includes a touch screen, and the operation of the user directly touches the touch screen.
In addition, in the physical object-based cognitive performance evaluation system, the physical object further includes a hand held tool, wherein the operation of the user is to touch the hand-held tool to the touch screen.
Also, in the physical object-based cognitive performance evaluation system, the operation element extracting unit may analyze the operation data to divide the operation of the user into a plurality of operation periods, And extracting the extracted data.
Also, in the physical object based cognitive performance evaluation system, the predetermined operation period may be an operation period including an operation element corresponding to the cognitive ability evaluation element.
Further comprising a database for storing data defining at least one cognitive performance evaluation element corresponding to each operation element in a physical object based cognitive performance evaluation system, And evaluating the cognitive ability of the user by applying the extracted operation element to the corresponding cognitive capability evaluation element.
In addition, in the cognitive ability evaluation system based on a physical object, the cognitive ability evaluation element includes psychomotor speed, attention, linguistic ability, calculation, visuospatial function, Memory, and Execution.
The system may further include a status presentation unit for delivering a message requesting a specific operation to the user to a voice or a display in the cognitive capability evaluation system based on the physical object, and an output unit for audibly or visually outputting the cognitive capability evaluation result have.
According to an embodiment of the present invention, there is provided a method for evaluating cognitive ability based on physical objects, comprising the steps of: receiving operation data on an operation of a user manipulating a physical object from the physical object; extracting at least one operation element And applying the extracted at least one action element to at least one cognitive ability assessment element to assess a cognitive ability level of the user.
The method may further include sensing at least one of an acceleration, a slope, and a pressure of the physical object to generate the operation data in the physical object based cognitive performance evaluation method.
Further, in the method for evaluating cognitive ability based on a physical object, the method may further include generating image data by photographing an operation of a user manipulating the physical object, wherein the operation data includes the image data .
Also, in the physical object based cognitive performance evaluation method, the physical object includes a touch screen, and the operation of the user directly touches the touch screen.
Also in the physical object based cognitive performance evaluation method, the physical object further includes a handheld tool, wherein the operation of the user is to touch the handheld tool to the touch screen.
Also, in the physical object based cognitive capability evaluation method, the step of extracting the operation element may include analyzing the operation data to divide the operation of the user into a plurality of operation sections, And extracting an operation element from the operation unit.
Also, in the physical object based cognitive capability evaluation method, the predetermined operation section may be an operation section including an operation element corresponding to the cognitive capability evaluation element.
Further comprising the step of, in a physical object based cognitive performance evaluation method, further comprising defining at least one cognitive performance evaluation element corresponding to each operational element, the step of evaluating the cognitive ability level of the user comprises: And the cognitive ability evaluation unit evaluates the cognitive ability of the user by applying the cognitive ability evaluation element to the corresponding cognitive capability evaluation element.
In addition, in the physical object-based cognitive ability evaluation method, the cognitive ability evaluation element may include psychomotor speed, attention, linguistic ability, calculation ability, visuospatial function, (Memory), and Execution (Execution).
In addition, in the method for evaluating cognitive ability based on a physical object, a step of transmitting a message requesting a specific operation to a user to a voice or a display; And outputting the cognitive ability evaluation result audibly or visually.
According to the present invention, the evaluation result is objective and less influenced by other environmental factors, since the user performs a cognitive ability evaluation using a tool familiar to the user in the daily life.
1 is a configuration diagram of a cognitive capability evaluation system based on a physical object according to an embodiment of the present invention.
2 is a configuration diagram of a
3 is a diagram illustrating a user operation through a touch screen according to an exemplary embodiment of the present invention.
FIG. 4 is a diagram illustrating an operation of a user divided into a plurality of operation periods according to an embodiment of the present invention.
5 is a diagram for explaining a correspondence relationship between an operation section (operation element) and a cognitive ability evaluation element according to an embodiment of the present invention.
6 is a flowchart of a cognitive ability evaluation method according to an embodiment of the present invention.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this specification, the terms "comprises ", or" having ", or the like, specify that there is a stated feature, number, step, operation, , Steps, operations, components, parts, or combinations thereof, as a matter of principle.
Unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries should be construed as meaning consistent with meaning in the context of the relevant art and are not to be construed as ideal or overly formal in meaning unless expressly defined herein . Like reference numerals in the drawings denote like elements.
In the following description, well-known functions or constructions are not described in detail to avoid unnecessarily obscuring the subject matter of the present invention. In addition, the size of each component in the drawings may be exaggerated for the sake of explanation and does not mean a size actually applied.
Embodiments described herein may be wholly hardware, partially hardware, partially software, or entirely software. A "unit," "module," "device," or "system" or the like in this specification refers to a computer-related entity such as a hardware, a combination of hardware and software, or software. A processor, an object, an executable, a thread of execution, a program, and / or a computer, for example, a computer, but is not limited to, a computer. For example, both an application running on a computer and a computer may correspond to a part, module, device or system of the present specification.
Embodiments have been described with reference to the flowcharts shown in the drawings. While the above method has been shown and described as a series of blocks for purposes of simplicity, it is to be understood that the invention is not limited to the order of the blocks, and that some blocks may be present in different orders and in different orders from that shown and described herein And various other branches, flow paths, and sequences of blocks that achieve the same or similar results may be implemented. Also, not all illustrated blocks may be required for implementation of the methods described herein. Furthermore, the method according to an embodiment of the present invention may be implemented in the form of a computer program for performing a series of processes, and the computer program may be recorded on a computer-readable recording medium.
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
1 is a configuration diagram of a cognitive capability evaluation system based on a physical object according to an embodiment of the present invention. 1, a cognitive
The
2 is a configuration diagram of a
For example, if the model knife and the fruit are physical objects, the sensor may be attached to the model knife and the fruit. In this case, the
The
In another embodiment, the cognitive
The
In yet another embodiment, the
3 is a diagram illustrating a user operation through a touch screen according to an exemplary embodiment of the present invention. Referring to FIG. 3, virtual objects 121-126 and a
When the
The touch input unit (not shown) of the
In one embodiment, the
In addition, the handheld tool may include any physical object, including balls, cups, model food, and the like, which may be used for cognitive abilities assessment. At least one of the physical objects equipped with the camera, the touch screen, and the sensor may be used at the same time to determine the operation of the user. When the physical object is a touch screen, the camera 10 is installed integrally with the
In one embodiment, the operation
Specifically, the
FIG. 4 is a diagram illustrating an operation of a user divided into a plurality of operation periods according to an embodiment of the present invention. FIG. 4 shows a process of making a sandwich by dividing into a plurality of operation sections. 4 may be performed using a physical object including an acceleration sensor or the like or a touch screen. When an acceleration sensor or the like is included, it is necessary to provide a sensor for bread, vegetable, and cheese for each model.
The operation
The action element is information representing an operation in which the user manipulates the physical object. Information about an action of moving a physical object a certain distance, or a process of assembling or disassembling a specific selection action or a physical object.
In another embodiment, the cognitive
5 is a diagram for explaining a correspondence relationship between an operation section (operation element) and a cognitive ability evaluation element according to an embodiment of the present invention. In one embodiment, the cognitive
As shown in FIG. 5, the cognitive
As described above, the cognitive
In another embodiment, the cognitive
The
The
The
By the organic combination and function of the above-described configuration, the cognitive
6 is a flowchart of a cognitive ability evaluation method according to an embodiment of the present invention. Referring to FIG. 6, the cognitive capability evaluation method includes receiving (S10) operation data on an operation of a user operating a physical object from the physical object, extracting at least one operation element from the operation data And applying the extracted at least one action element to at least one cognitive capability assessment element to evaluate a cognitive ability level of the user (S30).
The cognitive performance evaluation method may further include sensing at least one of acceleration, slope, and pressure of the physical object to generate operation data. In addition, the cognitive capability evaluation method may include a step of photographing an operation of the user who operates the physical object to generate image data, and the image data may be used as operation data.
In one embodiment, the physical object may be a touch screen, and the user may evaluate the user ' s motion from an operation that touches the touch screen. The touch on the touch screen can use a direct touch using the body and the tool.
In another embodiment, the physical object may further comprise a handheld tool in addition to the touch screen. Data may be used for operations that the user may manipulate, move, assemble, disassemble, etc., the handheld tool. In this case, the coordinates and time on the touch screen where the handheld tool is located can be used for operation data generation.
In another embodiment, the step of extracting the operation element (S20) includes analyzing the operation data to divide the operation of the user into a plurality of operation sections, and extracting operation elements in a predetermined operation section from the divided operation sections And extracting the extracted data. Here, the predetermined operation period may be an operation period including an operation element corresponding to the perceptual ability evaluation element.
Further, the cognitive ability evaluation step of the present invention may further include the step of defining at least one cognitive ability evaluation element corresponding to each operation element. The step of evaluating the user's cognitive ability level (S30) may evaluate the cognitive ability of the user by applying the extracted operation element to the cognitive ability evaluation element corresponding to the extracted operation element.
In one embodiment, the cognitive capability assessment step may further include delivering a message to the user that requires a particular action to the user's voice or display, and outputting the cognitive capability assessment results in a voice or visual manner.
While the invention has been shown and described with reference to certain embodiments thereof, it will be understood by those skilled in the art that various changes and modifications may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. However, it should be understood that such modifications are within the technical scope of the present invention. Accordingly, the true scope of the present invention should be determined by the technical idea of the appended claims.
Physical objects: 100
Operation data receiver: 200
Operation element extraction section: 300
Cognitive ability assessment department: 400
Camera: 500
Database: 600
Claims (20)
An operation data receiving unit operable to receive operation data on an operation of a user operating the physical object from the physical object;
An operation element extracting unit for extracting at least one operation element from the operation data; And
And a cognitive performance evaluation unit for evaluating a cognitive ability level of the user by applying the extracted at least one operation element to two or more cognitive performance evaluation elements,
Wherein the operation element extracting section analyzes the operation data to divide the operation of the user into a plurality of operation sections, extracts operation elements in a predetermined operation section from the divided operation sections,
Wherein the predetermined operation section is an operation section including an operation element corresponding to the perceptual ability evaluation element.
Wherein the physical object includes a sensor portion capable of sensing at least one of acceleration, tilt, and pressure,
Wherein the sensor unit senses the operation of the user and generates the operation data.
Further comprising a camera,
The camera captures the operation of the user to generate image data, transmits the image data to the operation data receiver,
Wherein the operation data includes the image data.
Wherein the physical object comprises a touch screen,
Wherein the operation of the user directly touches the touch screen.
The physical object further includes a hand held tool,
Wherein the action of the user is to touch the handheld tool to the touch screen.
Further comprising: a database storing data defining at least one cognitive performance evaluation element corresponding to each action element,
Wherein the cognitive ability evaluation unit evaluates the cognitive ability of the user by applying the extracted operation element to the cognitive ability evaluation element corresponding to the extracted operation element.
The cognitive abilities evaluation element may include at least one of psychomotor speed, attention, linguistic ability, calculation, visuospatial function, memory, Wherein the physical object-based cognitive performance evaluation system comprises:
A status presentation unit for delivering a message requesting a specific operation to the user to a voice or a display; And
Further comprising an output unit for audibly or visually outputting the cognitive ability evaluation result.
Extracting at least one operating element from the operational data; And
Applying the extracted at least one action element to more than one cognitive ability assessment element to assess a cognitive ability level of the user,
Wherein the extracting of the operating element comprises:
Analyzing the operation data to divide the operation of the user into a plurality of operation sections and extracting operation elements in a predetermined operation section from the separated operation sections,
Wherein the predetermined operation period is an operation period including an operation element corresponding to the perceptual ability evaluation element.
Further comprising sensing at least one of acceleration, slope, and pressure of the physical object to generate the operation data.
Further comprising the step of photographing an operation of a user manipulating the physical object to generate image data,
Wherein the operation data includes the image data.
Wherein the physical object comprises a touch screen,
Wherein the operation of the user directly touches the touch screen.
Wherein the physical object further comprises a handheld tool,
Wherein the action of the user is to touch the handheld tool to the touch screen.
Further comprising defining at least one cognitive capability assessment element corresponding to each action element,
Evaluating the user's cognitive ability level comprises:
And evaluating the cognitive ability of the user by applying the extracted action elements to a cognitive ability assessment element corresponding to the extracted action element.
The cognitive abilities evaluation element may include at least one of psychomotor speed, attention, linguistic ability, calculation, visuospatial function, memory, Wherein the physical object comprises a plurality of physical objects.
Delivering a message to the user requesting a specific action to a voice or display; And
Further comprising the step of audibly or visually outputting the cognitive ability evaluation result.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130037446A KR101447563B1 (en) | 2013-04-05 | 2013-04-05 | Evaluation system of cognitive ability based on physical object and method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130037446A KR101447563B1 (en) | 2013-04-05 | 2013-04-05 | Evaluation system of cognitive ability based on physical object and method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
KR101447563B1 true KR101447563B1 (en) | 2014-10-08 |
Family
ID=51996643
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020130037446A KR101447563B1 (en) | 2013-04-05 | 2013-04-05 | Evaluation system of cognitive ability based on physical object and method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101447563B1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160142597A (en) | 2015-06-03 | 2016-12-13 | 조선대학교산학협력단 | Health condition estimation apparatus and method for tracking through the person's hand function |
KR102123869B1 (en) * | 2019-02-12 | 2020-06-23 | 장성철 | Training device and method for improving cognitive response |
WO2022085853A1 (en) * | 2020-10-20 | 2022-04-28 | 빅픽쳐스 주식회사 | Method for determining whether object is recognized, and device using same |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006320424A (en) * | 2005-05-17 | 2006-11-30 | Tama Tlo Kk | Action teaching apparatus and method |
JP2010527642A (en) * | 2007-04-13 | 2010-08-19 | ナイキ インターナショナル リミテッド | Inspection and training of visual cognitive ability and cooperative behavior |
KR20120107736A (en) * | 2011-03-22 | 2012-10-04 | 한국과학기술연구원 | Norm-based cognitive measuring and evaluation system |
-
2013
- 2013-04-05 KR KR1020130037446A patent/KR101447563B1/en active IP Right Grant
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006320424A (en) * | 2005-05-17 | 2006-11-30 | Tama Tlo Kk | Action teaching apparatus and method |
JP2010527642A (en) * | 2007-04-13 | 2010-08-19 | ナイキ インターナショナル リミテッド | Inspection and training of visual cognitive ability and cooperative behavior |
KR20120107736A (en) * | 2011-03-22 | 2012-10-04 | 한국과학기술연구원 | Norm-based cognitive measuring and evaluation system |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160142597A (en) | 2015-06-03 | 2016-12-13 | 조선대학교산학협력단 | Health condition estimation apparatus and method for tracking through the person's hand function |
KR102123869B1 (en) * | 2019-02-12 | 2020-06-23 | 장성철 | Training device and method for improving cognitive response |
WO2022085853A1 (en) * | 2020-10-20 | 2022-04-28 | 빅픽쳐스 주식회사 | Method for determining whether object is recognized, and device using same |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA3007215C (en) | Systems, computer medium and methods for management training systems | |
Stander et al. | Psychological empowerment, job insecurity and employee engagement | |
JP6187902B2 (en) | Intelligent productivity analyzer, program | |
JP2014063128A (en) | Concentration level measurement device and program | |
Kim et al. | Grasping VR: Presence of pseudo-haptic interface based portable hand grip system in immersive virtual reality | |
Vincs et al. | Snapshots of complexity: using motion capture and principal component analysis to reconceptualise dance | |
Falcão et al. | Application of virtual reality technologies in consumer product usability | |
Grandi et al. | Creation of a UX index to design human tasks and workstations | |
KR101447563B1 (en) | Evaluation system of cognitive ability based on physical object and method thereof | |
US20130262182A1 (en) | Predicting purchase intent based on affect | |
Callejas et al. | A framework for the assessment of synthetic personalities according to user perception | |
Van Acker et al. | Development and validation of a behavioural video coding scheme for detecting mental workload in manual assembly | |
Chu et al. | A comparative study of design evaluation with virtual prototypes versus a physical product | |
Masoner et al. | Complexity of postural sway affects affordance perception of reachability in virtual reality | |
KR101216316B1 (en) | Method for self-management using user terminal and system for processing the method | |
Hossain et al. | Cognitive load and usability analysis of R-MAP for the people who are blind or visual impaired | |
MacNamara | Evaluating the effectiveness of the gestalt principles of perceptual observation for virtual reality user interface design | |
Iwasako et al. | Development of finger motion skill learning support system based on data gloves | |
Durall et al. | Feeler: supporting awareness and reflection about learning through EEG data | |
Loup-Escande et al. | Towards a user-centred methodological framework for the design and evaluation of applications combining brain-computer interfaces and virtual environments: contributions of ergonomics | |
Ukita et al. | A user-centered design approach to physical motion coaching systems for pervasive health | |
Conati et al. | User-adaptive visualizations: can gaze data tell us when a user needs them? | |
Wang et al. | Designing action-characterizing toy blocks for behavior assessments | |
Levulis et al. | Are all tests equal? A comparison of emulator and device testing for mobile usability evaluation | |
Pascual et al. | Proposal of an Intuitive Interface Structure for Ergonomics Evaluation Software |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant | ||
FPAY | Annual fee payment |
Payment date: 20170828 Year of fee payment: 4 |
|
FPAY | Annual fee payment |
Payment date: 20180903 Year of fee payment: 5 |
|
FPAY | Annual fee payment |
Payment date: 20190902 Year of fee payment: 6 |