KR101447563B1 - Evaluation system of cognitive ability based on physical object and method thereof - Google Patents

Evaluation system of cognitive ability based on physical object and method thereof Download PDF

Info

Publication number
KR101447563B1
KR101447563B1 KR1020130037446A KR20130037446A KR101447563B1 KR 101447563 B1 KR101447563 B1 KR 101447563B1 KR 1020130037446 A KR1020130037446 A KR 1020130037446A KR 20130037446 A KR20130037446 A KR 20130037446A KR 101447563 B1 KR101447563 B1 KR 101447563B1
Authority
KR
South Korea
Prior art keywords
user
physical object
cognitive
cognitive ability
data
Prior art date
Application number
KR1020130037446A
Other languages
Korean (ko)
Inventor
박완주
조성업
권규현
박세형
김래현
Original Assignee
한국과학기술연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국과학기술연구원 filed Critical 한국과학기술연구원
Priority to KR1020130037446A priority Critical patent/KR101447563B1/en
Application granted granted Critical
Publication of KR101447563B1 publication Critical patent/KR101447563B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

According to the present invention, disclosed are an evaluation system of cognitive ability based on a physical object comprising: the physical object; a movement data receiving part which receives movement data about the movement of a user who manipulates the physical object from the physical object; a movement component extracting part which extracts at least one movement component from the movement data; and a cognitive ability evaluation part which evaluates a cognitive ability level of the user by applying at least one extracted movement component to at least one component of cognitive ability evaluation, and a method thereof.

Description

TECHNICAL FIELD [0001] The present invention relates to a physical object based cognitive performance evaluation system and method,

The present invention relates to a cognitive capability evaluation system and method, and more particularly, to a cognitive capability evaluation system and method based on physical objects.

The importance of the evaluation of the cognitive function and the cognitive function of the elderly people in the aging society is now emphasized. Conventional cognitive skills assessment techniques are not generalized according to age groups, but they are not developed for elderly people in accordance with their personality and tool manipulation ability. The currently developed cognitive ability measurement and training techniques are performed using documents, computer or observation. For example, Korea Computerized Neuropsyological test (KCNT), Computerized neuropsychological test (CNT), 3D virtual reality capability measurement program, and Internet based cognitive stability index.

However, this paper-based method of measuring cognitive ability is difficult to apply to elderly people who are uncomfortable reading obscenity or illiteracy due to physical defects, and computer-based cognitive abilities measurement methods are not suitable for elderly people who are not familiar with computers And it can be inaccurate to evaluate the capabilities of the actual user by using unfamiliar input tools (mouse or keyboard).

In addition, conventional activity-of-living (ADL) performance measurement methods can affect the outcome of subjective factors because the nurse or the therapist directly controls the degree of performance of the subject on a given task, .

Patent Laid-Open Publication No. 10-2012-0077270

In order to solve the above problems, a system and a method for evaluating cognitive ability from a user's operation while providing an easy-to-operate interface for the user to obtain an objective evaluation result without any objection to environmental variables and measurement tools Is required.

According to another aspect of the present invention, there is provided a physical object based cognitive performance evaluation system comprising: a physical object; an operation data receiving unit operable to receive operation data on an operation of a user operating the physical object from the physical object; An operation element extracting unit for extracting one operation element and a cognitive ability evaluating unit for evaluating a cognitive ability level of the user by applying the extracted at least one operation element to at least one cognitive ability evaluation element.

Also, in the physical object based cognitive performance evaluation system, the physical object includes a sensor unit capable of sensing at least one of acceleration, tilt and pressure, and the sensor unit senses the operation of the user to generate the operation data .

The physical object based cognitive performance evaluation system may further include a camera, wherein the camera captures an image of the user's operation to generate image data, transmits the image data to the operation data receiving unit, Data may be included.

Also, in the physical object based cognitive performance evaluation system, the physical object includes a touch screen, and the operation of the user directly touches the touch screen.

In addition, in the physical object-based cognitive performance evaluation system, the physical object further includes a hand held tool, wherein the operation of the user is to touch the hand-held tool to the touch screen.

Also, in the physical object-based cognitive performance evaluation system, the operation element extracting unit may analyze the operation data to divide the operation of the user into a plurality of operation periods, And extracting the extracted data.

Also, in the physical object based cognitive performance evaluation system, the predetermined operation period may be an operation period including an operation element corresponding to the cognitive ability evaluation element.

Further comprising a database for storing data defining at least one cognitive performance evaluation element corresponding to each operation element in a physical object based cognitive performance evaluation system, And evaluating the cognitive ability of the user by applying the extracted operation element to the corresponding cognitive capability evaluation element.

In addition, in the cognitive ability evaluation system based on a physical object, the cognitive ability evaluation element includes psychomotor speed, attention, linguistic ability, calculation, visuospatial function, Memory, and Execution.

The system may further include a status presentation unit for delivering a message requesting a specific operation to the user to a voice or a display in the cognitive capability evaluation system based on the physical object, and an output unit for audibly or visually outputting the cognitive capability evaluation result have.

According to an embodiment of the present invention, there is provided a method for evaluating cognitive ability based on physical objects, comprising the steps of: receiving operation data on an operation of a user manipulating a physical object from the physical object; extracting at least one operation element And applying the extracted at least one action element to at least one cognitive ability assessment element to assess a cognitive ability level of the user.

The method may further include sensing at least one of an acceleration, a slope, and a pressure of the physical object to generate the operation data in the physical object based cognitive performance evaluation method.

Further, in the method for evaluating cognitive ability based on a physical object, the method may further include generating image data by photographing an operation of a user manipulating the physical object, wherein the operation data includes the image data .

Also, in the physical object based cognitive performance evaluation method, the physical object includes a touch screen, and the operation of the user directly touches the touch screen.

Also in the physical object based cognitive performance evaluation method, the physical object further includes a handheld tool, wherein the operation of the user is to touch the handheld tool to the touch screen.

Also, in the physical object based cognitive capability evaluation method, the step of extracting the operation element may include analyzing the operation data to divide the operation of the user into a plurality of operation sections, And extracting an operation element from the operation unit.

Also, in the physical object based cognitive capability evaluation method, the predetermined operation section may be an operation section including an operation element corresponding to the cognitive capability evaluation element.

Further comprising the step of, in a physical object based cognitive performance evaluation method, further comprising defining at least one cognitive performance evaluation element corresponding to each operational element, the step of evaluating the cognitive ability level of the user comprises: And the cognitive ability evaluation unit evaluates the cognitive ability of the user by applying the cognitive ability evaluation element to the corresponding cognitive capability evaluation element.

In addition, in the physical object-based cognitive ability evaluation method, the cognitive ability evaluation element may include psychomotor speed, attention, linguistic ability, calculation ability, visuospatial function, (Memory), and Execution (Execution).

In addition, in the method for evaluating cognitive ability based on a physical object, a step of transmitting a message requesting a specific operation to a user to a voice or a display; And outputting the cognitive ability evaluation result audibly or visually.

According to the present invention, the evaluation result is objective and less influenced by other environmental factors, since the user performs a cognitive ability evaluation using a tool familiar to the user in the daily life.

1 is a configuration diagram of a cognitive capability evaluation system based on a physical object according to an embodiment of the present invention.
2 is a configuration diagram of a physical object 100 according to an embodiment of the present invention.
3 is a diagram illustrating a user operation through a touch screen according to an exemplary embodiment of the present invention.
FIG. 4 is a diagram illustrating an operation of a user divided into a plurality of operation periods according to an embodiment of the present invention.
5 is a diagram for explaining a correspondence relationship between an operation section (operation element) and a cognitive ability evaluation element according to an embodiment of the present invention.
6 is a flowchart of a cognitive ability evaluation method according to an embodiment of the present invention.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this specification, the terms "comprises ", or" having ", or the like, specify that there is a stated feature, number, step, operation, , Steps, operations, components, parts, or combinations thereof, as a matter of principle.

Unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries should be construed as meaning consistent with meaning in the context of the relevant art and are not to be construed as ideal or overly formal in meaning unless expressly defined herein . Like reference numerals in the drawings denote like elements.

In the following description, well-known functions or constructions are not described in detail to avoid unnecessarily obscuring the subject matter of the present invention. In addition, the size of each component in the drawings may be exaggerated for the sake of explanation and does not mean a size actually applied.

Embodiments described herein may be wholly hardware, partially hardware, partially software, or entirely software. A "unit," "module," "device," or "system" or the like in this specification refers to a computer-related entity such as a hardware, a combination of hardware and software, or software. A processor, an object, an executable, a thread of execution, a program, and / or a computer, for example, a computer, but is not limited to, a computer. For example, both an application running on a computer and a computer may correspond to a part, module, device or system of the present specification.

Embodiments have been described with reference to the flowcharts shown in the drawings. While the above method has been shown and described as a series of blocks for purposes of simplicity, it is to be understood that the invention is not limited to the order of the blocks, and that some blocks may be present in different orders and in different orders from that shown and described herein And various other branches, flow paths, and sequences of blocks that achieve the same or similar results may be implemented. Also, not all illustrated blocks may be required for implementation of the methods described herein. Furthermore, the method according to an embodiment of the present invention may be implemented in the form of a computer program for performing a series of processes, and the computer program may be recorded on a computer-readable recording medium.

Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.

1 is a configuration diagram of a cognitive capability evaluation system based on a physical object according to an embodiment of the present invention. 1, a cognitive capability evaluation system 1000 according to an embodiment of the present invention includes a physical object 100, an operation data receiving unit 200, an operation element extracting unit 300, ).

The physical object 100 can generate operation data (interaction information) according to a user's operation. That is, information on the movement of the user through the physical object may be transmitted to the operation data receiving unit 200.

2 is a configuration diagram of a physical object 100 according to an embodiment of the present invention. Referring to FIG. 2, the physical object 100 includes a sensor unit 110. The sensor unit 110 is any sensor capable of sensing at least one of voice, acceleration, tilt, and pressure of the physical object 100. The sensor unit may generate operation data by sensing the operation of the user manipulating the physical object.

For example, if the model knife and the fruit are physical objects, the sensor may be attached to the model knife and the fruit. In this case, the sensor unit 110 may generate operation data for a user cutting a fruit with a knife using a knife and a sensor attached to the fruit, and may transmit the generated operation data to the operation data receiving unit 120.

The physical object 110 and the sensor unit 120 may be equipped with a communication module (Bluetooth module, RFID sensor module, etc.) capable of exchanging data by wire or wireless. In addition, the communication module may be provided to exchange data between any of the configurations included in the system of the present invention.

In another embodiment, the cognitive capability assessment system 1000 may further include a camera 500. [ The camera 500 may be a general camera or an infrared (IR) camera that still utilizes RGB information.

The camera 500 may capture motion of the user manipulating the physical object 100 and generate image data. The operation element extracting unit 400 analyzes the operation of the user by using the image data or the operation data inputted through the other sensing means, .

In yet another embodiment, the physical object 100 may be a touch screen. In this case, virtual objects displayed on the touch screen can be touched to perform an arbitrary operation.

3 is a diagram illustrating a user operation through a touch screen according to an exemplary embodiment of the present invention. Referring to FIG. 3, virtual objects 121-126 and a status message part 127 appear on the touch screen 120. FIG.

When the touch screen 120 is used as a physical object, as shown in FIG. 3, the user presses the material container 1 (125) to move the material container 1 125 to position A 122, which is the position of the original material container 1 125 ), And can move to the position A 122 by dragging. The touch on the touch screen may include direct touch using the user's body and touch action on the touch pen or the like.

The touch input unit (not shown) of the touch screen 120 can generate the operation data by receiving the touch input. The operation data thus generated may include information such as the time taken to arrange the material container 1 125 on the touch screen, the time taken to arrange it in its original position, and whether it has been moved to the correct position.

In one embodiment, the physical object 100 may further include a hand held tool. In this case, the user action may be to touch the handheld tool to the touch screen 120. For example, referring to FIG. 3, the material container 125 may be a physical object rather than a virtual object. The user can manually grasp the material container 125, which is a physical object, and place it in position A 122, which is the original position of the material container 1 125. [ The touch input unit of the touch screen may sense the material container 1 125 that is touched to the position A 122 to generate operation data.

In addition, the handheld tool may include any physical object, including balls, cups, model food, and the like, which may be used for cognitive abilities assessment. At least one of the physical objects equipped with the camera, the touch screen, and the sensor may be used at the same time to determine the operation of the user. When the physical object is a touch screen, the camera 10 is installed integrally with the touch screen 120 or separated from the touch screen 120 to separate the touch screen 120 from the touch screen 120, May be separately installed on the upper, left, or lower portion of the body 120.

In one embodiment, the operation element extracting unit 300 analyzes the operation data to divide the operation of the user into a plurality of operation periods (for example, a unit of a task) Can be extracted. In order to extract the operation element, the operation element extraction unit 300 can recognize the type, position, and operation of the physical object 100 from the received operation data.

Specifically, the operation element extractor 300 may divide the consecutive user's operations at regular intervals, select an operation interval required for evaluation among the divided operation intervals, and extract motion information in the operation interval. For this, motion recognition technology can be used.

FIG. 4 is a diagram illustrating an operation of a user divided into a plurality of operation periods according to an embodiment of the present invention. FIG. 4 shows a process of making a sandwich by dividing into a plurality of operation sections. 4 may be performed using a physical object including an acceleration sensor or the like or a touch screen. When an acceleration sensor or the like is included, it is necessary to provide a sensor for bread, vegetable, and cheese for each model.

The operation element extracting unit 300 can extract an operation element in a predetermined operation interval from among the divided operation intervals. This is because not all of the plurality of operation sections may be used to evaluate the cognitive ability of the user. Accordingly, the predetermined operation section may be an operation section including an operation element corresponding to the evaluation element required for the cognitive ability evaluation. For example, in the case where the ham lifter T4 in FIG. 4 is a user operation not used for evaluating the cognitive ability, the operation element extracting unit 300 extracts the operation elements in the operation period T1-T7 except for the operation period T4 The operation element can be extracted.

The action element is information representing an operation in which the user manipulates the physical object. Information about an action of moving a physical object a certain distance, or a process of assembling or disassembling a specific selection action or a physical object.

In another embodiment, the cognitive capability assessment system 1000 may further comprise a database (DB) 600 that stores data defining at least one cognitive performance assessment element corresponding to each operational element. The situation for assessing the user's cognitive abilities can vary, including making sandwiches, cooking, and following suggested actions. Since the operation interval for evaluating the cognitive ability is different for each situation, the database 600 can define and store the cognitive performance evaluation element for the operation element between the operation interval and the motion interval according to the situation.

5 is a diagram for explaining a correspondence relationship between an operation section (operation element) and a cognitive ability evaluation element according to an embodiment of the present invention. In one embodiment, the cognitive capability evaluation unit 400 may evaluate the cognitive ability level of the user by applying at least one operation element extracted from the operation element extraction unit 300 to at least one cognitive performance evaluation element.

As shown in FIG. 5, the cognitive capability evaluation unit 400 can evaluate a plurality of evaluation factors corresponding to the operation elements based on data on the task. When the memory, the space-time capability, and the cognitive ability correspond to the operation period (Task 1), the cognitive ability evaluation unit 400 determines the time and accuracy required for selecting and arranging materials necessary for cooking from the user's operating elements, The memory capacity 11 and the space-time capacity 12 can be evaluated. For example, the cognitive ability assessment unit 400 determines whether the subject has selected only the ingredients needed to make a sandwich, performs the order in which the sandwiches are made, and places the ingredients and tools that are left over after cooking in the correct place, The time taken to perform the action can be compared with a reference time to evaluate each action of the user. The reference time may be an average experimental value for each generation or an arbitrary evaluation index.

As described above, the cognitive ability evaluation unit 400 may include psychomotor speed, attention, linguistic ability, calculation, visuospatial function, memory ability, Memory, and Execution.

In another embodiment, the cognitive capability assessment system 1000 includes a situation suggestion unit 700 for delivering a message to a user requesting a specific action to a voice or display, and an output unit 800 for outputting the cognitive ability assessment result to a voice or display ).

The situation suggestion unit 700 and the output unit 700 may be a speaker or a display device, and each of them may be a separate device or an integrated device. In addition, when the physical object is a touch screen, the status display unit and the output unit may be included in the touch screen.

The situation suggesting unit 700 may present a situation to be processed by the user such as "making sandwiches" or "organizing kitchens" or providing a message to perform a piecemeal specific action can do. Accordingly, the user can recognize the action to be performed by himself / herself and perform the evaluation by presenting a specific situation including a plurality of behaviors. Since the cognitive ability assessment system can recognize the action to be performed and the action to be performed next, The following operations can be estimated and estimated.

The output unit 800 can output auditory or visual cognitive ability evaluation results. The evaluation result may be provided by the evaluation element, or the evaluation result may be provided as a diagram that can be viewed on one screen.

By the organic combination and function of the above-described configuration, the cognitive performance evaluation system 1000 proposes a user-friendly behavior, and the operation is performed by a simple type of physical object to perform the familiar behavior, Can be evaluated. Such a system also has the advantage of not requiring a separate evaluator (person) to evaluate the performance of cognitive abilities.

6 is a flowchart of a cognitive ability evaluation method according to an embodiment of the present invention. Referring to FIG. 6, the cognitive capability evaluation method includes receiving (S10) operation data on an operation of a user operating a physical object from the physical object, extracting at least one operation element from the operation data And applying the extracted at least one action element to at least one cognitive capability assessment element to evaluate a cognitive ability level of the user (S30).

The cognitive performance evaluation method may further include sensing at least one of acceleration, slope, and pressure of the physical object to generate operation data. In addition, the cognitive capability evaluation method may include a step of photographing an operation of the user who operates the physical object to generate image data, and the image data may be used as operation data.

In one embodiment, the physical object may be a touch screen, and the user may evaluate the user ' s motion from an operation that touches the touch screen. The touch on the touch screen can use a direct touch using the body and the tool.

In another embodiment, the physical object may further comprise a handheld tool in addition to the touch screen. Data may be used for operations that the user may manipulate, move, assemble, disassemble, etc., the handheld tool. In this case, the coordinates and time on the touch screen where the handheld tool is located can be used for operation data generation.

In another embodiment, the step of extracting the operation element (S20) includes analyzing the operation data to divide the operation of the user into a plurality of operation sections, and extracting operation elements in a predetermined operation section from the divided operation sections And extracting the extracted data. Here, the predetermined operation period may be an operation period including an operation element corresponding to the perceptual ability evaluation element.

Further, the cognitive ability evaluation step of the present invention may further include the step of defining at least one cognitive ability evaluation element corresponding to each operation element. The step of evaluating the user's cognitive ability level (S30) may evaluate the cognitive ability of the user by applying the extracted operation element to the cognitive ability evaluation element corresponding to the extracted operation element.

In one embodiment, the cognitive capability assessment step may further include delivering a message to the user that requires a particular action to the user's voice or display, and outputting the cognitive capability assessment results in a voice or visual manner.

While the invention has been shown and described with reference to certain embodiments thereof, it will be understood by those skilled in the art that various changes and modifications may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. However, it should be understood that such modifications are within the technical scope of the present invention. Accordingly, the true scope of the present invention should be determined by the technical idea of the appended claims.

Physical objects: 100
Operation data receiver: 200
Operation element extraction section: 300
Cognitive ability assessment department: 400
Camera: 500
Database: 600

Claims (20)

Physical objects;
An operation data receiving unit operable to receive operation data on an operation of a user operating the physical object from the physical object;
An operation element extracting unit for extracting at least one operation element from the operation data; And
And a cognitive performance evaluation unit for evaluating a cognitive ability level of the user by applying the extracted at least one operation element to two or more cognitive performance evaluation elements,
Wherein the operation element extracting section analyzes the operation data to divide the operation of the user into a plurality of operation sections, extracts operation elements in a predetermined operation section from the divided operation sections,
Wherein the predetermined operation section is an operation section including an operation element corresponding to the perceptual ability evaluation element.
The method according to claim 1,
Wherein the physical object includes a sensor portion capable of sensing at least one of acceleration, tilt, and pressure,
Wherein the sensor unit senses the operation of the user and generates the operation data.
The method according to claim 1,
Further comprising a camera,
The camera captures the operation of the user to generate image data, transmits the image data to the operation data receiver,
Wherein the operation data includes the image data.
The method according to claim 1,
Wherein the physical object comprises a touch screen,
Wherein the operation of the user directly touches the touch screen.
5. The method of claim 4,
The physical object further includes a hand held tool,
Wherein the action of the user is to touch the handheld tool to the touch screen.
delete delete The method according to claim 1,
Further comprising: a database storing data defining at least one cognitive performance evaluation element corresponding to each action element,
Wherein the cognitive ability evaluation unit evaluates the cognitive ability of the user by applying the extracted operation element to the cognitive ability evaluation element corresponding to the extracted operation element.
9. The method of claim 8,
The cognitive abilities evaluation element may include at least one of psychomotor speed, attention, linguistic ability, calculation, visuospatial function, memory, Wherein the physical object-based cognitive performance evaluation system comprises:
The method according to claim 1,
A status presentation unit for delivering a message requesting a specific operation to the user to a voice or a display; And
Further comprising an output unit for audibly or visually outputting the cognitive ability evaluation result.
Receiving operation data on an operation of a user operating a physical object from the physical object;
Extracting at least one operating element from the operational data; And
Applying the extracted at least one action element to more than one cognitive ability assessment element to assess a cognitive ability level of the user,
Wherein the extracting of the operating element comprises:
Analyzing the operation data to divide the operation of the user into a plurality of operation sections and extracting operation elements in a predetermined operation section from the separated operation sections,
Wherein the predetermined operation period is an operation period including an operation element corresponding to the perceptual ability evaluation element.
12. The method of claim 11,
Further comprising sensing at least one of acceleration, slope, and pressure of the physical object to generate the operation data.
12. The method of claim 11,
Further comprising the step of photographing an operation of a user manipulating the physical object to generate image data,
Wherein the operation data includes the image data.
12. The method of claim 11,
Wherein the physical object comprises a touch screen,
Wherein the operation of the user directly touches the touch screen.
15. The method of claim 14,
Wherein the physical object further comprises a handheld tool,
Wherein the action of the user is to touch the handheld tool to the touch screen.
delete delete 12. The method of claim 11,
Further comprising defining at least one cognitive capability assessment element corresponding to each action element,
Evaluating the user's cognitive ability level comprises:
And evaluating the cognitive ability of the user by applying the extracted action elements to a cognitive ability assessment element corresponding to the extracted action element.
19. The method of claim 18,
The cognitive abilities evaluation element may include at least one of psychomotor speed, attention, linguistic ability, calculation, visuospatial function, memory, Wherein the physical object comprises a plurality of physical objects.
12. The method of claim 11,
Delivering a message to the user requesting a specific action to a voice or display; And
Further comprising the step of audibly or visually outputting the cognitive ability evaluation result.
KR1020130037446A 2013-04-05 2013-04-05 Evaluation system of cognitive ability based on physical object and method thereof KR101447563B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130037446A KR101447563B1 (en) 2013-04-05 2013-04-05 Evaluation system of cognitive ability based on physical object and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130037446A KR101447563B1 (en) 2013-04-05 2013-04-05 Evaluation system of cognitive ability based on physical object and method thereof

Publications (1)

Publication Number Publication Date
KR101447563B1 true KR101447563B1 (en) 2014-10-08

Family

ID=51996643

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130037446A KR101447563B1 (en) 2013-04-05 2013-04-05 Evaluation system of cognitive ability based on physical object and method thereof

Country Status (1)

Country Link
KR (1) KR101447563B1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160142597A (en) 2015-06-03 2016-12-13 조선대학교산학협력단 Health condition estimation apparatus and method for tracking through the person's hand function
KR102123869B1 (en) * 2019-02-12 2020-06-23 장성철 Training device and method for improving cognitive response
WO2022085853A1 (en) * 2020-10-20 2022-04-28 빅픽쳐스 주식회사 Method for determining whether object is recognized, and device using same

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006320424A (en) * 2005-05-17 2006-11-30 Tama Tlo Kk Action teaching apparatus and method
JP2010527642A (en) * 2007-04-13 2010-08-19 ナイキ インターナショナル リミテッド Inspection and training of visual cognitive ability and cooperative behavior
KR20120107736A (en) * 2011-03-22 2012-10-04 한국과학기술연구원 Norm-based cognitive measuring and evaluation system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006320424A (en) * 2005-05-17 2006-11-30 Tama Tlo Kk Action teaching apparatus and method
JP2010527642A (en) * 2007-04-13 2010-08-19 ナイキ インターナショナル リミテッド Inspection and training of visual cognitive ability and cooperative behavior
KR20120107736A (en) * 2011-03-22 2012-10-04 한국과학기술연구원 Norm-based cognitive measuring and evaluation system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160142597A (en) 2015-06-03 2016-12-13 조선대학교산학협력단 Health condition estimation apparatus and method for tracking through the person's hand function
KR102123869B1 (en) * 2019-02-12 2020-06-23 장성철 Training device and method for improving cognitive response
WO2022085853A1 (en) * 2020-10-20 2022-04-28 빅픽쳐스 주식회사 Method for determining whether object is recognized, and device using same

Similar Documents

Publication Publication Date Title
CA3007215C (en) Systems, computer medium and methods for management training systems
Stander et al. Psychological empowerment, job insecurity and employee engagement
JP6187902B2 (en) Intelligent productivity analyzer, program
JP2014063128A (en) Concentration level measurement device and program
Kim et al. Grasping VR: Presence of pseudo-haptic interface based portable hand grip system in immersive virtual reality
Vincs et al. Snapshots of complexity: using motion capture and principal component analysis to reconceptualise dance
Falcão et al. Application of virtual reality technologies in consumer product usability
Grandi et al. Creation of a UX index to design human tasks and workstations
KR101447563B1 (en) Evaluation system of cognitive ability based on physical object and method thereof
US20130262182A1 (en) Predicting purchase intent based on affect
Callejas et al. A framework for the assessment of synthetic personalities according to user perception
Van Acker et al. Development and validation of a behavioural video coding scheme for detecting mental workload in manual assembly
Chu et al. A comparative study of design evaluation with virtual prototypes versus a physical product
Masoner et al. Complexity of postural sway affects affordance perception of reachability in virtual reality
KR101216316B1 (en) Method for self-management using user terminal and system for processing the method
Hossain et al. Cognitive load and usability analysis of R-MAP for the people who are blind or visual impaired
MacNamara Evaluating the effectiveness of the gestalt principles of perceptual observation for virtual reality user interface design
Iwasako et al. Development of finger motion skill learning support system based on data gloves
Durall et al. Feeler: supporting awareness and reflection about learning through EEG data
Loup-Escande et al. Towards a user-centred methodological framework for the design and evaluation of applications combining brain-computer interfaces and virtual environments: contributions of ergonomics
Ukita et al. A user-centered design approach to physical motion coaching systems for pervasive health
Conati et al. User-adaptive visualizations: can gaze data tell us when a user needs them?
Wang et al. Designing action-characterizing toy blocks for behavior assessments
Levulis et al. Are all tests equal? A comparison of emulator and device testing for mobile usability evaluation
Pascual et al. Proposal of an Intuitive Interface Structure for Ergonomics Evaluation Software

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20170828

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20180903

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20190902

Year of fee payment: 6