WO2013133583A1 - Système et procédé de réhabilitation cognitive par interaction tangible - Google Patents

Système et procédé de réhabilitation cognitive par interaction tangible Download PDF

Info

Publication number
WO2013133583A1
WO2013133583A1 PCT/KR2013/001704 KR2013001704W WO2013133583A1 WO 2013133583 A1 WO2013133583 A1 WO 2013133583A1 KR 2013001704 W KR2013001704 W KR 2013001704W WO 2013133583 A1 WO2013133583 A1 WO 2013133583A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
cognitive rehabilitation
sensory
information
display unit
Prior art date
Application number
PCT/KR2013/001704
Other languages
English (en)
Korean (ko)
Inventor
김래현
권규현
김건희
송교현
Original Assignee
한국과학기술연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국과학기술연구원 filed Critical 한국과학기술연구원
Publication of WO2013133583A1 publication Critical patent/WO2013133583A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training

Definitions

  • the present invention relates to a cognitive rehabilitation system and method, and more particularly, to a new type of cognitive rehabilitation system and method for enabling a user to use a human rehabilitation program realistically and easily by using a physical object and an interactive display. .
  • HCI Human-Computer Interaction
  • a human-computer interaction exists in the real space and provides a user interface that allows the object on the virtual space, that is, the screen, to respond to changes by a control from a keyboard or a mouse that acts as a controller. have.
  • Such a user interface requires a user to be familiar with a controller's operation method and the like in using a controller such as a keyboard or a mouse, and a single controller manufactured in a specific shape and operation form must perform various functions.
  • a controller such as a keyboard or a mouse
  • a single controller manufactured in a specific shape and operation form must perform various functions.
  • the user cannot experience the feeling of touching or manipulating the real object associated with each function.
  • An immersive user interface refers to an interface that utilizes a specific physical object in a virtual space to be manipulated, i.e., a location on a screen, without using a remote controller such as a keyboard, mouse, or pen-shaped mouse.
  • a remote controller such as a keyboard, mouse, or pen-shaped mouse.
  • the interface for drawing directly on the screen using a brush-type device is representative.
  • Korean Patent Laid-Open Publication No. 10-2007-0041790 "Virtual Experiment Interface for Interfacing with Experimental Device” also discloses a simulation method for outputting an image through a data output unit based on data obtained from an experimental measurement device. By using an image, the sense of reality is reduced.
  • the present invention has been made to solve the above problems, a sensory object that can be easily held and manipulated by the user, an interactive display that reacts to each other according to the user's actions, and a complex sensor module for measuring the user's state It is an object of the present invention to provide a cognitive rehabilitation system and method that reacts in real time according to a user's behavior and state measured using the method.
  • a cognitive rehabilitation system for achieving the above object is formed of a physical object to generate a sensory object for generating interaction information according to the user's operation, and to display information corresponding to the operation situation of the user And a controller configured to provide cognitive rehabilitation content to the user through the sensory object or the display unit, wherein the controller receives the interaction information from the sensory object to determine an operation state of the user, and the determination.
  • the virtual object corresponding to the sensory object may be generated based on the manipulation situation, and the generated virtual object may be displayed on the display unit.
  • the controller may further display event information related to talent rehabilitation on the display unit based on the determined manipulation situation.
  • the human resources rehabilitation system further includes a photographing unit for generating the image information by the user's operation of the sensory object and the display unit, the controller is the image generated by the photographing unit
  • the information may be used to determine the manipulation situation of the user.
  • the talent rehabilitation system further comprises a biosignal recognition unit for recognizing the biometric information of the user, the control unit is based on the biometric information of the user recognized by the biosignal recognition unit the user To determine the physical condition of the body.
  • the biosignal recognition unit may include one or more of an EEG measuring device, an EMG measuring device, a skin electrical response measuring device, an eyeball measuring device, a blood pressure measuring device, and a body temperature measuring device.
  • the controller may adjust the cognitive rehabilitation content according to the determined body state of the user.
  • the sensory object, the display unit, the control unit, the photographing unit, and the biosignal recognition unit may each include a communication module for performing wired or wireless communication, and may transmit and receive information with each other through the provided communication module.
  • the controller may recognize the sensory object by receiving ID information of the physical object through the communication module.
  • the sensory object may include a sensor module capable of sensing any one or more of acceleration, tilt, pressure, and temperature.
  • the sensory object may further include an output module, and the controller may output digital feedback corresponding to the interaction information through the output module.
  • the display unit may include a touch screen device, and the controller may detect a contact location between the sensory object and the touch screen device and display the virtual object at the contact location.
  • the cognitive rehabilitation system may further include a projection device for projecting an image so that the virtual object is displayed on the display unit.
  • the display unit may further include a sound output unit, and the controller may provide a guide voice through the sound output unit based on the determined operation situation.
  • the sensory object and the virtual object may constitute one object.
  • a cognitive rehabilitation method includes the steps of providing cognitive rehabilitation content to a user through a sensory object or a display unit, receiving interaction information according to the user's sensory object manipulation from the sensory object; Determining a manipulation situation of the user from the interaction information, generating a virtual object corresponding to the sensory object based on the determined manipulation situation, and displaying the virtual object on a display unit. have.
  • the cognitive rehabilitation method is to display the event information related to cognitive rehabilitation on the display on the basis of the determined operation situation, the digital object corresponding to the interaction information to the sensory object Outputting through the output module may further include.
  • the interaction information may be at least one of acceleration, tilt, pressure, and temperature sensed by a sensor module attached to the physical object.
  • the cognitive rehabilitation method may further include determining.
  • the cognitive rehabilitation method may further include the step of recognizing the sensory object by receiving ID information of the sensory object.
  • the generating of the virtual object may include displaying the virtual object at a contact position between the sensory object and the touch screen device formed on the display unit.
  • the cognitive rehabilitation method may further include adjusting cognitive rehabilitation content according to the determined physical condition of the user.
  • the biometric information may include one or more of EEG, EMG, skin electrical response, and eye information.
  • the cognitive rehabilitation method may further include providing a guide voice through a sound output unit based on the determined operation situation.
  • the physical object and the virtual object may constitute one object.
  • the cognitive rehabilitation system and method using the sensory interaction according to the present invention it is possible to experience a variety of cognitive rehabilitation content more easily and realistically.
  • existing computer-based cognitive rehabilitation systems are focused on the results of work in a digital environment rather than the process of performing them, but real cognitive rehabilitation therapists conduct cognitive rehabilitation and training with real tools and prove their effectiveness. have.
  • the cognitive rehabilitation system and method using the sensory interaction according to the present invention may have both the advantages of the computer-based program and the cognitive rehabilitation method using a real tool.
  • FIG. 1 is a schematic configuration diagram of a cognitive rehabilitation system according to an embodiment of the present invention.
  • FIG. 2 is a schematic structural diagram of a sensory object of a cognitive rehabilitation system according to an embodiment of the present invention.
  • FIG. 3 is a detailed configuration diagram illustrating the operation of the cognitive rehabilitation system according to an embodiment of the present invention.
  • FIG. 4 is a flowchart illustrating a cognitive rehabilitation method according to an embodiment of the present invention.
  • FIG. 5 is a view showing a tongtongtong applied to the cognitive rehabilitation system according to an embodiment of the present invention.
  • FIG. 6 is a view showing a syrup barrel applied to the cognitive rehabilitation system according to an embodiment of the present invention.
  • FIG. 7 is a conceptual diagram illustrating a cognitive rehabilitation system according to an embodiment of the present invention.
  • FIGS. 8A and 8B are diagrams illustrating a cookie making and topping sprinkling screen provided by a cognitive rehabilitation system according to an embodiment of the present invention.
  • Embodiments described herein may have aspects that are wholly hardware, partly hardware and partly software, or wholly software.
  • unit “module”, “device” or “system” and the like refer to hardware, a combination of hardware and software, or a computer related entity such as software.
  • parts, modules, devices, or systems herein refer to running processes, processors, objects, executables, threads of execution, programs, and / or computers. computer, but is not limited thereto.
  • both an application running on a computer and a computer may correspond to a part, module, device, system, or the like herein.
  • Embodiments have been described with reference to the flowchart presented in the drawings. Although the method is shown and described in a series of blocks for the sake of simplicity, the invention is not limited to the order of the blocks, and some blocks may occur in different order or simultaneously with other blocks than those shown and described herein. Various other branches, flow paths, and blocks may be implemented in order to achieve the same or similar results. In addition, not all illustrated blocks may be required for the implementation of the methods described herein. Furthermore, the method according to an embodiment of the present invention may be implemented in the form of a computer program for performing a series of processes, which may be recorded on a computer-readable recording medium.
  • FIG. 1 is a schematic configuration diagram of a cognitive rehabilitation system according to an embodiment of the present invention.
  • a cognitive rehabilitation system includes a photographing unit 10, a sensory object 20, a display unit 30, a biosignal recognition unit 40, and a controller ( 50).
  • the sensory object 10 is formed of a physical object and serves to generate interaction information according to a user's manipulation.
  • the sensory object may be composed of a physical object applied to the cognitive rehabilitation system of the present invention, that is, a topping container, a syrup container, and the like. Detailed configuration and operation of the sensory object 10 will be described later with reference to FIG. 2.
  • the display unit 30 serves to display cognitive rehabilitation content and information corresponding to a user's operation situation.
  • the display unit 30 may further include a sound output unit (not shown) that visually guides the user's operation status.
  • the display unit 30 may be configured as a touch screen device, and when the experiment apparatus 20 is placed on the touch screen device, the display unit 30 may transmit the coordinates to be touched to the controller 40 to be described later.
  • the photographing unit 10 generates image information by photographing a manipulation situation of the user's sensory object 20 and the display unit 30.
  • the photographing unit 10 may be formed of a camera, and in detail, may be formed of a general camera or an infrared (IR) camera that utilizes RGB information.
  • the photographing unit 10 may be installed integrally with the display unit 30, or may be separated from the display unit 30 so as to photograph a user's operation situation and the display unit 30. It can be installed separately on the left or the bottom.
  • the biosignal recognition unit 40 serves to recognize a biosignal of a user.
  • the biosignal recognized by the biosignal recognition unit 40 is used by the controller 50 to be described later to determine the physical state of the user.
  • the biosignal recognition unit 40 may include various sensor modules such as an electroencephalography (EGE), a galvanic skin reflex (GSR), an eyeball measuring device, a blood pressure measuring device, and a body temperature measuring device. Can be.
  • the controller 40 determines a user's manipulation situation using the image information received from the sensory object 20, generates a virtual object corresponding to the sensory object 20 based on the determined manipulation situation, and displays the display unit 30. ) To play a role.
  • the controller 40 serves as a controller for controlling the cognitive rehabilitation system as a whole, and basically also serves to provide the user with the cognitive rehabilitation content through the sensory object 20 and the display 30.
  • the controller 40 includes a cognitive rehabilitation content engine and a cognitive rehabilitation manager module as software, and displays the cognitive rehabilitation content on the display unit 30 as a screen, or displays light, vibration, etc. on the sensory object 20.
  • the interface can provide cognitive rehabilitation content and feedback lows. The detailed operation of the control unit 40 will be described later.
  • the sensory object 20 and the virtual object may constitute one specific object. That is, the sensory object 20 formed of a physical object corresponds to a part of a specific object, and the virtual object corresponds to a part of the specific object except for a part corresponding to the sensory object 20.
  • the sensory object 20 is a topping container when making a cookie
  • only a part of the topping container is configured as a physical object to configure the sensory object 20
  • the virtual object is the display object 20. It may mean the rest of the topping container in the case of contacting with (30).
  • the sensory object 20 itself may be configured to completely configure one physical object, and the virtual object may be configured to configure a preset shape corresponding to the sensory object 20.
  • a sensory object such as a cookie container or a syrup container is configured, and a virtual object will be described as an example of forming a cookie shape or a syrup shape.
  • FIG. 2 is a schematic structural diagram of a sensory object of a cognitive rehabilitation system according to an embodiment of the present invention.
  • the sensory object 20 includes a sensor module 22, an output module 24, a communication module 26, and a microcontroller unit 26. It is composed.
  • the sensor module 22 is installed in the sensory object 20 and serves to generate interaction information according to a user's manipulation.
  • interaction information' refers to a series of information related to the user's manipulation of the sensory object 20 when the user manipulates the sensory object 20.
  • the sensor module 22 may be a communication module (Bluetooth module, RFID sensor module, GPS sensor module, etc.), an acceleration sensor module, a tilt sensor module, a pressure sensor module, a temperature sensor module, and the like.
  • the information sensed by the sensor module 22 is transferred to a microcontroller unit (MCU) 6 to be described later, and used to determine an operation state of the sensory object 20.
  • MCU microcontroller unit
  • the output module 24 is installed in the sensory object 20 and serves to output digital feedback suitable for the operation situation of the physical object 20.
  • 'outputting digital feedback' refers to providing the user with information such as screen information, sound information, vibration, etc. according to the manipulation situation of the sensory object 20.
  • the output module 26 may be a display module, a sound output module, a light emitting module, an actuator module, or the like.
  • the display module may be configured such as an LCD panel to provide various types of information to the user as an image.
  • the sound output module may provide the user with specific sound information suitable for the situation, and the light emitting module may provide the user with light of a color suitable for the situation.
  • the actuator module may provide movement information such as vibration to the user.
  • the micro controller unit (MCU) 28 receives interaction information generated by the sensor module 22 and outputs digital feedback corresponding to the interaction information through the output module 24. That is, the MCU 28 uses the received interaction information to determine the type of digital feedback suitable for the operation situation of the current sensory object 20 and output the same through the output module 24.
  • FIG. 3 is a detailed configuration diagram illustrating the operation of the cognitive rehabilitation system according to an embodiment of the present invention.
  • the photographing unit 10 is formed of a camera
  • the sensory object 20 may be formed of an object such as a topping container
  • the display unit 30 is a tabletop display device formed of a touch screen device. It can be formed as.
  • the sensory object 20 may include a communication module such as Bluetooth and a sensor module that may sense acceleration, tilt, pressure, temperature, and the like.
  • a display module, a sound output module, a light emitting module, an actuator module, and the like may also be installed as the output module. The role of each sensor module will be described later with reference to the operation of the controller 50.
  • the photographing unit 10 photographs the manipulation and display unit 30 of the user's sensory object 20, generates image information, and transmits the image information to the image processing module 41 of the controller 50. That is, the photographing unit 10 photographs the manipulation of the user's sensory object 20, that is, the interaction image information, and photographs the display image information of the display unit 30.
  • the image information generated by the photographing unit 10 is transmitted to the image processing module 51 of the controller 50.
  • the image processing module 51 recognizes the position and operation of the sensory object 20 in the image information through image recognition technology. That is, the controller 50 compares the manipulation image information of the sensory object 20 with the image information of the display 30 and tracks the position and operation of the sensory object 20 through the difference value.
  • the controller 50 additionally receives ID information, location information, and the like from various sensor modules, such as an RFID sensor module and a GPS sensor module installed in the sensory object 20, thereby providing a more accurate state of the sensory object 20. It can also recognize information.
  • various sensor modules such as an RFID sensor module and a GPS sensor module installed in the sensory object 20, thereby providing a more accurate state of the sensory object 20. It can also recognize information.
  • the interaction sensor information sensed by the various sensor modules attached to the sensory object 20 is transmitted to the sensor processing module 53 of the controller 50.
  • the interaction sensor information may be information about a position and a state of the sensory object 20 (released, held by a hand, inclination, movement, etc.).
  • the sensor processing module 53 may receive the interaction sensor information to recognize the type, location, and motion of the sensory object 20.
  • the sensor processing module 53 may recognize the operating state of the sensory object 20 using the information received from the acceleration sensor module, and use the information received from the tilt sensor module to sense the sensory object 20. Can recognize the tilted state of the sensor, and can recognize the pressure state applied to the sensory object 20 using the information received from the pressure sensor module, and the sensory object 20 using the information received from the temperature sensor module. ) Temperature can be recognized.
  • the image processing module 51 and the sensor processing module 53 are complementary to each other, and may be used together depending on the environment, and only one of them may be used alone.
  • the image processing module 51 and the sensor processing module 53 recognize the type, position, and motion of the sensory object 20 from the image information and the interaction sensor information, respectively, and transfer the information to the situation recognition module 55. To pass.
  • the biosignal recognition unit 40 serves to recognize a user's biosignal and includes the aforementioned electroencephalography (EGE), galvanic skin reflex (GSR), eyeball measuring device, and blood pressure measurement. It may be configured to include a variety of sensor modules, such as a device, a body temperature measuring device.
  • EGE electroencephalography
  • GSR galvanic skin reflex
  • eyeball measuring device eyeball measuring device
  • blood pressure measurement blood pressure measurement. It may be configured to include a variety of sensor modules, such as a device, a body temperature measuring device.
  • the body state information recognized by the biosignal recognition unit 40 is transmitted to the sensor processing module 53, and the situation recognition module 55 receives the body state information from the sensor processing module 53 and receives the current body state of the user. Can be determined. That is, it can be determined whether the user is currently nervous or focused.
  • the controller 50 of the cognitive rehabilitation system may include an image processing module 51, a sensor processing module 53, a situation recognition module 53, and a decision module. 57 and the output control module 59 are comprised.
  • the situation recognition module 55 comprehensively determines the user's experiment operation situation based on the information received from the image processing module 51 and the sensor processing module 53.
  • the situation recognition module 55 may receive the touch position information, which is a contact point between the experiment apparatus 20 and the display unit 30, from the display unit 30 and use the same to determine the experiment operation situation.
  • the output control module 59 receives a virtual object, event information, and digital feedback information from the decision module 57 and outputs the same to the display unit 30 and the sensory object 20.
  • the output control module 59 may output a virtual object at the touched position.
  • the virtual object may be displayed on the portion of the display unit 30 corresponding to the vertically lower position of the sensory object 20.
  • the cognitive rehabilitation system constitutes a topping container, a syrup container and the like as a sensory object 20, and the user makes a cookie in response to the cognitive rehabilitation content provided to the display unit 20 as a topping container and syrup container
  • the user's cognitive ability such as concentration can be measured, and rehabilitation training can be performed.
  • FIG. 4 is a flowchart illustrating a cognitive rehabilitation method according to an embodiment of the present invention. Referring to FIG. 4, the operation of the cognitive rehabilitation system according to the present invention will be described below.
  • the control unit 50 of the cognitive rehabilitation system provides the cognitive rehabilitation content to the display unit 30 or the sensory object 20 (100). That is, the controller 50 may provide a cookie making game interface to the display 30 and output the event information associated with the cookie making game to the sensory object 20. That is, cognitive rehabilitation feedback may be provided to the user by generating light or vibrating the sensory object 20.
  • the user moves the sensory object 20 and displays the sensory object 20 if necessary according to the cognitive rehabilitation content provided to the sensory object 20 or the display unit 30.
  • the operation such as placing on the part 30 can be performed.
  • the user moves, tilts, or feels the sensory object 20
  • such interaction information is detected by various sensors installed in the sensory object 20 and transmitted to the controller 50, and the bio-signal recognition unit 40 of the user
  • the biosignal is recognized and transmitted to the controller 50 (110).
  • the controller 50 determines a user's manipulation situation and a user's body state based on the interaction information and the user's biosignal (120), and based on the determined result, a virtual object and help to be displayed on the display unit 30.
  • the cognitive rehabilitation content corresponding to the event information and the feedback of the user is generated 130 and displayed on the sensory object 20 or the display 30 (140).
  • the controller 50 can provide the user with continuous feedback information, and the user checks the cognitive rehabilitation ability while making the user feel like making a real cookie. And training.
  • FIG. 5 shows a topping container 20m used for a cookie making game
  • FIG. 6 shows a syrup container 20n used for a cookie making game.
  • the topping container 20m and the syrup container 20n are configured to recognize a motion of the user topping the syrup container 20n by installing a motion recognition sensor such as an acceleration sensor.
  • FIG. 7 is a conceptual diagram illustrating a cognitive rehabilitation system according to an embodiment of the present invention.
  • FIG 7 illustrates how the interactive display of the cognitive rehabilitation system according to an embodiment of the present invention, that is, the display unit 30, the sensory object 20a-g, the biosignal recognition unit 40, and the controller 50 interoperate with each other.
  • the display unit 30 the sensory object 20a-g, the biosignal recognition unit 40, and the controller 50 interoperate with each other.
  • the controller 50 interoperate with each other.
  • the controller 50 includes a cognitive rehabilitation content engine 52 and cognitive rehabilitation manager 54 that monitors the interaction information and the user's physical condition information as software for providing cognitive rehabilitation content and recommends content suitable for each situation.
  • the controller 50 displays a cookie making game on the display 30 on the display 30 based on the cognitive rehabilitation content engine 52 as shown in FIG. 7, and the user selects a cookie mold, a topping container, and a syrup.
  • An operation corresponding to cognitive rehabilitation content may be performed using the sensory objects 20a-g such as a barrel.
  • the biosignal recognition unit 40 recognizes the current biosignal of the user and transmits it to the controller 50, and the controller 50 determines the current physical state of the user based on the biosignal. It becomes possible.
  • FIGS. 8A and 8B are diagrams illustrating a cookie making and topping sprinkling screen provided by a cognitive rehabilitation system according to an embodiment of the present invention.
  • the user may proceed with the game by storing the type and number of cookies required through the cookie making game content provided through the display unit 30.
  • the virtual object is displayed on the display unit 30 in the shape of the cookie frame to generate a cookie.
  • the cookie made as shown in FIG. 8B is moved through the conveyor belt 22, and the user shakes the topping container 23 when the cookie made in the center is located at the center. You can spray it.
  • the cognitive game score may be provided based on the user's motion. In this way, users can realistically perform cognitive rehabilitation training and testing.
  • the cognitive rehabilitation system and method using the sensory interaction according to the present invention it can be widely used as a tool for cognitive rehabilitation using various cognitive rehabilitation content easily and realistically.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Child & Adolescent Psychology (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Developmental Disabilities (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Rehabilitation Tools (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un système et un procédé de réhabilitation cognitive produisant une interaction tangible utilisant un objet tangible et un écran interactif. Le système selon la présente invention comprend : un objet tangible possédant différents capteurs ; un écran interactif qui reconnaît un objet venant en contact avec l'écran et un toucher de l'utilisateur, et qui affiche un contenu de réhabilitation cognitive ; une unité photographique qui reconnaît les comportements de l'utilisateur et estime la position de l'objet ; une unité de reconnaissance de bio-signal servant à reconnaître les informations biométriques d'un utilisateur ; un gestionnaire de réhabilitation cognitive qui surveille la progression et le résultat d'un programme de réhabilitation cognitive et les actions et l'état physique de l'utilisateur, et qui recommande le contenu de réhabilitation cognitive et en assure la progression d'après les informations surveillées ; et un moteur de contenu de réhabilitation cognitive servant à autoriser l'utilisateur à entreprendre la réhabilitation cognitive et à se former en interagissant avec l'objet tangible.
PCT/KR2013/001704 2012-03-05 2013-03-04 Système et procédé de réhabilitation cognitive par interaction tangible WO2013133583A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2012-0022516 2012-03-05
KR1020120022516A KR101338043B1 (ko) 2012-03-05 2012-03-05 실감 인터랙션을 이용한 인지재활 시스템 및 방법

Publications (1)

Publication Number Publication Date
WO2013133583A1 true WO2013133583A1 (fr) 2013-09-12

Family

ID=49117006

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/001704 WO2013133583A1 (fr) 2012-03-05 2013-03-04 Système et procédé de réhabilitation cognitive par interaction tangible

Country Status (2)

Country Link
KR (1) KR101338043B1 (fr)
WO (1) WO2013133583A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9228412B2 (en) 2014-01-30 2016-01-05 Olympic Research, Inc. Well sealing via thermite reactions
US9394757B2 (en) 2014-01-30 2016-07-19 Olympic Research, Inc. Well sealing via thermite reactions
CN109034389A (zh) * 2018-08-02 2018-12-18 黄晓鸣 信息推荐系统的人机交互式修正方法、装置、设备和介质

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101686629B1 (ko) * 2015-01-30 2016-12-14 한국과학기술연구원 사용자의 압력 정보의 입력에 의해 지시되는 가상 공간 상의 위치를 결정하는 방법, 장치 및 이 방법을 실행하기 위한 컴퓨터 판독 가능한 기록 매체
WO2017069305A1 (fr) * 2015-10-22 2017-04-27 주식회사 싸이버메딕 Dispositif d'entrainement de rééducation à rétroaction biologique et à réalité virtuelle utilisant un module d'outil interactif, et son procédé de commande
KR20180045278A (ko) * 2016-10-25 2018-05-04 포항공과대학교 산학협력단 생체신호연동 가상현실 인지재활 시스템
KR102009753B1 (ko) * 2017-06-09 2019-08-12 동명대학교산학협력단 가상현실에 기반하는 객체 처리장치 및 그 동작 방법
US10290190B2 (en) * 2017-07-31 2019-05-14 Facebook, Inc. Providing temperature sensation to a user based on content presented to the user
WO2019066510A1 (fr) * 2017-09-28 2019-04-04 주식회사 네오펙트 Procédé d'entraînement de jeu sur plateau à chevilles et programme associé
TWI714887B (zh) * 2017-09-28 2021-01-01 南韓商耐奧飛特股份有限公司 釘板、康復訓練系統以及康復訓練方法
KR102019157B1 (ko) 2018-05-30 2019-09-09 (주)휴먼아이티솔루션 가상현실 기반 인지 재활 훈련 관리 시스템
KR102168977B1 (ko) * 2018-07-10 2020-10-22 연세대학교 원주산학협력단 인지재활 훈련 장치 및 방법

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100132592A (ko) * 2009-06-10 2010-12-20 연세대학교 산학협력단 감성인식장치의 개인별 최적화시스템 및 그 최적화 방법
KR20110054376A (ko) * 2009-11-17 2011-05-25 한국과학기술연구원 실감 인터랙션 시스템 및 방법

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010026818A (ja) 2008-07-18 2010-02-04 Geisha Tokyo Entertainment Inc 画像処理プログラム、画像処理装置及び画像処理方法
KR100985206B1 (ko) * 2008-09-17 2010-10-05 (주)지아트 증강 현실 수동 조작 도구 및 이를 이용한 증강 현실 저작 방법과 그 시스템

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100132592A (ko) * 2009-06-10 2010-12-20 연세대학교 산학협력단 감성인식장치의 개인별 최적화시스템 및 그 최적화 방법
KR20110054376A (ko) * 2009-11-17 2011-05-25 한국과학기술연구원 실감 인터랙션 시스템 및 방법

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9228412B2 (en) 2014-01-30 2016-01-05 Olympic Research, Inc. Well sealing via thermite reactions
US9394757B2 (en) 2014-01-30 2016-07-19 Olympic Research, Inc. Well sealing via thermite reactions
US9494011B1 (en) 2014-01-30 2016-11-15 Olympic Research, Inc. Well sealing via thermite reactions
CN109034389A (zh) * 2018-08-02 2018-12-18 黄晓鸣 信息推荐系统的人机交互式修正方法、装置、设备和介质

Also Published As

Publication number Publication date
KR20130101395A (ko) 2013-09-13
KR101338043B1 (ko) 2013-12-09

Similar Documents

Publication Publication Date Title
WO2013133583A1 (fr) Système et procédé de réhabilitation cognitive par interaction tangible
WO2018217060A1 (fr) Procédé et dispositif pouvant être porté permettant d'effectuer des actions à l'aide d'un réseau de capteurs corporels
US10838495B2 (en) Devices for controlling computers based on motions and positions of hands
WO2016028097A1 (fr) Dispositif pouvant être porté
WO2013133618A1 (fr) Procédé pour commander au moins une fonction d'un dispositif par action de l'œil et dispositif pour exécuter le procédé
WO2013055024A1 (fr) Appareil pour entraîner la capacité de reconnaissance à l'aide d'un robot et procédé associé
US20120268359A1 (en) Control of electronic device using nerve analysis
WO2017215223A1 (fr) Système vr, dispositif à porter sur soi permettant de commander un dispositif vr, et procédé associé
WO2020111344A1 (fr) Système et procédé de mise en œuvre de mouvement physique dans un espace virtuel en utilisant un signal électromyographique
TW202105129A (zh) 具有用於閘控使用者介面元件的個人助理元件之人工實境系統
WO2016036197A1 (fr) Dispositif et procédé de reconnaissance de mouvement de la main
WO2016129773A1 (fr) Procédé, dispositif et système pour fournir une rétroaction, et support d'enregistrement lisible par ordinateur non-transitoire
US20230095328A1 (en) Information processing apparatus, information processing method, computer program, and augmented reality system
WO2018038449A1 (fr) Procédé de réglage du niveau de difficulté d'un contenu d'apprentissage et dispositif électronique utilisant ce dernier
KR102277359B1 (ko) 가상 현실 트레이닝 관련 추가 정보 제공 방법 및 장치
WO2017150947A1 (fr) Dispositif et procédé destinés à fournir une interface utilisateur réactive
Mohammadi et al. A pilot study on a novel gesture-based tongue interface for robot and computer control
WO2021145068A1 (fr) Dispositif de traitement d'informations et procédé de traitement d'informations, programme informatique et système de réalité augmentée
WO2019074169A1 (fr) Lunettes de réalité augmentée à détection de mouvement
Esiyok et al. Novel hands-free interaction techniques based on the software switch approach for computer access with head movements
WO2018034386A1 (fr) Système de carte à puce relié à des informations biométriques et son procédé
Walsh et al. Assistive pointing device based on a head-mounted camera
Rojas et al. Development of a sensing platform based on hands-free interfaces for controlling electronic devices
WO2023017890A1 (fr) Procédé, appareil et système de fourniture de métavers
KR101221048B1 (ko) 발 모션을 이용한 로봇핸드 제어 시스템 및 제어방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13757469

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13757469

Country of ref document: EP

Kind code of ref document: A1