WO2013183822A1 - Appareil d'apprentissage de sociabilité et procédé correspondant - Google Patents

Appareil d'apprentissage de sociabilité et procédé correspondant Download PDF

Info

Publication number
WO2013183822A1
WO2013183822A1 PCT/KR2012/008912 KR2012008912W WO2013183822A1 WO 2013183822 A1 WO2013183822 A1 WO 2013183822A1 KR 2012008912 W KR2012008912 W KR 2012008912W WO 2013183822 A1 WO2013183822 A1 WO 2013183822A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
trainee
training mode
training
robot
Prior art date
Application number
PCT/KR2012/008912
Other languages
English (en)
Korean (ko)
Inventor
최종석
박성기
김동환
강해용
Original Assignee
한국과학기술연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국과학기술연구원 filed Critical 한국과학기술연구원
Publication of WO2013183822A1 publication Critical patent/WO2013183822A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass

Definitions

  • the embodiment relates to a social training apparatus and a method thereof, and more particularly, to an apparatus and a method for training the sociality of autistic children.
  • Autistic children are characterized by the lack of self-initiated interactions, turn-taking, imitation, emotion recognition, joint attention, and eye contact, which are basic functions for social and communication. In other words, cognitive development is immature and lacks sociality. Autistic children with these characteristics are difficult to socialize as adults. Without proper treatment at the right time, social interaction with people becomes impossible.
  • early detection and behavioral treatment of autism has been actively conducted in university hospitals and research institutes, and treatment has been carried out using robots (toys) that are not more complicated than human emotions. Recent research trends also suggest that robots can draw attention to children with autism and that their concentration has improved.
  • Patent 10-1061171 discloses a robot and a robot control system for play therapy.
  • a robot that outputs visual, auditory and tactile and behavioral stimuli for play therapy;
  • a mobile terminal which receives a child's response according to the stimulus output by the operator as response information and transmits it to the outside, and transmits a control command to the robot to output a stimulus according to the request of the operator;
  • Receives response information transmitted from the mobile terminal and determines whether the robot outputs a stimulus, whether it is preferred, unfavorable, or unresponsive, and selects the next stimulus according to the preference, non-preferred or non-response of the stimulus, and then selects the next stimulus
  • a robot control system for play therapy characterized in that it comprises a server for transmitting a control command to the robot.
  • the behavior of autistic children can be analyzed in multiple angles.
  • the input unit for receiving the behavior information of the trainee;
  • a trainee status analysis unit for analyzing the trainee's behavior information and classifying the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's status;
  • a training mode determiner configured to determine a training mode according to an analysis state of the trainee;
  • a social training apparatus including a robot controller for controlling a robot according to the determined training mode.
  • the input unit there is provided a social training apparatus that receives at least one of the EEG, voice information, image information, contact information of the trainee.
  • the trainee state analysis unit is provided, the sociality training device for analyzing the emotion information of the trainees based on the enjoyment information and excitement information.
  • the emotional information is provided with a social training apparatus, including six information of happiness, surprise, anxiety, anger, sadness, satisfaction.
  • the training mode determination unit is provided, the sociality training device for determining the training mode of the trainees based on the emotion information or intimacy information.
  • the training mode of the trainee is a first training mode to train the trainees, a second training mode to encourage the trainees to train, a third training mode not training the trainees Including a social training device is provided.
  • the training mode determination unit is provided with a social training apparatus, which extends the range of the first training mode in proportion to the intimacy information.
  • the robot control unit is provided, the social training apparatus for determining the speed of the robot according to the training mode based on the excitation information.
  • the robot control unit is provided, the social training apparatus for determining the behavior range of the robot according to the training mode based on the familiarity information.
  • the step of receiving the behavior information of the trainee Analyzing the trainee's behavior information to classify the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's state; And determining a training mode according to an analysis state of the trainee.
  • a social training method comprising the step of controlling the robot according to the determined training mode.
  • the step of receiving the training information of the trainee comprising the step of receiving at least one of the EEG, voice information, image information, contact information of the trainee Is provided.
  • analyzing the trainee's state further includes analyzing the trainee's emotion information based on the enjoyment information and the excitement information.
  • analyzing the trainee's state further includes analyzing the trainee's emotion information based on the enjoyment information and the excitement information.
  • the emotional information is provided with a sociality training method, including six information of happiness, surprise, anger, fear, sadness, disgust.
  • the determining of the training mode comprises providing a training mode of the trainee based on the emotion information or the intimacy information.
  • the training mode of the trainee is a first training mode to train the trainees, a second training mode to encourage the trainees to train, a third training mode not training the trainees Including a social training method is provided.
  • the determining of the training mode further includes extending the range of the first training mode in proportion to the intimacy information.
  • the step of controlling the robot there is provided a social training method for determining the speed of the robot according to the training mode based on the excitation information.
  • the step of controlling the robot there is provided a social training method for determining the behavior range of the robot according to the training mode based on the familiarity information.
  • FIG. 1 is a diagram illustrating a relationship between a social training apparatus 100, a robot 200, and an autistic child 300, according to an exemplary embodiment.
  • FIG. 2 is a diagram illustrating an internal configuration of the social training apparatus 100 according to an embodiment.
  • 3A and 3B are diagrams illustrating a three-dimensional graph in which a trainee state analyzing unit 120 analyzes a trainee state according to one embodiment.
  • 4A is a diagram illustrating an emotional state and a training mode according to pleasure information and excitement information when z is constant according to an embodiment.
  • 4B to 4D are diagrams illustrating a change in training mode according to a change in z, according to an exemplary embodiment.
  • FIG. 5 is a flowchart illustrating a social training method according to an embodiment.
  • the social training apparatus 100 analyzes various behaviors of the trainee 300, for example, a signal for controlling the robot 200 according to the analysis of various behaviors such as position, voice, brain waves, and the behavior of touching the robot 200. To the robot 200. The robot 200 operates based on the received control signal, and the operation of the robot 200 improves the sociality of the trainee 300.
  • the social training apparatus 100 receives the behavior information of the trainee 300, analyzes the state of the trainee 300, and determines the training mode based on the training information.
  • the trainee 300 is trained by controlling the robot 200 according to the determined training mode. Details related to the sociality training device 100 will be described later.
  • the robot 200 performs a variety of actions under the control of the sociality training device 100.
  • the robot 200 has a human-like appearance and shape, and can walk with two feet, and has two arms, and has a function of manipulating something with a hand. That is, the robot 200 may be a humanoid robot and may have a plurality of joints. The humanoid robot can also change the posture of parts such as arms and legs by adjusting each joint. You can also walk, throw things or kick a ball.
  • the robot 200 may be a humanoid configuration of only a part of the robot component. For example, the upper body part is the appearance and shape of the human body, but the lower body part may be connected to the wheel to facilitate the ease of movement.
  • Trainee 300 is the object that the robot 200 represents the operation. Trainee 300 may be a person with a social disability, may be a person who wants to improve sociality. For example, autistic children may lack social skills and communication skills. However, there is no particular limitation on the trainee 300.
  • the sociality training apparatus 100 may include an input unit 110, a trainee state analysis unit 120, a training mode determiner 130, and a robot controller 140.
  • the input unit 110 serves to receive behavior information of the trainee 300.
  • the behavior information of the trainee 300 may be the EEG, voice information, image information, contact information of the trainee 300.
  • EEG is an electroencephalogram and refers to an electrical recording diagram obtained by inducing a potential change in the cerebrum of a human or an animal, or a brain current caused by it on the scalp.
  • R. Caton (1875) began to observe changes in the cortical potential of rabbits and monkeys, and H. Berger (1929) succeeded in inducing the scalp of living organisms. The brainwaves for this were called verging. What is induced on the scalp is several tens of microseconds, and amplified and recorded.
  • the waveform is irregular, like noise, and has a slightly regular component around 10 Hz, i.e., the ⁇ wave (large amplitude and slower wavelength), and the ⁇ wave, which is a component with a small irregular amplitude of 13 to 25 Hz.
  • the ⁇ wave which is a component with a small irregular amplitude of 13 to 25 Hz.
  • EEG emotional state
  • the emotional state pleasure information and excitement information
  • a unique EEG may occur. For example, when the trainee 300 approaches the robot 200 and runs away, the difference in intimacy with the robot 200 is a positive case and a negative case, and may be expressed in the EEG. In addition, it may be variously expressed through the frequency domain.
  • the voice information is information related to sound waves generated by the trainee 300.
  • the voice information also represents the frequency characteristics of each of the six cases of happiness, surprise, anger, fear, sadness and disgust through experiments, and the emotional state of the trainee 300 is represented through the frequency characteristics. It can be divided into six types.
  • the input unit 110 may directly or indirectly obtain the above-described information of the trainee 300.
  • the direct acquisition is a case in which a sensor capable of collecting various types of information is included in the social training apparatus 100.
  • the input unit may directly use a depth sensor to acquire image information.
  • the depth measurement sensor can be configured as an infrared laser projector combined with a CMOS sensor and can shoot three dimensions in any brightness condition through the infrared rays that shoot and reflect countless infrared beams from one camera. In addition to the horizontal and vertical orientations, the sensor can detect distances and distances from the sensor to determine where and how your body behaves.
  • a method of obtaining a depth image using an infrared camera is generally a time of flight method, but it is also possible to calculate a depth by projecting a patterned light onto an object and performing stereo matching.
  • the trainee 300 may be detected for a dynamic object having a distance step by using a blob labeling technique in a stream sequence, and the position in space may be estimated and an id may be assigned to each dynamic object.
  • the depth measuring sensor may be included in the robot 200 to allow the robot 200 to acquire information and transmit the information to the input unit 110.
  • the voice sensor for collecting the sound generated by the trainee 300 may be used directly or indirectly.
  • the trainee 300 may use a touch sensor through the robot 200 to input the content of contacting the robot 200.
  • the touch sensor can detect not only single touch but also multi-touch.
  • the format of the touch input is the position of a touched point or the state of a new point, a moved point or a released point, or a tab, double tab, panning or flicking. It can be something like a few touches through touch, such as Flicking, drag and drop, Pinching, and Stretching.
  • the area of the touch and the time of the touch can be measured.
  • Trainee status analysis unit 120 classifies the received behavior information of the trainee 300 into any one of the excitement information (Arousal), pleasure information (Pleasure), intimacy information (Intimacy) state of the trainee (300) Analyze the role.
  • the trainee state analyzer 120 normalizes the behavior information of the trainee 300 to obtain an arousal, pleasure information, intimacy value of ⁇ 1 to +1. Can be converted to take.
  • the waveform of the EEG, the waveform of the voice information, and the image information (face information) may be calculated as pleasure information.
  • Image information may be calculated as excitation information.
  • the image information (speed information) may include a movement speed of a trainee's body, a movement speed of a hand, and a movement speed of a head.
  • the emotion information of the trainee may be analyzed based on the joy information and excitement information of the trainee 300.
  • the criterion of whether the emotion is positive or not is evaluated first by the criteria of pleasure information, and the degree of the degree of excitement is grasped as excitement information.
  • the above six types of information such as happiness, surprise, anger, fear, sadness and disgust can be assigned to each pleasure information and excitement information, and emotions are expressed as respective numerical information of emotions.
  • the intimacy information can be known by how much the trainee 300 ignores the behavior of the robot 200. If the degree of neglect is small, the degree of intimacy is high, and vice versa.
  • Intimacy can also be assigned a number according to its level, which can then be quantified separately.
  • the relationship between pleasure information, excitement information, and intimacy information is described through the graphs of FIGS. 3A to 3B.
  • 3A illustrates a three-dimensional graph in which the trainee state analyzer 120 analyzes a state of the trainee 300. It is represented by a spherical coordinate system and has factors of r, ⁇ , and z. It can be taken as parameters of Arousal, Pleasure, and Intimacy, and a relationship as in Equation 1 is established.
  • the training mode determiner 130 determines a training mode according to the analysis state of the trainee 300.
  • the training mode determiner may determine the training mode based on emotion information (pleasure information, excitement information) or intimacy information.
  • the training mode may be a plurality of training modes according to a predetermined setting. Specifically, it may include a first training mode for training the trainee, a second training mode for encouraging the trainee to train, and a third training mode for not training the trainee.
  • the first training mode is a conversation-turning mode for allowing the robot 200 and the trainee 300 to communicate, and eye contact for the trainee 300 and the robot 200 to face each other.
  • Mode a joint attention mode for focusing attention of the trainee 300, and an imitation mode in which the trainee 300 mimics the robot 200.
  • the third training mode may be, in one embodiment, a pause mode in which the operation of the robot 200 is stopped when the excitation information is very high (eg, when there is a sudden action of the trainee 300). And the like. Not limited to this, various training modes may be determined by emotion information (pleasure information, excitement information) or intimacy information.
  • the training mode may specify a training mode based on emotion information (pleasure information, excitement information) or intimacy information.
  • emotion information pleasure information, excitement information
  • intimacy information a training mode based on emotion information (pleasure information, excitement information) or intimacy information.
  • z is fixed at 0.5.
  • emotions of happiness, surprise, anger, sadness, and disgust can be expressed in a graph, and in the first training mode that trains trainees when they have a considerable amount of pleasure information and low excitation information ( Gray area).
  • the third training mode point area in which the trainee is not trained can be designated.
  • Areas other than the first training mode and the third training mode may be designated as a second training mode (hatched area) that encourages the trainee to train. To encourage the trainee to change from negative to positive.
  • 4b to 4d are diagrams illustrating a change in training mode according to a change in z.
  • the higher the intimacy the wider the marginal zone in which the trainee can be trained in the same emotional state.
  • the higher the intimacy the higher the emotional level the trainee can afford.
  • the robot controller 140 controls the robot 200 according to the determined training mode.
  • the operation pattern of the robot 200 is defined according to the training mode, and the robot 200 is controlled by transmitting the defined operation pattern to the robot 200.
  • the speed or range of motion of the robot 200 executing the training mode may be changed by other conditions. For example, when the excitation information is high, since the trainee 300 may react more quickly, the training of the trainee 300 may be faster by increasing the speed of the robot 200.
  • the intimacy information is high, the operation range of the robot 200 is expanded, so that the robot 200 can operate in a large space, thereby securing the effect of increasing the social training efficiency of the trainee 300. If the intimacy information is low, the robot 200 may be reduced in the operating range of the social training of the trainee 300 limited to a specific space.
  • FIG. 5 is a flowchart illustrating a sociality training method according to an embodiment of the present invention.
  • input information is received from the trainee 300 (S401).
  • the behavior information of the trainee 300 may be the EEG, voice information, image information, contact information of the trainee 300.
  • the state of the trainee 300 is analyzed through the received input information (S402).
  • the state analysis of the trainee 300 may classify the action information of the trainee 300 into one of excitement information, pleasure information, and intimacy information to analyze the state of the trainee.
  • the training mode is determined according to the analyzed state of the trainee 300 (S403).
  • the determination of the training mode may be made based on emotion information (enjoyment information and excitement information) or intimacy information.
  • the training mode may be the first training mode, the second training mode, and the third training mode described above.
  • the speed and the behavior range of the robot 200 in the training mode are determined (S404).
  • the excitation information is high
  • the speed of the robot 200 is increased
  • the intimacy information is high
  • the range of behavior may be extended.
  • the robot 200 is controlled by integrating these conditions (S405).
  • the robot 200 may be controlled by transmitting a control signal to the robot 200. Through this, the sociality of the trainee 300 can be trained and the sociality can be improved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un appareil d'apprentissage de sociabilité et un procédé correspondant. L'appareil d'apprentissage de sociabilité comprend : une partie d'entrée pour recevoir les informations de comportement d'un élève ; une partie d'analyse d'état d'élève pour déterminer l'état de l'élève par analyse et classification des informations de comportement en informations parmi des informations d'excitation, des informations de plaisir et des informations d'intimité ; une partie de décision de mode d'apprentissage pour déterminer un mode d'apprentissage selon l'état analysé de l'élève ; et une partie de commande de robot pour commander un robot selon le mode d'apprentissage déterminé.
PCT/KR2012/008912 2012-06-07 2012-10-29 Appareil d'apprentissage de sociabilité et procédé correspondant WO2013183822A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020120060697A KR20130137262A (ko) 2012-06-07 2012-06-07 사회성 훈련 장치 및 그 방법
KR10-2012-0060697 2012-06-07

Publications (1)

Publication Number Publication Date
WO2013183822A1 true WO2013183822A1 (fr) 2013-12-12

Family

ID=49712196

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2012/008912 WO2013183822A1 (fr) 2012-06-07 2012-10-29 Appareil d'apprentissage de sociabilité et procédé correspondant

Country Status (2)

Country Link
KR (1) KR20130137262A (fr)
WO (1) WO2013183822A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107283389A (zh) * 2017-08-31 2017-10-24 李景龙 用于辅助治疗自闭症的机器人
CN109623816A (zh) * 2018-12-19 2019-04-16 中新智擎科技有限公司 一种机器人回充方法、装置、存储介质及机器人

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070110158A (ko) * 2006-05-12 2007-11-16 고려대학교 산학협력단 감정 상태에 따른 피드백 장치, 그 방법 및 그 기록 매체
KR20090079655A (ko) * 2008-01-18 2009-07-22 주식회사 케이티 사용자 자극행동에 따른 로봇 반응행동 수행 방법 및 그로봇
KR101016381B1 (ko) * 2009-01-19 2011-02-21 한국과학기술원 사람과 상호작용이 가능한 감정 표현 로봇
KR101061771B1 (ko) * 2009-11-30 2011-09-05 재단법인대구경북과학기술원 놀이치료를 위한 로봇 및 로봇 제어 시스템

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070110158A (ko) * 2006-05-12 2007-11-16 고려대학교 산학협력단 감정 상태에 따른 피드백 장치, 그 방법 및 그 기록 매체
KR20090079655A (ko) * 2008-01-18 2009-07-22 주식회사 케이티 사용자 자극행동에 따른 로봇 반응행동 수행 방법 및 그로봇
KR101016381B1 (ko) * 2009-01-19 2011-02-21 한국과학기술원 사람과 상호작용이 가능한 감정 표현 로봇
KR101061771B1 (ko) * 2009-11-30 2011-09-05 재단법인대구경북과학기술원 놀이치료를 위한 로봇 및 로봇 제어 시스템

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107283389A (zh) * 2017-08-31 2017-10-24 李景龙 用于辅助治疗自闭症的机器人
CN109623816A (zh) * 2018-12-19 2019-04-16 中新智擎科技有限公司 一种机器人回充方法、装置、存储介质及机器人

Also Published As

Publication number Publication date
KR20130137262A (ko) 2013-12-17

Similar Documents

Publication Publication Date Title
US20190108770A1 (en) System and method of pervasive developmental disorder interventions
Mousavi Hondori et al. A review on technical and clinical impact of microsoft kinect on physical therapy and rehabilitation
EP4079383A2 (fr) Procédé et système permettant d'interagir avec un environnement virtuel
Sucar et al. Gesture therapy: A vision-based system for upper extremity stroke rehabilitation
US20130158883A1 (en) Intention conveyance support device and method
Petric et al. Four tasks of a robot-assisted autism spectrum disorder diagnostic protocol: First clinical tests
Özcan et al. Transitional wearable companions: a novel concept of soft interactive social robots to improve social skills in children with autism spectrum disorder
KR101307783B1 (ko) 사회성 훈련 장치 및 그 방법
WO2018084170A1 (fr) Robot autonome identifiant des personnes
JP2009101057A (ja) 生体情報処理装置、生体情報処理方法及びプログラム
KR102355511B1 (ko) Vr기반 인지 훈련 시스템
Dias et al. Designing better spaces for people: virtual reality and biometric sensing as tools to evaluate space use
Park et al. Narratives and sensor driven cognitive behavior training game platform
Cheng et al. Mind-Wave controlled robot: An Arduino Robot simulating the wheelchair for paralyzed patients
KR20210076561A (ko) Vr콘텐츠를 이용한 치매 예방 인지 훈련 시스템
WO2013183822A1 (fr) Appareil d'apprentissage de sociabilité et procédé correspondant
Singh et al. Physiologically attentive user interface for robot teleoperation: real time emotional state estimation and interface modification using physiology, facial expressions and eye movements
Mihelj et al. Emotion-aware system for upper extremity rehabilitation
JP7107017B2 (ja) ロボット、ロボットの制御方法及びプログラム
KR20200071236A (ko) 개인 맞춤형 운동강도조절 기능을 갖는 멀티센서 기반 타이밍 피드백 시스템
Mishra et al. Does elderly enjoy playing bingo with a robot? a case study with the humanoid robot nadine
AlZoubi et al. Affect-aware assistive technologies
De Silva et al. A Multi-agent Based Interactive System Towards Child's Emotion Performances Quantified Through Affective Body Gestures
Popescu et al. A critical analysis of the technologies used for development of applications dedicated to persons with autism spectrum disorder
US11493994B2 (en) Input device using bioelectric potential

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12878476

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12878476

Country of ref document: EP

Kind code of ref document: A1