WO2013183822A1 - Apparatus for training sociability and a method therefor - Google Patents

Apparatus for training sociability and a method therefor Download PDF

Info

Publication number
WO2013183822A1
WO2013183822A1 PCT/KR2012/008912 KR2012008912W WO2013183822A1 WO 2013183822 A1 WO2013183822 A1 WO 2013183822A1 KR 2012008912 W KR2012008912 W KR 2012008912W WO 2013183822 A1 WO2013183822 A1 WO 2013183822A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
trainee
training mode
training
robot
Prior art date
Application number
PCT/KR2012/008912
Other languages
French (fr)
Korean (ko)
Inventor
최종석
박성기
김동환
강해용
Original Assignee
한국과학기술연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국과학기술연구원 filed Critical 한국과학기술연구원
Publication of WO2013183822A1 publication Critical patent/WO2013183822A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass

Definitions

  • the embodiment relates to a social training apparatus and a method thereof, and more particularly, to an apparatus and a method for training the sociality of autistic children.
  • Autistic children are characterized by the lack of self-initiated interactions, turn-taking, imitation, emotion recognition, joint attention, and eye contact, which are basic functions for social and communication. In other words, cognitive development is immature and lacks sociality. Autistic children with these characteristics are difficult to socialize as adults. Without proper treatment at the right time, social interaction with people becomes impossible.
  • early detection and behavioral treatment of autism has been actively conducted in university hospitals and research institutes, and treatment has been carried out using robots (toys) that are not more complicated than human emotions. Recent research trends also suggest that robots can draw attention to children with autism and that their concentration has improved.
  • Patent 10-1061171 discloses a robot and a robot control system for play therapy.
  • a robot that outputs visual, auditory and tactile and behavioral stimuli for play therapy;
  • a mobile terminal which receives a child's response according to the stimulus output by the operator as response information and transmits it to the outside, and transmits a control command to the robot to output a stimulus according to the request of the operator;
  • Receives response information transmitted from the mobile terminal and determines whether the robot outputs a stimulus, whether it is preferred, unfavorable, or unresponsive, and selects the next stimulus according to the preference, non-preferred or non-response of the stimulus, and then selects the next stimulus
  • a robot control system for play therapy characterized in that it comprises a server for transmitting a control command to the robot.
  • the behavior of autistic children can be analyzed in multiple angles.
  • the input unit for receiving the behavior information of the trainee;
  • a trainee status analysis unit for analyzing the trainee's behavior information and classifying the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's status;
  • a training mode determiner configured to determine a training mode according to an analysis state of the trainee;
  • a social training apparatus including a robot controller for controlling a robot according to the determined training mode.
  • the input unit there is provided a social training apparatus that receives at least one of the EEG, voice information, image information, contact information of the trainee.
  • the trainee state analysis unit is provided, the sociality training device for analyzing the emotion information of the trainees based on the enjoyment information and excitement information.
  • the emotional information is provided with a social training apparatus, including six information of happiness, surprise, anxiety, anger, sadness, satisfaction.
  • the training mode determination unit is provided, the sociality training device for determining the training mode of the trainees based on the emotion information or intimacy information.
  • the training mode of the trainee is a first training mode to train the trainees, a second training mode to encourage the trainees to train, a third training mode not training the trainees Including a social training device is provided.
  • the training mode determination unit is provided with a social training apparatus, which extends the range of the first training mode in proportion to the intimacy information.
  • the robot control unit is provided, the social training apparatus for determining the speed of the robot according to the training mode based on the excitation information.
  • the robot control unit is provided, the social training apparatus for determining the behavior range of the robot according to the training mode based on the familiarity information.
  • the step of receiving the behavior information of the trainee Analyzing the trainee's behavior information to classify the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's state; And determining a training mode according to an analysis state of the trainee.
  • a social training method comprising the step of controlling the robot according to the determined training mode.
  • the step of receiving the training information of the trainee comprising the step of receiving at least one of the EEG, voice information, image information, contact information of the trainee Is provided.
  • analyzing the trainee's state further includes analyzing the trainee's emotion information based on the enjoyment information and the excitement information.
  • analyzing the trainee's state further includes analyzing the trainee's emotion information based on the enjoyment information and the excitement information.
  • the emotional information is provided with a sociality training method, including six information of happiness, surprise, anger, fear, sadness, disgust.
  • the determining of the training mode comprises providing a training mode of the trainee based on the emotion information or the intimacy information.
  • the training mode of the trainee is a first training mode to train the trainees, a second training mode to encourage the trainees to train, a third training mode not training the trainees Including a social training method is provided.
  • the determining of the training mode further includes extending the range of the first training mode in proportion to the intimacy information.
  • the step of controlling the robot there is provided a social training method for determining the speed of the robot according to the training mode based on the excitation information.
  • the step of controlling the robot there is provided a social training method for determining the behavior range of the robot according to the training mode based on the familiarity information.
  • FIG. 1 is a diagram illustrating a relationship between a social training apparatus 100, a robot 200, and an autistic child 300, according to an exemplary embodiment.
  • FIG. 2 is a diagram illustrating an internal configuration of the social training apparatus 100 according to an embodiment.
  • 3A and 3B are diagrams illustrating a three-dimensional graph in which a trainee state analyzing unit 120 analyzes a trainee state according to one embodiment.
  • 4A is a diagram illustrating an emotional state and a training mode according to pleasure information and excitement information when z is constant according to an embodiment.
  • 4B to 4D are diagrams illustrating a change in training mode according to a change in z, according to an exemplary embodiment.
  • FIG. 5 is a flowchart illustrating a social training method according to an embodiment.
  • the social training apparatus 100 analyzes various behaviors of the trainee 300, for example, a signal for controlling the robot 200 according to the analysis of various behaviors such as position, voice, brain waves, and the behavior of touching the robot 200. To the robot 200. The robot 200 operates based on the received control signal, and the operation of the robot 200 improves the sociality of the trainee 300.
  • the social training apparatus 100 receives the behavior information of the trainee 300, analyzes the state of the trainee 300, and determines the training mode based on the training information.
  • the trainee 300 is trained by controlling the robot 200 according to the determined training mode. Details related to the sociality training device 100 will be described later.
  • the robot 200 performs a variety of actions under the control of the sociality training device 100.
  • the robot 200 has a human-like appearance and shape, and can walk with two feet, and has two arms, and has a function of manipulating something with a hand. That is, the robot 200 may be a humanoid robot and may have a plurality of joints. The humanoid robot can also change the posture of parts such as arms and legs by adjusting each joint. You can also walk, throw things or kick a ball.
  • the robot 200 may be a humanoid configuration of only a part of the robot component. For example, the upper body part is the appearance and shape of the human body, but the lower body part may be connected to the wheel to facilitate the ease of movement.
  • Trainee 300 is the object that the robot 200 represents the operation. Trainee 300 may be a person with a social disability, may be a person who wants to improve sociality. For example, autistic children may lack social skills and communication skills. However, there is no particular limitation on the trainee 300.
  • the sociality training apparatus 100 may include an input unit 110, a trainee state analysis unit 120, a training mode determiner 130, and a robot controller 140.
  • the input unit 110 serves to receive behavior information of the trainee 300.
  • the behavior information of the trainee 300 may be the EEG, voice information, image information, contact information of the trainee 300.
  • EEG is an electroencephalogram and refers to an electrical recording diagram obtained by inducing a potential change in the cerebrum of a human or an animal, or a brain current caused by it on the scalp.
  • R. Caton (1875) began to observe changes in the cortical potential of rabbits and monkeys, and H. Berger (1929) succeeded in inducing the scalp of living organisms. The brainwaves for this were called verging. What is induced on the scalp is several tens of microseconds, and amplified and recorded.
  • the waveform is irregular, like noise, and has a slightly regular component around 10 Hz, i.e., the ⁇ wave (large amplitude and slower wavelength), and the ⁇ wave, which is a component with a small irregular amplitude of 13 to 25 Hz.
  • the ⁇ wave which is a component with a small irregular amplitude of 13 to 25 Hz.
  • EEG emotional state
  • the emotional state pleasure information and excitement information
  • a unique EEG may occur. For example, when the trainee 300 approaches the robot 200 and runs away, the difference in intimacy with the robot 200 is a positive case and a negative case, and may be expressed in the EEG. In addition, it may be variously expressed through the frequency domain.
  • the voice information is information related to sound waves generated by the trainee 300.
  • the voice information also represents the frequency characteristics of each of the six cases of happiness, surprise, anger, fear, sadness and disgust through experiments, and the emotional state of the trainee 300 is represented through the frequency characteristics. It can be divided into six types.
  • the input unit 110 may directly or indirectly obtain the above-described information of the trainee 300.
  • the direct acquisition is a case in which a sensor capable of collecting various types of information is included in the social training apparatus 100.
  • the input unit may directly use a depth sensor to acquire image information.
  • the depth measurement sensor can be configured as an infrared laser projector combined with a CMOS sensor and can shoot three dimensions in any brightness condition through the infrared rays that shoot and reflect countless infrared beams from one camera. In addition to the horizontal and vertical orientations, the sensor can detect distances and distances from the sensor to determine where and how your body behaves.
  • a method of obtaining a depth image using an infrared camera is generally a time of flight method, but it is also possible to calculate a depth by projecting a patterned light onto an object and performing stereo matching.
  • the trainee 300 may be detected for a dynamic object having a distance step by using a blob labeling technique in a stream sequence, and the position in space may be estimated and an id may be assigned to each dynamic object.
  • the depth measuring sensor may be included in the robot 200 to allow the robot 200 to acquire information and transmit the information to the input unit 110.
  • the voice sensor for collecting the sound generated by the trainee 300 may be used directly or indirectly.
  • the trainee 300 may use a touch sensor through the robot 200 to input the content of contacting the robot 200.
  • the touch sensor can detect not only single touch but also multi-touch.
  • the format of the touch input is the position of a touched point or the state of a new point, a moved point or a released point, or a tab, double tab, panning or flicking. It can be something like a few touches through touch, such as Flicking, drag and drop, Pinching, and Stretching.
  • the area of the touch and the time of the touch can be measured.
  • Trainee status analysis unit 120 classifies the received behavior information of the trainee 300 into any one of the excitement information (Arousal), pleasure information (Pleasure), intimacy information (Intimacy) state of the trainee (300) Analyze the role.
  • the trainee state analyzer 120 normalizes the behavior information of the trainee 300 to obtain an arousal, pleasure information, intimacy value of ⁇ 1 to +1. Can be converted to take.
  • the waveform of the EEG, the waveform of the voice information, and the image information (face information) may be calculated as pleasure information.
  • Image information may be calculated as excitation information.
  • the image information (speed information) may include a movement speed of a trainee's body, a movement speed of a hand, and a movement speed of a head.
  • the emotion information of the trainee may be analyzed based on the joy information and excitement information of the trainee 300.
  • the criterion of whether the emotion is positive or not is evaluated first by the criteria of pleasure information, and the degree of the degree of excitement is grasped as excitement information.
  • the above six types of information such as happiness, surprise, anger, fear, sadness and disgust can be assigned to each pleasure information and excitement information, and emotions are expressed as respective numerical information of emotions.
  • the intimacy information can be known by how much the trainee 300 ignores the behavior of the robot 200. If the degree of neglect is small, the degree of intimacy is high, and vice versa.
  • Intimacy can also be assigned a number according to its level, which can then be quantified separately.
  • the relationship between pleasure information, excitement information, and intimacy information is described through the graphs of FIGS. 3A to 3B.
  • 3A illustrates a three-dimensional graph in which the trainee state analyzer 120 analyzes a state of the trainee 300. It is represented by a spherical coordinate system and has factors of r, ⁇ , and z. It can be taken as parameters of Arousal, Pleasure, and Intimacy, and a relationship as in Equation 1 is established.
  • the training mode determiner 130 determines a training mode according to the analysis state of the trainee 300.
  • the training mode determiner may determine the training mode based on emotion information (pleasure information, excitement information) or intimacy information.
  • the training mode may be a plurality of training modes according to a predetermined setting. Specifically, it may include a first training mode for training the trainee, a second training mode for encouraging the trainee to train, and a third training mode for not training the trainee.
  • the first training mode is a conversation-turning mode for allowing the robot 200 and the trainee 300 to communicate, and eye contact for the trainee 300 and the robot 200 to face each other.
  • Mode a joint attention mode for focusing attention of the trainee 300, and an imitation mode in which the trainee 300 mimics the robot 200.
  • the third training mode may be, in one embodiment, a pause mode in which the operation of the robot 200 is stopped when the excitation information is very high (eg, when there is a sudden action of the trainee 300). And the like. Not limited to this, various training modes may be determined by emotion information (pleasure information, excitement information) or intimacy information.
  • the training mode may specify a training mode based on emotion information (pleasure information, excitement information) or intimacy information.
  • emotion information pleasure information, excitement information
  • intimacy information a training mode based on emotion information (pleasure information, excitement information) or intimacy information.
  • z is fixed at 0.5.
  • emotions of happiness, surprise, anger, sadness, and disgust can be expressed in a graph, and in the first training mode that trains trainees when they have a considerable amount of pleasure information and low excitation information ( Gray area).
  • the third training mode point area in which the trainee is not trained can be designated.
  • Areas other than the first training mode and the third training mode may be designated as a second training mode (hatched area) that encourages the trainee to train. To encourage the trainee to change from negative to positive.
  • 4b to 4d are diagrams illustrating a change in training mode according to a change in z.
  • the higher the intimacy the wider the marginal zone in which the trainee can be trained in the same emotional state.
  • the higher the intimacy the higher the emotional level the trainee can afford.
  • the robot controller 140 controls the robot 200 according to the determined training mode.
  • the operation pattern of the robot 200 is defined according to the training mode, and the robot 200 is controlled by transmitting the defined operation pattern to the robot 200.
  • the speed or range of motion of the robot 200 executing the training mode may be changed by other conditions. For example, when the excitation information is high, since the trainee 300 may react more quickly, the training of the trainee 300 may be faster by increasing the speed of the robot 200.
  • the intimacy information is high, the operation range of the robot 200 is expanded, so that the robot 200 can operate in a large space, thereby securing the effect of increasing the social training efficiency of the trainee 300. If the intimacy information is low, the robot 200 may be reduced in the operating range of the social training of the trainee 300 limited to a specific space.
  • FIG. 5 is a flowchart illustrating a sociality training method according to an embodiment of the present invention.
  • input information is received from the trainee 300 (S401).
  • the behavior information of the trainee 300 may be the EEG, voice information, image information, contact information of the trainee 300.
  • the state of the trainee 300 is analyzed through the received input information (S402).
  • the state analysis of the trainee 300 may classify the action information of the trainee 300 into one of excitement information, pleasure information, and intimacy information to analyze the state of the trainee.
  • the training mode is determined according to the analyzed state of the trainee 300 (S403).
  • the determination of the training mode may be made based on emotion information (enjoyment information and excitement information) or intimacy information.
  • the training mode may be the first training mode, the second training mode, and the third training mode described above.
  • the speed and the behavior range of the robot 200 in the training mode are determined (S404).
  • the excitation information is high
  • the speed of the robot 200 is increased
  • the intimacy information is high
  • the range of behavior may be extended.
  • the robot 200 is controlled by integrating these conditions (S405).
  • the robot 200 may be controlled by transmitting a control signal to the robot 200. Through this, the sociality of the trainee 300 can be trained and the sociality can be improved.

Abstract

Disclosed is an apparatus for training sociability and a method therefor. The apparatus for training sociability includes: an input part for receiving the behavioral information of a trainee; a trainee state analysis part for determining the state of the trainee by analyzing and classifying the behavioral information into one of excitement information, pleasure information and intimacy information; a training mode decision part for determining a training mode according to the analyzed state of the trainee; and a robot control part for controlling a robot according to the training mode determined.

Description

사회성 훈련 장치 및 그 방법Social training device and method
실시예는 사회성 훈련 장치 및 그 방법에 관한 것으로, 보다 구체적으로 자폐아동의 사회성을 훈련하는 장치 및 그 방법에 관한 것이다. The embodiment relates to a social training apparatus and a method thereof, and more particularly, to an apparatus and a method for training the sociality of autistic children.
자폐아동은 사회성과 의사소통을 위한 기본 기능이라 말할 수 있는 Self-initiated interactions, Turn-taking, Imitation, Emotion Recognition, Joint Attention 그리고 Eye Contact의 결핍이라는 특징을 가지고 있다. 즉, 인지발달이 미숙한 상태로 사회성이 결여되어있다. 이러한 특징을 가진 자폐아동은 성인이 되어서도 사회활동에 어렵다. 적절한 시기에 적절한 치료가 이루어 지지 않는다면, 사람들과의 사회적 상호작용이 불가능하게 된다. 최근에는 대학병원, 연구기관을 중심으로 자폐증의 조기발견과 행동학적 치료가 활발히 진행되고 있으며, 비교적 사람의 감정표현보다 복잡하지 않는 로봇(장난감)을 이용하여 치료가 이루어지고 있다. 근래의 연구동향 또한, 로봇이 자폐아동의 주의집중을 이끌어 낼 수 있으며 또한 그 집중도의 향상되었다는 결과들이 도출되고 있다.Autistic children are characterized by the lack of self-initiated interactions, turn-taking, imitation, emotion recognition, joint attention, and eye contact, which are basic functions for social and communication. In other words, cognitive development is immature and lacks sociality. Autistic children with these characteristics are difficult to socialize as adults. Without proper treatment at the right time, social interaction with people becomes impossible. In recent years, early detection and behavioral treatment of autism has been actively conducted in university hospitals and research institutes, and treatment has been carried out using robots (toys) that are not more complicated than human emotions. Recent research trends also suggest that robots can draw attention to children with autism and that their concentration has improved.
등록특허 10-1061171는 놀이치료를 위한 로봇 및 로봇 제어 시스템을 개시한다. 놀이치료를 위한 시각적 및 청각적 및 촉각적 및 행동적 자극을 출력하는 로봇; 조작자에 의해 상기 로봇이 출력하는 자극에 따른 아동의 반응을 반응정보로서 입력받아 외부로 전송함과 아울러, 상기 조작자의 요청에 따른 자극을 출력하도록 하는 제어명령을 상기 로봇으로 전송하는 이동 단말기; 상기 이동 단말기가 전송하는 반응정보를 입력받아 상기 로봇이 출력하는 자극을 선호 또는 비선호 또는 무반응 하는지를 판단하고, 상기 자극의 선호 또는 비선호 또는 무반응 판단에 따라 다음 자극을 선정하고, 선정된 다음 자극을 출력하도록 하는 제어명령을 상기 로봇으로 전송하는 서버를 구비하는 것을 특징으로 하는 놀이치료를 위한 로봇 제어 시스템을 개시하고 있다. 그러나 아동의 행동을 선호, 비선호, 무반응의 일차원적인 행동만을 분석할 수 있을 뿐으로, 아동의 행동을 다각도로 분석하여 그에 따라 로봇을 제어하여 아동의 사회성을 훈련시킬 수는 없었다.Patent 10-1061171 discloses a robot and a robot control system for play therapy. A robot that outputs visual, auditory and tactile and behavioral stimuli for play therapy; A mobile terminal which receives a child's response according to the stimulus output by the operator as response information and transmits it to the outside, and transmits a control command to the robot to output a stimulus according to the request of the operator; Receives response information transmitted from the mobile terminal and determines whether the robot outputs a stimulus, whether it is preferred, unfavorable, or unresponsive, and selects the next stimulus according to the preference, non-preferred or non-response of the stimulus, and then selects the next stimulus Disclosed is a robot control system for play therapy, characterized in that it comprises a server for transmitting a control command to the robot. However, it was only possible to analyze the one-dimensional behaviors of the children's behaviour, preference, non-response and non-response, and the children's behavior could not be trained by analyzing the children's behavior from various angles.
본 발명의 일 측면에 따르면, 자폐 아동의 행동을 다각도로 분석할 수 있다. According to an aspect of the present invention, the behavior of autistic children can be analyzed in multiple angles.
본 발명의 일 측면에 따르면, 자폐 아동의 행동에 따라 로봇을 제어함으로써 능동적이고 효율적으로 사회성을 훈련시킬 수 있다. According to an aspect of the present invention, by controlling the robot in accordance with the behavior of the child with autism, it is possible to train sociality actively and efficiently.
본 발명의 일 측면에 따르면, 피훈련자의 행동정보를 입력받는 입력부; 상기 입력받은 피훈련자의 행동정보를 분석하여 상기 피훈련자의 행동정보를 흥분정보, 즐거움정보, 친밀도정보 중 어느 하나로 분류하여 상기 피훈련자의 상태를 분석하는 피훈련자 상태 분석부; 상기 피훈련자의 분석 상태에 따라 훈련 모드를 결정하는 훈련 모드 결정부; 상기 결정된 훈련모드에 따라 로봇을 제어하는 로봇 제어부를 포함하는 사회성 훈련 장치가 제공된다.According to an aspect of the invention, the input unit for receiving the behavior information of the trainee; A trainee status analysis unit for analyzing the trainee's behavior information and classifying the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's status; A training mode determiner configured to determine a training mode according to an analysis state of the trainee; There is provided a social training apparatus including a robot controller for controlling a robot according to the determined training mode.
본 발명의 다른 측면에 따르면, 상기 입력부는, 상기 피훈련자의 EEG, 음성정보, 영상정보, 접촉정보 중 적어도 어느 하나를 입력받는, 사회성 훈련 장치가 제공된다.According to another aspect of the invention, the input unit, there is provided a social training apparatus that receives at least one of the EEG, voice information, image information, contact information of the trainee.
본 발명의 다른 측면에 따르면, 상기 피훈련자 상태 분석부는, 상기 즐거움정보와 흥분정보에 기반하여 피훈련자의 감정정보를 분석하는, 사회성 훈련 장치가 제공된다.According to another aspect of the present invention, the trainee state analysis unit is provided, the sociality training device for analyzing the emotion information of the trainees based on the enjoyment information and excitement information.
본 발명의 다른 측면에 따르면, 상기 감정정보는 행복, 놀램, 불안, 분노, 슬픔, 만족의 여섯가지 정보를 포함하는, 사회성 훈련 장치가 제공된다.According to another aspect of the invention, the emotional information is provided with a social training apparatus, including six information of happiness, surprise, anxiety, anger, sadness, satisfaction.
본 발명의 다른 측면에 따르면, 상기 훈련 모드 결정부는, 상기 감정정보 또는 친밀도정보에 기반하여 상기 피훈련자의 훈련모드를 결정하는, 사회성 훈련 장치가 제공된다.According to another aspect of the present invention, the training mode determination unit is provided, the sociality training device for determining the training mode of the trainees based on the emotion information or intimacy information.
본 발명의 다른 측면에 따르면, 상기 피훈련자의 훈련모드는 피훈련자를 훈련시키는 제1훈련모드, 피훈련자에게 훈련을 시킬 수 있도록 독려하는 제2훈련모드, 피훈련자를 훈련시키지 않는 제3훈련모드를 포함하는, 사회성 훈련 장치가 제공된다.According to another aspect of the invention, the training mode of the trainee is a first training mode to train the trainees, a second training mode to encourage the trainees to train, a third training mode not training the trainees Including a social training device is provided.
본 발명의 다른 측면에 따르면, 상기 훈련 모드 결정부는, 상기 친밀도정보에 비례하여 상기 제1훈련모드의 범위를 확장시키는, 사회성 훈련 장치가 제공된다.According to another aspect of the present invention, the training mode determination unit is provided with a social training apparatus, which extends the range of the first training mode in proportion to the intimacy information.
본 발명의 다른 측면에 따르면, 상기 로봇 제어부는, 상기 훈련모드에 따른 로봇의 속도를 상기 흥분정보에 기반하여 결정하는, 사회성 훈련 장치가 제공된다.According to another aspect of the present invention, the robot control unit is provided, the social training apparatus for determining the speed of the robot according to the training mode based on the excitation information.
본 발명의 다른 측면에 따르면, 상기 로봇 제어부는, 상기 훈련모드에 따른 로봇의 행동범위를 상기 친밀도정보에 기반하여 결정하는, 사회성 훈련 장치가 제공된다.According to another aspect of the present invention, the robot control unit is provided, the social training apparatus for determining the behavior range of the robot according to the training mode based on the familiarity information.
본 발명의 다른 측면에 따르면, 피훈련자의 행동정보를 입력받는 단계; 상기 입력받은 피훈련자의 행동정보를 분석하여 상기 피훈련자의 행동정보를 흥분정보, 즐거움정보, 친밀도정보 중 어느 하나로 분류하여 상기 피훈련자의 상태를 분석하는 단계; 및 상기 피훈련자의 분석 상태에 따라 훈련 모드를 결정하는 단계; 상기 결정된 훈련 모드에 따라 로봇을 제어하는 단계를 포함하는 사회성 훈련 방법이 제공된다.According to another aspect of the invention, the step of receiving the behavior information of the trainee; Analyzing the trainee's behavior information to classify the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's state; And determining a training mode according to an analysis state of the trainee. There is provided a social training method comprising the step of controlling the robot according to the determined training mode.
본 발명의 다른 측면에 따르면, 상기 피훈련자의 행동정보를 입력받는 단계는, 상기 피훈련자의 EEG, 음성정보, 영상정보, 접촉정보 중 적어도 어느 하나를 입력받는 단계를 포함하는, 사회성 훈련 방법이 제공된다.According to another aspect of the invention, the step of receiving the training information of the trainee, the social training method comprising the step of receiving at least one of the EEG, voice information, image information, contact information of the trainee Is provided.
본 발명의 다른 측면에 따르면, 상기 피훈련자의 상태를 분석하는 단계는 상기 즐거움정보와 흥분정보에 기반하여 피훈련자의 감정정보를 분석하는 단계를 더 포함하는, 사회성 훈련 방법이 제공된다.According to another aspect of the present invention, analyzing the trainee's state further includes analyzing the trainee's emotion information based on the enjoyment information and the excitement information.
본 발명의 다른 측면에 따르면, 상기 피훈련자의 상태를 분석하는 단계는 상기 즐거움정보와 흥분정보에 기반하여 피훈련자의 감정정보를 분석하는 단계를 더 포함하는, 사회성 훈련 방법이 제공된다.According to another aspect of the present invention, analyzing the trainee's state further includes analyzing the trainee's emotion information based on the enjoyment information and the excitement information.
본 발명의 다른 측면에 따르면, 상기 감정정보는 행복, 놀램, 분노, 공포, 슬픔, 혐오의 여섯가지 정보를 포함하는, 사회성 훈련 방법이 제공된다.According to another aspect of the present invention, the emotional information is provided with a sociality training method, including six information of happiness, surprise, anger, fear, sadness, disgust.
본 발명의 다른 측면에 따르면, 상기 훈련 모드를 결정하는 단계는, 상기 감정정보 또는 친밀도정보에 기반하여 상기 피훈련자의 훈련모드를 결정하는 단계를 포함하는, 사회성 훈련 방법이 제공된다.According to another aspect of the present invention, the determining of the training mode comprises providing a training mode of the trainee based on the emotion information or the intimacy information.
본 발명의 다른 측면에 따르면, 상기 피훈련자의 훈련모드는 피훈련자를 훈련시키는 제1훈련모드, 피훈련자에게 훈련을 시킬 수 있도록 독려하는 제2훈련모드, 피훈련자를 훈련시키지 않는 제3훈련모드를 포함하는, 사회성 훈련 방법이 제공된다.According to another aspect of the invention, the training mode of the trainee is a first training mode to train the trainees, a second training mode to encourage the trainees to train, a third training mode not training the trainees Including a social training method is provided.
본 발명의 다른 측면에 따르면, 상기 훈련 모드를 결정하는 단계는, 상기 친밀도정보에 비례하여 상기 제1훈련모드의 범위를 확장시키는 단계를 더 포함하는, 사회성 훈련 방법이 제공된다.According to another aspect of the present invention, the determining of the training mode further includes extending the range of the first training mode in proportion to the intimacy information.
본 발명의 다른 측면에 따르면, 상기 로봇을 제어하는 단계는, 상기 훈련모드에 따른 로봇의 속도를 상기 흥분정보에 기반하여 결정하는, 사회성 훈련 방법이 제공된다.According to another aspect of the invention, the step of controlling the robot, there is provided a social training method for determining the speed of the robot according to the training mode based on the excitation information.
본 발명의 다른 측면에 따르면, 상기 로봇을 제어하는 단계는, 상기 훈련모드에 따른 로봇의 행동범위를 상기 친밀도정보에 기반하여 결정하는, 사회성 훈련 방법이 제공된다.According to another aspect of the invention, the step of controlling the robot, there is provided a social training method for determining the behavior range of the robot according to the training mode based on the familiarity information.
본 발명의 일 측면에 따르면, 자폐 아동과 로봇과의 친밀도 및 다양한 감정들을 분석하게 되는 효과가 있다. According to an aspect of the present invention, there is an effect that analyzes the affinity and a variety of emotions between autistic children and robots.
본 발명의 일 측면에 따르면, 자폐 아동의 행동에 따라 능동적으로 로봇을 제어함으로써 종래 기술에 비하여 효율적으로 자폐 아동의 사회성을 훈련시킬 수 있는 효과가 있다. According to an aspect of the present invention, by actively controlling the robot according to the behavior of the child with autism, there is an effect that can effectively train the sociality of the child with autism compared to the prior art.
도 1은 일 실시예에 따라, 사회성 훈련 장치(100)와 로봇(200) 및 자폐 아동(300)과의 관계를 나타낸 도면이다.1 is a diagram illustrating a relationship between a social training apparatus 100, a robot 200, and an autistic child 300, according to an exemplary embodiment.
도 2는 일 실시예에 따라, 사회성 훈련 장치(100)의 내부 구성을 나타낸 도면이다.2 is a diagram illustrating an internal configuration of the social training apparatus 100 according to an embodiment.
도 3a 및 도3b는 일 실시예에 따라, 피훈련자 상태 분석부(120)가 피훈련자의 상태를 분석하는 3차원 그래프를 나타낸 도면이다.3A and 3B are diagrams illustrating a three-dimensional graph in which a trainee state analyzing unit 120 analyzes a trainee state according to one embodiment.
도 4a는 일 실시예에 따라, z가 일정할 때, 즐거움정보와 흥분정보에 따른 감정상태 및 훈련모드를 나타낸 도면이다. 4A is a diagram illustrating an emotional state and a training mode according to pleasure information and excitement information when z is constant according to an embodiment.
도 4b 내지 도 4d는 일 실시예에 따라, z의 변화에 따른 훈련모드의 변화를 나타낸 도면이다.4B to 4D are diagrams illustrating a change in training mode according to a change in z, according to an exemplary embodiment.
도 5는 일 실시예에 따른, 사회성 훈련 방법을 순서도로 나타낸 도면이다.5 is a flowchart illustrating a social training method according to an embodiment.
후술하는 본 발명에 대한 상세한 설명은, 본 발명이 실시될 수 있는 특정 실시예를 예시로서 도시하는 첨부 도면을 참조한다. 이들 실시예는 당업자가 본 발명을 실시할 수 있기에 충분하도록 상세히 설명된다. 본 발명의 다양한 실시예는 서로 다르지만 상호 배타적일 필요는 없음이 이해되어야 한다. 예를 들어, 여기에 기재되어 있는 특정 형상, 구조 및 특성은 일 실시예에 관련하여 본 발명의 정신 및 범위를 벗어나지 않으면서 다른 실시예로 구현될 수 있다. 또한, 각각의 개시된 실시예 내의 개별 구성요소의 위치 또는 배치는 본 발명의 정신 및 범위를 벗어나지 않으면서 변경될 수 있음이 이해되어야 한다. 따라서, 후술하는 상세한 설명은 한정적인 의미로서 취하려는 것이 아니며, 본 발명의 범위는, 적절하게 설명된다면, 그 청구항들이 주장하는 것과 균등한 모든 범위와 더불어 첨부된 청구항에 의해서만 한정된다. 도면에서 유사한 참조부호는 여러 측면에 걸쳐서 동일하거나 유사한 기능을 지칭한다.DETAILED DESCRIPTION The following detailed description of the invention refers to the accompanying drawings that show, by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. It should be understood that the various embodiments of the present invention are different but need not be mutually exclusive. For example, certain shapes, structures, and characteristics described herein may be embodied in other embodiments without departing from the spirit and scope of the invention with respect to one embodiment. In addition, it is to be understood that the location or arrangement of individual components within each disclosed embodiment may be changed without departing from the spirit and scope of the invention. The following detailed description, therefore, is not to be taken in a limiting sense, and the scope of the present invention, if properly described, is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled. Like reference numerals in the drawings refer to the same or similar functions throughout the several aspects.
도 1은 본 발명의 일 실시예에 따른, 사회성 훈련 장치(100)와 로봇(200) 및 피훈련자(300)과의 관계를 나타낸 도면이다. 사회성 훈련 장치(100)는 피훈련자(300)의 다양한 행동들, 예를 들어 위치, 음성, 뇌파, 로봇(200)을 만지는 행위 등의 다양한 행동들을 분석하여서 이에 맞게 로봇(200)을 제어하는 신호를 로봇(200)에 전송한다. 로봇(200)은 전송받은 제어신호에 기반하여 동작하며, 이러한 로봇(200)의 동작이 피훈련자(300)의 사회성을 향상시킨다.1 is a diagram illustrating a relationship between a social training apparatus 100, a robot 200, and a trainee 300 according to an exemplary embodiment of the present invention. The social training apparatus 100 analyzes various behaviors of the trainee 300, for example, a signal for controlling the robot 200 according to the analysis of various behaviors such as position, voice, brain waves, and the behavior of touching the robot 200. To the robot 200. The robot 200 operates based on the received control signal, and the operation of the robot 200 improves the sociality of the trainee 300.
사회성 훈련 장치(100)는 피훈련자(300)의 행동정보를 입력받아, 피훈련자(300)의 상태를 분석하고, 이에 의거하여 훈련 모드를 결정한다. 결정된 훈련 모드에 따라 로봇(200)을 제어함으로써 피훈련자(300)를 훈련시킨다. 사회성 훈련 장치(100)에 관련된 상세한 사항은 후술한다.The social training apparatus 100 receives the behavior information of the trainee 300, analyzes the state of the trainee 300, and determines the training mode based on the training information. The trainee 300 is trained by controlling the robot 200 according to the determined training mode. Details related to the sociality training device 100 will be described later.
로봇(200)은 사회성 훈련 장치(100)의 제어를 받아 다양한 행동을 하는 역할을 한다. 일 실시예에서 로봇(200)은 인간과 같은 외관과 형태를 가지고 두 발로 걸을 수 있으며 두 팔이 달려 있어 손으로 뭔가를 조작할 수 있는 기능을 갖는다. 즉 로봇(200)은 휴머노이드 로봇일 수 있으며 복수의 관절을 가지고 있을 수 있다. 휴머노이드 로봇은 각 관절을 조절하는 것에 의해 팔 다리 등의 부위의 자세를 바꿀 수도 있다. 또한 걸음을 걷거나 물건을 던지거나 공을 차는 등의 동작을 할 수도 있다. 다른 실시예에서 로봇(200)은 로봇 구성요소의 일부만 휴머노이드 구성을 할 수도 있다. 예를 들어 상체 부위는 인간의 외관과 형태이나, 하체 부위는 바퀴를 연결하여 이동의 편이성을 용이하게 할 수도 있다. The robot 200 performs a variety of actions under the control of the sociality training device 100. In one embodiment, the robot 200 has a human-like appearance and shape, and can walk with two feet, and has two arms, and has a function of manipulating something with a hand. That is, the robot 200 may be a humanoid robot and may have a plurality of joints. The humanoid robot can also change the posture of parts such as arms and legs by adjusting each joint. You can also walk, throw things or kick a ball. In another embodiment, the robot 200 may be a humanoid configuration of only a part of the robot component. For example, the upper body part is the appearance and shape of the human body, but the lower body part may be connected to the wheel to facilitate the ease of movement.
피훈련자(300)는 로봇(200)이 동작을 표현하는 대상이 된다. 피훈련자(300)는 사회성에 장애가 있는 사람일 수 있으며, 사회성의 향상을 원하는 사람일 수도 있다. 예를 들어, 사회성과 의사소통 능력이 부족한 자폐아동일 수 있다. 그러나 특별히 피훈련자(300)에 제한을 두지는 아니한다. Trainee 300 is the object that the robot 200 represents the operation. Trainee 300 may be a person with a social disability, may be a person who wants to improve sociality. For example, autistic children may lack social skills and communication skills. However, there is no particular limitation on the trainee 300.
도 2는 본 발명의 일 실시예에 따른, 사회성 훈련 장치(100)의 내부 구성을 나타낸 도면이다. 사회성 훈련 장치(100)는 입력부(110), 피훈련자 상태 분석부(120), 훈련모드 결정부(130), 로봇 제어부(140)를 포함할 수 있다.2 is a diagram showing the internal configuration of the social training apparatus 100, according to an embodiment of the present invention. The sociality training apparatus 100 may include an input unit 110, a trainee state analysis unit 120, a training mode determiner 130, and a robot controller 140.
입력부(110)는 피훈련자(300)의 행동정보를 입력 받는 역할을 한다. 일 실시예에서, 피훈련자(300)의 행동정보는 피훈련자(300)의 EEG, 음성정보, 영상정보, 접촉정보일 수 있다. EEG는 뇌전도(electroencephalogram)이며 사람 또는 동물의 대뇌에 일어나는 전위변동, 또는 그것에 의하여 일어나는 뇌전류(brain current)를 두피(頭皮) 상에서 유도하여 기록한 전기기록도를 일컫는다. R. Caton(1875)이 토끼, 원숭이 등의 대뇌피질 전위변동을 관찰하면서 시작되어 H. Berger(1929)가 생물체의 두피상 유도를 성공하면서 활발한 연구가 진행되었다. 이것에 대한 뇌파를 베르거리듬이라고 부르게 되었다. 두피 상에서 유도되는 것은 수십㎶ 정도이며 이것을 증폭시켜 기록한다. 그 파형은 잡음과 같이 불규칙한 것이며 10Hz 전후의 약간 규칙적인 성분, 즉 α파(진폭이 크고 느린 파장)에 13~25Hz의 불규칙한 진폭이 작은 성분인 β파가 섞여 있다. 감각자극 또는 정신활동, 예를 들면 암산할 때에는 α파가 소실된다. 수면, 마취시 등에는 파의 변동이 크고 완만하여 δ파라고 한다. 또한 수면시나 간질(癲澗) 등에는 각각 특이한 파형이 나타난다. 어떠한 메커니즘으로 이러한 파가 여러 형을 나타내는지에 관해서는 여러 주장이 있으나 일치하지 않는다. 그러나 일실시예에서, 행복, 놀램, 분노, 공포, 슬픔, 혐오의 6가지의 경우 각각의 주파수 특성을 나타내는 EEG를 나타내며, 주파수 특성을 통하여 피훈련자(300)의 감정 상태(즐거움정보 및 흥분정보)를 6가지로 구분할 수 있다. 또한 피훈련자(300)가 특정 행동을 하는 경우에도 고유의 EEG가 발생할 수 있다. 예를 들어, 로봇(200)에 피훈련자(300)가 다가가는 경우와 도망가는 경우는 로봇(200)과의 친밀도에서의 차이가 긍정적인 경우와 부정적인 경우이며, EEG에서 표현될 수 있다. 또한, 주파수 영역을 통하여 다양하게 표현될 수도 있다.The input unit 110 serves to receive behavior information of the trainee 300. In one embodiment, the behavior information of the trainee 300 may be the EEG, voice information, image information, contact information of the trainee 300. EEG is an electroencephalogram and refers to an electrical recording diagram obtained by inducing a potential change in the cerebrum of a human or an animal, or a brain current caused by it on the scalp. R. Caton (1875) began to observe changes in the cortical potential of rabbits and monkeys, and H. Berger (1929) succeeded in inducing the scalp of living organisms. The brainwaves for this were called verging. What is induced on the scalp is several tens of microseconds, and amplified and recorded. The waveform is irregular, like noise, and has a slightly regular component around 10 Hz, i.e., the α wave (large amplitude and slower wavelength), and the β wave, which is a component with a small irregular amplitude of 13 to 25 Hz. When sensory stimulation or mental activity, such as mental arithmetic, alpha waves are lost. During sleep and anesthesia, wave fluctuations are large and gentle and are called δ waves. In addition, during sleep, epilepsy, etc., a distinctive waveform appears. There are several arguments as to what mechanism these waves represent, but they are inconsistent. However, in one embodiment, six cases of happiness, surprise, anger, fear, sadness, and disgust represent EEG representing respective frequency characteristics, and the emotional state (pleasure information and excitement information) of the trainee 300 through the frequency characteristics. ) Can be divided into six types. In addition, even when the trainee 300 performs a specific action, a unique EEG may occur. For example, when the trainee 300 approaches the robot 200 and runs away, the difference in intimacy with the robot 200 is a positive case and a negative case, and may be expressed in the EEG. In addition, it may be variously expressed through the frequency domain.
음성 정보는 피훈련자(300)가 발생시키는 음파에 관련된 정보이다. 일 실시예에서, EEG와 마찬가지로 음성정보 또한 실험을 통하여 행복, 놀램, 분노, 공포, 슬픔, 혐오의 6가지의 경우 각각의 주파수 특성을 나타내며, 주파수 특성을 통하여 피훈련자(300)의 감정 상태를 6가지로 구분할 수 있다. The voice information is information related to sound waves generated by the trainee 300. In one embodiment, like the EEG, the voice information also represents the frequency characteristics of each of the six cases of happiness, surprise, anger, fear, sadness and disgust through experiments, and the emotional state of the trainee 300 is represented through the frequency characteristics. It can be divided into six types.
입력부(110)는 실시예에 따라, 피훈련자(300)의 전술한 정보들을 직접 획득할 수도 있고, 간접적으로 획득할 수 있다. 직접 획득하는 경우는 사회성 훈련 장치(100) 내부에 각종 정보를 수집할 수 있는 센서가 포함된 경우이다. 일 실시예에서, 입력부는 영상 정보를 획득하기 위하여 깊이 측정 센서(Depth Sensor)를 직접 이용할 수 있다. 깊이 측정 센서는 CMOS 센서와 결합된 적외선 레이저 프로젝터(Infrared laser projector)로 구성될 수 있으며 하나의 카메라에서 무수한 적외선 빔을 쏘고 반사되는 적외선을 통하여 어떠한 밝기 조건에서도 3차원을 감지할 수 있다. 가로와 세로 방향뿐만 아니라 센서와 멀고 가까운 거리까지 감지해 온몸이 어디 있는지 어떻게 행동하는지, 파악할 수 있다. 적외선 카메라로 깊이 영상을 얻어내는 방법은 Time of Flight 방식이 일반적이나, 패턴(Structured light)을 대상물에 투영하고 스테레오 매칭(Stereo matching)을 통해 깊이를 계산할 수도 있다. 다른 실시예에서 Stream sequence에서 Blob labeling 기법을 이용하여 거리 단차가 발생하는 동적 물체에 대하여 피훈련자(300)를 검출하며, 공간상의 위치를 추정하고 각 동적 물체에 대하여 id를 부여할 수도 있다. 다른 실시예에서 이러한 깊이 측정 센서는 로봇(200) 내부에 포함되어 로봇(200)이 정보를 취득하고, 입력부(110)로 전송하게 할 수도 있다. 또 다른 실시예에서, 피훈련자(300)가 내는 소리를 수집하는 음성 센서를 직접 또는 간접적으로 사용할 수도 있다. 또 다른 실시예에서, 피훈련자(300)가 로봇(200)을 접촉하는 내용을 입력하기 위한 터치 센서를 로봇(200)을 통해서 사용할 수도 있을 것이다. 터치 센서는 싱글(single) 터치뿐 아니라 멀터(multi) 터치의 감지도 가능하다. 터치 입력의 형식은 터치한 점의 위치나 새로운 점, 이동한 점, 놓은(release) 점 등의 상태와 같은 것이거나, 탭(tab), 더블 탭(double tab), 패닝(panning), 플리킹(Flicking), 드래그 앤드 드랍(drag and drop), 핀칭(Pinching) 및 스트레칭(Stretching) 등의 터치를 통한 몇 가지 졔스쳐와 같은 것일 수 있다. 또한 터치의 면적과 터치의 시간 등을 측정할 수 있음은 물론이다. According to an embodiment, the input unit 110 may directly or indirectly obtain the above-described information of the trainee 300. The direct acquisition is a case in which a sensor capable of collecting various types of information is included in the social training apparatus 100. In an embodiment, the input unit may directly use a depth sensor to acquire image information. The depth measurement sensor can be configured as an infrared laser projector combined with a CMOS sensor and can shoot three dimensions in any brightness condition through the infrared rays that shoot and reflect countless infrared beams from one camera. In addition to the horizontal and vertical orientations, the sensor can detect distances and distances from the sensor to determine where and how your body behaves. A method of obtaining a depth image using an infrared camera is generally a time of flight method, but it is also possible to calculate a depth by projecting a patterned light onto an object and performing stereo matching. In another embodiment, the trainee 300 may be detected for a dynamic object having a distance step by using a blob labeling technique in a stream sequence, and the position in space may be estimated and an id may be assigned to each dynamic object. In another embodiment, the depth measuring sensor may be included in the robot 200 to allow the robot 200 to acquire information and transmit the information to the input unit 110. In another embodiment, the voice sensor for collecting the sound generated by the trainee 300 may be used directly or indirectly. In another embodiment, the trainee 300 may use a touch sensor through the robot 200 to input the content of contacting the robot 200. The touch sensor can detect not only single touch but also multi-touch. The format of the touch input is the position of a touched point or the state of a new point, a moved point or a released point, or a tab, double tab, panning or flicking. It can be something like a few touches through touch, such as Flicking, drag and drop, Pinching, and Stretching. In addition, the area of the touch and the time of the touch can be measured.
피훈련자 상태 분석부(120)는 입력받은 피훈련자(300)의 행동정보를 흥분정보(Arousal), 즐거움정보(Pleasure), 친밀도정보(Intimacy) 중 어느 하나로 분류하여 상기 피훈련자(300)의 상태를 분석하는 역할을 한다. 일 실시예에서, 피훈련자 상태 분석부(120)는 피훈련자(300)의 행동정보를 정규화하여 -1 내지+1의 흥분정보(Arousal), 즐거움정보(Pleasure), 친밀도정보(Intimacy)값을 취하도록 변환시킬 수 있다. Trainee status analysis unit 120 classifies the received behavior information of the trainee 300 into any one of the excitement information (Arousal), pleasure information (Pleasure), intimacy information (Intimacy) state of the trainee (300) Analyze the role. In an embodiment, the trainee state analyzer 120 normalizes the behavior information of the trainee 300 to obtain an arousal, pleasure information, intimacy value of −1 to +1. Can be converted to take.
전술한 입력 정보의 경우 EEG의 파형, 음성정보의 파형, 영상정보(얼굴정보)가 즐거움정보로 계산될 수 있다.In the case of the aforementioned input information, the waveform of the EEG, the waveform of the voice information, and the image information (face information) may be calculated as pleasure information.
영상정보(속도정보)는 흥분정보로 계산될 수 있다. 구체적으로 영상정보(속도정보)는 피훈련자 몸의 움직임 속도, 손의 움직임 속도, 머리의 움직임 속도가 있을 수 있다. Image information (speed information) may be calculated as excitation information. In more detail, the image information (speed information) may include a movement speed of a trainee's body, a movement speed of a hand, and a movement speed of a head.
로봇(200)과 피훈련자(300)와의 거리정보, 자세정보, 얼굴정보, 접촉정보는 친밀도정보로 계산될 수 있다. 일 실시예에서, 피훈련자(300)의 즐거움정보와 흥분정보에 기반하여 피훈련자의 감정정보를 분석할 수 있다. 감정이 긍정적인지 긍정적이지 아닌지의 기준을 즐거움정보라는 기준으로 먼저 평가하고, 그 정도가 어느 정도인지를 흥분정보로써 파악하게 된다. 전술한 행복, 놀램, 분노, 공포, 슬픔, 혐오의 6가지 정보들을 각각의 즐거움정보와 흥분정보로써 수치를 할당할 수 있으며, 이에 따라 감정들은 각각의 수치화된 감정 정보로써 표현이 된다. 친밀도정보는 로봇(200)의 행위를 피훈련자(300)가 얼마나 무시하는 지의 여부로 알 수 있다. 무시하는 정도가 적은 경우에는 친밀도가 높은 경우이고 그 반대의 경우는 친밀도가 낮다고 할 수 있다. 친밀도 역시 그 수준에 따라 수치를 할당할 수 있으며, 이에 따라 각각 수치화될 수 있다. 일 실시예에서, 즐거움정보와 흥분정보 그리고 친밀도정보와의 관계는 도 3a 내지 도 3b의 그래프를 통해서 설명한다. 도 3a는 피훈련자 상태 분석부(120)가 피훈련자(300)의 상태를 분석하는 3차원 그래프를 나타낸 도면이다. 구면 좌표계로 나타내고 r, θ, z의 인자를 갖는다 흥분정보(Arousal), 즐거움정보(Pleasure), 친밀도정보(Intimacy) 매개 변수로 취할 수 있으며, 수학식1과 같은 관계가 성립된다.Distance information, posture information, face information, and contact information between the robot 200 and the trainee 300 may be calculated as intimacy information. In an embodiment, the emotion information of the trainee may be analyzed based on the joy information and excitement information of the trainee 300. The criterion of whether the emotion is positive or not is evaluated first by the criteria of pleasure information, and the degree of the degree of excitement is grasped as excitement information. The above six types of information such as happiness, surprise, anger, fear, sadness and disgust can be assigned to each pleasure information and excitement information, and emotions are expressed as respective numerical information of emotions. The intimacy information can be known by how much the trainee 300 ignores the behavior of the robot 200. If the degree of neglect is small, the degree of intimacy is high, and vice versa. Intimacy can also be assigned a number according to its level, which can then be quantified separately. In one embodiment, the relationship between pleasure information, excitement information, and intimacy information is described through the graphs of FIGS. 3A to 3B. 3A illustrates a three-dimensional graph in which the trainee state analyzer 120 analyzes a state of the trainee 300. It is represented by a spherical coordinate system and has factors of r, θ, and z. It can be taken as parameters of Arousal, Pleasure, and Intimacy, and a relationship as in Equation 1 is established.
수학식 1
Figure PCTKR2012008912-appb-M000001
Equation 1
Figure PCTKR2012008912-appb-M000001
단 0≤r≤1, 0≤θ≤2 π, 0≤z≤1 의 조건을 만족하는 원통형 좌표계이다.However, it is a cylindrical coordinate system which satisfies the conditions of 0≤r≤1, 0≤θ≤2π, and 0≤z≤1.
도 3b를 참조하면 r=1로 일정하다고 가정 하는 경우, z에 따라 원통의 높이가 달라지는 원통형 좌표계를 확인할 수 있다. 후술할 예정이지만, z에 따라서 훈련모드의 영역이 달라질 수 있다.Referring to FIG. 3B, when it is assumed that r = 1 is constant, a cylindrical coordinate system in which the height of the cylinder varies according to z can be confirmed. Although it will be described later, the area of the training mode may vary according to z.
훈련모드 결정부(130)는 상기 피훈련자(300)의 분석 상태에 따라 훈련 모드를 결정하는 역할을 한다. 일 실시예에서, 훈련모드 결정부는 감정정보(즐거움정보, 흥분정보) 또는 친밀도정보에 기반하여 훈련모드를 결정할 수 있다. 훈련 모드는 소정의 설정에 따라 복수의 훈련모드일 수 있다. 구체적으로, 피훈련자를 훈련시키는 제1훈련모드, 피훈련자에게 훈련을 시킬 수 있도록 독려하는 제2훈련모드, 피훈련자를 훈련시키지 않는 제3훈련모드를 포함할 수 있다. 제1훈련모드는 로봇(200)과 피훈련자(300)가 대화하게끔 하는 대화차례지키기(Turn-taking) 모드, 피훈련자(300)와 로봇(200)이 서로 눈을 마주보게 하는 눈맞춤(Eye contact) 모드, 피훈련자(300)의 주의를 집중시키는 공동 관심(Joint attention) 모드, 피훈련자(300)가 로봇(200)을 흉내는 흉내내기(Imitation)모드일 수 있다. 제2훈련모드는 피훈련자(300)의 즐거움정보가 매우 낮은 경우(예를 들어, 피훈련자(300)의 감정 상태가 슬픔인 경우) 피훈련자(300)의 로봇(200)행동 따라하기를 독려하는 독려모드일 수 있다. 제3훈련모드는 일 실시예에서, 흥분정보가 매우 높은 경우(예를 들어, 피훈련자(300)의 돌발 행동이 있는 경우) 로봇(200)의 동작이 정지되는 일시 정지모드일 수 있다. 등이 있을 수 있다. 이에 한정되지 않고, 다양한 훈련모드는 감정정보(즐거움정보, 흥분정보) 또는 친밀도정보에 의해서 결정될 수 있다.The training mode determiner 130 determines a training mode according to the analysis state of the trainee 300. In one embodiment, the training mode determiner may determine the training mode based on emotion information (pleasure information, excitement information) or intimacy information. The training mode may be a plurality of training modes according to a predetermined setting. Specifically, it may include a first training mode for training the trainee, a second training mode for encouraging the trainee to train, and a third training mode for not training the trainee. The first training mode is a conversation-turning mode for allowing the robot 200 and the trainee 300 to communicate, and eye contact for the trainee 300 and the robot 200 to face each other. ) Mode, a joint attention mode for focusing attention of the trainee 300, and an imitation mode in which the trainee 300 mimics the robot 200. In the second training mode, when the pleasure information of the trainee 300 is very low (for example, the emotional state of the trainee 300 is sad), the robot 200 of the trainee 300 is encouraged to follow the action. May be an encouragement mode. The third training mode may be, in one embodiment, a pause mode in which the operation of the robot 200 is stopped when the excitation information is very high (eg, when there is a sudden action of the trainee 300). And the like. Not limited to this, various training modes may be determined by emotion information (pleasure information, excitement information) or intimacy information.
도 4a를 참조하면, 일 실시예에서, 훈련모드는 감정정보(즐거움정보, 흥분정보) 또는 친밀도정보에 의한 훈련모드를 지정될 수 있다. 도 4a는 z=0.5로 고정된 경우를 가정한다. 즐거움정보와 흥분정보의 양에 따라 행복, 놀램, 분노, 슬품, 혐오의 감정을 그래프상으로 표현할 수 있으며, 상당한 양의 즐거움정보와 낮은 흥분정보를 갖는 경우 피훈련자를 훈련시키는 제1훈련모드(회색 영역)로 지정할 수 있다. 피훈련자의 훈련을 위해서는 기본적으로 피훈련자가 긍정적인 안정상태를 유지하게끔 하기 위함이다. 또한, 즐거움정보와 흥분정보의 제곱의 합이 일정범위 이하인 경우는 피훈련자를 훈련시키지 않는 제3훈련모드(점 영역)로 지정할 수 있다. 피훈련자가 극히 부정적인 상태인 경우는 피훈련자가 하고자 하는 일을 하도록 내버려 두는 것을 자폐증 아동의 치료 심리학에서 제시하고 있는 바, 이를 반영하였다. 제1훈련모드와 제3훈련모드 이외의 영역은 피훈련자에게 훈련을 시킬 수 있도록 독려하는 제2훈련모드(빗금 영역)로 지정할 수 있다. 피훈련자가 부정적인 상태에서 긍정적인 상태로 변하도록 독려하기 위함이다.Referring to FIG. 4A, in one embodiment, the training mode may specify a training mode based on emotion information (pleasure information, excitement information) or intimacy information. 4A assumes that z is fixed at 0.5. According to the amount of pleasure information and excitement information, emotions of happiness, surprise, anger, sadness, and disgust can be expressed in a graph, and in the first training mode that trains trainees when they have a considerable amount of pleasure information and low excitation information ( Gray area). In order to train a trainee, it is basically to keep the trainee in a positive stable state. In addition, when the sum of the squares of the enjoyment information and the excitation information is equal to or less than a predetermined range, the third training mode (point area) in which the trainee is not trained can be designated. If the trainee is in a very negative state, the treatment psychology of the autistic child is allowed to reflect what is left to the trainee to do what he / she wants to do. Areas other than the first training mode and the third training mode may be designated as a second training mode (hatched area) that encourages the trainee to train. To encourage the trainee to change from negative to positive.
도 4b 내지 도 4d는 z의 변화에 따른 훈련모드의 변화를 나타낸 도면이다. 도 4b는 z=0인 경우, 도 4c는 z=0.4인 경우, 도 4d는 z=1인 경우를 가정하였다. 친밀도가 높을 수록, 같은 감정 상태에서도 피훈련자가 훈련을 받을 수 있는 한계영역이 넓어지도록 설정될 수 있다. 친밀도가 높을수록 피훈련자가 감당할 수 있는 정서적 수준이 높아지기 때문이다. 4b to 4d are diagrams illustrating a change in training mode according to a change in z. 4B illustrates a case where z = 0, FIG. 4C illustrates a case where z = 0.4, and FIG. 4D illustrates a case where z = 1. The higher the intimacy, the wider the marginal zone in which the trainee can be trained in the same emotional state. The higher the intimacy, the higher the emotional level the trainee can afford.
로봇제어부(140) 상기 결정된 훈련 모드에 따라 로봇(200)을 제어하는 역할을 한다. 일 실시예에서, 훈련모드에 따라 로봇(200)의 동작 패턴이 정의되고, 정의된 동작 패턴을 로봇(200)에 전송하는 방식으로 로봇(200)을 제어하게 된다. 다른 실시예에서는, 동일한 훈련모드라고 할 지라도 훈련모드를 실행하는 로봇(200)의 속도 또는 행동범위가 다른 조건에 의해서 변경될 수도 있다. 예를 들어 흥분정보가 높은 경우, 피훈련자(300)가 좀 더 빠른 반응을 할 수 있기 때문에, 로봇(200)의 속도를 증가시켜 피훈련자(300)와의 훈련을 좀 더 빠르게 진행할 수 있다. 또한 친밀도정보가 높은 경우 로봇(200)의 동작범위는 확대되어, 로봇(200)이 넓은 공간에서 동작할 수 있고 이에 따라 피훈련자(300)의 사회성 훈련 효율이 증대되는 효과를 확보할 수 있다. 친밀도정보가 낮은 경우 로봇(200)의 동작 범위를 축소시켜 특정 공간에 한정된 피훈련자(300)의 사회성 훈련을 할 수 있을 것이다.The robot controller 140 controls the robot 200 according to the determined training mode. In one embodiment, the operation pattern of the robot 200 is defined according to the training mode, and the robot 200 is controlled by transmitting the defined operation pattern to the robot 200. In another embodiment, even in the same training mode, the speed or range of motion of the robot 200 executing the training mode may be changed by other conditions. For example, when the excitation information is high, since the trainee 300 may react more quickly, the training of the trainee 300 may be faster by increasing the speed of the robot 200. In addition, when the intimacy information is high, the operation range of the robot 200 is expanded, so that the robot 200 can operate in a large space, thereby securing the effect of increasing the social training efficiency of the trainee 300. If the intimacy information is low, the robot 200 may be reduced in the operating range of the social training of the trainee 300 limited to a specific space.
도 5는 본 발명의 일 실시예에 따른, 사회성 훈련 방법을 순서도로 나타낸 도면이다. 먼저 피훈련자(300)로부터 입력 정보를 수신한다(S401). 일 실시예에서, 피훈련자(300)의 행동정보는 피훈련자(300)의 EEG, 음성정보, 영상정보, 접촉정보일 수 있다. 수신한 입력정보를 통해 피훈련자(300)의 상태를 분석한다(S402). 일 실시예에서 피훈련자(300)의 상태 분석은 피훈련자(300)의 행동정보를 흥분정보, 즐거움정보, 친밀도정보 중 어느 하나로 분류하여 상기 피훈련자의 상태를 분석할 수 있다. 그 후 분석된 피훈련자(300)의 상태에 따라 훈련 모드를 결정한다(S403). 일 실시예에서 훈련 모드의 결정은 감정정보(즐거움정보와 흥분정보) 또는 친밀도정보에 기반하여 기반하여 이루어질 수 있다. 훈련모드는 전술한 제1훈련모드, 제2훈련모드, 및 제3훈련모드일 수 있다. 또한, 훈련 모드의 다변화를 주기 위하여 훈련 모드 내에서의 로봇(200)의 속도 및 행동 범위를 결정하게 된다(S404). 일 실시예에서 흥분정보가 높은 경우 로봇(200)의 속도를 증가시키며, 친밀도정보가 높은 경우 행동 범위를 확장할 수 있다. 이러한 조건을 통합하여 로봇(200)을 제어한다(S405). 일 실시예에서 제어신호를 로봇(200)에 전송하여 로봇(200)을 제어할 수 있다. 이를 통해 피훈련자(300)의 사회성을 훈련시키며 사회성을 향상시킬 수 있다.5 is a flowchart illustrating a sociality training method according to an embodiment of the present invention. First, input information is received from the trainee 300 (S401). In one embodiment, the behavior information of the trainee 300 may be the EEG, voice information, image information, contact information of the trainee 300. The state of the trainee 300 is analyzed through the received input information (S402). In an embodiment, the state analysis of the trainee 300 may classify the action information of the trainee 300 into one of excitement information, pleasure information, and intimacy information to analyze the state of the trainee. After that, the training mode is determined according to the analyzed state of the trainee 300 (S403). In one embodiment, the determination of the training mode may be made based on emotion information (enjoyment information and excitement information) or intimacy information. The training mode may be the first training mode, the second training mode, and the third training mode described above. In addition, in order to give a diversification of the training mode, the speed and the behavior range of the robot 200 in the training mode are determined (S404). In one embodiment, when the excitation information is high, the speed of the robot 200 is increased, and when the intimacy information is high, the range of behavior may be extended. The robot 200 is controlled by integrating these conditions (S405). In an embodiment, the robot 200 may be controlled by transmitting a control signal to the robot 200. Through this, the sociality of the trainee 300 can be trained and the sociality can be improved.
이상에서 본 발명이 구체적인 구성요소 등과 같은 특정 사항들과 한정된 실시예 및 도면에 의해 설명되었으나, 이는 본 발명의 보다 전반적인 이해를 돕기 위해서 제공된 것일 뿐, 본 발명이 상기 실시예들에 한정되는 것은 아니며, 본 발명이 속하는 기술분야에서 통상적인 지식을 가진 자라면 이러한 기재로부터 다양한 수정 및 변형을 꾀할 수 있다.Although the present invention has been described by specific embodiments such as specific components and the like, but the embodiments and the drawings are provided to assist in a more general understanding of the present invention, the present invention is not limited to the above embodiments. For those skilled in the art, various modifications and variations can be made from these descriptions.
본 발명의 일 측면에 따르면, 자폐 아동과 로봇과의 친밀도 및 다양한 감정들을 분석하게 되는 효과가 있다. According to an aspect of the present invention, there is an effect that analyzes the affinity and a variety of emotions between autistic children and robots.
본 발명의 일 측면에 따르면, 자폐 아동의 행동에 따라 능동적으로 로봇을 제어함으로써 종래 기술에 비하여 효율적으로 자폐 아동의 사회성을 훈련시킬 수 있는 효과가 있다. According to an aspect of the present invention, by actively controlling the robot according to the behavior of the child with autism, there is an effect that can effectively train the sociality of the child with autism compared to the prior art.

Claims (18)

  1. 피훈련자의 행동정보를 입력받는 입력부;An input unit to receive behavior information of a trainee;
    상기 입력받은 피훈련자의 행동정보를 분석하여 상기 피훈련자의 행동정보를 흥분정보, 즐거움정보, 친밀도정보 중 어느 하나로 분류하여 상기 피훈련자의 상태를 분석하는 피훈련자 상태 분석부;A trainee status analysis unit for analyzing the trainee's behavior information and classifying the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's status;
    상기 피훈련자의 분석 상태에 따라 훈련 모드를 결정하는 훈련 모드 결정부;A training mode determiner configured to determine a training mode according to an analysis state of the trainee;
    상기 결정된 훈련모드에 따라 로봇을 제어하는 로봇 제어부를 포함하는 사회성 훈련 장치.And a robot controller for controlling the robot according to the determined training mode.
  2. 제 1항에 있어서,The method of claim 1,
    상기 입력부는, 상기 피훈련자의 EEG, 음성정보, 영상정보, 접촉정보 중 적어도 어느 하나를 입력받는, 사회성 훈련 장치.The input unit, the social training device that receives at least one of the EEG, voice information, image information, contact information of the trainee.
  3. 제 1항에 있어서,The method of claim 1,
    상기 피훈련자 상태 분석부는, 상기 즐거움정보와 흥분정보에 기반하여 피훈련자의 감정정보를 분석하는, 사회성 훈련 장치.The trainee state analysis unit, the sociality training device for analyzing the emotion information of the trainees based on the pleasure information and excitement information.
  4. 제3항에 있어서,The method of claim 3,
    상기 감정정보는 행복, 놀램, 불안, 분노, 슬픔, 만족의 여섯가지 정보를 포함하는, 사회성 훈련 장치.The emotion information includes six kinds of information of happiness, surprise, anxiety, anger, sadness, satisfaction.
  5. 제 3항에 있어서,The method of claim 3,
    상기 훈련 모드 결정부는, 상기 감정정보 또는 친밀도정보에 기반하여 상기 피훈련자의 훈련모드를 결정하는, 사회성 훈련 장치.The training mode determining unit determines the training mode of the trainee based on the emotion information or intimacy information.
  6. 제 5항에 있어서,The method of claim 5,
    상기 피훈련자의 훈련모드는 피훈련자를 훈련시키는 제1훈련모드, 피훈련자에게 훈련을 시킬 수 있도록 독려하는 제2훈련모드, 피훈련자를 훈련시키지 않는 제3훈련모드를 포함하는, 사회성 훈련 장치.The training mode of the trainee includes a first training mode for training the trainee, a second training mode for encouraging the trainee to train, and a third training mode for not training the trainee.
  7. 제 6항에 있어서,The method of claim 6,
    상기 훈련 모드 결정부는, 상기 친밀도정보에 비례하여 상기 제1훈련모드의 범위를 확장시키는, 사회성 훈련 장치.The training mode determination unit, to expand the range of the first training mode in proportion to the intimacy information, social training apparatus.
  8. 제 1항에 있어서,The method of claim 1,
    상기 로봇 제어부는, 상기 훈련모드에 따른 로봇의 속도를 상기 흥분정보에 기반하여 결정하는, 사회성 훈련 장치.The robot controller determines the speed of the robot according to the training mode based on the excitation information.
  9. 제 1항에 있어서,The method of claim 1,
    상기 로봇 제어부는, 상기 훈련모드에 따른 로봇의 행동범위를 상기 친밀도정보에 기반하여 결정하는, 사회성 훈련 장치.The robot controller determines a range of behavior of the robot according to the training mode based on the intimacy information.
  10. 피훈련자의 행동정보를 입력받는 단계;Receiving behavior information of the trainee;
    상기 입력받은 피훈련자의 행동정보를 분석하여 상기 피훈련자의 행동정보를 흥분정보, 즐거움정보, 친밀도정보 중 어느 하나로 분류하여 상기 피훈련자의 상태를 분석하는 단계; 및Analyzing the trainee's behavior information to classify the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's state; And
    상기 피훈련자의 분석 상태에 따라 훈련 모드를 결정하는 단계;Determining a training mode according to an analysis state of the trainee;
    상기 결정된 훈련 모드에 따라 로봇을 제어하는 단계를 포함하는 사회성 훈련 방법.And controlling the robot according to the determined training mode.
  11. 제 10항에 있어서,The method of claim 10,
    상기 피훈련자의 행동정보를 입력받는 단계는, 상기 피훈련자의 EEG, 음성정보, 영상정보, 접촉정보 중 적어도 어느 하나를 입력받는 단계를 포함하는, 사회성 훈련 방법.The receiving of the training information of the trainee includes receiving at least one of the trainee's EEG, audio information, image information, and contact information.
  12. 제 10항에 있어서,The method of claim 10,
    상기 피훈련자의 상태를 분석하는 단계는 상기 즐거움정보와 흥분정보에 기반하여 피훈련자의 감정정보를 분석하는 단계를 더 포함하는, 사회성 훈련 방법.The analyzing of the trainee's state further includes analyzing emotion information of the trainee based on the pleasure information and excitement information.
  13. 제 11항에 있어서,The method of claim 11,
    상기 감정정보는 행복, 놀램, 분노, 공포, 슬픔, 혐오의 여섯가지 정보를 포함하는, 사회성 훈련 방법.The emotion information includes six kinds of information: happiness, surprise, anger, fear, sadness, disgust.
  14. 제 12항에 있어서,The method of claim 12,
    상기 훈련 모드를 결정하는 단계는, 상기 감정정보 또는 친밀도정보에 기반하여 상기 피훈련자의 훈련모드를 결정하는 단계를 포함하는, 사회성 훈련 방법.The determining of the training mode may include determining a training mode of the trainee based on the emotion information or the intimacy information.
  15. 제 14항에 있어서,The method of claim 14,
    상기 피훈련자의 훈련모드는 피훈련자를 훈련시키는 제1훈련모드, 피훈련자에게 훈련을 시킬 수 있도록 독려하는 제2훈련모드, 피훈련자를 훈련시키지 않는 제3훈련모드를 포함하는, 사회성 훈련 방법.The training mode of the trainee includes a first training mode for training the trainee, a second training mode for encouraging the trainee to train, and a third training mode for not training the trainee.
  16. 제 15항에 있어서,The method of claim 15,
    상기 훈련 모드를 결정하는 단계는,Determining the training mode,
    상기 친밀도정보에 비례하여 상기 제1훈련모드의 범위를 확장시키는 단계를 더 포함하는, 사회성 훈련 방법.And extending the range of the first training mode in proportion to the intimacy information.
  17. 제 10항에 있어서,The method of claim 10,
    상기 로봇을 제어하는 단계는, 상기 훈련모드에 따른 로봇의 속도를 상기 흥분정보에 기반하여 결정하는, 사회성 훈련 방법.The controlling of the robot may include determining a speed of the robot according to the training mode based on the excitation information.
  18. 제 10항에 있어서,The method of claim 10,
    상기 로봇을 제어하는 단계는, 상기 훈련모드에 따른 로봇의 행동범위를 상기 친밀도정보에 기반하여 결정하는, 사회성 훈련 방법.The controlling of the robot may include determining a range of behavior of the robot according to the training mode based on the familiarity information.
PCT/KR2012/008912 2012-06-07 2012-10-29 Apparatus for training sociability and a method therefor WO2013183822A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2012-0060697 2012-06-07
KR1020120060697A KR20130137262A (en) 2012-06-07 2012-06-07 Sociability training apparatus and method thereof

Publications (1)

Publication Number Publication Date
WO2013183822A1 true WO2013183822A1 (en) 2013-12-12

Family

ID=49712196

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2012/008912 WO2013183822A1 (en) 2012-06-07 2012-10-29 Apparatus for training sociability and a method therefor

Country Status (2)

Country Link
KR (1) KR20130137262A (en)
WO (1) WO2013183822A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107283389A (en) * 2017-08-31 2017-10-24 李景龙 Robot for auxiliary treatment self-closing disease
CN109623816A (en) * 2018-12-19 2019-04-16 中新智擎科技有限公司 A kind of robot recharging method, device, storage medium and robot

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070110158A (en) * 2006-05-12 2007-11-16 고려대학교 산학협력단 Apparatus for feedback according to emotional state, method thereof and recording medium thereof
KR20090079655A (en) * 2008-01-18 2009-07-22 주식회사 케이티 Method of executing robot behavior based on user behavior and its robot
KR101016381B1 (en) * 2009-01-19 2011-02-21 한국과학기술원 The emotion expression robot which can interact with human
KR101061771B1 (en) * 2009-11-30 2011-09-05 재단법인대구경북과학기술원 Robot and Robot Control System for Play Therapy

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070110158A (en) * 2006-05-12 2007-11-16 고려대학교 산학협력단 Apparatus for feedback according to emotional state, method thereof and recording medium thereof
KR20090079655A (en) * 2008-01-18 2009-07-22 주식회사 케이티 Method of executing robot behavior based on user behavior and its robot
KR101016381B1 (en) * 2009-01-19 2011-02-21 한국과학기술원 The emotion expression robot which can interact with human
KR101061771B1 (en) * 2009-11-30 2011-09-05 재단법인대구경북과학기술원 Robot and Robot Control System for Play Therapy

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107283389A (en) * 2017-08-31 2017-10-24 李景龙 Robot for auxiliary treatment self-closing disease
CN109623816A (en) * 2018-12-19 2019-04-16 中新智擎科技有限公司 A kind of robot recharging method, device, storage medium and robot

Also Published As

Publication number Publication date
KR20130137262A (en) 2013-12-17

Similar Documents

Publication Publication Date Title
US20190108770A1 (en) System and method of pervasive developmental disorder interventions
Mousavi Hondori et al. A review on technical and clinical impact of microsoft kinect on physical therapy and rehabilitation
EP4079383A2 (en) Method and system for interacting with a virtual environment
Sucar et al. Gesture therapy: A vision-based system for upper extremity stroke rehabilitation
CN109620185A (en) Self-closing disease assistant diagnosis system, equipment and medium based on multi-modal information
US20130158883A1 (en) Intention conveyance support device and method
Petric et al. Four tasks of a robot-assisted autism spectrum disorder diagnostic protocol: First clinical tests
Özcan et al. Transitional wearable companions: a novel concept of soft interactive social robots to improve social skills in children with autism spectrum disorder
KR101307783B1 (en) sociability training apparatus and method thereof
WO2018084170A1 (en) Autonomous robot that identifies persons
JP2009101057A (en) Biological information processing apparatus, biological information processing method and program
KR102355511B1 (en) VR-Based Recognition Training System
Dias et al. Designing better spaces for people: virtual reality and biometric sensing as tools to evaluate space use
Cheng et al. Mind-Wave controlled robot: An Arduino Robot simulating the wheelchair for paralyzed patients
KR20210076561A (en) Recognition Training System For Preventing Dementia Using Virtual Reality Contents
WO2013183822A1 (en) Apparatus for training sociability and a method therefor
Singh et al. Physiologically attentive user interface for robot teleoperation: real time emotional state estimation and interface modification using physiology, facial expressions and eye movements
Mihelj et al. Emotion-aware system for upper extremity rehabilitation
JP7107017B2 (en) Robot, robot control method and program
KR20200071236A (en) Timing feedback system based on multi-sensor with adjusting intensity of exercise
Mishra et al. Does elderly enjoy playing bingo with a robot? a case study with the humanoid robot nadine
AlZoubi et al. Affect-aware assistive technologies
De Silva et al. A Multi-agent Based Interactive System Towards Child's Emotion Performances Quantified Through Affective Body Gestures
Cipresso et al. Contactless bio-behavioral technologies for virtual reality
Popescu et al. A critical analysis of the technologies used for development of applications dedicated to persons with autism spectrum disorder

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12878476

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12878476

Country of ref document: EP

Kind code of ref document: A1