KR20130137262A - Sociability training apparatus and method thereof - Google Patents

Sociability training apparatus and method thereof Download PDF

Info

Publication number
KR20130137262A
KR20130137262A KR1020120060697A KR20120060697A KR20130137262A KR 20130137262 A KR20130137262 A KR 20130137262A KR 1020120060697 A KR1020120060697 A KR 1020120060697A KR 20120060697 A KR20120060697 A KR 20120060697A KR 20130137262 A KR20130137262 A KR 20130137262A
Authority
KR
South Korea
Prior art keywords
information
trainee
training mode
training
robot
Prior art date
Application number
KR1020120060697A
Other languages
Korean (ko)
Inventor
최종석
박성기
김동환
강해용
Original Assignee
한국과학기술연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국과학기술연구원 filed Critical 한국과학기술연구원
Priority to KR1020120060697A priority Critical patent/KR20130137262A/en
Priority to PCT/KR2012/008912 priority patent/WO2013183822A1/en
Publication of KR20130137262A publication Critical patent/KR20130137262A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass

Abstract

Disclosed are a social training apparatus and a method thereof. An input unit to receive behavior information of a trainee; A trainee status analysis unit for analyzing the trainee's behavior information and classifying the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's status; A training mode determiner configured to determine a training mode according to an analysis state of the trainee; Disclosed is a social training apparatus including a robot controller for controlling a robot according to the determined training mode.

Description

SOCIABILITY TRAINING APPARATUS AND METHOD THEREOF}

The embodiment relates to a social training apparatus and a method thereof, and more particularly, to an apparatus and a method for training the sociality of autistic children.

Autistic children are characterized by the lack of self-initiated interactions, turn-taking, imitation, emotion recognition, joint attention, and eye contact, which are basic functions for social and communication. In other words, cognitive development is immature and lacks sociality. Autistic children with these characteristics are difficult to socialize as adults. Without proper treatment at the right time, social interaction with people becomes impossible. In recent years, early detection and behavioral treatment of autism has been actively conducted in university hospitals and research institutes, and treatment has been carried out using robots (toys) that are not more complicated than human emotions. Recent research trends also suggest that robots can draw attention to children with autism and that their concentration has improved.

Patent 10-1061171 discloses a robot and a robot control system for play therapy. A robot that outputs visual, auditory and tactile and behavioral stimuli for play therapy; A mobile terminal which receives a child's response according to the stimulus output by the operator as response information and transmits it to the outside, and transmits a control command to the robot to output a stimulus according to the request of the operator; Receives response information transmitted from the mobile terminal and determines whether the robot outputs a stimulus, whether it is preferred, unfavorable, or unresponsive, and selects the next stimulus according to the preference, non-preferred or non-response of the stimulus, and then selects the next stimulus. Disclosed is a robot control system for play therapy, characterized in that it comprises a server for transmitting a control command to the robot. However, it was only possible to analyze the one-dimensional behaviors of the children's behaviour, preference, non-response and non-response, and the children's behavior could not be trained by analyzing the children's behavior from various angles.

Patent 10-1061771

According to an aspect of the present invention, the behavior of autistic children can be analyzed in multiple angles.

According to an aspect of the present invention, by controlling the robot in accordance with the behavior of the child with autism, it is possible to train sociality actively and efficiently.

According to an aspect of the invention, the input unit for receiving the behavior information of the trainee; A trainee status analysis unit for analyzing the trainee's behavior information and classifying the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's status; A training mode determiner configured to determine a training mode according to an analysis state of the trainee; There is provided a social training apparatus including a robot controller for controlling a robot according to the determined training mode.

According to another aspect of the invention, the input unit, there is provided a social training apparatus that receives at least one of the EEG, voice information, image information, contact information of the trainee.

According to another aspect of the present invention, the trainee state analysis unit is provided, the sociality training device for analyzing the emotion information of the trainees based on the enjoyment information and excitement information.

According to another aspect of the invention, the emotional information is provided with a social training apparatus, including six information of happiness, surprise, anxiety, anger, sadness, satisfaction.

According to another aspect of the present invention, the training mode determination unit is provided, the sociality training device for determining the training mode of the trainees based on the emotion information or intimacy information.

According to another aspect of the invention, the training mode of the trainee is a first training mode to train the trainees, a second training mode to encourage the trainees to train, a third training mode not training the trainees Including a social training device is provided.

According to another aspect of the present invention, the training mode determination unit is provided with a social training apparatus, which extends the range of the first training mode in proportion to the intimacy information.

According to another aspect of the present invention, the robot control unit is provided, the social training apparatus for determining the speed of the robot according to the training mode based on the excitation information.

According to another aspect of the present invention, the robot control unit is provided, the social training apparatus for determining the behavior range of the robot according to the training mode based on the familiarity information.

According to another aspect of the invention, the step of receiving the behavior information of the trainee; Analyzing the trainee's behavior information to classify the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's state; And determining a training mode according to an analysis state of the trainee. There is provided a social training method comprising the step of controlling the robot according to the determined training mode.

According to another aspect of the invention, the step of receiving the training information of the trainee, the social training method comprising the step of receiving at least one of the EEG, voice information, image information, contact information of the trainee Is provided.

According to another aspect of the present invention, analyzing the trainee's state further includes analyzing the trainee's emotion information based on the enjoyment information and the excitement information.

According to another aspect of the present invention, analyzing the trainee's state further includes analyzing the trainee's emotion information based on the enjoyment information and the excitement information.

According to another aspect of the present invention, the emotional information is provided with a sociality training method, including six information of happiness, surprise, anger, fear, sadness, disgust.

According to another aspect of the present invention, the determining of the training mode comprises providing a training mode of the trainee based on the emotion information or the intimacy information.

According to another aspect of the invention, the training mode of the trainee is a first training mode to train the trainees, a second training mode to encourage the trainees to train, a third training mode not training the trainees Including a social training method is provided.

According to another aspect of the present invention, the determining of the training mode further includes extending the range of the first training mode in proportion to the intimacy information.

According to another aspect of the invention, the step of controlling the robot, there is provided a social training method for determining the speed of the robot according to the training mode based on the excitation information.

According to another aspect of the invention, the step of controlling the robot, there is provided a social training method for determining the behavior range of the robot according to the training mode based on the familiarity information.

According to an aspect of the present invention, there is an effect that analyzes the affinity and a variety of emotions between autistic children and robots.

According to an aspect of the present invention, by actively controlling the robot according to the behavior of the child with autism, there is an effect that can effectively train the sociality of the child with autism compared to the prior art.

1 is a diagram illustrating a relationship between a social training apparatus 100, a robot 200, and an autistic child 300, according to an exemplary embodiment.
2 is a diagram illustrating an internal configuration of the social training apparatus 100 according to an embodiment.
3A and 3B are diagrams illustrating a three-dimensional graph in which a trainee state analyzing unit 120 analyzes a trainee state according to one embodiment.
4A is a diagram illustrating an emotional state and a training mode according to pleasure information and excitement information when z is constant according to an embodiment.
4B to 4D are diagrams illustrating a change in training mode according to a change in z, according to an exemplary embodiment.
5 is a flowchart illustrating a social training method according to an embodiment.

The following detailed description of the invention refers to the accompanying drawings, which illustrate, by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. It should be understood that the various embodiments of the present invention are different, but need not be mutually exclusive. For example, certain features, structures, and characteristics described herein may be implemented in other embodiments without departing from the spirit and scope of the invention in connection with an embodiment. It is also to be understood that the position or arrangement of the individual components within each disclosed embodiment may be varied without departing from the spirit and scope of the invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is to be limited only by the appended claims, along with the full scope of equivalents to which such claims are entitled, if properly explained. In the drawings, like reference numerals refer to the same or similar functions throughout the several views.

1 is a diagram illustrating a relationship between a social training apparatus 100, a robot 200, and a trainee 300 according to an exemplary embodiment of the present invention. The social training apparatus 100 analyzes various behaviors of the trainee 300, for example, a signal for controlling the robot 200 according to the analysis of various behaviors such as position, voice, brain waves, and the behavior of touching the robot 200. To the robot 200. The robot 200 operates based on the received control signal, and the operation of the robot 200 improves the sociality of the trainee 300.

The social training apparatus 100 receives the behavior information of the trainee 300, analyzes the state of the trainee 300, and determines the training mode based on the training information. The trainee 300 is trained by controlling the robot 200 according to the determined training mode. Details related to the sociality training device 100 will be described later.

The robot 200 performs a variety of actions under the control of the sociality training device 100. In one embodiment, the robot 200 has a human-like appearance and shape, and can walk with two feet, and has two arms, and has a function of manipulating something with a hand. That is, the robot 200 may be a humanoid robot and may have a plurality of joints. The humanoid robot can also change the posture of parts such as arms and legs by adjusting each joint. You can also walk, throw things or kick a ball. In another embodiment, the robot 200 may be a humanoid configuration of only a part of the robot component. For example, the upper body part is the appearance and shape of the human body, but the lower body part may be connected to the wheel to facilitate the ease of movement.

Trainee 300 is the object that the robot 200 represents the operation. Trainee 300 may be a person with a social disability, may be a person who wants to improve sociality. For example, autistic children may lack social skills and communication skills. However, there is no particular limitation on the trainee 300.

2 is a diagram showing the internal configuration of the social training apparatus 100, according to an embodiment of the present invention. The sociality training apparatus 100 may include an input unit 110, a trainee state analysis unit 120, a training mode determiner 130, and a robot controller 140.

The input unit 110 serves to receive behavior information of the trainee 300. In one embodiment, the behavior information of the trainee 300 may be the EEG, voice information, image information, contact information of the trainee 300. EEG is an electroencephalogram and refers to an electrical recording diagram obtained by inducing a potential change in the cerebrum of a human or an animal, or a brain current caused by it on the scalp. R. Caton (1875) began to observe changes in the cortical potential of rabbits and monkeys, and H. Berger (1929) succeeded in inducing the scalp of living organisms. The brainwaves for this were called verging. What is induced on the scalp is several tens of microseconds, and amplified and recorded. The waveform is irregular, like noise, and has a slightly regular component around 10 Hz, i.e., the α wave (large amplitude and slower wavelength), and the β wave, which is a component with a small irregular amplitude of 13 to 25 Hz. When sensory stimulation or mental activity, such as mental arithmetic, alpha waves are lost. During sleep and anesthesia, wave fluctuations are large and gentle and are called δ waves. In addition, during sleep, epilepsy, etc., a distinctive waveform appears. There are several arguments as to what mechanism these waves represent, but they are inconsistent. However, in one embodiment, six cases of happiness, surprise, anger, fear, sadness, and disgust represent EEG representing respective frequency characteristics, and the emotional state (pleasure information and excitement information) of the trainee 300 through the frequency characteristics. ) Can be divided into six types. In addition, even when the trainee 300 performs a specific action, a unique EEG may occur. For example, when the trainee 300 approaches the robot 200 and runs away, the difference in intimacy with the robot 200 is a positive case and a negative case, and may be expressed in the EEG. In addition, it may be variously expressed through the frequency domain.

The voice information is information related to sound waves generated by the trainee 300. In one embodiment, like the EEG, the voice information also represents the frequency characteristics of each of the six cases of happiness, surprise, anger, fear, sadness and disgust through experiments, and the emotional state of the trainee 300 is represented through the frequency characteristics. It can be divided into six types.

According to an embodiment, the input unit 110 may directly or indirectly obtain the above-described information of the trainee 300. The direct acquisition is a case in which a sensor capable of collecting various types of information is included in the social training apparatus 100. In an embodiment, the input unit may directly use a depth sensor to acquire image information. The depth measurement sensor can be configured as an infrared laser projector combined with a CMOS sensor and can shoot three dimensions in any brightness condition through the infrared rays that shoot and reflect countless infrared beams from one camera. In addition to the horizontal and vertical orientations, the sensor can detect distances and distances from the sensor to determine where and how your body behaves. A method of obtaining a depth image using an infrared camera is generally a time of flight method, but it is also possible to calculate a depth by projecting a patterned light onto an object and performing stereo matching. In another embodiment, the trainee 300 may be detected for a dynamic object having a distance step by using a blob labeling technique in a stream sequence, and the position in space may be estimated and an id may be assigned to each dynamic object. In another embodiment, the depth measuring sensor may be included in the robot 200 to allow the robot 200 to acquire information and transmit the information to the input unit 110. In another embodiment, the voice sensor for collecting the sound generated by the trainee 300 may be used directly or indirectly. In another embodiment, the trainee 300 may use a touch sensor through the robot 200 to input the content of contacting the robot 200. The touch sensor can detect not only single touch but also multi-touch. The format of the touch input is the position of a touched point or the state of a new point, a moved point or a released point, or a tab, double tab, panning, or flicking. It can be something like a few touches through touch, such as Flicking, drag and drop, Pinching, and Stretching. In addition, the area of the touch and the time of the touch can be measured.

Trainee status analysis unit 120 classifies the received behavior information of the trainee 300 into any one of the excitement information (Arousal), pleasure information (Pleasure), intimacy information (Intimacy) state of the trainee (300) Analyze the role. In an embodiment, the trainee state analyzer 120 normalizes the behavior information of the trainee 300 to obtain an arousal, pleasure information, intimacy value of −1 to +1. Can be converted to take.

In the case of the aforementioned input information, the waveform of the EEG, the waveform of the voice information, and the image information (face information) may be calculated as pleasure information.

Image information (speed information) may be calculated as excitation information. In more detail, the image information (speed information) may include a movement speed of a trainee's body, a movement speed of a hand, and a movement speed of a head.

Distance information, posture information, face information, and contact information between the robot 200 and the trainee 300 may be calculated as intimacy information. In an embodiment, the emotion information of the trainee may be analyzed based on the joy information and excitement information of the trainee 300. The criterion of whether the emotion is positive or not is evaluated first by the criteria of pleasure information, and the degree of the degree of excitement is grasped as excitement information. The above six types of information such as happiness, surprise, anger, fear, sadness and disgust can be assigned to each pleasure information and excitement information, and emotions are expressed as respective numerical information of emotions. The intimacy information can be known by how much the trainee 300 ignores the behavior of the robot 200. If the degree of neglect is small, the degree of intimacy is high, and vice versa. Intimacy can also be assigned a number according to its level, which can then be quantified separately. In one embodiment, the relationship between pleasure information, excitement information, and intimacy information is described through the graphs of FIGS. 3A to 3B. 3A illustrates a three-dimensional graph in which the trainee state analyzer 120 analyzes a state of the trainee 300. It is represented by a spherical coordinate system and has factors of r, θ, and z. It can be taken as parameters of Arousal, Pleasure, and Intimacy.

Figure pat00001

However, it is a cylindrical coordinate system which satisfies the conditions of 0≤r≤1, 0≤θ≤2π, and 0≤z≤1.

Referring to FIG. 3B, when it is assumed that r = 1 is constant, a cylindrical coordinate system in which the height of the cylinder varies according to z can be confirmed. Although it will be described later, the area of the training mode may vary according to z.

The training mode determiner 130 determines a training mode according to the analysis state of the trainee 300. In one embodiment, the training mode determiner may determine the training mode based on emotion information (pleasure information, excitement information) or intimacy information. The training mode may be a plurality of training modes according to a predetermined setting. Specifically, it may include a first training mode for training the trainee, a second training mode for encouraging the trainee to train, and a third training mode for not training the trainee. The first training mode is a conversation-turning mode for allowing the robot 200 and the trainee 300 to communicate, and eye contact for the trainee 300 and the robot 200 to face each other. ) Mode, a joint attention mode for focusing attention of the trainee 300, and an imitation mode in which the trainee 300 mimics the robot 200. In the second training mode, when the pleasure information of the trainee 300 is very low (for example, the emotional state of the trainee 300 is sad), the robot 200 of the trainee 300 is encouraged to follow the action. May be an encouragement mode. The third training mode may be, in one embodiment, a pause mode in which the operation of the robot 200 is stopped when the excitation information is very high (eg, when there is a sudden action of the trainee 300). And so on. Not limited to this, various training modes may be determined by emotion information (pleasure information, excitement information) or intimacy information.

Referring to FIG. 4A, in one embodiment, the training mode may specify a training mode based on emotion information (pleasure information, excitement information) or intimacy information. 4A assumes that z is fixed at 0.5. According to the amount of pleasure information and excitement information, emotions of happiness, surprise, anger, sadness, and disgust can be expressed in a graph. Green area). In order to train a trainee, it is basically to keep the trainee in a positive stable state. In addition, when the sum of the squares of the enjoyment information and the excitation information is equal to or less than a predetermined range, the third training mode (pink region) in which the trainee is not trained can be designated. If the trainee is in a very negative state, the treatment psychology of the autistic child is allowed to reflect what is left to the trainee to do what he / she wants to do. Areas other than the first training mode and the third training mode may be designated as a second training mode (blue area) that encourages the trainee to train. To encourage the trainee to change from negative to positive.

4b to 4d are diagrams illustrating a change in training mode according to a change in z. 4B illustrates a case where z = 0, FIG. 4C illustrates a case where z = 0.4, and FIG. 4D illustrates a case where z = 1. The higher the intimacy, the wider the marginal zone in which the trainee can be trained in the same emotional state. The higher the intimacy, the higher the emotional level the trainee can afford.

The robot controller 140 controls the robot 200 according to the determined training mode. In one embodiment, the operation pattern of the robot 200 is defined according to the training mode, and the robot 200 is controlled by transmitting the defined operation pattern to the robot 200. In another embodiment, even in the same training mode, the speed or range of motion of the robot 200 executing the training mode may be changed by other conditions. For example, when the excitation information is high, since the trainee 300 may react more quickly, the training of the trainee 300 may be faster by increasing the speed of the robot 200. In addition, when the intimacy information is high, the operation range of the robot 200 is expanded, so that the robot 200 can operate in a large space, thereby securing the effect of increasing the social training efficiency of the trainee 300. If the intimacy information is low, the robot 200 may be reduced in the operating range of the social training of the trainee 300 limited to a specific space.

5 is a flowchart illustrating a sociality training method according to an embodiment of the present invention. First, input information is received from the trainee 300 (S401). In one embodiment, the behavior information of the trainee 300 may be the EEG, voice information, image information, contact information of the trainee 300. The state of the trainee 300 is analyzed through the received input information (S402). In an embodiment, the state analysis of the trainee 300 may classify the action information of the trainee 300 into one of excitement information, pleasure information, and intimacy information to analyze the state of the trainee. After that, the training mode is determined according to the analyzed state of the trainee 300 (S403). In one embodiment, the determination of the training mode may be made based on emotion information (enjoyment information and excitement information) or intimacy information. The training mode may be the first training mode, the second training mode, and the third training mode described above. In addition, in order to give a diversification of the training mode, the speed and the behavior range of the robot 200 in the training mode are determined (S404). In one embodiment, when the excitation information is high, the speed of the robot 200 is increased, and when the intimacy information is high, the range of behavior may be extended. The robot 200 is controlled by integrating these conditions (S405). In an embodiment, the robot 200 may be controlled by transmitting a control signal to the robot 200. Through this, the sociality of the trainee 300 can be trained and the sociality can be improved.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, Those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope and spirit of the invention as disclosed in the accompanying claims.

Therefore, the spirit of the present invention should not be limited to the embodiments described above, and all of the equivalents or equivalents of the claims, as well as the claims below, are included in the scope of the spirit of the present invention. I will say.

100: sociality training device
110: input unit
120: Trainee Status Analysis
130: training mode determination unit
140: robot control unit
200: robot
300: Trainee

Claims (18)

An input unit to receive behavior information of a trainee;
A trainee status analysis unit for analyzing the trainee's behavior information and classifying the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's status;
A training mode determiner configured to determine a training mode according to an analysis state of the trainee;
And a robot controller for controlling the robot according to the determined training mode.
The method of claim 1,
The input unit, the social training device that receives at least one of the EEG, voice information, image information, contact information of the trainee.
The method of claim 1,
The trainee state analysis unit, the sociality training device for analyzing the emotion information of the trainees based on the pleasure information and excitement information.
The method of claim 3,
The emotion information includes six kinds of information of happiness, surprise, anxiety, anger, sadness, satisfaction.
The method of claim 3, wherein
The training mode determining unit determines the training mode of the trainee based on the emotion information or intimacy information.
6. The method of claim 5,
The training mode of the trainee includes a first training mode for training the trainee, a second training mode for encouraging the trainee to train, and a third training mode for not training the trainee.
The method according to claim 6,
The training mode determination unit, to expand the range of the first training mode in proportion to the intimacy information, social training apparatus.
The method of claim 1,
The robot controller determines the speed of the robot according to the training mode based on the excitation information.
The method of claim 1,
The robot controller determines a range of behavior of the robot according to the training mode based on the intimacy information.
Receiving behavior information of the trainee;
Analyzing the trainee's behavior information to classify the trainee's behavior information into any of excitement information, pleasure information, and intimacy information to analyze the trainee's state; And
Determining a training mode according to an analysis state of the trainee;
And controlling the robot according to the determined training mode.
The method of claim 10,
The receiving of the training information of the trainee includes receiving at least one of the trainee's EEG, audio information, image information, and contact information.
The method of claim 10,
The analyzing of the trainee's state further includes analyzing emotion information of the trainee based on the pleasure information and excitement information.
12. The method of claim 11,
The emotion information includes six kinds of information: happiness, surprise, anger, fear, sadness, disgust.
13. The method of claim 12,
The determining of the training mode may include determining a training mode of the trainee based on the emotion information or the intimacy information.
The method of claim 14,
The training mode of the trainee includes a first training mode for training the trainee, a second training mode for encouraging the trainee to train, and a third training mode for not training the trainee.
16. The method of claim 15,
Determining the training mode,
And extending the range of the first training mode in proportion to the intimacy information.
The method of claim 10,
The controlling of the robot may include determining a speed of the robot according to the training mode based on the excitation information.
The method of claim 10,
The controlling of the robot may include determining a range of behavior of the robot according to the training mode based on the familiarity information.
KR1020120060697A 2012-06-07 2012-06-07 Sociability training apparatus and method thereof KR20130137262A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020120060697A KR20130137262A (en) 2012-06-07 2012-06-07 Sociability training apparatus and method thereof
PCT/KR2012/008912 WO2013183822A1 (en) 2012-06-07 2012-10-29 Apparatus for training sociability and a method therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120060697A KR20130137262A (en) 2012-06-07 2012-06-07 Sociability training apparatus and method thereof

Publications (1)

Publication Number Publication Date
KR20130137262A true KR20130137262A (en) 2013-12-17

Family

ID=49712196

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120060697A KR20130137262A (en) 2012-06-07 2012-06-07 Sociability training apparatus and method thereof

Country Status (2)

Country Link
KR (1) KR20130137262A (en)
WO (1) WO2013183822A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107283389B (en) * 2017-08-31 2021-03-16 李景龙 Robot for assisting in treating autism
CN109623816A (en) * 2018-12-19 2019-04-16 中新智擎科技有限公司 A kind of robot recharging method, device, storage medium and robot

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070110158A (en) * 2006-05-12 2007-11-16 고려대학교 산학협력단 Apparatus for feedback according to emotional state, method thereof and recording medium thereof
KR100919095B1 (en) * 2008-01-18 2009-09-28 주식회사 케이티 Method of executing robot behavior based on user behavior and its robot
KR101016381B1 (en) * 2009-01-19 2011-02-21 한국과학기술원 The emotion expression robot which can interact with human
KR101061771B1 (en) * 2009-11-30 2011-09-05 재단법인대구경북과학기술원 Robot and Robot Control System for Play Therapy

Also Published As

Publication number Publication date
WO2013183822A1 (en) 2013-12-12

Similar Documents

Publication Publication Date Title
US10593349B2 (en) Emotional interaction apparatus
US10176725B2 (en) System and method of pervasive developmental disorder interventions
TWI658377B (en) Robot assisted interaction system and method thereof
US10258760B1 (en) Computer system for determining a state of mind and providing a sensory-type antidote to a subject
WO2018016461A1 (en) Autonomous-behavior-type robot that understands emotional communication through physical contact
JP4481682B2 (en) Information processing apparatus and control method thereof
Melo et al. Project INSIDE: towards autonomous semi-unstructured human–robot social interaction in autism therapy
CN108290070A (en) Method and system for interacting with virtual environment
KR101307783B1 (en) sociability training apparatus and method thereof
Kotsia et al. Affective gaming: A comprehensive survey
JP2014528806A (en) Emotion control method and apparatus
Özcan et al. Transitional wearable companions: a novel concept of soft interactive social robots to improve social skills in children with autism spectrum disorder
WO2018084170A1 (en) Autonomous robot that identifies persons
Lin et al. An Adaptive Interface Design (AID) for enhanced computer accessibility and rehabilitation
JP2009101057A (en) Biological information processing apparatus, biological information processing method and program
JP6935774B2 (en) Estimating system, learning device, learning method, estimation device and estimation method
KR102525394B1 (en) Platform for Identification of Biomarkers Using Navigation Tasks and Treatments Using Navigation Tasks
Zhao et al. Virtual avatar-based life coaching for children with autism spectrum disorder
KR20130137262A (en) Sociability training apparatus and method thereof
Singh et al. Physiologically attentive user interface for robot teleoperation: real time emotional state estimation and interface modification using physiology, facial expressions and eye movements
Bastos et al. Development of a Socially Assistive Robot Controlled by Emotions Based on Heartbeats and Facial Temperature of Children with Autistic Spectrum Disorder
KR101727941B1 (en) Interaction training device and method, and system thereof
MCMAHAN et al. Adaptive Virtual Environments using Machine Learning and Artificial Intelligence
Costa Affective robotics for socio-emotional development in children with autism spectrum disorders
KR102048546B1 (en) System and Method for rehabilitation training using Virtual reality device

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment