KR20150137683A - System for interaction between robot and its use - Google Patents

System for interaction between robot and its use Download PDF

Info

Publication number
KR20150137683A
KR20150137683A KR1020140065973A KR20140065973A KR20150137683A KR 20150137683 A KR20150137683 A KR 20150137683A KR 1020140065973 A KR1020140065973 A KR 1020140065973A KR 20140065973 A KR20140065973 A KR 20140065973A KR 20150137683 A KR20150137683 A KR 20150137683A
Authority
KR
South Korea
Prior art keywords
unit
information
user
robot
topic
Prior art date
Application number
KR1020140065973A
Other languages
Korean (ko)
Inventor
오미현
Original Assignee
유진주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 유진주식회사 filed Critical 유진주식회사
Priority to KR1020140065973A priority Critical patent/KR20150137683A/en
Publication of KR20150137683A publication Critical patent/KR20150137683A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/06Control stands, e.g. consoles, switchboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/06Safety devices
    • B25J19/061Safety devices with audible signals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present invention relates to a system capable of efficiently interacting between a robot and a user by influencing a user's decision making in consideration of a user's economic level and conditions of a visit purpose. By setting a specific target to achieve within the robot of the present invention and performing an operation for achieving a goal through interaction with a human, it is possible to increase sales and provide satisfaction to the user.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a system for interacting with a robot,

The present invention relates to a system capable of interacting between a robot and a user, and more particularly, to a system capable of interacting between a robot and a user by influencing a user's decision making in consideration of a user's economic level and conditions of a visit purpose ≪ / RTI >

With the development of electronic technology, interest in robots is increasing. A robot capable of being driven according to a signal input by a person is being created, and a robot capable of performing work according to a preset pattern or function such as a robot cleaner is emerging. In this way, robots perform tasks that replace labor and labor instead of human beings. In recent years, robots in the form of social robots and emotional robots have appeared, and in particular, interest in the field of interaction between robots and users is increasing.

In this regard, the patent document 10-2013-0039578 (referred to as 'prior art document') discloses a technique of using a fusion of multi-modal sensor signals and a dataassociation technique, The present invention provides an intelligent robot configured to efficiently provide service in an environment, a system for interacting with an intelligent robot and a method for user interaction with an intelligent robot. Therefore, the precedent document performs location tracking and behavior pattern analysis for the user, and intelligent robots determine the users to be interacted with among the users, so that the intelligent robots can efficiently provide services to various people .

As described above, the prior art is a technology capable of analyzing behavior patterns for users through multiple sensors and stored data, but has limitations in that robots can not interact with each other based on emotion parts.

Korean Patent Laid-Open Publication No. 10-2013-0039578 (Apr. 22, 2013: Intelligent Robot, System for Interaction between Intelligent Robot and User and Method for Interaction of Intelligent Robot with User)

The object of the present invention is to provide a system for interaction between a robot and a user capable of performing a task suitable for the robot in interaction with the robot.

It is still another object of the present invention to provide a system for interacting with a robot for setting a specific goal to be achieved in a robot and performing a task for achieving a goal through interaction with a human.

According to an aspect of the present invention, there is provided a system capable of interacting with a robot according to an aspect of the present invention, the system configured to set a topic tree for a specific region on the basis of ontology modeling, And a dialogue generator 120 for generating at least one dialogue information to be used in the generated topic forest, and a user interface 100 for receiving dialogue information and user terminal information from the administrator terminal 100, And a control unit for estimating a budget line for the user information and estimating a budget line of the user information based on the degree of proximity of the user and the preset reference information and the proximity of the budget line A task setting unit 240 for setting a state of expression to be executed, The robot comprises a device 200 including the operation execution unit 250 for execution.

The administrator terminal 100 according to the present invention includes an administrator communication unit 130 for receiving the dialogue information being executed from the robot terminal 200 and an administrator communication unit 130 for displaying the dialogue information received through the administrator communication unit 130 And a monitoring unit 140.

The task execution unit 250 includes an audio output unit for outputting audio information corresponding to the extracted dialogue information and an operation unit for operating a predetermined motion corresponding to the extracted dialogue information do.

The present invention has the effect of helping the user to make a decision by enabling the robot to carry out tasks that are appropriate to him / her in the process of interacting with humans.

The present invention has the effect of increasing the sales and providing the satisfaction to the user by performing a task for achieving the goal through interaction with the human being by setting a specific goal to achieve within the robot.

1 is a block diagram of a system for interaction between a robot and a user according to the present invention.
2 is a diagram for explaining a budget line and an indifference curve according to the present invention.
FIG. 3 is a view for explaining a method of setting a presentation state of a robot terminal according to the present invention.
4 is a diagram for explaining an embodiment of monitoring according to the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the present invention may be unnecessarily blurred.

Referring to FIG. 1, a system for interaction between a robot and a user according to the present invention may include an administrator terminal 100 and a robot terminal 200.

1, the administrator terminal 100 includes an ontology unit 110, a dialogue generation unit 120, an administrator communication unit 130, and a monitoring unit 140.

The ontology unit 110 is a device for setting a topic tree for a specific region on the basis of ontology modeling and generating a topic forest in which the set topic tree is grouped.

First, ontology is a conceptual and computer-mediated model of what people see, hear, feel and think about the world through discussions with each other. . This ontology represents a consensus of knowledge, so it is not limited to any individual, but is a concept in which group members all agree. And there are many stereotypes because the program must be understandable. In computer science and information science, it is defined as a set of formal vocabularies that describe the relationship between concepts and concepts belonging to a specific domain as a data model representing a specific domain.

For example, if a specific domain (domiain) for generating an ontology is a restaurant, each topic used in a restaurant is set as a topic tree. If the top topic of the topic tree is an order, the sub-topic of the order can be linked to the stake, salad, appetizer, dessert and the like, and the type of the dressing can be connected to the sub topic of the salad. Another subject, if the parent topic is a seat, can be linked to the child topic of the seat as a window, a non-smoking seat, etc. In this way, the topic forest is created by grouping the topic tree for the order and the topic tree for the position.

The dialog generator 120 is a device for generating at least one dialog log information to be used in the generated topic forest.

For example, to generate dialog information for a topic tree in a topic forest, dialog information for the order may be generated, such as 'choose a menu', and the sub- Dialogue information about the steak can be generated such as 'How much meat can I cook?' Or 'How many do you want to order?'

The manager communication unit 130 is a device for transmitting the generated dialogue information to the robot terminal 200 and receiving the dialogue information being executed from the robot terminal 200.

The monitoring unit 140 is a device for displaying the dialogue information received through the manager communication unit 130. The monitoring will be described later.

The robot terminal 200 may include a receiving unit 210, a database unit 220, a user's degree extracting unit 230, a task setting unit 240, and a task executing unit 250.

The receiving unit 210 is a device for receiving dialog information from the administrator terminal 100 and user information from the user terminal. Here, the user information is information on the user's personal information, which means personal information such as a user's address, a vehicle, mobile phone information, and the like. Accordingly, the receiving unit 210 can use the short-range wireless communication, the mobile communication network, and the cable communication.

The database unit 220 is a device for storing dialog information and user information.

The user's degree extracting unit 230 is a device for estimating a budget line for the received user information to identify the consumable area. Here two factors for deducing the budget line use the economic level and the purpose of the visit. The boundary level is determined by the received user information (housing type, type of mobile phone, presence of vehicle). The Bayesian probability can be used as a way of estimating the budget line.

First, Bayesian describes the relationship between conditional probability and boundary probability of two random variables A. B and can be used in Equation (1).

[Equation 1]

P (A B) = (P (B A) P (A)) / P (B)

Here P (A) means that there is no consideration of event B with the prior probability or boundary probability of A. P (A │ B) is the conditional probability of A when B is picked up and can be called posterior probability. P (B) serves as a constant term constant.

The economic level determination can be used in Equation (2) below.

&Quot; (2) "

P (E, car, phone, add) = P (E, car, phone, add)

P (car) P (phone) P (add)) / (P (car) E (P)

car has a car, and phone cell phone add is a residential type.

Therefore Bayesian method combining boundary level and visiting sentence with Bayesian method is as follows.

&Quot; (3) "

P (B, E, V) = P (B, E, V) / P (E, V)

In equation (3), B represents the psychological budget available per person, E represents the boundary level, and V represents the purpose of the visit. Equation (3) determines the posterior probability of the psychologically available budget under the economic and visiting conditions.

In this way, the user's degree extracting unit 230 determines the economic level through Equation (2) and deduces the budget line through Equation (3).

The task setting unit 240 is a device for setting a presentation state to be executed according to the proximity of a budget line and a set reference information. In addition, the task setting unit 240 sets a presentation state corresponding to the dialogue information transmitted from the administrator terminal 100. [

As budget lines and indifference curves are used to determine the state of expression, it will be described with reference to FIG. 2 below.

Referring to FIG. 2, budget lines and indifference curves are plotted on the X, Y coordinates.

Indifference curves are curves that can be called utility curves and show various combinations of two goods that bring about the same satisfaction. The slope of the budget line is determined by the price of X, Y, and the slope is rotated according to the price of the good.

The axiom of the indifference curve is as follows. If you set the following axioms and are considered true, you can make the utility function u = u (x, y). The quantity of goods x, the quantity of goods y, and the function value of the utility to express the combination preference of these goods in the graph on a three-dimensional graph, u = u (x, y) can be expressed as a three-dimensional space constituting each axis. This means that consumers had the rational ability to make numerical decisions in the comparison of either combination.

The indifference curves have the following characteristics.

Completeness, the consumer is able to make a judgment that either one is preferred or there is no difference between the two presented combinations. This means that consumers had the rational ability to make numerical decisions in the comparison of either combination.

Transitivity, a combination of shavers X, Y, Z If X is at least equal to Y, preferring X more or preferring X more and at the same time comparing Y to at least Z, And are more likely to prefer or prefer the same.

Continuity, Continuously changes as preferences change, No sudden changes in preferences. When these three assumptions are met, the preference system can create a continuous utility function. For the sake of analytical convenience, we also add the axiom of composition.

Strong monotonicity means that between two proposed combinations, at least one item is more or less the same in each combination, and the other is more preferred. However, the axiom of the formation of a pulpit may not be established when it is aimed at non-material objects such as garbage.

FIG. 2 is a graph showing budget lines and indifference curves. The curves of the indifference curves IC1, IC2, and IC3 indicate that the user has a greater utility as the number of curves increases. The budget line indicates the area where the combination of X and Y goods that can be consumed by the user is possible. Therefore, FIG. 2 shows that the user consumes at point C and feels a great satisfaction at a given budget line. In this way, the budget line and the indifference curve can be used to find the maximum satisfaction and the maximum consumption expenditure on the budget line.

Hereinafter, a method for setting the presentation state of the robot terminal will be described with reference to FIG.

3, the X and Y sides (the number of goods), the yellow line (indifference curve), and the black line (budget line) are shown. The first square point is the position of the item being purchased through the (1 x 1) interaction, and the second square point (0 x 4) is the current minimum cost point of consumption. Accordingly, the square of the first square reflects the desire of the robot terminal 200, and the square of the second square reflects human desire.

Therefore, the job setting unit 240 is determined according to the values in Table 1 below the presentation state.

condition Criteria of Ecology Much more B - Px * x + Py * y> Px * 1.5 + Py * 1.5 More Px * 0.3 + Py * 0.3 <B - Px * x + Py * y <Px * 1.5 + Py * 1.5 Little more Px * 0.1 + Py * 0.1 <B - Px * x + Py * y <Px * 0.3 + Py * 0.3 Nomal Px * 0.1 - Py * 0.1 < B - Px * x + Py * y & Very very Good -Px * 0.3 - Py * 0.3 <B - Px * x + Py * y <- Px * 0.1 - Py * 0.1 Very Good -Px * 1.5 - Py * 1.5 <B - Px * x + Py * y <- Px * 0.3 - Py * 0.3 Good B - Px * x - Py * y - Px * 1.5 - Py * 1.5

The formula for the budget line is defined as Px * x + Py * y = B. x, y is the value of the good, and B is the psychologically available budget.

Referring again to FIG. 1, the task execution unit 250 is a device for extracting and executing dialogue information corresponding to a presentation state.

The task execution unit 250 may include an audio output unit 251 for outputting audio information corresponding to dialogue information and an operation unit 252 for operating predetermined motion corresponding to dialogue information.

Table 2 below is an example of a representation state and a dialogue type expressed through the task execution unit 250. [ Table 2 is based on the case where a specific area is a restaurant.

condition Expression behavior  Number of proposals Dialog type Much more Dislike + surprise 4 Order Offers and Promotions More dislike 2 Order proposal Little more dislike One Order proposal Nomal stop 0 Business completion Good Greetings One Thank you for your order results Very Good Like One Thank you for your order results Very very Good Like One Thank you for your order results

In this manner, when the presentation state is very very good, the operation execution unit 250 performs auditing of the order result, and the operation unit 252 executes the motion corresponding to the predetermined liking. Also, the voice output unit 251 outputs a voice message corresponding to the audit result of the order result.

Also, the number of suggestions is set according to the state. If the presentation state is much more as shown in Table 2, the voice output unit 251 suggests a voice message corresponding to the order proposal and the promotion four times, and the task execution unit 250 sets the motion corresponding to the dislike + surprise to 4 Repeat this process. In this way, the robot terminal 200 influences the decision making of the lion in consideration of the economic level and the conditions of the visit purpose, thereby realizing effective interaction between the robot and the user.

FIG. 4 is a diagram for monitoring. In the left part of FIG. 4, it can be seen that the display state of the robot terminal is displayed through a predictive line and an irregular curve. The current state of the dialogue information is displayed on the rear side of the drawing. The dialogue information displayed on this screen is output based on the topic forest, and the administrator can check the current status of the current user through the dialogue information outputted. Referring to FIG. 4, the dialogue information indicates the stake, and thus, the user can confirm that the stake menu has been selected. In addition, the monitoring unit 140 may include an interface unit (not shown) for the administrator to directly operate the dialogue. Accordingly, the monitoring unit 140 may be configured as a touchable touch panel or receive signals from outside such as a keyboard and a mouse.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. The present invention is not limited to the drawings.

100: administrator terminal 110: ontology section
120: Dialog generation unit 130: Manager communication unit
140: Monitoring unit 200: Robot terminal
210: Receiving unit 220:
230: user's degree extracting unit 240: job setting unit
250: task execution unit 251: sound wave output unit
252:

Claims (4)

An ontology unit 110 for setting a topic tree for a specific region on the basis of ontology modeling and generating a topic forest in which the set topic tree is grouped; And a dialog generator (120) for generating at least one dialog log information to be used in the generated topic forest (100); And
A receiving unit 210 for receiving dialog information from the manager terminal 100 and at least one user information from the user terminal; A user's estimating unit for estimating a budget line for the user information and identifying a consumable area; A task setting unit (240) for setting a presentation state to be executed according to a proximity of the predetermined reference information and the budget line; And a task execution unit (250) for executing the dialog information corresponding to the set expression state.
The administrator terminal (100) includes a manager communication unit (130) for receiving dialog information being executed from the robot terminal (200); And a monitoring unit (140) for displaying the dialogue information received through the manager communication unit (130). The method according to claim 1,
And a voice output unit for outputting voice information corresponding to the extracted dialogue information, wherein the task execution unit (250) is capable of interacting with a user.
The method according to claim 1,
Wherein the task execution unit (250) includes an operation unit for operating a predetermined motion corresponding to the extracted dialogue information.
KR1020140065973A 2014-05-30 2014-05-30 System for interaction between robot and its use KR20150137683A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020140065973A KR20150137683A (en) 2014-05-30 2014-05-30 System for interaction between robot and its use

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140065973A KR20150137683A (en) 2014-05-30 2014-05-30 System for interaction between robot and its use

Publications (1)

Publication Number Publication Date
KR20150137683A true KR20150137683A (en) 2015-12-09

Family

ID=54873580

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140065973A KR20150137683A (en) 2014-05-30 2014-05-30 System for interaction between robot and its use

Country Status (1)

Country Link
KR (1) KR20150137683A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180058563A (en) 2016-11-24 2018-06-01 동아대학교 산학협력단 Module for moral decision making, robot comprising the same, and method for moral decision making

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180058563A (en) 2016-11-24 2018-06-01 동아대학교 산학협력단 Module for moral decision making, robot comprising the same, and method for moral decision making

Similar Documents

Publication Publication Date Title
Svenningsson et al. Artificial intelligence in conversational agents: A study of factors related to perceived humanness in chatbots
US9063565B2 (en) Automated avatar creation and interaction in a virtual world
WO2018000259A1 (en) Method and system for generating robot interaction content, and robot
US10248721B2 (en) Management, evaluation and visualization method, system and user interface for discussions and assertions
WO2009076203A1 (en) System and methods for facilitating collaboration of a group
CN110249325A (en) Input system with traffic model
CN106982394A (en) The unread message reminding method and device of a kind of network direct broadcasting
US9299178B2 (en) Generation of animated gesture responses in a virtual world
Kim et al. Identifying affect elements based on a conceptual model of affect: A case study on a smartphone
US10068029B2 (en) Visualizing relationships in survey data
CN111512617B (en) Device and method for recommending contact information
CN104428804A (en) Method and apparatus for rating objects
CN106462255A (en) A method, system and robot for generating interactive content of robot
CN110209778A (en) A kind of method and relevant apparatus of dialogue generation
US20230273685A1 (en) Method and Arrangement for Handling Haptic Feedback
CN104866275A (en) Image information acquisition method and device
Kuo et al. Motion generation and virtual simulation in a digital environment
CN105095080A (en) Method and device for evaluating to-be-tested application
CN107976919A (en) A kind of Study of Intelligent Robot Control method, system and electronic equipment
CN106537293A (en) Method and system for generating robot interactive content, and robot
KR20150137683A (en) System for interaction between robot and its use
Noh et al. Virtual companion based mobile user interface: an intelligent and simplified mobile user interface for the elderly users
Phadnis et al. The work avatar face-off: Knowledge worker preferences for realism in meetings
Kjeldskov et al. Combining ethnography and object-orientation for mobile interaction design: Contextual richness and abstract models
CN106537425A (en) Method and system for generating robot interaction content, and robot

Legal Events

Date Code Title Description
N231 Notification of change of applicant
WITN Withdrawal due to no request for examination