KR20150137683A - System for interaction between robot and its use - Google Patents
System for interaction between robot and its use Download PDFInfo
- Publication number
- KR20150137683A KR20150137683A KR1020140065973A KR20140065973A KR20150137683A KR 20150137683 A KR20150137683 A KR 20150137683A KR 1020140065973 A KR1020140065973 A KR 1020140065973A KR 20140065973 A KR20140065973 A KR 20140065973A KR 20150137683 A KR20150137683 A KR 20150137683A
- Authority
- KR
- South Korea
- Prior art keywords
- unit
- information
- user
- robot
- topic
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/06—Control stands, e.g. consoles, switchboards
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/06—Safety devices
- B25J19/061—Safety devices with audible signals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1669—Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The present invention relates to a system capable of efficiently interacting between a robot and a user by influencing a user's decision making in consideration of a user's economic level and conditions of a visit purpose. By setting a specific target to achieve within the robot of the present invention and performing an operation for achieving a goal through interaction with a human, it is possible to increase sales and provide satisfaction to the user.
Description
The present invention relates to a system capable of interacting between a robot and a user, and more particularly, to a system capable of interacting between a robot and a user by influencing a user's decision making in consideration of a user's economic level and conditions of a visit purpose ≪ / RTI >
With the development of electronic technology, interest in robots is increasing. A robot capable of being driven according to a signal input by a person is being created, and a robot capable of performing work according to a preset pattern or function such as a robot cleaner is emerging. In this way, robots perform tasks that replace labor and labor instead of human beings. In recent years, robots in the form of social robots and emotional robots have appeared, and in particular, interest in the field of interaction between robots and users is increasing.
In this regard, the patent document 10-2013-0039578 (referred to as 'prior art document') discloses a technique of using a fusion of multi-modal sensor signals and a dataassociation technique, The present invention provides an intelligent robot configured to efficiently provide service in an environment, a system for interacting with an intelligent robot and a method for user interaction with an intelligent robot. Therefore, the precedent document performs location tracking and behavior pattern analysis for the user, and intelligent robots determine the users to be interacted with among the users, so that the intelligent robots can efficiently provide services to various people .
As described above, the prior art is a technology capable of analyzing behavior patterns for users through multiple sensors and stored data, but has limitations in that robots can not interact with each other based on emotion parts.
The object of the present invention is to provide a system for interaction between a robot and a user capable of performing a task suitable for the robot in interaction with the robot.
It is still another object of the present invention to provide a system for interacting with a robot for setting a specific goal to be achieved in a robot and performing a task for achieving a goal through interaction with a human.
According to an aspect of the present invention, there is provided a system capable of interacting with a robot according to an aspect of the present invention, the system configured to set a topic tree for a specific region on the basis of ontology modeling, And a
The
The
The present invention has the effect of helping the user to make a decision by enabling the robot to carry out tasks that are appropriate to him / her in the process of interacting with humans.
The present invention has the effect of increasing the sales and providing the satisfaction to the user by performing a task for achieving the goal through interaction with the human being by setting a specific goal to achieve within the robot.
1 is a block diagram of a system for interaction between a robot and a user according to the present invention.
2 is a diagram for explaining a budget line and an indifference curve according to the present invention.
FIG. 3 is a view for explaining a method of setting a presentation state of a robot terminal according to the present invention.
4 is a diagram for explaining an embodiment of monitoring according to the present invention.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the present invention may be unnecessarily blurred.
Referring to FIG. 1, a system for interaction between a robot and a user according to the present invention may include an
1, the
The
First, ontology is a conceptual and computer-mediated model of what people see, hear, feel and think about the world through discussions with each other. . This ontology represents a consensus of knowledge, so it is not limited to any individual, but is a concept in which group members all agree. And there are many stereotypes because the program must be understandable. In computer science and information science, it is defined as a set of formal vocabularies that describe the relationship between concepts and concepts belonging to a specific domain as a data model representing a specific domain.
For example, if a specific domain (domiain) for generating an ontology is a restaurant, each topic used in a restaurant is set as a topic tree. If the top topic of the topic tree is an order, the sub-topic of the order can be linked to the stake, salad, appetizer, dessert and the like, and the type of the dressing can be connected to the sub topic of the salad. Another subject, if the parent topic is a seat, can be linked to the child topic of the seat as a window, a non-smoking seat, etc. In this way, the topic forest is created by grouping the topic tree for the order and the topic tree for the position.
The
For example, to generate dialog information for a topic tree in a topic forest, dialog information for the order may be generated, such as 'choose a menu', and the sub- Dialogue information about the steak can be generated such as 'How much meat can I cook?' Or 'How many do you want to order?'
The
The
The
The
The
The user's
First, Bayesian describes the relationship between conditional probability and boundary probability of two random variables A. B and can be used in Equation (1).
[Equation 1]
P (A B) = (P (B A) P (A)) / P (B)
Here P (A) means that there is no consideration of event B with the prior probability or boundary probability of A. P (A │ B) is the conditional probability of A when B is picked up and can be called posterior probability. P (B) serves as a constant term constant.
The economic level determination can be used in Equation (2) below.
&Quot; (2) "
P (E, car, phone, add) = P (E, car, phone, add)
P (car) P (phone) P (add)) / (P (car) E (P)
car has a car, and phone cell phone add is a residential type.
Therefore Bayesian method combining boundary level and visiting sentence with Bayesian method is as follows.
&Quot; (3) "
P (B, E, V) = P (B, E, V) / P (E, V)
In equation (3), B represents the psychological budget available per person, E represents the boundary level, and V represents the purpose of the visit. Equation (3) determines the posterior probability of the psychologically available budget under the economic and visiting conditions.
In this way, the user's
The
As budget lines and indifference curves are used to determine the state of expression, it will be described with reference to FIG. 2 below.
Referring to FIG. 2, budget lines and indifference curves are plotted on the X, Y coordinates.
Indifference curves are curves that can be called utility curves and show various combinations of two goods that bring about the same satisfaction. The slope of the budget line is determined by the price of X, Y, and the slope is rotated according to the price of the good.
The axiom of the indifference curve is as follows. If you set the following axioms and are considered true, you can make the utility function u = u (x, y). The quantity of goods x, the quantity of goods y, and the function value of the utility to express the combination preference of these goods in the graph on a three-dimensional graph, u = u (x, y) can be expressed as a three-dimensional space constituting each axis. This means that consumers had the rational ability to make numerical decisions in the comparison of either combination.
The indifference curves have the following characteristics.
Completeness, the consumer is able to make a judgment that either one is preferred or there is no difference between the two presented combinations. This means that consumers had the rational ability to make numerical decisions in the comparison of either combination.
Transitivity, a combination of shavers X, Y, Z If X is at least equal to Y, preferring X more or preferring X more and at the same time comparing Y to at least Z, And are more likely to prefer or prefer the same.
Continuity, Continuously changes as preferences change, No sudden changes in preferences. When these three assumptions are met, the preference system can create a continuous utility function. For the sake of analytical convenience, we also add the axiom of composition.
Strong monotonicity means that between two proposed combinations, at least one item is more or less the same in each combination, and the other is more preferred. However, the axiom of the formation of a pulpit may not be established when it is aimed at non-material objects such as garbage.
FIG. 2 is a graph showing budget lines and indifference curves. The curves of the indifference curves IC1, IC2, and IC3 indicate that the user has a greater utility as the number of curves increases. The budget line indicates the area where the combination of X and Y goods that can be consumed by the user is possible. Therefore, FIG. 2 shows that the user consumes at point C and feels a great satisfaction at a given budget line. In this way, the budget line and the indifference curve can be used to find the maximum satisfaction and the maximum consumption expenditure on the budget line.
Hereinafter, a method for setting the presentation state of the robot terminal will be described with reference to FIG.
3, the X and Y sides (the number of goods), the yellow line (indifference curve), and the black line (budget line) are shown. The first square point is the position of the item being purchased through the (1 x 1) interaction, and the second square point (0 x 4) is the current minimum cost point of consumption. Accordingly, the square of the first square reflects the desire of the
Therefore, the
The formula for the budget line is defined as Px * x + Py * y = B. x, y is the value of the good, and B is the psychologically available budget.
Referring again to FIG. 1, the
The
Table 2 below is an example of a representation state and a dialogue type expressed through the
In this manner, when the presentation state is very very good, the
Also, the number of suggestions is set according to the state. If the presentation state is much more as shown in Table 2, the voice output unit 251 suggests a voice message corresponding to the order proposal and the promotion four times, and the
FIG. 4 is a diagram for monitoring. In the left part of FIG. 4, it can be seen that the display state of the robot terminal is displayed through a predictive line and an irregular curve. The current state of the dialogue information is displayed on the rear side of the drawing. The dialogue information displayed on this screen is output based on the topic forest, and the administrator can check the current status of the current user through the dialogue information outputted. Referring to FIG. 4, the dialogue information indicates the stake, and thus, the user can confirm that the stake menu has been selected. In addition, the
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. The present invention is not limited to the drawings.
100: administrator terminal 110: ontology section
120: Dialog generation unit 130: Manager communication unit
140: Monitoring unit 200: Robot terminal
210: Receiving unit 220:
230: user's degree extracting unit 240: job setting unit
250: task execution unit 251: sound wave output unit
252:
Claims (4)
A receiving unit 210 for receiving dialog information from the manager terminal 100 and at least one user information from the user terminal; A user's estimating unit for estimating a budget line for the user information and identifying a consumable area; A task setting unit (240) for setting a presentation state to be executed according to a proximity of the predetermined reference information and the budget line; And a task execution unit (250) for executing the dialog information corresponding to the set expression state.
And a voice output unit for outputting voice information corresponding to the extracted dialogue information, wherein the task execution unit (250) is capable of interacting with a user.
Wherein the task execution unit (250) includes an operation unit for operating a predetermined motion corresponding to the extracted dialogue information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140065973A KR20150137683A (en) | 2014-05-30 | 2014-05-30 | System for interaction between robot and its use |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140065973A KR20150137683A (en) | 2014-05-30 | 2014-05-30 | System for interaction between robot and its use |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20150137683A true KR20150137683A (en) | 2015-12-09 |
Family
ID=54873580
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020140065973A KR20150137683A (en) | 2014-05-30 | 2014-05-30 | System for interaction between robot and its use |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20150137683A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180058563A (en) | 2016-11-24 | 2018-06-01 | 동아대학교 산학협력단 | Module for moral decision making, robot comprising the same, and method for moral decision making |
-
2014
- 2014-05-30 KR KR1020140065973A patent/KR20150137683A/en not_active Application Discontinuation
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180058563A (en) | 2016-11-24 | 2018-06-01 | 동아대학교 산학협력단 | Module for moral decision making, robot comprising the same, and method for moral decision making |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Svenningsson et al. | Artificial intelligence in conversational agents: A study of factors related to perceived humanness in chatbots | |
US9063565B2 (en) | Automated avatar creation and interaction in a virtual world | |
WO2018000259A1 (en) | Method and system for generating robot interaction content, and robot | |
US10248721B2 (en) | Management, evaluation and visualization method, system and user interface for discussions and assertions | |
WO2009076203A1 (en) | System and methods for facilitating collaboration of a group | |
CN110249325A (en) | Input system with traffic model | |
CN106982394A (en) | The unread message reminding method and device of a kind of network direct broadcasting | |
US9299178B2 (en) | Generation of animated gesture responses in a virtual world | |
Kim et al. | Identifying affect elements based on a conceptual model of affect: A case study on a smartphone | |
US10068029B2 (en) | Visualizing relationships in survey data | |
CN111512617B (en) | Device and method for recommending contact information | |
CN104428804A (en) | Method and apparatus for rating objects | |
CN106462255A (en) | A method, system and robot for generating interactive content of robot | |
CN110209778A (en) | A kind of method and relevant apparatus of dialogue generation | |
US20230273685A1 (en) | Method and Arrangement for Handling Haptic Feedback | |
CN104866275A (en) | Image information acquisition method and device | |
Kuo et al. | Motion generation and virtual simulation in a digital environment | |
CN105095080A (en) | Method and device for evaluating to-be-tested application | |
CN107976919A (en) | A kind of Study of Intelligent Robot Control method, system and electronic equipment | |
CN106537293A (en) | Method and system for generating robot interactive content, and robot | |
KR20150137683A (en) | System for interaction between robot and its use | |
Noh et al. | Virtual companion based mobile user interface: an intelligent and simplified mobile user interface for the elderly users | |
Phadnis et al. | The work avatar face-off: Knowledge worker preferences for realism in meetings | |
Kjeldskov et al. | Combining ethnography and object-orientation for mobile interaction design: Contextual richness and abstract models | |
CN106537425A (en) | Method and system for generating robot interaction content, and robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
N231 | Notification of change of applicant | ||
WITN | Withdrawal due to no request for examination |