KR20130064270A - Hri integrated framework apparatus and method for providing hri service using the same - Google Patents

Hri integrated framework apparatus and method for providing hri service using the same Download PDF

Info

Publication number
KR20130064270A
KR20130064270A KR1020110130803A KR20110130803A KR20130064270A KR 20130064270 A KR20130064270 A KR 20130064270A KR 1020110130803 A KR1020110130803 A KR 1020110130803A KR 20110130803 A KR20110130803 A KR 20110130803A KR 20130064270 A KR20130064270 A KR 20130064270A
Authority
KR
South Korea
Prior art keywords
user
service
unit
information
robot interaction
Prior art date
Application number
KR1020110130803A
Other languages
Korean (ko)
Inventor
김도형
이재연
윤호섭
지수영
김계경
김혜진
윤우한
윤영우
김재홍
손주찬
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020110130803A priority Critical patent/KR20130064270A/en
Publication of KR20130064270A publication Critical patent/KR20130064270A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Manipulator (AREA)

Abstract

PURPOSE: An HRI(Human Robot Interaction)-integrated framework device and an HRI service providing method thereof are provided to supply a satisfied recognition result by continuously and permanently monitoring a volatile recognition result besides integrating individual recognition components. CONSTITUTION: A tracking unit(110) detects user information through sensors and tracks a user state based on the detected information. An ID recognizing unit(120) recognizes a user ID based on the detected information. A behavior recognizing unit(130) recognizes a user gesture based on the information. A multi integrating unit(140) integrates the user state, the user ID, and the user gesture and obtains HRI service information based on the integrated information. An interlinking unit(150) receives an HRI service from a service application and provides the HRI service to the user through the application. [Reference numerals] (110) Tracking unit; (120) ID recognizing unit; (130) Behavior recognizing unit; (140) Multi integrating unit; (145) Database; (150) Interlinking unit; (210) Camera; (220) Microphone

Description

HRI INTEGRATED FRAMEWORK APPARATUS AND METHOD FOR PROVIDING HRI SERVICE USING THE SAME}

The present invention relates to a human robot interaction integrated framework device and a method for providing a human robot interaction service using the same. More specifically, the human robot interaction integrated framework device which enables various types of human robot interaction services by providing continuous user information on who, where, what is done by fusing information from various sensors and the same The present invention relates to a method for providing human robot interaction service.

Human-Robot Interaction (also referred to as "HRI") technology is a technology that enables robots to interact and interact naturally with humans, just as humans interact with each other.

Intelligent service robots need to be able to interact with humans in the same way as humans in order to provide necessary services while living with humans in everyday environments. Therefore, HRI technology that deals with human-robot interaction problem is the first technology to be solved in the practical use of intelligent service robot, and it is a technology that is the key to the success of robot industrialization.

HRI technology is the core technology of intelligent robots that design and implement robotic systems and interaction environments to enable human and robots to interact cognitively and emotionally through various communication channels. There are differences from human-computer interaction (HCI), such as the variety of interactions or control levels.

Since R & D mainly focuses on improving the performance of element technology in the field of HRI technology, it does not lead to improvement of HRI function in robots that require optimized coupling between element technologies.

Interaction between the robot and the user takes place continuously, but recognition by conventional individual user recognition components is only one time. Therefore, a service application that wants to provide a service using these components cannot use useful information as much as possible, and the recognition performance provided is not satisfactory. In addition, for certain applications, it is necessary to improve performance by introducing an approach other than video and audio, which is highly dependent on the environment.

An object of the present invention is to integrate human robot interaction framework apparatus that enables various types of human robot interaction service by providing continuous user information on who, where and what by fusing information from various sensors And it provides a method for providing a human robot interaction service using the same.

Human robot interaction integrated framework device according to an embodiment of the present invention for solving the above problems is

A tracking unit which detects the user's information using specific sensors and tracks the user's state based on the detected user's information; ID recognition unit for recognizing the user ID based on the detected user information; A behavior recognition unit recognizing a gesture of the user based on the detected user information; A multi integration unit for integrating the user's state, the user's ID, and the user's gesture, and obtaining information on the human robot interaction service based on the integrated result; And an interworking unit for receiving a request for a human robot interaction service from a service application and providing a user with the user through the service application.

According to an embodiment of the present invention, the human robot interaction integrated framework device and the method for providing a human robot interaction service using the same are continuously and permanently monitored for the conventional single and volatilized recognition result as well as the integration of individual recognition components. Monitoring can provide recognition results that are satisfactory to real users.

In addition, the present invention may not only perform a user's command, but may also provide a foundation for providing an optimal service by determining a situation by a robot by itself.

1 is a block diagram schematically illustrating an environment for providing a human robot interaction service using a human robot interaction integrated framework device according to an embodiment of the present invention.
2 is a block diagram showing a human robot interaction integrated framework device according to an embodiment of the present invention.
3 is a flowchart illustrating a method for providing a human robot interaction service according to an embodiment of the present invention.

The present invention will now be described in detail with reference to the accompanying drawings. Hereinafter, a repeated description, a known function that may obscure the gist of the present invention, and a detailed description of the configuration will be omitted. Embodiments of the present invention are provided to more fully describe the present invention to those skilled in the art. Accordingly, the shapes and sizes of the elements in the drawings and the like can be exaggerated for clarity.

First, in the related art, a service application that intends to provide a human robot interaction service directly drives a user recognition engine at a specific time point, receives image and voice data from a robot, and analyzes it, and uses the analyzed result to interact with the human robot. Provided action services. At this time, the service application developer had to work hard to perform a task for the user recognition engine to maintain optimal performance. In addition, by running the user recognition engine along with the service application, it was inevitable to use the computing resources.

However, the human robot interaction integrated framework device according to an embodiment of the present invention can register information about a specific user by only registering the corresponding event, without the burden of having to directly recognize a user. Can be provided.

While the user application has no choice but to run the recognition engine at a specific point in time to provide the human robot interaction service, the human robot interaction integrated framework device according to the embodiment of the present invention is independent of the application's requirements. It is advantageous to provide user information with high reliability by performing recognition in advance when it is advantageous for recognition. In addition, it is possible to provide high recognition performance by fusion of a plurality of recognizers and recognition results.

Hereinafter, a human robot interaction integrated framework device and a method for providing a human robot interaction service using the same according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings.

1 is a block diagram schematically illustrating an environment for providing a human robot interaction service using a human robot interaction integrated framework device according to an embodiment of the present invention.

Referring to FIG. 1, an environment for providing a human robot interaction service according to an embodiment of the present invention includes a human robot interaction integrated framework device 100, a robot 10, and a service application 20. Here, the human robot interaction integrated framework device 100 is also referred to as "HRIDemon" to perform human robot interaction.

The human robot interaction integrated framework device 100 continuously receives image and audio data from the robot 10, analyzes the received image and audio data, and continuously accumulates the analysis result. Here, the analysis result includes information about who does what and where based on the video and audio data.

In addition, the human robot interaction integrated framework device 100 provides a human robot interaction service corresponding to the request of the service application 20 by using the accumulated analysis result.

The service application 20 may register a specific event to be provided to the human robot interaction integrated framework device 100 or request specific recognition information and a service.

For example, the service application 20 registers a notification event for notifying when the father of the user appears in the human robot interaction integrated framework device 100. Then, the human robot interaction integrated framework device 100 analyzes the video and audio data continuously received from the robot 10, and delivers it to the service application 20 when the analysis result corresponds to the notification event. .

In addition, the service application 20 may provide a specific service, for example, “come to me” or “follow me” based on the user recognition information. ). Then, the human robot interaction framework device 100 immediately performs the specific service when the requested specific service corresponds to the registered event.

Next, the human robot interaction integrated framework device 100 will be described in detail with reference to FIG. 2.

2 is a block diagram showing a human robot interaction integrated framework device according to an embodiment of the present invention.

First, the human robot interaction integrated framework device 100 according to an embodiment of the present invention may be in the form of one execution process in which components corresponding to human robot interaction are combined, but are not limited thereto.

Referring to FIG. 2, the human robot interaction integrated framework device 100 includes a camera 210, a microphone 220, a tracking unit 110, an ID recognition unit 120, a behavior recognition unit 130, and a multi-integration. The unit 140, a database 145, and an interlocking unit 150 are included. Here, although the camera 210 and the microphone 220 are illustrated as being located in the human robot interaction integrated framework device 100, the camera 210 and the microphone 220 may be located outside, for example, the robot 10. In addition, the human robot interaction integrated framework device 100 may use various types of sensors capable of recognizing a user as well as the camera 210 and the microphone 220.

 The tracking unit 110 detects user information such as a user's face through various types of sensors such as a camera 210 and a microphone 220, and tracks a user's state based on the detected user information.

The ID recognition unit 120 recognizes the user's ID based on the user's face and voice information detected by the camera 110 and the microphone 220 in the tracking unit 110.

The behavior recognizer 130 recognizes the gesture of the user based on the user information detected by the tracker 110.

The multi integration unit 140 integrates the results tracked and acquired by the tracking unit 110, the ID recognizing unit 120, and the behavior recognizing unit 130, and based on the integrated result, who, where, and what are they doing? Obtain information about the human robot interaction service.

The database 145 stores the human robot interaction service information obtained from the multi integration unit 140.

The interlocking unit 150 operates in conjunction with the service application 20 to receive a human robot interaction service, and to allow the service application 20 to provide a corresponding human robot interaction service to the user.

As such, the human robot interaction framework 100 is a tracking unit 110, ID recognition unit 120, behavior recognition unit 130, multi integration unit 140, database 145, interlocking unit It is just a design example of HRIDemon composed of components corresponding to human robot interaction such as (150). In other words, HRIDemon can be designed differently depending on what kind of human robot interaction service is provided.

Next, a method of providing a human robot interaction service using a human robot interaction integrated framework device will be described in detail with reference to FIG. 3.

3 is a flowchart illustrating a method for providing a human robot interaction service according to an embodiment of the present invention.

First, in order for the service application 20 to interwork with the human robot interaction integrated framework device 100, specific events and protocols are required. For example, the present invention may be classified into an event and a query request, but is not limited thereto.

(1) Event: The service application 20 requests the human robot interaction integration framework device 100 to register an event, and the human robot interaction integration framework device 100 generates a registered event. Call the set callback function of the service application 20.

(2) Query: The service application 20 requests user related information from the human robot interaction integrated framework device 100.

(3) request: The service application 20 requests a specific service from the human robot interaction integrated framework device 100.

Referring to FIG. 3, the human robot interaction framework device 100 receives a result (= sensor detection result) detected through various sensors included in the robot 10 through the robot 10 (S300). . Here, the various sensors may be various types of sensors capable of recognizing the user as well as the camera 210 and the microphone 220, but are not limited thereto. In addition, the human robot interaction integrated framework device 100 analyzes the received sensor detection result, for example, video and audio data, and continuously accumulates the analysis result. Here, the analysis result includes information about who does what and where based on the video and audio data.

The service application 20 requests registration of the event to be provided from the human robot interaction integrated framework device 100 to the human robot interaction integrated framework device 100 (S310).

The human robot interaction integrated framework device 100 detects an event registered based on an analysis result corresponding to the received sensor detection result received in step S300 (S320).

When the event registered in step S320 occurs, the human robot interaction framework device 100 calls the set callback function of the service application 20 (S330).

After receiving the callback function, the service application 20 requests user-related information, that is, user information, from the human robot interaction integrated framework device 100 (S340). In addition, the service application 20 requests a specific service corresponding to the user information (S350).

The human robot interaction integrated framework device 100 allows the service application 20 to provide the user with the human robot interaction service corresponding to the request of the service application 20 (S360).

As such, the human robot interaction integrated framework device and the method for providing a human robot interaction service using the same according to an exemplary embodiment of the present invention fuse a user's information about who, where and what by fusing information obtained from various sensors. Information can be provided to provide various types of human robot interaction services.

As described above, the best embodiment has been disclosed in the drawings and the specification. Although specific terms have been employed herein, they are used for purposes of illustration only and are not intended to limit the scope of the invention as defined in the claims or the claims. Therefore, those skilled in the art will understand that various modifications and equivalent other embodiments are possible from this. Accordingly, the true scope of the present invention should be determined by the technical idea of the appended claims.

10; Robot 20; Service application
100; Human Robot Interaction Integrated Framework Device
210; Camera 220; MIC
110; Tracking unit 120; ID recognition unit
130; A behavior recognition unit 140; Multi integrated part
145; Database 150; Linkage

Claims (1)

A tracking unit which detects the user's information using specific sensors and tracks the user's state based on the detected user's information;
ID recognition unit for recognizing the user ID based on the detected user information;
A behavior recognition unit recognizing a gesture of the user based on the detected user information;
A multi integration unit for integrating the user's state, the user's ID, and the user's gesture, and obtaining information on the human robot interaction service based on the integrated result; And
Interworking unit for receiving a request for a human robot interaction service from a service application, and providing a user with a human robot interaction service corresponding to the request through the service application
Human robot interaction framework device comprising a.
KR1020110130803A 2011-12-08 2011-12-08 Hri integrated framework apparatus and method for providing hri service using the same KR20130064270A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110130803A KR20130064270A (en) 2011-12-08 2011-12-08 Hri integrated framework apparatus and method for providing hri service using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110130803A KR20130064270A (en) 2011-12-08 2011-12-08 Hri integrated framework apparatus and method for providing hri service using the same

Publications (1)

Publication Number Publication Date
KR20130064270A true KR20130064270A (en) 2013-06-18

Family

ID=48861270

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110130803A KR20130064270A (en) 2011-12-08 2011-12-08 Hri integrated framework apparatus and method for providing hri service using the same

Country Status (1)

Country Link
KR (1) KR20130064270A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190076380A (en) * 2017-12-22 2019-07-02 한국전자통신연구원 Apparatus and method for detecting action based on multi-modal
KR20210071851A (en) * 2019-12-06 2021-06-16 주식회사 원더풀플랫폼 Helper system using helper robot

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190076380A (en) * 2017-12-22 2019-07-02 한국전자통신연구원 Apparatus and method for detecting action based on multi-modal
KR20210071851A (en) * 2019-12-06 2021-06-16 주식회사 원더풀플랫폼 Helper system using helper robot

Similar Documents

Publication Publication Date Title
US9900685B2 (en) Creating an audio envelope based on angular information
TWI734406B (en) Method for organizing a network of nodes
US10528816B2 (en) System and method for retrieving and displaying supplemental information and pertinent data using augmented reality
CN103226436A (en) Man-machine interaction method and system of intelligent terminal
CN107170446A (en) Semantic processing server and method for semantic processing
CN103514075B (en) The method and apparatus that monitoring api function is called in the terminal
US20150103776A1 (en) Event driven anonymous device identifier generation
US20170305009A1 (en) Control system and control method
US11514600B2 (en) Information processing device and information processing method
CN105843503B (en) Using open method, device and terminal device
WO2015128954A1 (en) Device information providing system, and device information providing method
CN110825594B (en) Data reporting and issuing method, client and server
WO2019228236A1 (en) Method and apparatus for human-computer interaction in display device, and computer device and storage medium
US20240022633A1 (en) Method and system for providing web content in virtual reality environment
CN107195301A (en) Intelligent robot semantic processing method and device
US20160379071A1 (en) User Identification Method and Electronic Device
CN110784523B (en) Target object information pushing method and device
WO2012111252A1 (en) Information processing device
KR20130064270A (en) Hri integrated framework apparatus and method for providing hri service using the same
US20140218516A1 (en) Method and apparatus for recognizing human information
US8625468B2 (en) Systems and methods for granting feature control based on user location
CN106533798A (en) Detection method and device
KR102302029B1 (en) System for recogniting multiple input based on artificial intelligent
KR20110096372A (en) Method for providing user interface of terminal with projecting function
CN105159181A (en) Control method and device for intelligent equipment

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination