JP2007152443A - Clearing-away robot - Google Patents

Clearing-away robot Download PDF

Info

Publication number
JP2007152443A
JP2007152443A JP2005347322A JP2005347322A JP2007152443A JP 2007152443 A JP2007152443 A JP 2007152443A JP 2005347322 A JP2005347322 A JP 2005347322A JP 2005347322 A JP2005347322 A JP 2005347322A JP 2007152443 A JP2007152443 A JP 2007152443A
Authority
JP
Japan
Prior art keywords
robot
article
user
tidying
person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2005347322A
Other languages
Japanese (ja)
Inventor
Ken Onishi
Shigetoshi Shiotani
Tetsuya Tomonaka
哲也 塘中
成敏 塩谷
献 大西
Original Assignee
Mitsubishi Heavy Ind Ltd
三菱重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Heavy Ind Ltd, 三菱重工業株式会社 filed Critical Mitsubishi Heavy Ind Ltd
Priority to JP2005347322A priority Critical patent/JP2007152443A/en
Publication of JP2007152443A publication Critical patent/JP2007152443A/en
Application status is Withdrawn legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a clearing-away robot, keeping optional daily goods in an optional position and clearing away the same in an optional vacant space according to a person's instruction or spontaneously . <P>SOLUTION: The voice, name and position of a person are input from a person condition measuring part 51 to a standby position determination part 54. On the other hand, the position of the clearing-away robot is input from a clearing-away robot is input from a clearing-away robot condition measuring part 25 to the standby position determination part 54. The traveling path of the clearing-away robot is input from an environment map 53 to the standby position determination part 54. According to these inputs, the standby position determination part 54 determines the standby position, and the clearing-away robot itself moves to that place. An instruction confirming part 55 interprets person's voice instruction to confirm the working instruction from the person, and inputs a clearing-away command (a clearing-away instruction) or fetching command (a return instruction) to an execution part 56. <P>COPYRIGHT: (C)2007,JPO&INPIT

Description

  The present invention relates to a tidying robot, and more particularly, to a tidying robot that can carry small items such as daily items in a general household, record the place where the item is tidied up, and respond when asked about the place.

As a technique close to the above-described cleanup robot, for example, there is a “conveyance support cleanup robot” disclosed in Patent Document 1. This transport assisting tidying robot first receives the ID and room number of the cared person from the remote monitoring operation unit, recognizes the meal tray in the delivery wagon with the recognition label, stores it in the transport wagon, Carry on. Then, a simple conversation with the care recipient is performed, and a tray with meals is placed on the table. When the meal is finished, the meal tray is returned to the delivery wagon, the empty space of the delivery wagon is detected, and the meal tray is returned to the delivery wagon. When all the lower arm work for several people is completed in one room, the tidying robot moves to the next room. Note that the tidying robot moves based on the route plan while recognizing and avoiding obstacles on the traveling path by means of four freely rotatable drive wheels.
JP-A-9-267276 (FIG. 1, paragraph 0011)

  However, the “conveyance support cleanup robot” disclosed in Patent Document 1 is a cleanup robot that delivers a specific object at a specific position. However, the person who wants to receive the clean-up service in a general household is not in a specific position. In addition, there is a demand for a service for clearing not only specific items such as tableware but also various daily items. Also, it is troublesome to tag many daily necessities.

  Furthermore, there is a demand for a service that provides a memory support by a tidying robot when a user forgets things that have been tidied up or tidied up.

  Therefore, an object of the present invention is to enable a clearing robot to deposit an arbitrary everyday article at an arbitrary position in accordance with a person's instruction and clear it into an arbitrary empty space.

  Another object of the present invention is to enable a tidying-up robot to perform tidying-up / returning, etc., voluntarily based on the experience of a life support service performed by itself.

  Further, an object of the present invention is to make it possible to remember with the assistance of a tidying-up robot even when a person cannot remember the storage location of an object.

A first means for solving the above-described problem is a tidying robot that receives an article from a user and stores it in a storage place, a camera that photographs the article, a hand that grips the article, and a microphone that detects the user's speech. , Speech recognition device for recognizing user's utterance contents, speaker for tidying robot uttering to user, article management for storing user name, article receiving place and storage place, receipt and storage date, article image and article name With a database, detecting the user's position, moving to the side, waiting, recognizing the user's utterance contents, taking out the hand and receiving the article,
Taking an article, recognizing the user name and the utterance of the article name by the user, and storing the article in the nearest storage location.

  The second means, in the first means, recognizes the utterance of the article return request by the user, searches the article management database, displays the candidate article image to be returned to the user, and displays the article specified by the user to the user. It is to return.

  The third means is that in the second means, the article return request is based on the ambiguous memory of the user.

  The fourth means further includes a history database for storing the frequency of goods, date / time, and place related to storage or return in the first means, and for the goods, date / time and place where the frequency exceeds a predetermined threshold value, A tidying-up robot characterized by uttering an offer for providing a service according to the speaker.

  A fifth means is the fourth means further comprising a tidying-up robot schedule table for storing service offers in time order.

  The sixth means further comprises a message database for recording the user's message about the storage date and location of the article stored by the user in the first means, and based on the message reproduction request by the user's utterance, the candidate article It is to present an image.

  A seventh means is that in the sixth means, the message reproduction request is ambiguous.

  According to the first means, an arbitrary article can be cleared to the nearest storage place by requesting the cleaning robot.

  According to the second means, the cleared article can be returned based on a request by a human voice.

  According to the third means, even if it is unclear where or when the article was stored in the tidying robot, the tidying robot searches the article management DB and displays the candidate for the article. I can have you.

  According to the fourth means, for a clean-up service that is frequently used, the clean-up robot voluntarily provides the service at a date and place where the use frequency is high, and it is easy for the user to easily use the high-use goods for the user. can do.

  According to the fifth means, the life support service can be voluntarily performed according to the schedule.

  According to the sixth means, even when the user forgets the storage place of the articles that he / she has cleaned up, the cleaning-up robot reminds us.

  According to the seventh means, even if the user's message reproduction request is ambiguous or the user forgets the storage location of the article that the user has cleaned up, the cleaning-up robot reminds the user.

  According to the present invention, according to a person's instruction, the tidying-up robot can keep any daily goods at any position and put it in any empty space. Also, even when a clean-up robot is requested to return an article that has been unclear with human memory and is ambiguous, a search for a database presents a candidate image of the relevant article to a person and identifies the article. It can be returned to a person.

  Further, according to the present invention, the tidying-up robot can voluntarily offer the provision of a life support service with high provision frequency according to the schedule.

  In addition, according to the present invention, even when a person cannot remember the storage location of things that have been cleaned up by himself / herself, it can be remembered with the help of a cleaning robot.

  Hereinafter, embodiments of the present invention will be described with reference to the drawings. However, even if there is a specific description in the dimensions, materials, shapes, relative arrangements, and the like of the component parts described in the present embodiment, the present invention is not limited thereto.

[Outline of clean-up robot]
FIG. 1 is a front view of an example of a tidying robot used in the present invention. In the head 2 of the tidying-up robot, a top camera 3, a pair of left and right front cameras 5, and a pair of microphones 4 are arranged at the position of a human eyebrow at the lower part of the forehead. A pair of left and right speakers 7 are arranged on the chest 6 away from the vertical center line. The elbows and shoulders of both arms 9 have a joint structure, and a hand 10 holds an object. The neck also has a joint structure. In this tidying robot, the lower limb is a cover having a skirt 12 shape and houses an electric circuit or the like, and can be moved by the left and right wheels 14 at the lowermost part of the lower limb. In addition, the pair of left and right obstacle sensors (infrared detection or ultrasonic detection) 13 automatically avoids obstacles and moves autonomously to the destination on the upper leg.

  FIG. 2 is a block diagram of the tidying robot. Input to the CPU 21 is output from the top camera 3, front camera 5, microphone 4, obstacle sensor 13, and data input / output 20. The CPU 21 includes an accident position recognizing unit 22, a pointing point recognizing unit 23, a voice recognizing unit 24, a neck / arm / manual recognizing unit 25, a body direction recognizing unit 26, and a past motion / position database. The neck joint drive unit 28, the speaker 7, the wrist joint drive unit 29, the chest joint drive unit 30, and the vehicle drive unit 31 are driven. The communication ability of this tidying robot by computer control is the ability to detect a person who moves or detect a face; personal identification ability to detect a face feature and identify a predetermined number of users including one owner ; Speech recognition ability such as word recognition, continuous word recognition required for conversation scenarios, and recognition of human voice by acoustic echo canceller; speech synthesis ability that can change settings such as text-to-speech method and loudness is there. In addition, the moving ability is given the ability to search for and approach a person in order to support life. As for communication capability, it has a homepage dedicated to a clean-up robot, and it is possible to register family names, change behavior patterns, perform remote operations, and view images. For example, it is possible to have a person come to meet you at a specified time by remote control, or to visit a home away from home and view the video. As for the video, not only the current video but also the past video can be browsed using the memory of the computer 21. The images include a color image obtained by the front camera 5 and a monochrome image obtained by panoramicizing the omnidirectional camera obtained by the top camera 3. Also, the tidying robot itself can search the homepage on the Internet and deliver the information to the family.

  Services provided by a tidying robot include a service in which a tidying robot deposits an article (article) from a person (user) and cleans up, and a service in which a tidying robot clears an article to a person.

[Outline of clean-up service]
FIG. 3 is a flowchart of an overview of the cleanup service. First, in S30, the position of the user who requests the cleanup is detected. The cleanup request is based on, for example, a user's utterance. The clean-up robot that has recognized the utterance responds, for example, “What should be cleaned up?”, Receives an answer from the user, and confirms the article name. Further, the face is detected by the front camera, and the position of the user is detected. When a plurality of face images are detected, the user may be specified by checking the direction of the voice or confirming the clothes of the user.

  Next, in S32, it moves to the specified user and waits. The waiting place is as close to the user as possible within the range in which the goods can be delivered, and is a place where the goods can be easily received without obstructing the passage of the specified user or other users.

  Next, in S34, when a person requests to clean up by utterance, the cleanup robot picks up the hand and receives the article by voice recognition, and then photographs the article. Receipt and shooting may be in reverse order or simultaneously.

  Next, in S36, the received article is stored in the nearest storage location (such as a shelf). If the nearest storage location is not empty, store it in the next closest location.

  Next, in S38, the image of the deposited article, the date and time, the location where the delivery was performed from the user, and the like are stored in the article management DB with a tidying-up robot. Thereby, the cleaning by the cleaning robot is completed.

[Outline of return service]
FIG. 4 is a flowchart of an overview of the return service. First, in S40, in response to a call from a person, the position of the person is detected.

  Next, in S42, the tidying-up robot moves near the person.

  Next, in S44, when a person makes a request with an ambiguous expression such as “Come with this pre-deposited item”, the tidying robot extracts a plurality of candidate article images from the article management DB and displays them. More specifically, the current person's overnight time is collated with the storage date / time stored in the article management DB, and the candidate article is displayed. The person selects the item he wants to return.

  Next, in S46, the tidying-up robot moves to the storage location, picks up the stored item, and gives it to the person.

  Finally, in S48, the tidying-up robot updates the article management DB and ends the return service.

[Embodiment 1 (cleaning-up / returning service for a clean-up robot based on human instructions)]
FIG. 5 is a block diagram of a tidying robot that executes a tidying / returning service based on a human instruction. From the person state measurement unit 51, the person's position, the person's name, and the person's voice are input to the standby position determination unit 54. Further, the position of the tidying robot is input from the tidying robot state measuring unit 52 to the standby position determining unit 54. Further, from the environment map 53, the travel route of the tidying robot is input to the standby position determination unit 54. Based on these inputs, the standby position determination unit 54 determines the standby position, and the tidying robot itself moves to that location. The instruction confirmation unit 55 interprets a person's voice command to confirm a work instruction from the person, and inputs a clearing command (cleaning command) or a fetching command (return command) to the execution unit 56.

  The execution unit 56 that has received the clean-up instruction stores the items, stores the storage information, searches the storage management database for available storage spaces, moves to the storage location based on the travel route in the environmental map 53, Store things in storage space.

  FIG. 6 is an example of the environment map 53, and the illustrated floor is divided into several sections (a veranda, a living room, a Japanese-style room, etc.). The predetermined cleaning robot travel route and the storage location of the article are stored therein. It should be noted that the stand-by location of the tidying robot is on the tidying robot travel route, and an area where the person's current position is faced and the article can be delivered is set as the place where the distance from the person is closest. Moreover, it is good to exclude the range which obstructs a person's passage.

  On the other hand, when a return instruction is received, the custody is searched in the article management DB, an image of the article is displayed, and a person is selected. Next, the location of the object is searched in the article management DB, and the object is moved to the location based on the travel route in the environmental map 53 and the object is picked up. Then, return to the waiting place and hand the thing to the person.

  When the sorting or returning is completed, the DB update unit 57 updates the custody information or updates the return information / storage information. Here, the custody information includes the name of the person who deposited the item with the tidying robot, the image of the item, and the location / date and time when the item name tidying robot deposited the item. The person's name is confirmed by face recognition. If the face cannot be recognized, the tidying robot asks and memorizes the person. The return information is the name of the person who returned the item and the date and time of return. The storage information is information indicating whether or not there is an object in the storage space.

  FIG. 6 shows a specific example of an object confirmation process (a process for allowing a user (person) to confirm what is desired to be returned from a plurality of candidates) in the execution unit 56 when executing a return service. The instruction confirmation unit 55 recognizes the name 61 of the deposited item, the deposited location 62 (“deposited here”), and the date and time 63 (deposited “about one year ago”) by the person's utterance. And The candidate of the thing to retrieve is searched for the thing corresponding to the name 61 in the article management DB. Next, the person who checked in narrowed down the objects, and further compared the position 62 where the deposit was made with the current clean-up robot position, and the one stored at a short distance is set as a candidate. Further, narrowing down the items by the person who checked in, and comparing the stored position 62 with the current tidying robot position, the items stored in a close distance are searched, and the items stored at the date and time close to the date and time 63 when the item was stored are searched. . Then, a number is attached to the image of the article as the search result, and the person selects it. The selection result is made by a person specifying a desired number by voice.

  According to the first embodiment described above, it is possible to have a clean-up robot come near a person based on a voice instruction of the person and deposit an object, and have a pre-registered storage location clean up. It is. Moreover, even if the person's memory about the deposit, date, place, etc. is ambiguous, it is possible to determine what should be returned based on an ambiguous instruction.

[Embodiment 2 (Spontaneous cleanup / return service by cleanup robot)]
FIG. 8 is a block diagram of a tidying robot that voluntarily executes a tidying / returning service. Blocks that completely overlap with FIG. 5 are also indicated in parentheses with the reference numerals given in FIG. From the person state measuring unit 71 (51), the position of the person, the name of the person, and the voice of the person are input to the standby position determining unit 74 (54). Further, the position of the tidying robot is input from the tidying robot state measuring unit 72 (52) to the standby position determining unit 74 (54). Further, from the environment map 73 (53), the traveling route of the tidying robot is input to the standby position determination unit 74 (54). Based on these inputs, the standby position determination unit 74 (54) determines the standby position, and the tidying robot itself moves to that location. The offer unit 75 (not present in FIG. 5) inputs the service content that the tidying robot voluntarily offers to the execution unit 76 (partially different from the execution unit 56 in FIG. 5).

  The second embodiment is different from the first embodiment in that a history DB 79 is provided in addition to the article management DB 78 (58) so that the tidying-up robot can execute a voluntary service.

  The history DB 79 stores the number of executions, the storage location, the execution time, and the name of the user for each article that has been cleared. Specifically, the support execution frequency for each time is used as history information. For example, the article A is deposited a times and the article B is b times. In addition, the article A is stored a times and the article B is stored b times every week or every day of the week. Further, for each room such as a living room and a bedroom, the article A is deposited a times and the article B is deposited b times.

  The offer unit 75 refers to the history DB 79, and for articles and places with high service frequency, for example, offer for clean up at times with high service frequency, offer for clean up at places with high service frequency, and articles with high service frequency. And the like are input to the execution unit 76. The execution unit stores these offers in a tidying robot activity schedule, and performs services such as cleaning up, returning, and preparing items at a predetermined time and place. Specifically, when the frequency exceeds a preset threshold by using the history DB and totaling the service frequency for a predetermined period, for example, by registering as an article / time having a high frequency of cleaning, Performs spontaneous action. For example, when a user comes near a frequently stored place, the voluntary action asks, “Would you like to take what you always use?”, Displays the article image, and if the user accepts, takes the article. Come. Also, for example, when the frequency is high, it asks "Is there any tidy thing?" Also, for example, when it comes to a specific day with a high frequency, it asks "Do you have something you always use?"

  According to the second embodiment described above, it is possible to provide a nifty service in accordance with a person's lifestyle, such as a clean-up robot willingly clean up things with high service frequency. Is possible.

[Embodiment 3 (memory support service by tidying robot)]
FIG. 8 is a block diagram of a tidying-up robot that performs a memory support service. Blocks that completely overlap with FIG. 5 are also indicated in parentheses with the reference numerals given in FIG. From the person state measurement unit 81 (51), the position of the person, the name of the person, and the voice of the person are input to the standby position determination unit 84 (54). Further, the position of the tidying robot is input from the tidying robot state measuring unit 82 (52) to the standby position determining unit 84 (54). Further, from the environment map 83 (53), the traveling route of the tidying robot is input to the standby position determination unit 84 (54). Based on these inputs, the standby position determination unit 84 (54) determines the standby position, and the tidying robot itself moves to that location. A message playback unit 85 (not present in FIG. 5) gives an instruction to reproduce the previous message or adds a new message, and based on these instructions, a message search unit 86 (not present in FIG. 5). Searches or adds messages.

  Further, the third embodiment is different from the first and second embodiments in that a message (message) DB 79 is provided so that the tidying-up robot can perform memory support. Here, the message DB 87 includes voice information of the message content, the date and time of the message, the location of the tidying robot that received the message, the location of the person who issued the message, and an image of what the person presented to the tidying robot at the time of the message.

  The message reproduction instruction in the message reproduction unit 85 is, for example, “Do not remember this area?”, “Do not remember about one year ago” or the like, depending on the utterance by a person. Moreover, adding a message means recording a new message in the message DB 87.

  When the message retrieval unit 86 receives a message reproduction instruction from the message reproduction unit 85, the message retrieval unit 86 retrieves the message DB 87 and reproduces the message, and also displays an object image. On the other hand, when a message addition instruction is received, a message is added to the message DB (in other words, the message DB is updated).

  FIG. 10 is a flowchart of memory support. First, in S1, the tidying robot identifies a person and measures its position. Next, in S2, it waits beside a person. Next, in S3, it waits for a voice or an instruction from a remote controller.

  When a human instruction is recognized, the process proceeds to a storage mode (adding a new message) or an information presentation mode (reproduction of a past message) according to the content of the instruction.

  When the process proceeds to the storage mode S40, an object to be stored is photographed and stored in the message DB 87 in S41. Next, in S42, after the tidying robot stores the thing in the person, the person answers the storage location by voice. Finally, in S43, the storage of the stored image, the recording of the message voice (the muffler was put away in the third drawer from the top of the bag), the position and location of the tidying robot that received the message, and the name of the person who made the message Store in the message DB.

  When proceeding to the information providing mode S50, in S51, the tidying-up robot asks the person which message he / she wants to listen to, such as a message in this neighborhood, a message one month ago, a message one year ago. If a person wishes to search using position information, the tidying robot collates the tidying robot position received in S52 with the current tidying robot position, extracts a message having a “distance” near, and proceeds to step S54. On the other hand, if a person wishes to search by date information, the tidying-up robot extracts a message close to the “date and time” designated by the user in S53, and proceeds to S54.

  In S54, an object image is added to the extracted message and displayed. Next, in S55, the person selects an image of an object he wants to know where he / she stored it by voice or a remote controller. Finally, the clean-up robot reproduces the message corresponding to the object image in S56.

  According to the third embodiment, even when a person cannot remember the storage location of the items that he / she has cleaned up, he / she can remember them with the assistance of the cleaning robot.

  INDUSTRIAL APPLICABILITY The present invention can be used for a tidying robot that supports a person's life, and in particular, a tidying robot for tidying up or returning daily necessities or a person who forgets a storage location for daily necessities according to a person's instruction or voluntarily. It can be used for a tidying-up robot that provides memory support.

It is a front view of a tidying-up robot. It is a block diagram of a tidying-up robot. It is a general | schematic flowchart of a tidying-up process. It is a general | schematic flowchart of a return process. It is a block diagram of the tidying-up robot which works by a human instruction. It is an environmental map which memorize | stored the driving | running route of a tidying-up robot, a waiting place, etc. It is a table | surface which shows the method of making a person select the articles | goods to return. It is a block diagram of the tidying-up robot which works spontaneously. It is a block diagram of a memory assistance tidying-up robot. It is a flowchart of a memory | storage assistance process.

Explanation of symbols

2 Head 3 Head Camera 4 Microphone 5 Front Camera 6 Chest 7 Speaker 9 Arm 10 Hand 12 Skirt 13 Obstacle Sensor 14 Wheel 20 Data Input / Output 21 CPU
22 Self-position recognition unit 23 Instruction point recognition unit 24 Voice recognition unit 25 Neck / arm / hand motion recognition unit 26 Body direction recognition unit 27 Past motion / position database 28 Neck joint drive unit 29 Wrist joint drive unit 30 Arm joint Drive unit 31 vehicle drive unit 51 human state measurement unit 52 tidying robot state measurement unit 53 environmental map 54 standby position determination unit 55 instruction confirmation unit 56 execution unit 57 DB update unit 58 article management DB

Claims (7)

  1. In a tidying-up robot that receives items from users and stores them in a storage location,
    A camera for photographing the article;
    A hand for gripping the article;
    A microphone that detects user utterances;
    A speech recognition device for recognizing the user's utterance content;
    A speaker that the tidying robot speaks to a user;
    An article management database that stores a user name, a receiving place and a storage place of the article, a reception and storage date, an image of the article, and an article name;
    Detect the user's location, move to the side and wait,
    Recognizing the user's utterance content and taking out the hand to receive the article,
    Photograph the article,
    Recognizing the user name and article name utterance by the user,
    A tidying robot that stores the article in the nearest storage location.
  2. In claim 1,
    Recognizing the utterance of the item return request by the user,
    Search the article management database and display images of candidate articles to be returned to the user;
    A tidying robot, wherein the article specified by the user is returned to the user.
  3. In claim 2,
    The tidying-up robot, wherein the article return request is based on ambiguous memory of the user.
  4. In claim 1,
    Further providing a history database for storing the frequency of goods, date and place related to the storage or return,
    A tidying robot characterized by voluntarily uttering an offer for providing a service related to an article, date and place, each of which has a frequency exceeding a predetermined threshold.
  5. In claim 4,
    A tidying robot further comprising a tidying robot schedule table for storing the service offer in time order.
  6. In claim 1,
    A message database for recording the user's message about the storage date and time and the storage location of the article stored by the user;
    A tidying robot that presents an image of a candidate for the article based on the message reproduction request by the user's utterance.
  7. In claim 6,
    A tidying-up robot, wherein the message reproduction request has ambiguous content.
JP2005347322A 2005-11-30 2005-11-30 Clearing-away robot Withdrawn JP2007152443A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005347322A JP2007152443A (en) 2005-11-30 2005-11-30 Clearing-away robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005347322A JP2007152443A (en) 2005-11-30 2005-11-30 Clearing-away robot

Publications (1)

Publication Number Publication Date
JP2007152443A true JP2007152443A (en) 2007-06-21

Family

ID=38237439

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005347322A Withdrawn JP2007152443A (en) 2005-11-30 2005-11-30 Clearing-away robot

Country Status (1)

Country Link
JP (1) JP2007152443A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009297880A (en) * 2008-06-17 2009-12-24 Panasonic Corp Article management system, article management method, and article management program
US8751048B2 (en) 2011-07-25 2014-06-10 Sony Corporation Robot device, method of controlling the same, computer program, and robot system
JP2014168824A (en) * 2013-03-01 2014-09-18 Advanced Telecommunication Research Institute International Robot control system and robot control method
JP2015196600A (en) * 2014-03-31 2015-11-09 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Object management system and transport robot
JP2017010518A (en) * 2015-06-24 2017-01-12 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド Control system, method, and device for intelligent robot based on artificial intelligence
WO2019187834A1 (en) * 2018-03-30 2019-10-03 ソニー株式会社 Information processing device, information processing method, and program

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009297880A (en) * 2008-06-17 2009-12-24 Panasonic Corp Article management system, article management method, and article management program
US8751048B2 (en) 2011-07-25 2014-06-10 Sony Corporation Robot device, method of controlling the same, computer program, and robot system
US10293487B2 (en) 2011-07-25 2019-05-21 Sony Corporation Robot device, method of controlling the same, computer program, and robot system
US9463575B2 (en) 2011-07-25 2016-10-11 Sony Corporation Robot device, method of controlling the same, computer program, and robot system
US9908241B2 (en) 2011-07-25 2018-03-06 Sony Corporation Robot device, method of controlling the same, computer program, and robot system
JP2014168824A (en) * 2013-03-01 2014-09-18 Advanced Telecommunication Research Institute International Robot control system and robot control method
JP2015196600A (en) * 2014-03-31 2015-11-09 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Object management system and transport robot
JP2017010518A (en) * 2015-06-24 2017-01-12 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド Control system, method, and device for intelligent robot based on artificial intelligence
US10223638B2 (en) 2015-06-24 2019-03-05 Baidu Online Network Technology (Beijing) Co., Ltd. Control system, method and device of intelligent robot based on artificial intelligence
WO2019187834A1 (en) * 2018-03-30 2019-10-03 ソニー株式会社 Information processing device, information processing method, and program

Similar Documents

Publication Publication Date Title
US10565509B2 (en) Adaptive virtual intelligent agent
EP2932371B1 (en) Response endpoint selection
CN107978313B (en) Intelligent automation assistant
CN105027195B (en) The context-sensitive processing interrupted
US10367652B2 (en) Smart home automation systems and methods
CN103718125B (en) Finding a called party
CN104335205B (en) The prompting entry of operation can be taken
US8386078B1 (en) Methods and systems for providing a data library for robotic devices
Fischinger et al. Hobbit, a care robot supporting independent living at home: First prototype and lessons learned
Starner et al. Augmented reality through wearable computing
US9895802B1 (en) Projection of interactive map data
Pollack et al. Pearl: A mobile robotic assistant for the elderly
JP2015536489A (en) Robot and method for autonomously inspecting or processing floor surfaces
US20170178082A1 (en) Assignment of a motorized personal assistance apparatus
US8229877B2 (en) Information processing system, information processing method, and computer program product
CN107111472A (en) It is easy to interacting between user and their environment using the earphone with input mechanism
US7269479B2 (en) Article transporting robot
US6976032B1 (en) Networked peripheral for visitor greeting, identification, biographical lookup and tracking
US9796093B2 (en) Customer service robot and related systems and methods
US8996429B1 (en) Methods and systems for robot personality development
US7584158B2 (en) User support apparatus
US7720685B2 (en) Receptionist robot system
CN105981418A (en) Personal geofence
US20140365334A1 (en) Retail customer service interaction system and method
Ekvall et al. Integrating active mobile robot object recognition and slam in natural environments

Legal Events

Date Code Title Description
A300 Withdrawal of application because of no request for examination

Free format text: JAPANESE INTERMEDIATE CODE: A300

Effective date: 20090203