JP4713846B2 - Robot remote control system - Google Patents

Robot remote control system Download PDF

Info

Publication number
JP4713846B2
JP4713846B2 JP2004136531A JP2004136531A JP4713846B2 JP 4713846 B2 JP4713846 B2 JP 4713846B2 JP 2004136531 A JP2004136531 A JP 2004136531A JP 2004136531 A JP2004136531 A JP 2004136531A JP 4713846 B2 JP4713846 B2 JP 4713846B2
Authority
JP
Japan
Prior art keywords
robot
autonomous mobile
mobile robot
remote control
control system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2004136531A
Other languages
Japanese (ja)
Other versions
JP2005313303A (en
Inventor
政典 杉坂
影雄 秋月
Original Assignee
独立行政法人科学技術振興機構
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 独立行政法人科学技術振興機構 filed Critical 独立行政法人科学技術振興機構
Priority to JP2004136531A priority Critical patent/JP4713846B2/en
Publication of JP2005313303A publication Critical patent/JP2005313303A/en
Application granted granted Critical
Publication of JP4713846B2 publication Critical patent/JP4713846B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a technology for controlling a remote autonomous mobile robot from a mobile phone and a personal computer via the Internet. The technology also relates to controlling multiple autonomous mobile robots simultaneously.

A background art robot remote control system is disclosed in Japanese Patent Laid-Open No. 15-006532 and will be described below. This background art robot remote control system is connected to a network, decodes a user's control signal and controls a movable robot, and is connected to the network and operates according to the control signal of the robot control server. A movable robot equipped with a camera, a terminal connected to a network and outputting a command for controlling the movable robot, and a billing process that checks the usage status of the robot control server in response to the user and performs a billing process accordingly It is the structure which comprises the means. According to this background art, a movable robot can be controlled by a command from a user terminal connected to the network via a control server connected to the network, and the user terminal can receive a camera image from the robot. In addition to being able to control a robot that can be moved by a user from a remote location, and not only being able to judge the control result from a remote location based on a photographic image, a plurality of control programs for controlling the robot on the control server are provided. Since it can be managed, it becomes possible for the service provider to manage on the control server, and the user can receive the latest functions at a lower cost than when the control program is individually loaded on the robot side. In addition, since the control server company develops software and performs maintenance, the user does not need to install a server or develop a program independently, and can receive a service at a low cost. Moreover, since the processor on the control server side moves the server software, the processor on the robot body side is lightly loaded, and a small and inexpensive processor can be used.
Japanese Patent Laid-Open No. 15-006532

  However, the background art is configured as described above, the robot itself is not scalable, and it is necessary to significantly modify the robot itself in order to request an operation that cannot be realized with the original configuration. There is a problem that it takes time and cannot respond quickly. Although the robot itself can be replaced with a robot having a new configuration, it is not practical in terms of cost. Further, the background art has a problem that only a single robot is remotely controlled, and a plurality of robots cannot be remotely controlled. Even if it can be done, the robot cannot perform joint work.

  The present invention has been made to solve the above-described problems, and an object of the present invention is to provide a robot remote control system that is versatile and can respond to various requests of an operator. It is another object of the present invention to provide a robot remote control system that realizes joint work of a plurality of robots by remote control.

Robot remote control system according to the present invention, a control terminal having a terminal communication means for inputting means and the communication of one or both of the character input and voice input, consisting of one or both of the motor and engine drive Means, a pre-arranged equipment unit, a common connection part that is a general-purpose interface that can be connected to the equipment, and an additional equipment that is connected to the common connection part as necessary. A plurality of autonomous mobile robots having a certain additional equipment unit and robot communication means for communicating, and operating based on control commands from the control terminal via the Internet, the plurality of autonomous mobile robots, communicates between said each autonomous mobile robot according to the control command from the control terminal, the position information of the cross of the autonomous mobile robot, progress information of the work, and the task information including Based on the robot state information, the robots operate in cooperation with each other, and in the relationship between any one autonomous mobile robot and any other autonomous mobile robot, The task of any one of the autonomous mobile robots is not smoothly delayed and the task scheduled to be performed by any other autonomous mobile robot is determined from the task information of any other autonomous mobile robot. When the mobile robot can execute, the arbitrary one autonomous mobile robot assists the arbitrary other autonomous mobile robot.
In the present invention as this, and a control terminal and an autonomous mobile robot, since the autonomous mobile robot based on the control command from the control terminal to operate freely autonomously via the control terminal while a remote location In addition to being able to operate the mobile robot, it is possible to add not only pre-arranged devices but also additional devices as necessary, and more smoothly perform operations according to user requirements. it can. In particular, an interface for connecting to a device is general-purpose and can be easily added. Here, the communication via the communication means may be any communication, and any communication such as communication using a mobile phone, communication via the Internet, and wireless communication in a specific band can be applied. It is possible to support a plurality of communications, and one can be used as a main line and the other as an auxiliary line, so that availability can be improved.
Since each autonomous mobile robot operates in cooperation with each other, the work can be performed quickly and reliably. Without arranging various devices in a concentrated manner on one autonomous mobile robot, the devices can be distributed and operated on each autonomous mobile robot. By disposing a plurality of identical devices on some of the autonomous mobile robots, even if a device with a high failure rate actually breaks down, the operation can be continued without interruption .

  In addition, the robot remote control system according to the present invention includes a server that receives a control command from the control terminal and controls the autonomous mobile robot based on the control command, if necessary. As described above, in the present invention, the server controls the autonomous mobile robot based on the control command. Therefore, even if the processing capacity of the control terminal is low, information processing related to control can be borne on the server side. Further, even when there are a plurality of control terminals, control can be processed without competing, and smooth operation can be realized.

The robot remote control system according to the present invention, if necessary, the server software to detect the abnormal pattern, in which further incorporates software such as data analysis software. As described above, in the present invention, by adding the function in the server, the autonomous mobile robot can be controlled more smoothly in response to the control command from the control terminal, and the information acquired from the autonomous mobile robot can be added. Information can be provided to the control terminal by performing information processing from the function that has been performed, and the information processing result can be fed back to the operation of the autonomous mobile robot.

  Further, in the robot remote control system according to the present invention, the server has robot state information such as the current position of each autonomous mobile robot as necessary. Thus, in the present invention, since the server has robot state information such as the current position of each autonomous mobile robot, the server can be efficiently operated using the robot state information. For example, in a series of work, if this work is not completed, it is often the case that all work cannot proceed to the next step. The work can be facilitated.

  By giving commands or control to the autonomous mobile robot Taro and its group system via indirect remote control via personal computer and mobile phone via the Internet, the robot and its group share information with robots and servers. It is now possible for robots to share and execute. According to the present invention, the task execution by the robot and its group can be expected to be used in the various fields already described, can perform highly complex work, etc., and its effect is very great.

(First embodiment of the present invention)
A robot remote control system according to a first embodiment of the present invention will be described with reference to FIGS. 1 is a schematic diagram of a robot remote control system according to the present embodiment, FIG. 2 is a front view and a side view of the autonomous mobile robot according to the present embodiment, and FIG. 3 is a hardware configuration diagram of the autonomous mobile robot according to the present embodiment. 4 is a hardware configuration diagram from another viewpoint of the autonomous mobile robot according to the present embodiment, FIG. 5 is a software configuration diagram of the autonomous mobile robot according to the present embodiment, and FIG. 6 or FIG. 7 is an autonomous configuration according to the present embodiment. FIG. 8 is a connection state diagram of the autonomous mobile robot according to the present embodiment, and FIG. 9 is an example of a grammar file of the command execution system according to the present embodiment.

  The robot remote control system according to the present embodiment is preliminarily provided with a control terminal having input means for character input, voice input, etc. and terminal communication means for communication, drive means including a motor, an engine, etc., a TV camera, etc. Possessed equipment part, a common connection part that is a general-purpose interface that can be connected to the equipment, an additional equipment part that is an additional equipment connected to the common connection part as necessary, and An autonomous mobile robot 20 having robot communication means for communicating, and a server 40 that receives a control command from the control terminal and controls the autonomous mobile robot 20 based on the control command, from the control terminal The autonomous mobile robot operates based on a control command.

  The control terminal corresponds to a mobile phone 11 and a personal computer 12 (hereinafter referred to as a personal computer). However, the control terminal is not limited to the mobile phone 11 or the personal computer 12 as long as it can at least input and communicate. The processing capability of the mobile phone 11 has also been improved and can withstand sufficient use, and processing with high load can be implemented by distributing it to the server 40 or the autonomous mobile robot 20. For example, the mobile phone 11 performs only a low-load process for transmitting a control command according to http (HyperText Transfer Protocol) to the server 40, the server 40 identifies the control command, and performs the corresponding control as an autonomous mobile robot. 20 can be performed. In the case of the mobile phone 11, a button such as a dial button and a microphone serve as input means, and the communication function of the mobile phone 11 serves as terminal communication means. In addition, the Internet connection from the mobile phone 11 is performed via the base station and the exchange station and via the gateway server of the telecommunications carrier.

  As shown in FIG. 2, the autonomous mobile robot 20 includes an indicator LED 21 (Light Emitting Diode), a monitor 22, an audio output unit 23, a communication personal computer 24, an image processing personal computer 25, a control personal computer 26, and a DC (Direct Current). ) Motor 27, driving wheel 28, auxiliary wheel 29, lithium ion battery 30, CCD (Charge Coupled Devices) camera 31, transceiver 32 (RDIS / LT-08: Furukawa Machine Metal Co., Ltd.), and supporting these, It consists of the exterior body 33 to protect. The DC motor 27 and the driving wheel 28 and the auxiliary wheel 29 that are driven and rotated by the DC motor 27 are the driving means, and the indicator LED 21, the audio output unit 23, and the CCD camera 31 are the owned device unit, and the RDIS / LT- 0832 corresponds to the common connection portion, and RDIS / LT-0832 and the communication personal computer 24 correspond to the robot communication means. Although the additional equipment section is not explicitly shown in FIG. 2, a dedicated arm, a radioactivity measurement sensor, a toxic gas sensor, a biosensor, a landmine detection sensor, and the like are applicable.

  The server 40 is a computer (sometimes referred to as software) that provides functions and data to the control terminal and the autonomous mobile robot 20 on the network, and is implemented on a personal computer in this embodiment. The server 40 can be configured to incorporate various types of software. In addition, the server 40 can be configured to include a plurality of servers 40 so that each server provides a function. In addition, the form which the user browses the result obtained by incorporating these software into the server 40 may be used by the control terminal, or the data acquired from the autonomous mobile robot is directly processed by incorporating the software into the control terminal. It can also be realized in a form that the user browses.

  In FIG. 1, this robot remote control system is an autonomous mobile robot 20 (referred to as “Taro”) equipped with an artificial brain having RDIS / LT-0832 in a remote danger area, an operation center, or a mobile phone at any place. It consists of a telephone 11, a personal computer 12 (which is a control terminal), and a server 40.

  The many-body Taro 20 is spatially dispersed (see FIG. 3), each of which has a communication function with the artificial brain and can exchange commands and data, and is configured as a spatially distributed network system. A work command (control command) can be transmitted from the mobile phone 11 and the personal computer 12 to these autonomous mobile robots 20 via the Internet to perform work. For example, as an example of work, as shown in FIG. 1, from various sensors (CCD camera, radioactivity measurement sensor, toxic gas sensor, biosensor, landmine detection sensor, etc.) mounted on an autonomous mobile group robot. There are various tasks such as data, measurement and transmission of abnormal values, and other tasks such as patrol and garbage collection.

  The mobile phone 11 and the personal computer 12 can transmit a signal for instructing work performed by the autonomous mobile robot 20 and the data measured by the autonomous mobile robot 20 and the status of the work performed via the Internet. For example, the mobile phone 11 and the personal computer 12 have the following three software programs to make it possible. (I) Control software for the so-called autonomous mobile robot 20 to work, which issues commands (commands) to control the autonomous mobile robot for tasks such as measurement, movement, face recognition, voice recognition and synthesis, (II) measurement So-called data analysis software consisting of data analysis and graphing software, (III) so-called abnormal patterns consisting of software that recognizes abnormal patterns using artificial neural networks that have previously learned abnormal patterns It is software to detect. As described above, these software can be incorporated into the server 40.

  For the Internet and some functions of the server 40, a server (existing on the Internet) provided by Furukawa Machine Metal Co., Ltd. is used. In this regard, the server managed by Furukawa Machine Metal Co., Ltd. can be used by acquiring the management and monitoring IDs and passwords, and a description thereof will be omitted. In order to remotely control the autonomous mobile robot, in addition to the transmitter / receiver RDIS / LT-0832 provided by Furukawa Machine Metal Co., Ltd. and a program for operating the transmitter / receiver, the RDIS / LT-0832 and the autonomous mobile robot 20 Has its own connection circuit (see FIG. 8) and its own program for operating it.

  Next, the navigation based on vision and the recognition system will be described for the use operation of the robot remote control system according to the present embodiment. First, a description will be given of navigation (running) based on vision as a use operation of a robot. There are two types of landmarks. The position of the continuous mark (guideline) is determined. Common landmarks (circles, triangles, etc.) can be used to perform specific actions, for example, rotate, stop, etc. The autonomous mobile robot 20 extracts a guideline and a common mark from one image captured by two CCD cameras 31 each capable of capturing 24 images / second and recognizes them. The autonomous mobile robot 20 acts based on the result. The autonomous mobile robot has a maximum speed of 12 cm / sec, and has the ability to find the target mark as fast as possible during movement.

  The autonomous mobile robot 20 has a visual tracking skill learning function. Therefore, the autonomous mobile robot 20 has the ability to adapt to environmental changes and has the ability to learn new skills through interaction. Therefore, it is not necessary to rewrite the program of the autonomous mobile robot 20 in accordance with changes in the environment. The outline and results of visual tracking skill learning are shown in FIG. For the image captured from the CCD camera 31, the object is extracted, and the error e (t) between the position y of the object and the target yd (in the case of guideline tracking, y is the center of gravity of the guideline and yd is the center of the image) and its change The minute de / dt state is input and the control signal u (t) is selected as the output, and learning is performed with an artificial neural circuit each time. The learning result is added to the actuator 27, the autonomous mobile robot 20 is run, and learning is repeated to learn the tracking skill. Comparison of experimental results (red and light blue), line (yellow), and error of the line tracking of forward and reverse learning control of guideline tracking and preset control (PID, etc.) Submit a reference map separately). For details, see the literature (AA Loukianov, M. Sugisaka, "An approach for learning a visual tracking skill on mobile robot", Proceedings of the SICE / ICASE Workshop "Control Theory and Application", Nagoya, Japan, pp. 83-87. , 2001.).

  In addition, the autonomous mobile robot 20 has a function of estimating its own position using sensor data in the environment. For details, see the literature (AA Loukianov, M. Sugisaka, "A hybrid method for mobile robot probabillstic localization using a single camera", Proceedings of International Conference on Control, Automation and Systems, Jeju, Korea, pp. 284-287, 2001. See). The autonomous mobile robot 20 has a function of finding out how to move to a desired place based on a map of the topology (form) of the environment. For details, refer to the literature (T. Kubik, M. Sugisaka, "Rule based robot navigation system working in an indoor environment", Proceedings of XIV International Conference on Systems Science, Wroclow, Poland, pp. 212-219, 2001.) reference. In addition, moment invariants are used to recognize common landmarks.

  Next, the recognition system will be described. IBM ViaVoice SDK (Software Developer's Kit) is used for speech recognition, and IBM ViaVoice TTS (Text to Speech) SDK software is used for speech. An example of the command execution system grammar file is shown in the table of FIG. You can easily create an extension of this. Use this system to understand Japanese and English commands, and answer and ask questions.

  For Japanese, you can do the following: For example, self-introduction is as follows. Speak “self-introduction” to the microphone of the autonomous mobile robot 20 and understand it, answer “Yes” in Japanese, and then introduce Taro 20 in Japanese. For example, "My name is Taro. Born on August 28, 2001 in Oita University Sugisaka Laboratory ...." and so on).

  Speak “face recognition” to the microphone of the autonomous mobile robot 20 and understand it, answer “yes” in Japanese, and then the person in front of the robot 20 . the number has to a face recognized by its own software is also possible) to look at how much, "Hello ##'s" (of ## people name, for example, answer "Tokuda"). When another person comes, the part of ## becomes another person's name.

  Speaking “recognition” to the microphone of the robot 20 and understanding it, answering “yes” in Japanese and showing a red (any color) triangle figure in front of the CCD camera 31 of the robot 20 “This is a triangle,” the robot answers. If you show a round figure instead of a triangle, you answer "This is a round." Recognize a figure using the invariant of learned figures.

  Speak to the microphone of the robot 20 and understand it, answer “yes” in Japanese, and follow the markers on the colored floor with the robot 20 camera. 20 stops when it has run out.

  When you say “Stop” to the microphone of the robot 20, you understand it, answer “Yes” in Japanese, and the robot stops the action you are performing.

  Speaking “control” to the microphone of the robot 20 understands it and answers “robot control” in Japanese. After that, if you speak to the microphone of the robot 20 and say “2 meter advance”, you will understand it, and it will ask you “Do you advance 2 meters” in Japanese. If you answer “Yes”, the robot 20 will advance 2 meters. If “no” is answered, the robot 20 does not execute the command.

  If you speak to the microphone of the robot 20 saying “Turn your head to the left / right”, you will understand it, answer “Yes” in Japanese and execute the command.

  If you speak to the microphone of the robot 20 saying “He straightens”, he understands it, answers “Yes” in Japanese and executes the command.

  If you say "Look up" to the microphone of the robot, you will understand it, answer "Yes" in Japanese and execute the command. Similarly, if they speak to the microphone of the robot 20 such as “look down” and “look in front”, they understand them, answer “yes” in Japanese, and execute those commands.

  When you speak to the microphone of the robot 20 saying "Turn right ### degrees", you understand it, answer "Yes" in Japanese and execute the command. The same applies to the left case.

  Speaking “No” to the microphone of the robot 20 understands it and cancels the given command.

  Next, an English version of the above directive has already been developed. In English, the commands are “Self introduction”, “Face recognition”, “Recognition”, “Line tracking”, “Stop”, and “Control”, respectively. As an example, only “Control” will be described. Speak “Go 2 meters” to the microphone of the robot 20 and understand it, and in English it will ask “Shall I execute“ Go 2 meters ”command?”. If you answer “Yes please”, the robot 20 advances 2 meters. When answering “No”, the robot 20 answers “Command is canceled” and does not execute the command. The same applies to other commands. For example, "Turn your head to the left / right", "Straighten your head", "Look up / down / straight", "Turn left / right ### degrees", "No (for canceling of a command) (# ## means a number) ”.

  The above is combined into one system and considered as an artificial brain of the autonomous mobile robot 20. FIG. 3 shows the conceptual hardware, FIG. 5 shows the software, and FIG. 7 shows the behavior table. With this artificial brain system, the autonomous mobile robot 20 moves autonomously on the internal map based on the image of the CCD camera 31 and the ultrasonic sensor signal, acts by recognizing a command by voice recognition, It was possible to detect and recognize, read the name and talk with the other party.

  Further, as described above, a command is sent from the mobile phone 11 and the personal computer 12 to the RDIS / LT 32, and various sensors (CCD camera, radioactivity measurement sensor, toxic gas sensor, biosensor, landmine detection, etc.) mounted on the autonomous mobile robot 20 are sent. After the data and its abnormal values are measured and transmitted by a sensor or the like, the server 40 is equipped with so-called data analysis software consisting of software for analyzing and graphing the measurement data. Further, as already described in Japanese Patent No. 30060601 (name of invention “self-supporting vehicle”), software that recognizes an abnormal pattern using an artificial neural network that has previously learned an abnormal pattern, that is, abnormal It also has software to detect patterns. For details, see Japanese Patent No. 30060601.

  Below, the part regarding the remote control of the autonomous mobile robot 20 will be described. RDIS / LT-08 is provided by Furukawa Machine Metal Co., Ltd. In the present invention, a unique connection circuit necessary for remotely controlling the autonomous mobile robot “Taro” 20 will be described. Therefore, the specification of RDIS / LT-08 is described. This RDIS / LT-0832 has 8 digital input ports, 8 digital output ports, 2 analog input ports, and an RS232 data port. These ports and the autonomous mobile robot 20 are connected. By doing so, the control of the server 40 reaches the autonomous mobile robot 20 via the RDIS / LT-0832.

  There are two types of control methods for remote control. One is indirect remote control by the mobile phone 11 and the personal computer 12, and the other is direct control. First, indirect remote control will be described. A connection circuit between the RDIS / LT-0832 and the autonomous mobile robot 20 is shown in FIG. The data terminal of RS232 is connected to various sensors mounted on Taro 20.

  The autonomous mobile robot Taro 20 can be controlled by a combination of signals from the eight output ports of RDIS / LT-0832. The state of the autonomous mobile robot Taro 20 and the status of various sensors mounted on Taro 20 are RDID / LT -Sent to 8 input ports of -0832. An example is shown in the table of FIG.

When an event occurs in the input port of RDIS / LT-0832, it is possible to take the next action or execute various complicated commands depending on the state of Taro. As an example, the following is an example of a program.
if (O5 == 0 && O6 == 0) then
outr 1 10;
elseif (O5 == 1 && O6 == 0) then
outr 1 20;
elseif (O5 == 0 && O6 == 1) then
outr 1 30;
elseif (O5 == 1 && O6 == 1) then
outr 1 00;
else
outr 1 00;
endif
In this example, if Taro sends a signal that the action at the input port of RDID / LT-0832 is finished, according to the above table, if Taro's action is in the initialization state, “turn the neck to the left” If the command is given and the neck is turned to the left, give the command “Turn the neck to the right”, and if the head is turned to the right, give the command “Turn the head straight” and turn the neck straight. If it is in a state, it is a program that gives an "initialization" command and repeats it indefinitely. Of course, it is possible to perform a given job by combining various other operations, and is versatile.

  The direct remote control is a method of executing on / off of switches and relays of various drivers of the autonomous mobile robot Taro 20 in accordance with commands from the mobile phone 11 and the personal computer 12. The driver includes two DC motors 27 that move the traveling wheels, a stepping motor that moves the neck, and drivers of other mounted measuring instruments and devices. If the autonomous mobile robot Taro 20 cannot cope with changes in the environment, the Taro is directly remotely controlled by the operator using the mobile phone 11 or the personal computer 12.

  Usually, Taro 20 runs along lines and landmarks on the road in advance, and checks the state of the environment with various sensors installed. If these sensors measure abnormal values or find a suspicious person, they are immediately sent to the mobile phone 11 and the personal computer 12, and if the autonomous mobile robot Taro 20 cannot cope with autonomous actions, the situation will change. In order to quickly respond, the robot 20 is controlled by switching to a mode in which the operator directly performs remote control while viewing the image data from the robot 20. By constructing such a control system, it is possible to construct a homeland security system using the robot 20 remotely controlled by the Internet as shown in FIG.

  When the autonomous mobile robot 20 is desired from anywhere on the earth via the Internet, the mobile phone 11 and the personal computer 12 are directly or indirectly controlled by remote control. The robot 20 can perform a given job. The contents of work use various sensors to measure and send the data, recognize and detect abnormal values of measured values, send alarms, recognize people, talk to people and guide them. I took an example of that. The type of work depends on the function of the robot 20, but the remote control system of the present invention can naturally be applied to the robot 20 having a different function. Moreover, not only the sensor of the robot 20 but also the data measured by the measuring instrument mounted on the robot 20 can be directly received and analyzed by the personal computer 12 or the mobile phone 11, so that it has versatile novelty, originality and marketability. A new autonomous mobile robot remote control system with high functionality and high expandability can be constructed, and remote control can be easily performed from anywhere.

  Actually, an experiment was performed using the autonomous mobile robot Taro 20, and various operations and tasks were performed with the mobile phone 11 and the personal computer 12. The RDIS / LT-0832 program includes a program for indirectly controlling the autonomous mobile robot Taro 20 from the personal computer 12 and a program for indirectly controlling the mobile phone 11. Since the personal computer 12 has a large LCD (Liquid Crystal Display) screen, it can draw various figures, but since the LCD screen of the mobile phone 11 is small, the display method is different, and the essential remote control functions are the same. Therefore, it is necessary not only to prevent the mobile phone 11 from performing high-load processing, but also to devise display, and a user interface that is as simple as possible is desired.

(Second embodiment of the present invention)
A robot remote control system according to a second embodiment of the present invention will be described with reference to FIGS. 10 is a schematic diagram of the robot remote control system according to the present embodiment, FIG. 11 is a simplified diagram of the robot remote control system according to the present embodiment, and FIG. 12 or FIG. 13 is information exchange of the robot remote control system according to the present embodiment. FIG.

  10 or 11, the robot remote control system according to the present embodiment is configured in the same manner as the robot remote control system according to the first embodiment, and includes a plurality of autonomous mobile robots 20. Are different configurations that operate in cooperation with each other. In addition, the server 40 may be configured to have robot state information such as the current position of each autonomous mobile robot 20.

  The robot state information includes the position information of the autonomous mobile robot 20, what the autonomous mobile robot 20 is currently doing (running, stopping, detecting, communicating, etc.), the remaining lithium battery of the autonomous mobile robot 20 It is information related to the autonomous mobile robot such as the amount and the temperature of the autonomous mobile robot 20. This robot state information is updated by uploading from each autonomous mobile robot and is always the latest. Further, these pieces of information are recorded at any time, and can be used as a verification log when a problem occurs in the autonomous mobile robot 20.

  The server 40 can realize the operation in cooperation with each other by referring to each robot state information recorded in the server 40. There are multiple components, and after setting the operations that can be performed by these components in advance, the job scheduler that automatically executes various tasks by entering the tasks you want to perform is already FA (Factory Automation) It is often done in such fields. Similarly, in this embodiment, the autonomous mobile robot 20 can be operated in cooperation by causing the server 40 to take charge of the job scheduler. It can be easily implemented by incorporating work management software already created in the server 40 (a certain amount of correction is necessary for the software). As a function to be added to the job scheduler, when a certain autonomous mobile robot 20 passes through another autonomous mobile robot 20, if the autonomous mobile robot 20 has requirements of various conditions, The operation of the autonomous mobile robot 20 is assisted. The various conditions correspond to, for example, that the progress of work currently being performed by a certain autonomous mobile robot 20 is not steadily delayed and that a task to be assisted can be executed.

  Next, a specific example of the use operation of the robot remote control system according to the present embodiment will be described. The joint task is to build blocks and create a building house. Here, there are three robots, which are robots 20A, 20B, and 20C, respectively (see FIG. 12).

  The job scheduler of the server 40 grasps the current position of the robot, causes the robot 20A to carry the roof part to a predetermined position, and similarly causes the robots 20B and 20C to carry the second floor part and the first floor part, respectively. Next, the parts on the first floor are arranged on the robot 20C having the parts on the first floor as a base. Next, the robot 20B assembles the parts on the second floor on the parts on the first floor, and finally places the roof parts on the robot A to complete the building house. Here, in this description, the case where the position of the building block to be carried is already known has been described. However, even if the position of the building block is not grasped, each robot 20 uses the CCD camera 31 to find the building block. After that, each robot can carry blocks as well. If the arrival of parts on the first floor is delayed, it is possible to arrange the roof parts on the parts on the second floor and schedule more efficiently.

  The robot remote control system according to the present embodiment has a configuration in which the robot 20 is operated by the control of the server 40. However, in order to make the robot 20 take charge of the control, the robot 20 communicates with each other to obtain the robot state information. They can also be exchanged (see FIG. 13). By exchanging the robot state information with each other in this way, it is possible to grasp the state of the other robot 20 and issue a command to the other robot 20 or ask for help. If the problem cannot be solved by the robot 20 alone, the server 40 or the user can be inquired. When the example of building blocks is applied to a configuration in which robot state information can be exchanged between the robots 20, each robot 20 first receives a task of building blocks from the server 40. The robot 20 grasps the position of each robot, and declares it to the other robots 20 so that each robot 20 carries the nearest building block. If this declaration does not conflict, the work is started. If there is a conflict, the efficiency of carrying the remaining blocks is estimated, and the efficient robot 20 is made to carry the remaining blocks. After that, the building house is completed in the same manner as described above. By doing in this way, it is possible to work quickly without applying a load to the server 40. Here, a leader can be determined in the robot 20, and the determination of the leader can be prioritized when judging the work. The leader robot 20 can also prompt the robot 20 whose work is delayed, and the robot 20 that has received the prompt increases the work ability or works on another robot 20. You will be asking for help.

  In this embodiment, the server 40 controls each robot 20 according to conditions. However, as a method for improving the workability, the control part for the robot 20 of the server 40 is controlled by genetic programming. It can be implemented and efficiency can be measured aiming at the evolution of this collaborative work. In the genetic programming, in the configuration in which the robots 20 can exchange the robot state information, the efficiency of the genetic programming can be similarly achieved by implementing the robot work determination part by genetic programming.

1 is a schematic diagram of a robot remote control system according to a first embodiment of the present invention. It is the front view and side view of the autonomous mobile robot which concern on the 1st Embodiment of this invention. It is a hardware block diagram of the autonomous mobile robot which concerns on the 1st Embodiment of this invention. It is a hardware block diagram from another viewpoint of the autonomous mobile robot which concerns on the 1st Embodiment of this invention. It is a software block diagram of the autonomous mobile robot which concerns on the 1st Embodiment of this invention. It is operation | movement explanatory drawing of the autonomous mobile robot which concerns on the 1st Embodiment of this invention. It is operation | movement explanatory drawing of the autonomous mobile robot which concerns on the 1st Embodiment of this invention. It is a connection state figure of the autonomous mobile robot which concerns on the 1st Embodiment of this invention. It is an example of the grammar file of the command execution system which concerns on the 1st Embodiment of this invention. It is the schematic of the robot remote control system which concerns on the 2nd Embodiment of this invention. It is a simplified diagram of a robot remote control system according to a second embodiment of the present invention. It is an information exchange form figure of the robot remote control system which concerns on the 2nd Embodiment of this invention. It is an information exchange form figure of the robot remote control system which concerns on the 2nd Embodiment of this invention.

Explanation of symbols

11 Mobile phone 12 Personal computer 20 Autonomous mobile robot 21 Indicator LED
DESCRIPTION OF SYMBOLS 22 Monitor 23 Audio | voice output part 24 Communication personal computer 25 Image processing personal computer 26 Control personal computer 27 DC motor 28 Driving wheel 29 Auxiliary wheel 30 Lithium ion battery 31 CCD camera 32 Transceiver 33 Exterior body 40 Server

Claims (4)

  1. A control terminal having a terminal communication means for inputting means and communicating one or both of the character input and voice input,
    Drive means consisting of one or both of the motor and engine, and pre-owned devices unit is disposed the equipment, the common connection portion is an interface which can be connected to generically equipment, the common connection as required A plurality of autonomous mobile robots that have an additional device unit that is connected and additionally disposed, and robot communication means for communicating, and that operate based on control commands from the control terminal via the Internet And
    The plurality of autonomous mobile robots communicate with each other according to a control command from the control terminal , and include robot state information including position information of each autonomous mobile robot, work progress information, and task information. Based on the operation information of the arbitrary one autonomous mobile robot in relation to any one autonomous mobile robot and any other autonomous mobile robot. The task of the autonomous mobile robot is not smoothly delayed, and the task of any other autonomous mobile robot is scheduled to be performed by any one autonomous mobile robot based on the task information of the other autonomous mobile robot. A robot remote control characterized in that said one arbitrary autonomous mobile robot assists said any other autonomous mobile robot when it can be executed Stem.
  2. The robot remote control system according to claim 1, wherein
    A robot remote control system comprising a server that receives a control command from the control terminal and controls the autonomous mobile robot based on the control command .
  3. The robot remote control system according to claim 2 , wherein
    The robot remote control system , wherein the server further includes software for detecting an abnormal pattern and data analysis software .
  4. In the robot remote control system according to claim 2 or 3,
    The robot remote control system, wherein the server has robot state information including a current position of each autonomous mobile robot .
JP2004136531A 2004-04-30 2004-04-30 Robot remote control system Expired - Fee Related JP4713846B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004136531A JP4713846B2 (en) 2004-04-30 2004-04-30 Robot remote control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004136531A JP4713846B2 (en) 2004-04-30 2004-04-30 Robot remote control system

Publications (2)

Publication Number Publication Date
JP2005313303A JP2005313303A (en) 2005-11-10
JP4713846B2 true JP4713846B2 (en) 2011-06-29

Family

ID=35441253

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004136531A Expired - Fee Related JP4713846B2 (en) 2004-04-30 2004-04-30 Robot remote control system

Country Status (1)

Country Link
JP (1) JP4713846B2 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5033994B2 (en) * 2006-01-19 2012-09-26 株式会社国際電気通信基礎技術研究所 Communication Robot
TWI338588B (en) * 2007-07-31 2011-03-11 Ind Tech Res Inst Method and apparatus for robot behavior series control based on rfid technology
JP4839487B2 (en) * 2007-12-04 2011-12-21 本田技研工業株式会社 Robot and task execution system
US8731714B2 (en) * 2010-09-22 2014-05-20 GM Global Technology Operations LLC Concurrent path planning with one or more humanoid robots
JP5912451B2 (en) * 2011-11-25 2016-04-27 学校法人千葉工業大学 Remote control system for unmanned vehicle
CN102561294A (en) * 2011-12-13 2012-07-11 河海大学 Telerobot-based geotechnical engineering parameter mobile test system and control system thereof
JP5296898B2 (en) * 2012-03-21 2013-09-25 株式会社国際電気通信基礎技術研究所 Android control system
KR101504699B1 (en) * 2013-04-09 2015-03-20 얄리주식회사 Phonetic conversation method and device using wired and wiress communication
JP6352151B2 (en) * 2014-11-07 2018-07-04 ソニー株式会社 Information processing apparatus, information processing system, and information processing method
JP6532279B2 (en) 2015-04-28 2019-06-19 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Movement control method and movement control device
JP6380469B2 (en) * 2016-06-23 2018-08-29 カシオ計算機株式会社 Robot, robot control method and program

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01295772A (en) * 1988-05-19 1989-11-29 Mitsubishi Heavy Ind Ltd Robot for space
JPH10166286A (en) * 1996-12-06 1998-06-23 Sony Corp Robot device, connecting device and actuator module
JPH11156765A (en) * 1997-11-30 1999-06-15 Sony Corp Robot device
JP2002154081A (en) * 2000-11-16 2002-05-28 Nec Access Technica Ltd Robot, its facial expression method and detecting method for step difference and lifting-up state
JP2003001578A (en) * 2001-06-26 2003-01-08 Casio Comput Co Ltd Robot, robot management system, robot control program, robot management processing program, robot management method, and instrument management system, instrument management processing program, instrument management method
JP2003006532A (en) * 2001-06-27 2003-01-10 Fujitsu Ltd Movable robot and service providing system through server using the acquired image
JP2003181783A (en) * 2001-12-17 2003-07-02 Fuji Photo Film Co Ltd Information communication apparatus
JP2003291083A (en) * 2002-03-28 2003-10-14 Toshiba Corp Robot device, robot controlling method, and robot delivery system
JP2003340762A (en) * 2002-05-24 2003-12-02 Mitsubishi Heavy Ind Ltd Robot and robot system
JP2003345435A (en) * 2002-05-24 2003-12-05 Mitsubishi Heavy Ind Ltd Robot and robot system
JP2005514213A (en) * 2001-10-17 2005-05-19 ウィリアム・マーシュ・ライス・ユニバーシティ Autonomous robot crawler for in-pipe inspection

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11109847A (en) * 1997-10-01 1999-04-23 Sony Corp Cell and multicellular robot

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01295772A (en) * 1988-05-19 1989-11-29 Mitsubishi Heavy Ind Ltd Robot for space
JPH10166286A (en) * 1996-12-06 1998-06-23 Sony Corp Robot device, connecting device and actuator module
JPH11156765A (en) * 1997-11-30 1999-06-15 Sony Corp Robot device
JP2002154081A (en) * 2000-11-16 2002-05-28 Nec Access Technica Ltd Robot, its facial expression method and detecting method for step difference and lifting-up state
JP2003001578A (en) * 2001-06-26 2003-01-08 Casio Comput Co Ltd Robot, robot management system, robot control program, robot management processing program, robot management method, and instrument management system, instrument management processing program, instrument management method
JP2003006532A (en) * 2001-06-27 2003-01-10 Fujitsu Ltd Movable robot and service providing system through server using the acquired image
JP2005514213A (en) * 2001-10-17 2005-05-19 ウィリアム・マーシュ・ライス・ユニバーシティ Autonomous robot crawler for in-pipe inspection
JP2003181783A (en) * 2001-12-17 2003-07-02 Fuji Photo Film Co Ltd Information communication apparatus
JP2003291083A (en) * 2002-03-28 2003-10-14 Toshiba Corp Robot device, robot controlling method, and robot delivery system
JP2003340762A (en) * 2002-05-24 2003-12-02 Mitsubishi Heavy Ind Ltd Robot and robot system
JP2003345435A (en) * 2002-05-24 2003-12-05 Mitsubishi Heavy Ind Ltd Robot and robot system

Also Published As

Publication number Publication date
JP2005313303A (en) 2005-11-10

Similar Documents

Publication Publication Date Title
Fong et al. Vehicle teleoperation interfaces
AU2006306522B2 (en) Networked multi-role robotic vehicle
US20120095619A1 (en) Remote Vehicle Missions and Systems for Supporting Remote Vehicle Missions
EP2194435A2 (en) Garment worn by the operator of a semi-autonomous machine
JP2004306242A (en) Home robot control system and home robot application method of it
US20090037024A1 (en) Robot Operator Control Unit Configuration System and Method
JP2004160653A (en) Home robot control system and home robot operating method
JP2006172462A (en) Communication method and communication system
Pinto et al. The LSTS toolchain for networked vehicle systems
Fong et al. Collaborative control: A robot-centric model for vehicle teleoperation
Grange et al. Effective vehicle teleoperation on the world wide web
US20130054024A1 (en) Universal Payload Abstraction
EP2677384B1 (en) Remote control of an autonomous vehicle
Kohlbrecher et al. Human‐robot teaming for rescue missions: Team ViGIR's approach to the 2013 DARPA Robotics Challenge Trials
US6009381A (en) Remote control measuring system
KR20110064861A (en) Mobile robot based on a crowed intelligence, method for controlling the same and watching robot system
JP2003514294A (en) The systems and methods associated directivity to marked and the selected information technical components subject
KR20170057084A (en) Apparatus and method for traning model for autonomous driving, autonomous driving apparatus
US9292015B2 (en) Universal construction robotics interface
Rothermich et al. Distributed localization and mapping with a robotic swarm
Yoshimi et al. Development of a concept model of a robotic information home appliance, ApriAlpha
US20090157228A1 (en) User interface device of remote control system for robot device and method using the same
CN102141797B (en) Airport terminal service robot and control method thereof
US20160116912A1 (en) System and method for controlling unmanned vehicles
CN105058389A (en) Robot system, robot control method, and robot

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070418

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100608

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100805

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110111

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110223

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110322

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110325

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

LAPS Cancellation because of no payment of annual fees