WO2002030626A1 - Systeme de commande de robot et procede de commande de robot - Google Patents
Systeme de commande de robot et procede de commande de robot Download PDFInfo
- Publication number
- WO2002030626A1 WO2002030626A1 PCT/JP2001/008846 JP0108846W WO0230626A1 WO 2002030626 A1 WO2002030626 A1 WO 2002030626A1 JP 0108846 W JP0108846 W JP 0108846W WO 0230626 A1 WO0230626 A1 WO 0230626A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- control
- information
- control module
- control unit
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/008—Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40304—Modular structure
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S707/00—Data processing: database and file management or data structures
- Y10S707/99941—Database schema or data structure
- Y10S707/99943—Generating database or data structure, e.g. via user interface
Definitions
- the present invention is, robots controlled using software robots articulated such legged walking type Touea, the program
- the software for multi-joint robots whose hardware configuration may be significantly changed due to the attachment / detachment / replacement of each operation unit such as legs and head, etc.
- the present invention relates to a robot control system controlled by using a robot. More specifically, the present invention provides a software comprising a combination of a software layer highly dependent on the hardware configuration and a software layer independent of the hardware configuration.
- the present invention relates to a robot control system for dynamically changing a combination to control an articulated robot and a program interface between software layers.
- a mechanical device that performs a motion that resembles a human motion by using an electric or magnetic action is called a “robot”. It is said that the robot is derived from the Slavic language ROB 0 ⁇ A (robot machine).
- ROB 0 ⁇ A robot machine
- mouth bots began to spread in the late 1960's, but most of them were manufactured by manipulators and transport robots for the purpose of automation and unmanned production in factories.
- body mechanism of a four-legged animal such as a cat and its Or a humanoid robot that mimics the body mechanism or movement of a bipedal walking animal such as a human monkey.
- One of the uses of the legged mobile robot is to represent various difficult tasks in industrial and production activities. For example, difficult work such as maintenance work at nuclear power plants, thermal power plants, petrochemical plants, transport and assembly of parts at manufacturing plants, cleaning at high-rise buildings, rescue at fire sites and other places Agency.
- Another use of the legged mobile robot is not the work support described above, but a life-based type, that is, a “symbiosis” with humans or “entertainment”.
- This type of robot emulates the movement mechanism of a relatively intelligent legged walking animal, such as a human or dog (pet), and rich emotional expression using the limbs.
- words and attitudes received from the user (or other robots) such as “praise”, “scoring”, and “slap” It is also required to realize a lively response expression that dynamically responds to
- intelligent robots are equipped with behavioral models and learning models caused by movements, and determine the movements by changing the model based on input information such as external sounds, images, and tactile sensations. In this way, autonomous thinking and motion control are realized.
- the robot can express autonomous behavior according to the robot's own emotions and instinct.
- the robot is equipped with an image input device and a voice input / output device, and performs image recognition processing and voice recognition processing, thereby realizing realistic communication with humans at a higher intellectual level. Is also possible.
- Recent legged mobile robots have high information processing ability, and can be regarded as intelligent robots themselves as a kind of computer system.
- robots model and maintain various rules related to movement, such as emotion models, behavior models, and learning models, and respond to external factors such as user's actions according to these models.
- the action plan can be embodied through driving and voice output of each joint actuary, and user feedback can be provided.
- the planning of such an action plan and the robot operation control for embodying the action plan on the aircraft are implemented in the form of executing a program code (for example, an application) on a computer system.
- a mobile robot can be a mobile robot that is attached to the torso, but has a head, legs, and tail, and a robot that consists of only the torso and wheels.
- a control software layer such as middleware for performing hardware operations.
- the stability discrimination criteria when moving and walking are completely different between the case where the moving means is a movable leg and the case where wheels are used, and the case where two or four legs are used. Therefore, the operating environment for executing an application varies greatly between systems.
- Hardware-independent software is, for example, application layer software that performs processes with little relation to hardware operations, such as emotion models, behavior models, and learning models.
- the hardware-dependent software is, for example, a middleware layer software composed of a collection of software modules that provide the basic functions of the robot 1, and the configuration of each module is a robot. It is affected by hardware attributes such as mechanical and electrical characteristics, specifications, and shapes.
- the middleware functions as a recognition middleware that processes input from the sensor of each unit and recognizes it to notify the upper-level application, and drives each joint actuator in accordance with commands issued by the application. It can be broadly divided into output middleware that performs hardware drive control.
- middleware that conforms to the hardware configuration into a robot
- the same application can be installed in various hardware configurations. It can be executed on the robot.
- new software can be supplied via removable media, or software can be downloaded via a network.
- New software such as abbreviated middleware, can be easily introduced into the mouthpiece by simply removing the media from the slot.
- An object of the present invention is to provide an excellent robot control system that controls a multi-joint robot such as a legged walking type using a software program.
- a further object of the present invention is to use a software program for an articulated robot whose hardware configuration is likely to change drastically with the attachment / detachment / replacement of each motion unit such as legs and head. It is to provide an excellent robot control system for controlling the robot.
- a further object of the present invention is to provide a software that is highly dependent on the hardware configuration.
- An excellent robot control system that controls articulated robots using a software program consisting of a combination of software layers and software layers independent of the hardware configuration, and programs between software layers The goal is to provide an evening face.
- a further object of the present invention is to dynamically change the combination of a hardware-dependent software layer such as middleware and a hardware-independent software layer such as an application to make it multi-joint type.
- An object of the present invention is to provide an excellent robot control system for controlling a robot and a program interface between respective software layers.
- a first aspect of the present invention is a robot control system that controls the operation of a robot including a combination of a plurality of hardware components
- the first control unit performs processing independent of the hardware configuration information of the robot.
- a second control unit that performs processing dependent on the hardware configuration information of the robot
- a communication unit that performs communication between the first and second control units
- a robot control system comprising:
- system refers to a logical grouping of multiple devices (or functional modules that realize specific functions), and each device or functional module is in a single housing. It does not matter in particular.
- the first control unit mentioned here is implemented by application layer software independent of the hardware configuration. Further, the second control unit is implemented by middle layer software having a high dependency on a hardware configuration. Further, the communication unit can be implemented in the form of a program interface that realizes a data exchange process between the application and the middleware.
- the communication unit is an interface between the application layer and the middleware layer, By establishing a format for exchanging data commands between software layers, any combination between application and middleware can be permitted.
- the first control unit is realized, for example, by executing application software that determines a robot action sequence using a model that abstracts the configuration and operation of the robot.
- Application software is, for example, an emotion model that models robot emotions, an instinct model that models instinct, a learning model that sequentially stores the causal relationship between external events and the actions taken by the robot. It has behavior models that model behavior patterns.
- the second control unit is realized by executing, for example, middleware / software that provides basic functions on a robot body.
- the middleware software for example, receives input data detected from the hardware of the robot via the system control layer and performs detection of external factors such as distance detection, posture detection, and contact by hardware components. It consists of a recognition system processing unit that takes into account the configuration, and an output system processing unit that processes operation control on the robot's body based on commands from the application.
- the communication unit notifies the first control unit of information detected by the recognition system processing unit, and transfers a command by the first control unit to the output system processing unit.
- the communication unit may include an information communication interface that notifies information from the second control unit to the first control unit, and the first control unit controls the second control unit. It is equipped with a command-in-yuichi face and so on.
- the communication unit may include an information database for the first control unit to specify, on a semantic basis, information to be acquired from the second control unit.
- the second control unit communicates with the first control unit. Then, the corresponding information can be transferred.
- the communication unit may include a command database for specifying a command that the first control unit wants to issue to the second control unit on a semantic basis.
- the first control unit can select a command on a semantic basis by using the command command.
- the communication unit notifies the second control unit of a recognition result obtained by the first control unit, and performs a first control on a relationship between the recognition result and an action that can be performed by the second control unit. It may have a feedback interface for notifying the department.
- first control unit and the second control unit may be configured to be able to be handled independently.
- the communication unit may notify the first control unit of a system event detected by the second control unit.
- the communication unit may include means for notifying the first control unit of a shutdown factor detected by the second control unit, and a resume condition for the shutdown set by the first control unit. Means for notifying the second control unit.
- the information processing apparatus may further include means for notifying the first control unit of a recommended resume condition set by the second control unit.
- the robot control system there is a semantic relationship between a middleware layer that depends on the hardware configuration of the robot and an application layer that does not depend on the hardware configuration.
- the second aspect of the present invention relates to a combination of a plurality of hardware components.
- a first control module that performs a process that does not depend on the hardware configuration information of the robot, and a second control module that performs a process that does not depend on the hardware configuration information of the robot.
- a robot control method comprising a communication step for performing communication between the first control module and the second control module.
- the first control module mentioned here is implemented by application layer software independent of the hardware configuration.
- the second control module is implemented by middleware layer software that is highly dependent on the hardware configuration.
- the communication step can be implemented in the form of a program interface that realizes data exchange processing between the application and the middleware.
- an interface between the application layer and the middleware layer is realized, and data and commands between the software layers are realized.
- the first control module is implemented by an application (1) software that determines a robot behavior sequence by a model that abstracts a robot configuration and operation.
- Application software is, for example, an emotion model that models the emotions of the robot, an instinct model that models the instinct, a learning model that sequentially stores the causal relationship between external events and the actions taken by the robot, and actions. It consists of a behavior model that models patterns.
- the second control module is implemented by middleware / software that provides basic functions on a robot body.
- Middleware for example, receives input data detected from the hardware of the robot via the system control layer and considers the hardware configuration in consideration of external factors such as distance detection, posture detection, and contact.
- Recognition system A processing module, and an output processing module that processes operation control on the robot body based on a command from the application.
- the communication step may include notifying the first control module of detection information by execution of the recognition processing module and transferring a command by execution of the first control module to the output processing module. Good.
- an information communication interface that notifies the first control module of information from the second control module may be used.
- the first control module may use a command interface for controlling the second control module.
- the first control module may use an information database for specifying information to be acquired from the second control module in a meaning-based manner.
- the corresponding information can be transferred from the second control module to the first control module.
- the first control module uses a command database for specifying a command to be issued to the second control module on a semantic basis, and The first control module may select the command on a semantic basis.
- the communication step notifies the second control module of a recognition result by the first control module, and indicates a relationship between a recognition result and an action possible in the second control module to the first control module.
- a feedback loop for notifying the control module may be executed.
- the first control module and the second control module may be configured to be able to be handled independently.
- the communication step may include a sub-step of notifying the first control module of a system event detected by the second control module.
- the communication step includes a sub-step of notifying the first control module of a shutdown factor detected by the second control module, and a resume condition for the shutdown set by the first control module. And a sub-step of notifying the second control module.
- the communication step may further include a sub-step of notifying the first control module of a recommended resume condition set by the second control module.
- the robot operates semantically between the middleware layer that depends on the hardware configuration of the robot and the application layer that does not depend on the hardware configuration.
- Providing an interface and a data base to perform the operation ensures that even if the combination of middleware and applications installed on the robot is changed, normal operation is always performed can do.
- the application can obtain appropriate input data via the middleware or issue an appropriate command.
- a third aspect of the present invention is a robot control system configured by an object-oriented program
- a robot control system comprising:
- the information may be hierarchically described in terms of the meaning base of the information to be registered. Further, it is preferable that the information-based format includes at least an information identification information field, a classification field, and a sensor information identification field.
- FIG. 1 is a diagram schematically showing a hardware configuration of a robot used for carrying out the present invention.
- FIG. 2 is a diagram showing an example in which the configuration of the drive system subsystem 50 is changed by attaching / detaching / replacement of CPC components.
- FIG. 3 is a diagram showing an example in which the configuration of the drive system subsystem 50 is changed by attaching / detaching / replacement of a CPC component.
- FIG. 4 is a diagram showing a format example of the information 'data pace.
- FIG. 5 is a diagram showing an example of entry of a record based on information and data.
- FIG. 6 is a diagram showing an example of entry of a record at the info-formation-de-one-time pace.
- FIG. 7 is a diagram schematically showing a state in which information used by an application is registered, and a state in which the registered information is notified via middleware.
- FIG. 9 is a diagram showing a mechanism for requesting sensor value information from a project.
- FIG. 9 is a chart schematically showing the exchange between the objects for the application object to acquire the sensor value information.
- FIG. 10 is a diagram schematically showing a data structure of sensor value information.
- FIG. 11 is a diagram schematically showing a data structure of header information.
- FIG. 12 is a diagram schematically showing a data structure of sensor information (Sensorlnfo).
- FIG. 13 is a diagram showing a state where the sensor is converted to the position of the reference coordinates by Tmatrix.
- FIG. 14 is a diagram showing a manner in which the application object issues a command according to the command information notified by the feedback loop using the feedback interface.
- FIG. 15 is a diagram schematically showing a data structure of target information (Targetlnfo) used in the feedback loop.
- FIG. 16 is a diagram schematically showing the structure of the TargetStatus indicating the status of the evening target.
- FIG. 17 is a diagram schematically showing a size3D data structure in which a target size is represented by a rectangular parallelepiped.
- FIG. 18 is a diagram schematically showing a data structure of the variable Enable3D in FIG.
- FIG. 19 is a diagram schematically showing the data structure of command information (Command Info) used in the feedback loop.
- FIG. 20 is a diagram schematically showing the data structure of Status indicating the relationship with the evening target.
- FIG. 21 is a diagram schematically illustrating processing of a system event via an application interface.
- FIG. 22 is a diagram schematically showing a data structure of a system event (SysEvent).
- FIG. 23 is a diagram showing an example of use of the aid.
- FIG. 24 is a diagram showing an example of using aid.
- FIG. 25 is a diagram showing an example of processing for notifying a shutdown cause and setting a resume condition via an application interface.
- Fig. 26 is a diagram schematically showing the structure of the shutdown event PowerEvent that the middleware object notifies the application ⁇ object.
- FIG. 27 is a diagram schematically showing a data structure of a power-down factor PowerEvent notified by the middleware object to the application object.
- FIG. 28 is a diagram showing an example of a format based on the command.
- FIG. 29 is a diagram showing an example of entry of a record based on the command 'data overnight'.
- FIG. 30 is a diagram showing an example of entry of a record based on command data.
- FIG. 31 is a diagram showing a mechanism for issuing an application object, middleware off, issuing commands between objects, and notifying a status.
- FIG. 32 is a diagram schematically showing, on a chart, exchanges between the respective objects for executing the commands issued by the application objects.
- FIG. 33 is a diagram schematically showing a data structure of a command issued by the application object.
- FIG. 34 is a diagram schematically showing a data structure of TargetID.
- FIG. 35 is a diagram schematically showing a data structure of TargetID.
- FIG. 36 is a diagram schematically showing data improvement of resource information (Resourcelnfo). BEST MODE FOR CARRYING OUT THE INVENTION
- FIG. 1 schematically illustrates a hardware configuration of a robot used for implementing the present invention.
- the mouth port hardware is composed of a control subsystem 10 and a drive subsystem 50.
- the control subsystem of the robot includes a CPU (Central Processing Unit) 11, a main memory 12, a fixed memory device 13, and a replaceable memory device 14.
- CPU Central Processing Unit
- the CPU 11 Under the control of the system control software, the CPU 11 as a main-controller executes hardware-independent programs such as applications and hardware-dependent programs such as middleware. It controls the overall operation of the robot as a whole.
- the CPU 11 is bus-connected to memory and other circuit components and peripheral devices. Each device on the bus is assigned a unique address (memory address or 1-0 address), and the CPU 11 can communicate with a specific device on the bus by specifying the address.
- the bus is a common signal transmission path that includes the address' bus, data bus', and control 'bus.
- the main memory 12 is usually composed of a volatile storage device composed of a plurality of DRAM (Dynamic Random Access Memory) chips, and is used to load the execution program code of the CPU 11 and to store the work data. Used for temporary storage.
- the program code such as application / middleware supplied from the fixed memory device 13 or the replaceable memory device 14 is developed on the main memory 12, ie, in the memory space. Mapped above.
- the fixed memory device 13 is a non-replaceable non-volatile storage device fixedly attached to the robot body.
- flash notes The fixed memory device 13 can be configured by using a nonvolatile memory element that is programmable by application of a write voltage like a memory.
- the fixed memory device 13 is used for storing program code such as an application for controlling the operation and thinking of the robot and middleware for hardware operation. However, since the fixed memory device 13 is installed fixedly to the device, hardware-dependent software such as middleware is shipped at the time of robot shipment (default) or standard. It is preferable to prepare a purge yoke suitable for the hard disk configuration in the fixed memory device 13.
- the replaceable memory device 14 is a non-volatile storage device that is detachably attached to the robot and that is replaceable.
- a replaceable memory device 14 is configured using a cartridge-type storage medium such as a memory card or a memory stick, and is mounted on a predetermined memory slot so that it can be mounted on the machine. Provided for interchangeable use.
- the exchangeable memory device 14 stores, like the fixed memory device 13, an application ⁇ ⁇ for controlling the robot operation and thinking, and a program code such as middleware for hardware operation. Used for However, since the replaceable memory device 14 is provided to the robot body for attachment / detachment-replacement, and it is assumed that it will be used between models with different hardware configurations, it will be used for the latest software. This can be used when providing wear to the aircraft.
- middleware For hardware-dependent software such as middleware, replaceable memory devices with special attention to whether the robot is shipped (at the time of default) or a version compatible with the standard hardware configuration. Low need to store on top. Rather, it is preferable that middleware capable of providing an operating environment for a hardware configuration assumed by an application be stored in the exchangeable memory device 14 in combination with an application.
- the driving system subsystem 50 of the robot And its drive control circuit, motion detection encoder, and various sensors (not shown) such as a camera and a contact sensor.
- various sensors such as a camera and a contact sensor.
- the drive system subsystem 50 is handled in units of drive units such as the head, the body, and the legs.
- each physical component is provided with unique identification information, that is, a component ID.
- the CPU 11 (more specifically, the system control software running on the CPU 11) of the control subsystem 10 accesses each equipped physical component via a bus, and Control commands can be transferred to physical components, and the component ID of each can be obtained.
- the detected combination of component IDs becomes the current hardware configuration information in the robot.
- FIG. 2 shows an example in which the configuration of the drive system subsystem 50 is changed by detaching and replacing the CPC component.
- multiple physical components such as head, leg, and tail are attached to the torso (that is, the hardware configuration information is ⁇ body ID, head ID, leg ID: tail ID ⁇ ).
- the hardware configuration information is ⁇ fuselage ID, wheel ID ⁇ ).
- the robot control software is composed of an application layer that does not depend on the hardware configuration of the robot, a middle air layer that depends on the hardware configuration, and a device driver of the lowest layer.
- the software of each layer controls the hardware of the robot, that is, the drive subsystem 50 described later, under the control of a predetermined operating system (OS).
- OS operating system
- object-oriented programming can be adopted in the software design of each layer.
- Each object-oriented software is handled in units of “objects”, which integrate data and processing procedures for the data.
- Application 'interface a predetermined programming interface
- various combinations of ablation and middleware are permitted by introducing software via the exchangeable memory device 14 or the like.
- Compatibility between each application and middleware is provided by this application's interface.
- the mechanism of the application-show-in-yuichi-face will be explained in detail later.
- the application layer and the middleware layer each consist of multiple object-files.
- the application layer and the middleware layer are under the control of an object management system.
- the communication between the middleware layer and the lowermost device driver is performed by a predetermined programming interface (hereinafter referred to as “device / dryno interface”).
- the communication is carried out overnight.
- the operation of the hardware that is, the issuance of control commands to the drive joints and the input of the detection value of each sensor, are performed by the middleware. Is not done directly, but through each device driver corresponding to each hardware component.
- the application consists of an emotion model that models the emotions of the robot, an instinct model that models the instinct, a learning model that sequentially stores the causal relationship between external events and the actions taken by the robot, and a behavior model that models And an output destination of the action determined by the action model based on sensor input information, that is, an external factor.
- the emotion model and instinct model have recognition results and action histories as inputs, respectively, and manage emotion values and instinct values.
- the behavior model can refer to these emotion values and instinct values.
- the learning model updates the action selection probability based on the learning instruction from the outside (operator), and supplies the updated content to the action model.
- An application is a hardware-independent software that is not affected by hardware attributes because it performs arithmetic processing using a model that abstracts the robot configuration and operation.
- the middleware layer is a collection of software modules that provide basic functions on the robot body.
- Each module is composed of hardware such as the mechanical and electrical characteristics, specifications, and shape of the mouthboard. This is hardware-dependent software that is affected by software attributes.
- the middleware layer can be functionally divided into recognition middleware and output middleware.
- Recognition middleware receives raw data from hardware, such as image data, audio data, and detection data obtained from other sensors, via the system control layer and processes them. I do. That is, based on various types of input information, processing such as voice recognition, distance detection, posture detection, contact, motion detection, and color recognition is performed, and a recognition result is obtained (for example, a ball is detected, a fall is detected, a stroke is detected. , Beaten, audible Domiso scale, moving object detected, obstacle detected, obstacle recognized, etc.). Recognition result Is notified to the upper application layer and used for action planning, etc.
- the output middleware provides functions such as walking, reproducing movement, synthesizing output sounds, and controlling the lighting of LEDs, which correspond to the eyes. That is, it receives commands related to the action plan drafted in the application layer, and outputs the robot command values, output sound, output light (LED), output light for each joint of the robot for each function of the robot. The sound is generated and output, that is, it is demonstrated on a robot via a virtual robot.
- the application gives abstract action commands (eg, forward, backward, rejoicing, barking, sleeping, gymnastics, surprised, tracking, etc.) to give each robot's joint actuation a command. It can control the operation of the evening and other output units on the aircraft.
- the middleware prepares information communication on a semantic basis between the application and the middleware, and can share the knowledge pace.
- the database includes an information database (InformationDB) that applications use to identify input information on a semantic basis, and a command database (CommandDB) that applications use to select execution commands on a semantic basis. included. The details of the overnight schedule will be described later.
- Each layer of software for robot control is provided in the robot 1 by a fixed memory device 13 or a replaceable memory device 14.
- the software of each layer is deployed on the main memory 12, that is, used by being mapped on the memory space.
- an application interface is defined between a middle layer that depends on the hardware configuration of the robot and an application layer that does not depend on the hardware configuration such as action selection.
- robots with various hardware configurations can be configured with one application.
- ⁇ ⁇ ⁇ Be prepared to respond to accidents. (Users can move, distribute, or sell applications they own (or bring up) between aircraft.)
- middleware and applications By configuring middleware and applications so that they can be handled independently, it is possible to create middleware that supports various applications on an aircraft that has a single hardware configuration. Since the reusability of middleware is increased, application development efficiency is improved.
- the software-to-air development vendor can concentrate on the function development in the area of specialty. For example, a vendor specializing in control systems can use various applications by focusing on middleware development.
- the application interface according to the present embodiment includes the following function realizing means.
- the application can obtain input information for learning (for example, “patched” from the user) without being aware of the actual location of the sensor that is the information source. Can be.
- input information for learning for example, “patched” from the user
- an alternative sensor for example, to perceive that "struck” It is possible to substitute some kind of switch on the head, etc.
- the relevant emotion eg, pleasant emotion
- the middleware prepares an information database (described later) to identify input information on a semantic basis. By registering the information used in the InformationDB, the application can acquire the input information on a semantic basis.
- the application can execute the required action without being aware of what kind of mode / effect the middleware actually operates.
- the middleware prepares the CommandDB command database (described later) to realize command selection at a semantic pace.
- a Headerlnfo area (described later) is specified during Sensorlnformation to provide information on the robot's posture and moving speed.
- TargetPos ⁇ application layer of the Command provides information to the TargetPos ⁇ application layer of the Command, such as whether a robot with the current hardware configuration can pass the detected gap.
- the application layer can perform processing that changes depending on the structure of the middleware, such as a return operation from a fall, without having to understand the details, And the state of the application can be consistent.
- the method of returning from the fall posture depends on the type of posture supported by the middleware. Of course, it depends on the performance of middleware.
- Shirt down factors and resuming conditions are closely related. For example, if the battery temperature rises too high during discharge, The time required depends on the characteristics of the aircraft, such as the heat radiation performance of the hardware. Middle Air will notify the application as a recommended resume condition for the next possible time as the resume condition at that time. The application sets the final resume conditions based on the recommended conditions and the application's own convenience, and notifies the middleware at the time of shutdown. (Shutdown / Resume Condition)
- the software module in charge of tracking of the middleware determines the recognition result specified from the recognition result service by the command in the application layer. Processing can be speeded up because it can be obtained without the use of existing software. (Target ID of Target Info / Co thigh and)
- the system searches for R0M implemented on physical components and notifies the information.
- Information Database (InformationDB):
- information used in the application can be selected on a semantic basis from among the data supplied through the application interface.
- the middleware prepares an information base. By registering the information to be used in the information database in a meaning-based manner, Necessary input information can be obtained.
- the application can obtain input information for learning (for example, “patched” from the user) without being aware of the actual location of the sensor that is the information source.
- input information for learning for example, “patched” from the user
- an alternative sensor for example, to perceive that “stroke”
- it is possible to connect to the relevant emotion for example, a pleasant emotion
- FIG. 4 shows an example of a format based on information and data.
- one record of the information data is composed of an information identification information (InformationID) field, a classification field, a sensor value information field, and a sensor identification information (SensorlD) field.
- Information identification information is a unique identifier that is uniquely assigned to a record, and can be determined by middleware.
- Classification fields are further subdivided into Compatible, Large, Medium, and Small fields.
- the compatibility field is a flag that indicates whether the subsequent ID is a standard or extended ID.
- the major, middle, and small categories are entered in each of the large, medium, and small fields.
- the information is selected in a semantic basis, that is, the aspects of the semantic base of the information are described hierarchically. By selecting necessary information after reading the database, it is possible to operate with different combinations of databases.
- the sub-classification takes the form of the actual hardware structure (the unit system, etc. is determined at this point).
- the sensor value information field includes fields for entering the maximum value, minimum value, and resolution of the corresponding sensor input.
- the sensor identification information (SensorlD) field contains fields for specifying the sensor component and the site where the sensor is mounted, and the fields related to the component and site. Includes flags that indicate compatibility.
- the classification and the sensor identification information are the decisive factors for selecting the relevant record during a database search.
- Software such as applications, searches the information / data base of the middleware combined at the time of execution, obtains the required information identification information, and collects information that is notified at the time of operation
- the information is extracted from the information using the information identification information as a search key, and the necessary processing is applied.
- Fig. 5 and Fig. 6 show examples of the information 'database' record entry.
- the records with information IDs 101 and 102 actually designate the same sensor.
- the 200 and 201 records actually specify the same sensor.
- 201 is different from 200 in that the relationship between the location measured by the PSD and the location captured by the camera is obtained as an image-linked type.
- the application can receive notification of the corresponding sensor input information from the middleware.
- FIG. 7 schematically shows a state in which information used by the application object is registered, and a state in which the registered information is notified to the application object via the middleware object.
- a description will be given of a mechanism in which the application object registers the usage information, and a mechanism in which the application object receives the registered information.
- the object management system searches the information database using the semantic-based classification “stroke” as a search key. Then, the relevant record is found out, and the information identifying the information is returned to the application-object of the request source (T 2).
- the application object If the application object is satisfied with this return value, it declares (ie, registers) that the information is to be used to the object management system (T3), and the middleware The object is notified of the ready state, that is, the input of the corresponding sensor value information is OK (T4). Also, in response to this declaration, the object management system confirms the registration of the middleware 'object ( ⁇ 5).
- the object responds to the registration confirmation and activates the sensor (that is, the driver is turned on (that is, the driver object that operates the sensor specified by the corresponding record in the information record). (Sensor 'Open') ( ⁇ 6).
- the middleware object can input sensor value information momentarily ( ⁇ 7).
- the middleware object collectively notifies the input information from each sensor for each application object according to the registered contents ( ⁇ 8).
- the mechanism by which the application object registers the usage information using the information * overnight to enter the desired sensor value information is as described above. Next, the mechanism by which the application object acquires sensor value information and details of the sensor value information will be described.
- Fig. 8 illustrates the mechanism by which the application object requests sensor value information from the middle object.
- the application object is an information database. Information to obtain necessary information based on the source information
- the application object uses this information identification information to issue a read request (Read) to the middleware object.
- the middleware object notifies the application object of sensor value information input via the driver's object in response to the Read request.
- FIG. 9 schematically shows, on a chart, exchanges between the objects so that the application objects obtain the sensor value information.
- the application object issues a Read request to the middleware object when sensor value information is needed.
- the middleware 'object' inputs sensor values from each sensor via the dryno'object.
- the sensor value corresponding to the requested information identification information is returned to the application object.
- sensor values input when the application-object is not in the Ready state are discarded instead of being transferred to the application-object.
- FIG. 10 schematically illustrates the data structure of the sensor value information.
- the sensor value information is composed of a header section and a main body section.
- the header part consists of header information (Headerlnfo) and sensor information (Sensorlnfo).
- header information Headerlnfo
- Sensorlnfo sensor information
- information that can not be described at a constant pace and that changes during operation can be added. Since a plurality of pieces of information are sent at one time, the corresponding header can be searched by the information identification information. The offset to the value information is written in the searched header.
- the value is composed of a series of data, and the corresponding sensor value can be extracted from the header information and the sensor information. Cut.
- FIG. 11 schematically illustrates the data structure of the header information.
- the header information includes header size (HeaderSize), body size (BodySize), number of sensors (NumOf Sensor), time (Time), robot speed (RobotSpeed), and attitude. It consists of identification information (PosturelD).
- the translation speed component and the rotation speed component in each of the x, y, and z axes are written as the robot speed (RobotSpeed).
- the posture identification information (PosturelD) defined as a common concept.
- the posture identification information is defined by the following data structure.
- TypeDef PosturelD byte Fig. 12 schematically shows the data structure of sensor information (Sensorlnfo).
- the format of the main unit can be described in the sensor information (Sensorlnfo).
- the sensor information includes information for identifying information (Information ID) and information for accessing information specified by classification from the main body.
- Information for accessing the main unit consists of the offset amount of the main unit (offset), vertical size (vertical), horizontal size (horizontal), number of skips (skip), number of steps (step), etc. .
- Limit is a flag that indicates whether the sensor itself cannot be moved because it is at the end point.
- Tmatrix is information indicating the state of the sensor, and more specifically, is a 4 ⁇ 4 matrix for converting sensor coordinates into reference coordinates.
- Figure 13 shows how the sensor is converted to the position of the reference coordinates by Tmatrix.
- the application 'interface notifies the middleware of the recognition result detected by the application, and notifies the application of the relationship between the recognition result and the action that can be performed by the middleware. Let's realize the feedback 'loop.
- the feedback interface for the feedback loop ie, the “feed-noise-winning face”, is implemented.
- FIG. 14 illustrates how an application object issues a command according to the command information notified by a feedback loop using a feedback interface.
- the middleware object sends sensor value information such as image data to the application object based on the registered contents of the specification information.
- the application object recognizes and processes a predetermined target based on the sensor value information.
- the middleware object feeds back command information (Command Info) relating to possible interactions with the target to the application and the object.
- the application object selects a selectable command based on the command information and issues a command to the middleware object.
- ⁇ Middleware objects are specified through the Dryno and the object. By performing the hardware operation, the command can be embodied on the robot body.
- Fig. 15 schematically shows the data structure of the target information (Targetlnfo) used in the feedback loop.
- the target information includes Time indicating the time at which the information was detected, AppID for identifying the evening track to be tracked, SensorlD for identifying the sensor used, and a reference.
- TargetPos which displays the position in polar coordinates
- Size3D which shows the size of the evening gate as a rectangular parallelepiped (see Fig. 17; see Fig. 18 for the data structure of the variable Enable3D in Fig. 1) ).
- Fig. 19 schematically shows the data structure of the command information (Co thigh and Info) used in the feed pack loop.
- the command information includes the time indicating the time at which the information was detected, AppID for identifying the target to be tracked, SensorlD for identifying the sensor used, and the relationship between the target and the target. It is composed of Status (shown in Fig. 20), Category that indicates the command of the interaction (that is, a selectable command), and Position3D that indicates the relationship between the evening get and the center.
- Ki c k To u ch commands vary depending on the robot configuration. Tracing is treated as a fixed command.
- the command information shown in Fig. 19 is a type that gives the freedom of selection by Category. As a modification, it is represented by a Command ID column. A type that does not have a degree of freedom to specify Action D can be mentioned.
- the application interface has a mechanism that, when detecting a predetermined event, that is, a system event, occurring on the robot body, notifies the application object.
- the application object executes a predetermined command in response to the notification of the system event.
- Figure 21 illustrates the processing of the application 'system via interface' event.
- Figure 22 shows the middleware object as an application. System events used for event notifications for objects
- a system event is represented by a Time indicating the event occurrence time, a Category indicating an interaction command (that is, a selectable command), a Naction indicating the number of actions, and a Command ID column. Consists of ActionID.
- Processing corresponding to the system event can be assigned to the array variable a i d [i] that composes the function. Examples of using aid are shown in FIGS. 23 and 24.
- System events differ from sensor value information and do not include header information (Headerlnfo) or sensor information (Sensorlnfo).
- the application interface includes a mechanism for notifying the application object of the cause of shutdown by the middle air object, and a mechanism for notifying the application object to the middle object. It has a mechanism for setting the conditions for resuming the wear object, that is, the next boot.
- shuttdown factors and resuming conditions are closely related. For example, if the temperature of the battery at the time of discharge rises too much, the time it takes for the temperature to fall to the appropriate temperature depends on the characteristics of the aircraft, such as the heat dissipation performance of the hardware.
- the middle-air object notifies the application object of the next possible time as a recommended resume condition as the resume condition at that time.
- the application's object sets the final registration conditions based on the recommended resume conditions and the application object's own circumstances, and notifies the middleware object when the shirt is down.
- FIG. 5 illustrates an example of a process of notifying a down factor and setting a resume condition.
- the middleware object notifies the application object of the PowerEvent.
- the middleware object notifies the PowerEvent, it also notifies the next possible time with the resume condition as the recommended condition (RecommendCondition).
- the application object sets the final resume condition (ResumeCondition) based on the recommended resume condition and the convenience of the application's object itself. Notice.
- FIG. 26 schematically shows a data structure of a shutdown event PowerEvent that the middleware object notifies the application object.
- the PowerEvent includes a Time indicating the time when the shutdown factor occurred, a Category indicating an interaction command (that is, a selectable command), and a condition recommended as a resume condition.
- RecommendCondition Naction indicating the number of actions, and ActionID represented by the CommandID column.
- FIG. 27 schematically shows a data structure of the ResumeCondition which the application object transfers to the middleware object in order to set the next boot condition.
- the ResumeCondition is composed of a Mask indicating a mask at the time of event notification, an Event indicating an event at the time of shutdown, and a ResumeTime for setting the next time of the schedule.
- the application interface according to the present embodiment is used because the application object controls the middleware object. Provides a command-in-face.
- the middleware object prepares the command and data pace so that the command to be executed by the application object can be semantically selected from the commands supported by the interface. .
- the application object can perform the required actions without having to be aware of what the middleware object is actually activating.
- Fig. 28 shows an example of the format of the command 'database.
- one record of the command ⁇ ⁇ is a command identification information (CommandlD) field, a classification field, a resource field, a posture information field, and a sensor identification information.
- (SensorlD) field infinite execution flag 'field, generation identification information (generation ID) field, and degree identification information (degree ID) field.
- the command identification information (Co-customer andID) is a unique identifier that is uniquely assigned to a record, and can be determined by middleware.
- Classification fields are further subdivided into Compatible, Large, Medium, and Small fields.
- the compatibility field is a flag that indicates whether the subsequent ID is a standard or extended ID.
- the large, medium, and small fields are filled with major, medium, and small categories, respectively. Classification is based on the consideration of selecting information on a meaning basis, that is, the meaning of information
- the resource field is subdivided into bit fields for each management unit: motion (head, tail, and foot), sound (one speaker), and special (head LED, tail LED).
- motion head, tail, and foot
- sound one speaker
- special head LED, tail LED
- the posture information field is subdivided into fields that specify the start posture and end posture at the time of command execution, and posture fluctuation (the posture may change even if the posture is the same at the start and end).
- the infinite execution flag is a flag indicating that the execution does not end.
- the generation identification information (generation ID) field is a field that represents whether or not a command that can be used for each generation, such as an adult, child, or infant, by using a bit value at a corresponding bit position. For example, 0 1 1 indicates that it can be used commonly by children and infants.
- the degree identification information (degree ID) field is a field for entering an index value indicating the degree of appearance of the command (for example, the amount of exercise).
- the command database contains a header (not shown), and if it matches, you can use the extension by knowing the meaning of the extension.
- Software such as applications and objects, searches for the middleware and object commands that are combined at run time to determine what action you want to perform, and tells you your current instinct, emotions, or other —Can be selected according to predefined scenarios.
- the command identification information assigned to the selected action By passing the command identification information assigned to the selected action to the middleware object and executing the corresponding action, the intended action can be realized on the robot body.
- FIG. 29 and FIG. 30 show examples of entry of a record based on the command “data overnight”.
- All records with command IDs of 100, 100, and 102 Is a command that expresses
- the application object selects one of the commands and specifies which record's “happy” to use, depending on the robot's on-air condition (for example, fatigue level or generation). be able to.
- Records with a command ID of 200 and 201 are both commands for instructing the kick of an object. This can be used when you know what action, sound, and LED the action is, and want to make sure that action is performed.
- the application interface provides a command interface for an application object to control a middleware object.
- the middleware object executes commands on the robot's aircraft via an agent such as a dryware object, and executes the application object. Returns the status for.
- Figure 31 illustrates the mechanism for issuing commands and notifying status between the application object and the middleware object.
- the application object is based on the command database, and the command identification information of the command to implement an action adapted to the current instinct, emotion, or other predefined scenario.
- the application object issues a command when the middleware object is ready.
- the middle key objects execute commands on the robot's aircraft via agents such as dryno objects, and status messages are sent to application objects. return it.
- Figure 32 shows the commands issued by the application object. The interaction between each object to execute the operation is schematically shown on the time chart.
- the application object issues a command to the middleware object.
- the middleware object locks resources and postures to disable use by other commands, and locks resources and postures as status information to application objects. return.
- the middleware object then notifies the application object that it is ready.
- the middleware object requests an agent such as a dryino or "object" to execute a command.
- the driver object returns input sensor information and the like. .
- the middleware object releases the lock and releases resources for the next command.
- a status including resource release information and aircraft attitude information is returned.
- the middleware object When an application object issues a command N times or infinitely consecutively, basically, the middleware object returns a status, as in the procedure for issuing a single command. .
- the application object can grasp the status of the resource. Then, you can grasp the range of commands that can be selected next.
- Fig. 33 schematically illustrates the data structure of the command issued by the application ⁇ object.
- the command consists of MW-ID for specifying the command ID, TargetlD for specifying the subject identification information (SubjectlD), and TargetPos for displaying the position of the evening target in the sensor coordinate system in polar coordinates.
- Targetlnfo's TargetPos is the sensor center, here In the case of TargetPos, the given coordinates are the coordinates where the sensor part is fixed to the body.
- the MW-ID is a field for specifying identification information (Co-employment and ID) of a command to be used.
- identification information Co-employment and ID
- Figure 34 schematically illustrates the target ID data structure. If 0ID included in SubjectlD is 0x0, TargetPos is valid.
- Figure 35 schematically illustrates the data structure of the status returned by the middleware object to the application object. As shown in the figure, the status consists of MW-ID that specifies the command ID, CommandStatus that indicates the status of the command, ActionStatus that indicates the status of the action, posture identification information PosturelD, and resource information Resourcelnfo. It consists of.
- the command status CommandStatus defines three statuses: "Complete”, “Not Support”, and "Incomplete”. Incomplete if there is no command to be executed or an emergency stop has occurred.
- the action status ActionStatus defines three states: “Success” (when the execution is detected successfully), “Fall” (when the execution failure is detected), and "Unknown” (when there is no execution judgment function). I have.
- Resource information Resourcelnfo describes resource information used in middleware.
- Fig. 35 schematically shows the data structure of Resourcelnfo. As shown in the figure, Resourcelnfo describes motion, sound (speaker), LED, and other resources in a bitmap format. The correspondence between each bit and resource can be freely defined.
- Fig. 36 schematically shows the improvement of resource information (Resourcelnfo) over time. ing.
- Posture identification information PosturelD describes the current posture.
- the gist of the present invention is not necessarily limited to products called “robots”. That is, if it is a mechanical device that performs movements that resemble human movements by using electric or magnetic action, the same applies to products belonging to other industrial fields such as toys.
- the invention can be applied.
- the present invention has been disclosed by way of example, and should not be construed as limiting. In order to determine the gist of the present invention, the claims described at the beginning should be considered.
- a multi-joint robot whose hardware configuration is likely to be significantly changed with the attachment / detachment / replacement of each operation unit such as a leg and a head is implemented using a software program. It is possible to provide an excellent robot control system that can control the robot.
- a multi-joint robot using a software comprising a combination of a software layer highly dependent on a hardware configuration and a software layer independent of a hardware configuration. It is possible to provide an excellent robot control system and a program interface between each software layer. Further, according to the present invention, the combination of a hardware-dependent software layer such as middleware and a hardware-independent software layer such as an application is dynamically changed to provide an articulated robot. It is possible to provide an excellent robot control system capable of controlling the software, and a program interface between each software layer.
- the robot interface is established. Even if the combination of middleware and applications introduced on the platform is changed, normal operation can always be guaranteed.
- each application is provided with a middleware by providing an interface and a database for performing a semantic operation between the application and the middleware. It is possible to acquire appropriate input data via the, and to issue appropriate commands for middleware operation to the middleware.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Robotics (AREA)
- Manipulator (AREA)
- Toys (AREA)
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP01972718.9A EP1327503B1 (en) | 2000-10-11 | 2001-10-09 | Robot control system and robot control method |
US10/168,993 US6816753B2 (en) | 2000-10-11 | 2001-10-09 | Robot control system and robot control method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2000310033 | 2000-10-11 | ||
JP2000-310033 | 2000-10-11 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2002030626A1 true WO2002030626A1 (fr) | 2002-04-18 |
Family
ID=18790060
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2001/008846 WO2002030626A1 (fr) | 2000-10-11 | 2001-10-09 | Systeme de commande de robot et procede de commande de robot |
Country Status (3)
Country | Link |
---|---|
US (1) | US6816753B2 (ja) |
EP (1) | EP1327503B1 (ja) |
WO (1) | WO2002030626A1 (ja) |
Families Citing this family (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002127059A (ja) * | 2000-10-20 | 2002-05-08 | Sony Corp | 行動制御装置および方法、ペットロボットおよび制御方法、ロボット制御システム、並びに記録媒体 |
US20020138246A1 (en) * | 2001-03-08 | 2002-09-26 | Czora Gregory J. | System and method for simulating conciousness |
JP3908735B2 (ja) * | 2001-10-16 | 2007-04-25 | 本田技研工業株式会社 | 歩行状態判定装置及び方法 |
JP4515701B2 (ja) * | 2002-12-13 | 2010-08-04 | 株式会社デンソー | 車両用制御プログラム、及び、車両用制御装置 |
US7464775B2 (en) | 2003-02-21 | 2008-12-16 | Lockheed Martin Corporation | Payload module for mobility assist |
US7150340B2 (en) | 2003-02-21 | 2006-12-19 | Lockheed Martin Corporation | Hub drive and method of using same |
US8839891B2 (en) | 2003-02-21 | 2014-09-23 | Lockheed Martin Corporation | Multi-mode skid steering |
US20050023052A1 (en) | 2003-02-21 | 2005-02-03 | Beck Michael S. | Vehicle having an articulated suspension and method of using same |
WO2005069890A2 (en) * | 2004-01-15 | 2005-08-04 | Mega Robot, Inc. | System and method for reconfiguring an autonomous robot |
DE102004031485B4 (de) * | 2004-06-30 | 2015-07-30 | Kuka Roboter Gmbh | Verfahren und Vorrichtung zum Steuern des Handhabungsgeräts |
US7659906B2 (en) * | 2004-07-29 | 2010-02-09 | The United States Of America As Represented By The Secretary Of The Navy | Airborne real time image exploitation system (ARIES) |
US8000837B2 (en) | 2004-10-05 | 2011-08-16 | J&L Group International, Llc | Programmable load forming system, components thereof, and methods of use |
KR100703692B1 (ko) * | 2004-11-03 | 2007-04-05 | 삼성전자주식회사 | 공간상에 존재하는 오브젝트들을 구별하기 위한 시스템,장치 및 방법 |
US8713025B2 (en) | 2005-03-31 | 2014-04-29 | Square Halt Solutions, Limited Liability Company | Complete context search system |
WO2007098468A1 (en) * | 2006-02-21 | 2007-08-30 | University Of Florida Research Foundation Inc. | Modular platform enabling heterogeneous devices, sensors and actuators to integrate automatically into heterogeneous networks |
US7587260B2 (en) * | 2006-07-05 | 2009-09-08 | Battelle Energy Alliance, Llc | Autonomous navigation system and method |
US7974738B2 (en) * | 2006-07-05 | 2011-07-05 | Battelle Energy Alliance, Llc | Robotics virtual rail system and method |
US7620477B2 (en) * | 2006-07-05 | 2009-11-17 | Battelle Energy Alliance, Llc | Robotic intelligence kernel |
US7801644B2 (en) * | 2006-07-05 | 2010-09-21 | Battelle Energy Alliance, Llc | Generic robot architecture |
US7211980B1 (en) | 2006-07-05 | 2007-05-01 | Battelle Energy Alliance, Llc | Robotic follow system and method |
US8271132B2 (en) | 2008-03-13 | 2012-09-18 | Battelle Energy Alliance, Llc | System and method for seamless task-directed autonomy for robots |
US7668621B2 (en) * | 2006-07-05 | 2010-02-23 | The United States Of America As Represented By The United States Department Of Energy | Robotic guarded motion system and method |
US8073564B2 (en) | 2006-07-05 | 2011-12-06 | Battelle Energy Alliance, Llc | Multi-robot control interface |
US8965578B2 (en) | 2006-07-05 | 2015-02-24 | Battelle Energy Alliance, Llc | Real time explosive hazard information sensing, processing, and communication for autonomous operation |
US7584020B2 (en) * | 2006-07-05 | 2009-09-01 | Battelle Energy Alliance, Llc | Occupancy change detection system and method |
US8355818B2 (en) * | 2009-09-03 | 2013-01-15 | Battelle Energy Alliance, Llc | Robots, systems, and methods for hazard evaluation and visualization |
US8068935B2 (en) * | 2006-10-18 | 2011-11-29 | Yutaka Kanayama | Human-guided mapping method for mobile robot |
KR100772175B1 (ko) * | 2006-10-23 | 2007-11-01 | 한국전자통신연구원 | 네트워크 로봇 시스템 및 네트워크 로봇 시스템에서의 통신방법 |
US8095238B2 (en) * | 2006-11-29 | 2012-01-10 | Irobot Corporation | Robot development platform |
US8538755B2 (en) * | 2007-01-31 | 2013-09-17 | Telecom Italia S.P.A. | Customizable method and system for emotional recognition |
KR100869587B1 (ko) | 2007-02-02 | 2008-11-21 | 주식회사 유진로봇 | 로봇 미들웨어 프레임워크 시스템 |
US8086551B2 (en) | 2007-04-16 | 2011-12-27 | Blue Oak Mountain Technologies, Inc. | Electronic system with simulated sense perception and method of providing simulated sense perception |
US8078357B1 (en) * | 2007-06-06 | 2011-12-13 | Spark Integration Technologies Inc. | Application-independent and component-isolated system and system of systems framework |
US20090082879A1 (en) | 2007-09-20 | 2009-03-26 | Evolution Robotics | Transferable intelligent control device |
DE102008005124A1 (de) * | 2008-01-18 | 2009-07-23 | Kuka Roboter Gmbh | Computersystem, Steuerungsvorrichtung für eine Maschine, insbesondere für einen Industrieroboter, und Industrieroboter |
WO2009146199A2 (en) * | 2008-04-16 | 2009-12-03 | Deka Products Limited Partnership | Systems, apparatus, and methods for the management and control of remotely controlled devices |
JP4934113B2 (ja) * | 2008-08-01 | 2012-05-16 | 株式会社オートネットワーク技術研究所 | 制御装置及びコンピュータプログラム |
KR101021836B1 (ko) * | 2008-10-09 | 2011-03-17 | 한국전자통신연구원 | 동적 행위 바인딩을 통한 다수 로봇의 협업 구현 시스템 및그 구현 방법 |
US8521328B2 (en) * | 2009-12-10 | 2013-08-27 | The Boeing Company | Control system for robotic vehicles |
KR101714791B1 (ko) * | 2010-12-01 | 2017-03-09 | 한국전자통신연구원 | 원격 함수호출 방식 기반 네트워크 로봇 시스템의 서비스를 제어하기 위한 장치 및 그 방법 |
TW201245931A (en) * | 2011-05-09 | 2012-11-16 | Asustek Comp Inc | Robotic device |
US20120314020A1 (en) * | 2011-06-13 | 2012-12-13 | Honda Motor Co,, Ltd. | Move-it: monitoring, operating, visualizing, editing integration toolkit for reconfigurable physical computing |
US8386079B1 (en) * | 2011-10-28 | 2013-02-26 | Google Inc. | Systems and methods for determining semantic information associated with objects |
AU2013204965B2 (en) * | 2012-11-12 | 2016-07-28 | C2 Systems Limited | A system, method, computer program and data signal for the registration, monitoring and control of machines and devices |
JP6003942B2 (ja) * | 2014-04-24 | 2016-10-05 | トヨタ自動車株式会社 | 動作制限装置及び動作制限方法 |
EP2952299A1 (en) | 2014-06-05 | 2015-12-09 | Aldebaran Robotics | Standby mode of a humanoid robot |
US10350763B2 (en) * | 2014-07-01 | 2019-07-16 | Sharp Kabushiki Kaisha | Posture control device, robot, and posture control method |
TW201805598A (zh) * | 2016-08-04 | 2018-02-16 | 鴻海精密工業股份有限公司 | 自主移動設備及建立導航路徑的方法 |
WO2019104189A1 (en) | 2017-11-27 | 2019-05-31 | Intuition Robotics, Ltd | System and method for optimizing resource usage of a robot |
US11775814B1 (en) | 2019-07-31 | 2023-10-03 | Automation Anywhere, Inc. | Automated detection of controls in computer applications with region based detectors |
US11243803B2 (en) | 2019-04-30 | 2022-02-08 | Automation Anywhere, Inc. | Platform agnostic robotic process automation |
US11301224B1 (en) | 2019-04-30 | 2022-04-12 | Automation Anywhere, Inc. | Robotic process automation system with a command action logic independent execution environment |
US11113095B2 (en) | 2019-04-30 | 2021-09-07 | Automation Anywhere, Inc. | Robotic process automation system with separate platform, bot and command class loaders |
US11614731B2 (en) | 2019-04-30 | 2023-03-28 | Automation Anywhere, Inc. | Zero footprint robotic process automation system |
US11481304B1 (en) | 2019-12-22 | 2022-10-25 | Automation Anywhere, Inc. | User action generated process discovery |
US11348353B2 (en) | 2020-01-31 | 2022-05-31 | Automation Anywhere, Inc. | Document spatial layout feature extraction to simplify template classification |
US11182178B1 (en) | 2020-02-21 | 2021-11-23 | Automation Anywhere, Inc. | Detection of user interface controls via invariance guided sub-control learning |
US11734061B2 (en) | 2020-11-12 | 2023-08-22 | Automation Anywhere, Inc. | Automated software robot creation for robotic process automation |
EP4002031B1 (en) * | 2020-11-17 | 2024-06-26 | Robocorp Technologies, Inc. | Model and concept to automate processes across several it systems |
US11820020B2 (en) | 2021-07-29 | 2023-11-21 | Automation Anywhere, Inc. | Robotic process automation supporting hierarchical representation of recordings |
US11968182B2 (en) | 2021-07-29 | 2024-04-23 | Automation Anywhere, Inc. | Authentication of software robots with gateway proxy for access to cloud-based services |
CN114193438B (zh) * | 2021-12-15 | 2023-12-08 | 北京航星机器制造有限公司 | 基于触摸屏控制机器人的方法及装置 |
CN114226955A (zh) * | 2022-01-11 | 2022-03-25 | 武汉点金激光科技有限公司 | 一种具有安全防护功能的激光加工机器人控制系统 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0855335A2 (en) * | 1997-01-23 | 1998-07-29 | Sony Corporation | Robot apparatus |
JPH1158280A (ja) * | 1997-08-22 | 1999-03-02 | Sony Corp | ロボット装置及びその制御方法 |
JP2000210886A (ja) | 1999-01-25 | 2000-08-02 | Sony Corp | ロボット装置 |
JP2000267852A (ja) * | 1999-01-13 | 2000-09-29 | Sony Corp | 演算処理装置、オブジェクト間通信方法及びロボット |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6157864A (en) * | 1998-05-08 | 2000-12-05 | Rockwell Technologies, Llc | System, method and article of manufacture for displaying an animated, realtime updated control sequence chart |
WO2000066239A1 (fr) * | 1999-04-30 | 2000-11-09 | Sony Corporation | Systeme d'animal de compagnie electronique, systeme de reseau, robot et support de donnees |
US7184866B2 (en) * | 1999-07-30 | 2007-02-27 | Oshkosh Truck Corporation | Equipment service vehicle with remote monitoring |
JP2002113675A (ja) * | 2000-10-11 | 2002-04-16 | Sony Corp | ロボット制御システム並びにロボット制御用ソフトウェアの導入方法 |
-
2001
- 2001-10-09 EP EP01972718.9A patent/EP1327503B1/en not_active Expired - Lifetime
- 2001-10-09 WO PCT/JP2001/008846 patent/WO2002030626A1/ja active Application Filing
- 2001-10-09 US US10/168,993 patent/US6816753B2/en not_active Expired - Lifetime
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0855335A2 (en) * | 1997-01-23 | 1998-07-29 | Sony Corporation | Robot apparatus |
JPH1158280A (ja) * | 1997-08-22 | 1999-03-02 | Sony Corp | ロボット装置及びその制御方法 |
JP2000267852A (ja) * | 1999-01-13 | 2000-09-29 | Sony Corp | 演算処理装置、オブジェクト間通信方法及びロボット |
JP2000210886A (ja) | 1999-01-25 | 2000-08-02 | Sony Corp | ロボット装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1327503A4 |
Also Published As
Publication number | Publication date |
---|---|
EP1327503A1 (en) | 2003-07-16 |
EP1327503A4 (en) | 2006-08-30 |
US6816753B2 (en) | 2004-11-09 |
US20030114959A1 (en) | 2003-06-19 |
EP1327503B1 (en) | 2017-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2002030626A1 (fr) | Systeme de commande de robot et procede de commande de robot | |
WO2002030627A1 (fr) | Systeme de commande de robot et procede d"installation d"un logiciel de commande de robot | |
US6470235B2 (en) | Authoring system and method, and storage medium used therewith | |
EP1610221A1 (en) | Information providing device, method, and information providing system | |
US7363108B2 (en) | Robot and control method for controlling robot expressions | |
Hönig et al. | Flying multiple UAVs using ROS | |
US7219064B2 (en) | Legged robot, legged robot behavior control method, and storage medium | |
US20050080514A1 (en) | Content providing system | |
JP2001121455A (ja) | 移動ロボットのための充電システム及び充電制御方法、充電ステーション、移動ロボット及びその制御方法 | |
JP2002103258A (ja) | オーサリング・システム及びオーサリング方法、並びに記憶媒体 | |
JP2001191275A (ja) | ロボット・システム、外装及びロボット装置 | |
JP2006110707A (ja) | ロボット装置 | |
JP3558222B2 (ja) | ロボットの行動制御システム及び行動制御方法、並びにロボット装置 | |
JP2002187082A (ja) | ロボット制御システム及びロボット制御方法 | |
Aloui et al. | A new SysML model for UAV swarm modeling: UavSwarmML | |
WO1998041952A1 (fr) | Generateur d'images et support de donnees | |
JP3925140B2 (ja) | 情報提供方法及び情報提供装置、並びにコンピュータ・プログラム | |
US11833441B2 (en) | Robot | |
JP2002059384A (ja) | ロボットのための学習システム及び学習方法 | |
JP2004114285A (ja) | ロボット装置及びその行動制御方法 | |
JP2003071775A (ja) | ロボット装置 | |
JP2005321954A (ja) | ロボット装置、情報処理システム及び情報処理方法、並びにコンピュータ・プログラム | |
JP2004255529A (ja) | ロボット装置およびロボット装置の制御方法、並びにロボット装置移動制御システム | |
JP2002346958A (ja) | 脚式移動ロボットのための制御装置及び制御方法 | |
De Martini et al. | eduMorse: an open-source framework for mobile robotics education |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2001972718 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 10168993 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2001972718 Country of ref document: EP |