WO2022091333A1 - Program, robot operation assistance method, and robot operation assistance device - Google Patents

Program, robot operation assistance method, and robot operation assistance device Download PDF

Info

Publication number
WO2022091333A1
WO2022091333A1 PCT/JP2020/040789 JP2020040789W WO2022091333A1 WO 2022091333 A1 WO2022091333 A1 WO 2022091333A1 JP 2020040789 W JP2020040789 W JP 2020040789W WO 2022091333 A1 WO2022091333 A1 WO 2022091333A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
operator
information
operation information
operation support
Prior art date
Application number
PCT/JP2020/040789
Other languages
French (fr)
Japanese (ja)
Inventor
宏季 遠野
Original Assignee
株式会社Plasma
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Plasma filed Critical 株式会社Plasma
Priority to PCT/JP2020/040789 priority Critical patent/WO2022091333A1/en
Publication of WO2022091333A1 publication Critical patent/WO2022091333A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/06Control stands, e.g. consoles, switchboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • the present invention relates to a program, a robot operation support method, and a robot operation support device.
  • Patent Document 1 describes an arm / hand operation unit equipped to an operator for operating a robot arm and a robot hand, and an arm / hand operation unit connected to the arm / hand operation unit to control the robot arm and the robot hand.
  • a robot arm / hand operation control system including an arm / hand control processing unit for performing the operation is described.
  • the robot arm / hand operation control system as described in Patent Document 1 does not necessarily have high operability when remotely controlling an industrial robot.
  • an object of the present invention is to provide a robot operation support technique capable of improving operability when performing remote control of a robot.
  • the program according to one aspect of the present invention is a robot based on a virtual viewpoint of a robot operator in which a robot operation support device that supports remote control of a robot is associated with the position of the robot or an image pickup device arranged around the robot. It functions as an acquisition unit for acquiring operation information for operating the robot and a conversion unit for converting the acquired operation information into control information for controlling the operation of the robot.
  • the robot operation support method is a robot operation support method executed by a robot operation support device that supports remote control of the robot, and is associated with the position of the robot or an image pickup device arranged around the robot. It includes a step of acquiring operation information for operating the robot based on the virtual viewpoint of the operator of the robot, and a step of converting the acquired operation information into control information for operating the robot.
  • the robot operation support device is a robot operation support device that supports remote control of the robot, and is a robot operator associated with the position of the robot or an image pickup device arranged around the robot. It includes an acquisition unit for acquiring operation information for operating a robot based on a virtual viewpoint, and a conversion unit for converting the acquired operation information into control information for controlling the operation of the robot.
  • the "part” does not simply mean a physical means, but also includes a case where the function of the "part” is realized by software. Further, even if the function of one "part” or device is realized by two or more physical means or devices, the function of two or more "parts” or devices is realized by one physical means or device. May be.
  • FIG. 1 is a schematic diagram of the overall configuration of the robot operation support system 100 according to the first embodiment.
  • the robot operation support system 100 includes a robot device 7, an operator terminal device 5 used by an operator O who remotely operates the robot device 7, and a robot terminal that controls the operation of the robot device 7. It includes a device 1 and a robot operation support server 3 that provides a program or the like for supporting remote control of the robot device 7.
  • the robot operation support system 100 supports remote control of the robot device 7 by the operator O.
  • the position of the robot device 7 for example, the position P where the camera 9 (imaging device) is arranged in the robot device 7) and the virtual viewpoint of the remote operator O.
  • the operator O visually recognizes the image G based on the image information obtained by imaging the periphery (predetermined range) of the arrangement position P of the camera 9 on the display unit 21 of the operator terminal device 5. Display as possible. Therefore, the operator O can confirm the surroundings of the robot device 7 as if he / she is at the remote position of the robot device 7.
  • the operation information input to the operator terminal device 5 by the operator O who can easily see the surroundings of the robot device 7 is converted into control information for controlling the operation of the robot device 7 in the robot terminal device 1. .. Therefore, the robot terminal device 1 can control various operations of the robot device 7 by providing the control information to the robot device 7.
  • the position where the camera 9 is arranged in the robot device 7 is associated with the virtual viewpoint of the operator O of the robot device 7.
  • the robot operation support system 100 acquires operation information for the operator O to operate the robot device 7 based on a virtual viewpoint, and uses the acquired operation information as control information for controlling the operation of the robot device 7. It is converted and the operation of the robot device 7 is controlled based on the converted control information. Therefore, the operator O can easily operate the robot device 7 while checking the surroundings of the robot device 7 as if he / she is at a remote position of the robot device 7. Therefore, it is possible to improve the operability when the robot device 7 is remotely controlled.
  • the operator O can input the operation information via at least one of the keyboard 23 and the mouse 25 of the operator terminal device 5. Therefore, the remote control of the robot device 7 as described above can be realized by using a general-purpose computer device. Since it is not necessary to adopt an advanced dedicated computer device, remote control of the robot device 7 can be realized easily and at low cost.
  • a general-purpose computer device Since it is not necessary to adopt an advanced dedicated computer device, remote control of the robot device 7 can be realized easily and at low cost.
  • the robot device 7 is an industrial robot device that executes, for example, assembly work of machines and electronic devices.
  • the robot device 7 can, for example, grip and move the works W1 and W2, which are the objects of work. When each of the works W1 and W2 is not distinguished, it is called "work W".
  • the robot device 7 is a control device that controls the operations of the arm portion AU having a multi-axis articulated configuration, the hand portion HU for gripping the work W, the camera 9, the arm portion AU, the hand portion HU, and the camera 9. And have.
  • the camera 9 may be configured separately from the robot device 7, and is arranged at a predetermined position of the robot device 7 when the camera 9 is used.
  • the camera 9 acquires image information obtained by imaging a predetermined range from the arrangement position P in the robot device 7.
  • the camera 9 transmits the acquired image information to the operator terminal device 5 via the robot device 7.
  • the arrangement position P of the camera 9 in the robot device 7 is arbitrary.
  • the arrangement position P of the camera 9 in the robot device 7 is arranged near the tip end (tip portion) of the arm portion AU of the robot device 7.
  • the tip of the arm portion AU usually moves to a position closer to the work W when the camera 9 images the work W, so that the camera 9 is moved to a position closer to the work W. Will move. Therefore, the work W can be imaged more accurately by the camera 9, and the operator O can accurately confirm the work W on the display unit 21 of the operator terminal device 5.
  • the camera 9 is, for example, a camera equipped with a CCD (Charge - Grouped Device ) .
  • the arrangement position P of the camera 9 in the robot device 7 is not limited to the vicinity of the tip of the arm portion AU.
  • the arrangement position P may be a predetermined position on the outer peripheral surface of the arm portion AU of the robot device 7.
  • the operator terminal device 1 includes, for example, a display unit 21, a keyboard 23 which is a pointing device for receiving input of operation information of the operator O, and a mouse computer 25 (mouse).
  • a display unit 21 capable of displaying an image G based on image information obtained by photographing the periphery (predetermined range) of the arrangement position P of the camera 9, and a pointing device.
  • the display with the touch panel function included in the smartphone or tablet terminal device is the display unit and the pointing device.
  • the operation information may be input by operating the virtual keyboard displayed on the display with the touch panel function, the virtual game controller, or the like by the operator O.
  • the operator terminal device 1 may include a game controller or a glove-type controller as a pointing device.
  • the robot terminal device 1 is a device for executing a robot operation support process for supporting the remote control of the robot device 7. Each function of the robot terminal device 1 will be described with reference to FIG. 2, which will be described later. Examples of the robot terminal device 1 include portable information communication devices such as IoT devices, smartphones, mobile phones, personal digital assistants (PDAs), tablet terminals, portable game machines, portable music players, and wearable terminals.
  • the robot device 7 in the robot operation support system 100 is configured to be able to communicate with the robot terminal device 1.
  • the robot terminal device 1 is configured to be able to communicate with the operator terminal device 5 via the robot operation support server 3. Further, the robot terminal device 1 is configured to be capable of P2P communication with the operator terminal device 5.
  • an arbitrary communication network is adopted, and for example, a wired or wireless communication network can be adopted.
  • FIG. 2 is a block diagram showing the configuration of the robot terminal device according to the first embodiment.
  • the robot terminal device 1 exemplifies the information processing unit 11 that executes the robot operation support process that supports the remote control of the robot device 7 shown in FIG. 1, and the robot operation support process.
  • a recording unit 12 for recording data used in the case or data related to the result of robot operation support processing is provided.
  • the information processing unit 11 functionally includes a viewpoint associating unit 13, an image information acquisition unit 14, an operation information acquisition unit 15 (acquisition unit), an operation allocation unit 16, an information conversion unit 17 (conversion unit), and the like.
  • An operation control unit 18 (control unit) is provided.
  • Each of the above parts of the information processing unit 11 can be realized, for example, by using a storage area such as a memory or a hard disk, or by executing a program stored in the storage area by a processor.
  • the viewpoint associating unit 13 associates the position P where the camera 9 is arranged in the robot device 7 shown in FIG. 1 with the virtual viewpoint of the operator O of the robot device 7.
  • the image G based on the image information obtained by photographing the periphery of the arrangement position P of the camera 9 is displayed on the display unit 21 of the operator terminal device 5.
  • the operator O can visually display the image. Therefore, the operator O can confirm the surroundings of the robot device 7 as if he / she is at the position of the remote robot device 7 (virtually, the surroundings of the position P can be confirmed from the arrangement position P of the camera 9). It is.).
  • the image information acquisition unit 14 acquires image information obtained by the camera 9 taking an image of a predetermined range from the arrangement position P from the robot device 7. As shown in FIG. 1, the image information acquisition unit 14 operates the image information in order to display the image G corresponding to the image information on the display unit 21 of the operator terminal device 5 so that the operator O can visually recognize the image G. Transfer to the personal terminal device 5.
  • the operation information acquisition unit 15 acquires the operation information OI for the operator O to operate the robot device 7 based on the virtual viewpoint. For example, the operator O obtains an image G (an image G corresponding to the image information obtained by the camera 9 capturing a predetermined range from the arrangement position P) displayed on the display unit 21 of the operator terminal device 5. While checking, remotely control the robot device 7. More specifically, the operator O inputs the operation information OI for operating the robot device 7 via the pointing device (for example, the keyboard 23 and the mouse 25). The operation information acquisition unit 15 acquires the operation information OI of the operator O received by the pointing device of the operator terminal device 5 from the operator terminal device 5.
  • the pointing device for example, the keyboard 23 and the mouse 25
  • the motion assignment unit 16 associates the operations of the keyboard 23 and the mouse 25 shown in FIG. 1 with various motions of the robot device 7. For example, the motion assigning unit 16 allocates various operations of the arm portion of the robot device 7 to the operation of a plurality of keys of the keyboard 23. Further, the motion allocation unit 16 allocates at least one of the various motions of the arm portion and the hand portion of the robot device 7 to at least one of the movement operation and the click operation of the mouse 25.
  • FIG. 3 is a conceptual diagram for explaining an example of key assignment information according to the first embodiment of the present invention.
  • FIG. 4 is a conceptual diagram for explaining an example of mouse allocation information according to the first embodiment of the present invention.
  • 5 to 7 are views showing an example of motion control of an arm portion of a robot device.
  • the arm portion AU of the robot device 7 shown in FIGS. 5 to 7 includes a plurality of link portions LU for transmitting displacement and force.
  • Each link portion LU is connected in series by a plurality of joint portions JUs that are rotatably or rotatably connected to each other.
  • a tip portion T is rotatably connected to the vicinity of the tip end of the arm portion AU of the robot device 7.
  • the robot device 7 is arranged and used on a predetermined surface such as a floor or a pedestal via a bottom BU.
  • the key assignment information KAI is managed in association with various operations of the arm portion AU of the robot device 7 shown in FIGS. 5 to 7 and operations of a plurality of keys included in the keyboard 23.
  • An example of key assignment information KAI is listed below.
  • the "W" key K1 and the "S" key K2 are associated with an operation in the front-rear direction of the arm portion AU of the robot device 7 (for example, a movement operation in the arrow A1 direction shown in FIGS. 5 and 6). For example, when the "W" key K1 is pressed, the arm portion AU moves in the forward direction. For example, when the "S" key K2 is pressed, the arm portion AU moves backward.
  • the "A” key K3 and the “D” key K4 are associated with a left-right movement of the arm portion AU of the robot device 7 (for example, a movement movement in the arrow A3 direction shown in FIGS. 5 and 7).
  • the "Q" key K6 and the “E” key K7 rotate the tip T of the robot device 7 (for example, the rotation operation in the direction of the arrow A7 with respect to the center point CP shown in FIGS. 6 and 7). Is associated with.
  • the space key K8 and the control key K9 are associated with a vertical movement of the arm portion AU of the robot device 7 (for example, a movement movement in the arrow A2 direction shown in FIGS. 5 to 7).
  • the shift key K10 is associated with an increase in the speed of the moving operation of the arm portion AU of the robot device 7. For example, when the space key K8 is pressed while pressing the shift key K10, the upward movement operation of the arm portion AU of the robot device 7 is executed at a speed faster than usual.
  • the "F" key K11 is associated with a special operation of the arm portion AU of the robot device 7 (for example, an operation other than the operations described in (1) to (5) above).
  • the above assignment example is merely an example, and a plurality of other keys may be assigned to other operations of the arm portion AU of the robot device 7, or a plurality of keys of the keyboard 23 may be assigned to the hand portion HU of the robot device 7. It may be assigned to the operation of. For example, a specific key may be associated with a decrease in the speed of movement of the arm portion AU of the robot device 7. Further, as shown in FIGS. 6 and 7, the rotation operation of the bottom BU in the arrow A9 direction in the robot device 7 may be associated with one or more keys on the keyboard 25.
  • FIG. 8 is a diagram showing an example of motion control of the hand portion of the robot according to the first embodiment of the present invention.
  • FIG. 9 is a diagram showing an example of motion control of the hand portion of the robot according to the first embodiment of the present invention.
  • a hand portion HU is connected to a tip portion T near the tip of the arm portion AU of the robot device 7.
  • the hand unit HU can grip and move a work or the like, which is an object of operation such as assembly work.
  • the mouse allocation information MAI manages the various operations of at least one of the arm portion AU and the hand portion HU of the robot device 7 in association with at least one of the movement operation and the click operation of the mouse 25. Will be done.
  • An example of mouse allocation information MAI is listed below.
  • the movement operation of the mouse 25 in a state where the left-click LC is pressed for a predetermined period is a swing operation of the tip portion T of the arm portion AU of the robot device 7 (for example, shown in FIG. 5). It is associated with a swinging motion in the directions of arrows A4 and A5 starting from the center point CP).
  • the operation of pressing the left-click LC is associated with the operation of opening the hand portion HU shown in FIG.
  • the operation of pressing the left-click LC a plurality of times is associated with the operation of closing the hand portion HU shown in FIG.
  • the above assignment example is merely an example, and other specific mouse operations may be assigned to at least one of the various movements of the arm portion AU and the hand portion HU of the robot device 7.
  • the hand portion HU may have a function of being able to hold the work W, and may have a function of being able to suck or suck the work W, for example, in addition to the function of being able to grip the work W.
  • the pressing operation of the left-click LC may be associated with the suction or suction operation of the work W
  • the double-click operation of the left-click LC may be associated with the operation of separating the work W from the hand portion HU.
  • the opening / closing operation, suction or suction operation of the hand portion HU shown in FIGS. 8 and 9 may be controlled not by the operation of the mouse 25 but by the operation via the keyboard 23.
  • the motion allocation unit 16 adjusts the operation of the keyboard 23 and the mouse 25 shown in FIG. 1 such as the direction (shooting angle) of the camera 9 connected to the robot device 7 and the imaging process (starting or ending imaging). It may be associated with the action to be performed. According to this configuration, the operator O can remotely control the operation of the camera 9 arranged in the robot device 7.
  • a predetermined operation of the robot device 7 may be associated with a shortcut key operation using a plurality of keys on the keyboard 25. For example, by associating a predetermined posture of the robot device 7 with a specific shortcut key operation, when a specific shortcut key operation is executed, the arm portion AU and the hand portion are automatically set so that the robot device 7 is in the predetermined posture. At least one of the HUs is controlled. According to this configuration, the operation of the operator for the robot device 7 to perform a predetermined operation becomes simpler.
  • the information conversion unit 17 converts the operation information OI acquired by the operation information acquisition unit 15 into a control information CI for controlling the operation of the robot device 7.
  • the information conversion unit 17 refers to the key allocation information KAI recorded in the recording unit 12 and the mouse and the allocation information MAI, and converts the operation information OI into a control information CI corresponding to the operation information OI.
  • the motion control unit 18 controls the motion of the robot device 7 based on the control information CI converted by the information conversion unit 17. For example, the motion control unit 18 transmits the control information CI to the robot device 7.
  • the control device that controls the operation in the robot device 7 receives the control information CI
  • the control device controls the operation in the robot device 7 based on the control information CI.
  • the motion control unit 18 is, for example, from the robot device 7, the position of the target hand portion HU (hand) and the arm portion AU. Receives information about the posture of the robot (information received this time).
  • the motion control unit 18 calculates the moving direction and speed of the hand unit HU by comparing the previously received information regarding the position of the hand unit HU (hand) and the posture of the arm unit AU with the received information this time. do.
  • the motion control unit 18 executes a process based on inverse kinematics using, for example, Jacobian, based on the calculated movement direction and speed of the hand unit HU.
  • the motion control unit 18 calculates the speed and the like of each joint portion JU for reaching the posture and the like of the robot device 7 assumed next.
  • the motion control unit 18 transmits the calculation result to the robot device 7 as control information.
  • the recording unit 12 for example, captures the captured image information II acquired from the camera 9, the key allocation information KAI, the mouse allocation information MAI, the operation information OI input by the operator O shown in FIG. 1, and the operation of the robot device 7.
  • the control information CI for control is recorded.
  • a program describing the contents of the robot operation support process executed by the robot operation support system 100 may be installed in the recording unit 12.
  • FIG. 10 is a flowchart showing an example of the robot operation support processing method according to the first embodiment.
  • the robot terminal device 1 shown in FIG. 1 acquires operation information OI for the operator O to operate the robot device 7 based on a virtual viewpoint (S1).
  • the robot terminal device 1 converts the acquired operation information OI into a control information CI for controlling the operation of the robot device 7 (S2).
  • the robot terminal device 1 controls the operation of the robot device 7 based on the converted control information CI (S3).
  • the position P where the camera 9 is arranged in the robot device 7 is associated with the virtual viewpoint of the operator O of the robot device 7.
  • the robot operation support system 100 acquires operation information for the operator O to operate the robot device 7 based on a virtual viewpoint, and uses the acquired operation information as control information for controlling the operation of the robot device 7. It is converted and the operation of the robot device 7 is controlled based on the converted control information. Therefore, the operator O can easily operate the robot device 7 while checking the surroundings of the robot device 7 as if he / she is at a remote position of the robot device 7. Therefore, it is possible to improve the operability when the robot device 7 is remotely controlled.
  • the arrows in the front-back and left-right directions of the arm portion AU (for example, the arrows shown in FIGS. 5 and 6) in response to the pressing operation of the keys K1 to K4.
  • the movement control in the direction of A1 and A3) changes based on the direction of the base of the hand portion HU.
  • FIG. 11 is a schematic diagram showing an example of the overall configuration of the robot operation support system according to the second embodiment of the present invention.
  • the arrangement position P1 of the camera 9 on the wall C around the robot device 7 is associated with the virtual viewpoint of the remote operator O.
  • the operator O visually recognizes the image G based on the image information obtained by photographing the periphery (predetermined range) of the arrangement position P1 of the camera 9 on the display unit 21 of the operator terminal device 5. Display as possible.
  • the position where the camera 9 is arranged around the robot device 7 is associated with the virtual viewpoint of the operator O of the robot device 7.
  • the robot operation support system 100 acquires operation information for the operator O to operate the robot device 7 based on a virtual viewpoint, and uses the acquired operation information as control information for controlling the operation of the robot device 7. It is converted and the operation of the robot device 7 is controlled based on the converted control information. Therefore, the operator O can easily operate the robot device 7 while checking the surroundings of the robot device 7 as if it were in the vicinity of the remote robot device 7. Therefore, it is possible to improve the operability when the robot device 7 is remotely controlled.
  • the robot operation in the second embodiment when the movement control of the arm portion AU in a specific direction according to the pressing operation of the keys K1 to K4 changes based on the direction (arrangement direction) of the robot device 7 itself, the robot operation in the second embodiment.
  • the support system 100 is suitable.
  • the arm portion AU has four specific directions regardless of the orientation of the base of the hand portion HU as in the first embodiment. Movement is controlled (for example, in each direction of north, south, east, and west). That is, in the robot operation support system 100 according to the second embodiment, it is possible to determine the processing content for the translation of the arm unit AU with respect to the operation input according to the arrangement position of the camera 9 around the robot device 7.
  • the operator O can easily operate the robot device 7. be.
  • FIG. 12 is a schematic diagram showing an example of the overall configuration of the robot operation support system according to the third embodiment of the present invention.
  • the robot operation support system 100 includes a plurality of cameras 9, for example, one camera 9 is arranged at a position P in the robot device 7, and the other cameras 9 are arranged by the robot device 7. It may be arranged at the position P1 of the ceiling C (predetermined surface) of the room. Further, the plurality of cameras 9 may be individually arranged at a plurality of locations in the robot device 7. For example, one camera 9 may be arranged in the arm portion AU of the robot device 7, and the other camera 9 may be arranged in the hand portion HU of the robot device 7. Further, the plurality of cameras 9 may be arranged at a plurality of predetermined positions of the arm portion AU, or may be arranged at a plurality of predetermined positions of the hand portion HU.
  • an image based on the image information acquired by one camera 9 and an image based on the image information acquired by another camera 9 may be displayed on the display unit 21 so as to be switchable. More specifically, basically, while displaying an image based on the image information acquired by one camera 9 on the display unit 21 as the main image, the image information acquired by the other camera 9 can be displayed as needed.
  • the based image may be displayed on the display unit 21 as a sub-image. Both images may be displayed on the same screen at the same time on the display unit 21.
  • FIG. 13 is a diagram showing an example of the hardware configuration of the computer according to the embodiment. With reference to FIG. 13, it is used to configure various devices in the robot operation support 100 shown in FIGS. 1, 11 and 12, for example, a robot terminal device 1, a robot operation support server 3, and an operator terminal device 5. An example of the hardware configuration of a computer that can be used will be described.
  • the computer 40 mainly includes a processor 41, a main recording device 42, an auxiliary recording device 43, an input / output interface 44, and a communication interface 45 as hardware resources. These are connected to each other via a bus line 46 including an address bus, a data bus, a control bus and the like. An interface circuit (not shown) may be appropriately interposed between the bus line 46 and each hardware resource.
  • the processor 41 controls the entire computer.
  • the processor 41 corresponds to, for example, the information processing unit 11 of the robot terminal device 1 shown in FIG.
  • the main recording device 42 provides a work area for the processor 41, and is a volatile memory such as a SRAM ( Static Random Access Memory) or a DRAM ( Dynamic Random Access Memory).
  • the auxiliary recording device 43 is a non-volatile memory such as an HDD, SSD, or flash memory that stores software programs and the like and data. The program, data, or the like is loaded from the auxiliary recording device 43 to the main recording device 42 via the bus line 46 at an arbitrary time point.
  • the auxiliary recording device 43 corresponds to, for example, the recording unit 12 of the robot terminal device 1 shown in FIG. 2.
  • the input / output interface 44 performs one or both of presenting information and receiving input of information, and is a camera and a keyboard (for example, the keyboard 23 of the operator terminal device 5 shown in FIGS. 1, 11 and 12). , Mouse (eg, mouse 25 of operator terminal device 5 shown in FIGS. 1, 11 and 12), display (eg, display unit 21 of operator terminal device 5 shown in FIGS. 1, 11 and 12), touch panel display, microphone. Such as a speaker.
  • the communication interface 45 is for transmitting and receiving various data between various devices in the robot operation support 100 shown in FIGS. 1, 11 and 12 via a predetermined communication network.
  • the communication interface 45 and the predetermined communication network may be connected by wire or wirelessly.
  • the communication interface 45 may also acquire information related to the network, for example, information related to a Wi-Fi access point, information related to a base station of a communication carrier, and the like.
  • each of the above embodiments is for facilitating the understanding of the present invention, and does not limit the interpretation of the present invention.
  • the present invention may be modified or improved without departing from the spirit thereof, and the present invention also includes an equivalent thereof.
  • the present invention can form various disclosures by appropriately combining the plurality of components disclosed in each of the above embodiments. For example, some components may be removed from all the components shown in the embodiments. Further, the components may be appropriately combined in different embodiments.
  • the robot terminal device 1 has each function in the information processing unit 11 (for example, a viewpoint association unit 13, an image information acquisition unit 14, and an operation information acquisition unit 15). , An operation allocation unit 16, an information conversion unit 17, and an operation control unit 18), and a recording unit 12, but the present invention is not limited to this.
  • the robot operation support server 3 (robot operation support device) shown in FIGS. 1, 11 and 12 may have at least a part of each function in the information processing unit 11 shown in FIG. 2 and at least one function of the recording unit 12. .
  • the robot terminal device 1 may include an operation control unit 18, and the robot operation support server 3 may include an image information acquisition unit 15, an operation information acquisition unit 16, and an information conversion unit 17.
  • the robot operation support server 3 may further include a viewpoint associating unit 13 and an action assigning unit 16.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

Provided is a robot operation assistance technology for enabling improvement of operability of a robot when executing remote operation of the robot. This program causes a robot operation assistance device for providing assistance in remote operation of a robot to serve as: an acquisition unit for acquiring operation information OI for operating the robot, based on a virtual viewpoint of an operator of the robot, the virtual viewpoint being associated with a position of the robot or an imaging device disposed in the surrounding of the robot; and a conversion unit that converts the acquired operation information OI to control information CI for controlling movement of the robot.

Description

プログラム、ロボット操作支援方法、及び、ロボット操作支援装置Program, robot operation support method, and robot operation support device
 本発明は、プログラム、ロボット操作支援方法、及び、ロボット操作支援装置に関する。 The present invention relates to a program, a robot operation support method, and a robot operation support device.
 従来、機械や電子機器の組立などの作業を行う産業用ロボットが知られている。 Conventionally, industrial robots that perform work such as assembling machines and electronic devices are known.
 産業用ロボットに関して、特許文献1には、ロボットアーム及びロボットハンドの操作のために操作者に装備させるアーム・ハンド操作部と、アーム・ハンド操作部に接続され、ロボットアーム及びロボットハンドの制御を行うアーム・ハンド制御処理部と、を備えるロボットアーム・ハンド操作制御システムが記載されている。 Regarding an industrial robot, Patent Document 1 describes an arm / hand operation unit equipped to an operator for operating a robot arm and a robot hand, and an arm / hand operation unit connected to the arm / hand operation unit to control the robot arm and the robot hand. A robot arm / hand operation control system including an arm / hand control processing unit for performing the operation is described.
特開2005-46931号公報Japanese Unexamined Patent Publication No. 2005-46931
 しかしながら、特許文献1に記載されているようなロボットアーム・ハンド操作制御システムは、必ずしも産業用ロボットを遠隔操作する場合の操作性が高いとはいえない。 However, the robot arm / hand operation control system as described in Patent Document 1 does not necessarily have high operability when remotely controlling an industrial robot.
 そこで、本発明は、ロボットの遠隔操作を実行する場合の操作性を改善することができるロボット操作支援技術を提供することを目的とする。 Therefore, an object of the present invention is to provide a robot operation support technique capable of improving operability when performing remote control of a robot.
 本発明の一態様に係るプログラムは、ロボットの遠隔操作を支援するロボット操作支援装置を、ロボット又はロボットの周辺に配置された撮像装置の位置と関連づけられたロボットの操作者の仮想視点に基づくロボットを操作するための操作情報を取得する取得部と、取得された操作情報を、ロボットの動作を制御するための制御情報に変換する変換部と、して機能させる。 The program according to one aspect of the present invention is a robot based on a virtual viewpoint of a robot operator in which a robot operation support device that supports remote control of a robot is associated with the position of the robot or an image pickup device arranged around the robot. It functions as an acquisition unit for acquiring operation information for operating the robot and a conversion unit for converting the acquired operation information into control information for controlling the operation of the robot.
 本発明の一態様に係るロボット操作支援方法は、ロボットの遠隔操作を支援するロボット操作支援装置が実行するロボット操作支援方法であって、ロボット又はロボットの周辺に配置された撮像装置の位置と関連づけられたロボットの操作者の仮想視点に基づくロボットを操作するための操作情報を取得するステップと、取得された操作情報を、ロボットを操作するための制御情報に変換するステップと、を含む。 The robot operation support method according to one aspect of the present invention is a robot operation support method executed by a robot operation support device that supports remote control of the robot, and is associated with the position of the robot or an image pickup device arranged around the robot. It includes a step of acquiring operation information for operating the robot based on the virtual viewpoint of the operator of the robot, and a step of converting the acquired operation information into control information for operating the robot.
 本発明の一態様に係るロボット操作支援装置は、ロボットの遠隔操作を支援するロボット操作支援装置であって、ロボット又はロボットの周辺に配置された撮像装置の位置と関連づけられたロボットの操作者の仮想視点に基づくロボットを操作するための操作情報を取得する取得部と、取得された操作情報を、ロボットの動作を制御するための制御情報に変換する変換部と、を備える。 The robot operation support device according to one aspect of the present invention is a robot operation support device that supports remote control of the robot, and is a robot operator associated with the position of the robot or an image pickup device arranged around the robot. It includes an acquisition unit for acquiring operation information for operating a robot based on a virtual viewpoint, and a conversion unit for converting the acquired operation information into control information for controlling the operation of the robot.
 なお、本発明において、「部」とは、単に物理的手段を意味するものではなく、その「部」が有する機能をソフトウェアによって実現する場合も含む。また、1つの「部」や装置が有する機能が2つ以上の物理的手段や装置により実現されても、2つ以上の「部」や装置の機能が1つの物理的手段や装置により実現されても良い。 In the present invention, the "part" does not simply mean a physical means, but also includes a case where the function of the "part" is realized by software. Further, even if the function of one "part" or device is realized by two or more physical means or devices, the function of two or more "parts" or devices is realized by one physical means or device. May be.
 本発明によれば、ロボットの遠隔操作を実行する場合の操作性を改善することができる。 According to the present invention, it is possible to improve the operability when performing remote control of the robot.
本発明の第1実施形態に係るロボット操作支援システムの全体構成の一例を示す模式図である。It is a schematic diagram which shows an example of the whole structure of the robot operation support system which concerns on 1st Embodiment of this invention. 本発明の第1実施形態に係るロボット端末装置の構成の一例を示すブロック図である。It is a block diagram which shows an example of the structure of the robot terminal apparatus which concerns on 1st Embodiment of this invention. 本発明の第1実施形態に係るキー割当情報の一例を説明するための概念図である。It is a conceptual diagram for demonstrating an example of the key allocation information which concerns on 1st Embodiment of this invention. 本発明の第1実施形態に係るマウス割当情報の一例を説明するための概念図である。It is a conceptual diagram for demonstrating an example of mouse allocation information which concerns on 1st Embodiment of this invention. 本発明の第1実施形態に係るロボットのアーム部の動作制御の一例を示す図である。It is a figure which shows an example of the operation control of the arm part of the robot which concerns on 1st Embodiment of this invention. 本発明の第1実施形態に係るロボットのアーム部の動作制御の一例を示す図である。It is a figure which shows an example of the operation control of the arm part of the robot which concerns on 1st Embodiment of this invention. 本発明の第1実施形態に係るロボットのアーム部の動作制御の一例を示す図である。It is a figure which shows an example of the operation control of the arm part of the robot which concerns on 1st Embodiment of this invention. 本発明の第1実施形態に係るロボットのハンド部の動作制御の一例を示す図である。It is a figure which shows an example of the operation control of the hand part of the robot which concerns on 1st Embodiment of this invention. 本発明の第1実施形態に係るロボットのハンド部の動作制御の一例を示す図である。It is a figure which shows an example of the operation control of the hand part of the robot which concerns on 1st Embodiment of this invention. 本発明の第1実施形態に係るロボット操作支援処理方法の一例を示すフローチャートである。It is a flowchart which shows an example of the robot operation support processing method which concerns on 1st Embodiment of this invention. 本発明の第2実施形態に係るロボット操作支援システムの全体構成の一例を示す模式図である。It is a schematic diagram which shows an example of the whole structure of the robot operation support system which concerns on 2nd Embodiment of this invention. 本発明の第3実施形態に係るロボット操作支援システムの全体構成の一例を示す模式図である。It is a schematic diagram which shows an example of the whole structure of the robot operation support system which concerns on 3rd Embodiment of this invention. 本発明の一実施形態に係るコンピュータのハードウェア構成の一例を示す模式図である。It is a schematic diagram which shows an example of the hardware composition of the computer which concerns on one Embodiment of this invention.
 以下、添付図面を参照しながら本発明の実施の形態について説明する。以下の実施の形態は、本発明を説明するための例示であり、本発明をその実施の形態のみに限定する趣旨ではない。また、本発明は、その要旨を逸脱しない限り、様々な変形が可能である。さらに、各図面において同一の構成要素に対しては可能な限り同一の符号を付し、重複する説明は省略する。 Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. The following embodiments are examples for explaining the present invention, and the present invention is not intended to be limited only to the embodiments thereof. Further, the present invention can be modified in various ways as long as it does not deviate from the gist thereof. Further, in each drawing, the same components are designated by the same reference numerals as much as possible, and duplicate description will be omitted.
 <第1実施形態>
 [ロボット操作支援システムの構成]
 図1を参照して、本発明の第1実施形態に係る、ロボット装置(ロボット)の遠隔操作を支援するロボット操作支援システムの構成について説明する。図1は、第1実施形態に係るロボット操作支援システム100の全体構成の模式図である。
<First Embodiment>
[Configuration of robot operation support system]
With reference to FIG. 1, a configuration of a robot operation support system that supports remote control of a robot device (robot) according to the first embodiment of the present invention will be described. FIG. 1 is a schematic diagram of the overall configuration of the robot operation support system 100 according to the first embodiment.
 図1に示すように、ロボット操作支援システム100は、ロボット装置7と、ロボット装置7を遠隔で操作する操作者Oが利用する操作者端末装置5と、ロボット装置7の動作を制御するロボット端末装置1と、ロボット装置7の遠隔操作を支援するためのプログラム等を提供するロボット操作支援サーバ3と、を備える。 As shown in FIG. 1, the robot operation support system 100 includes a robot device 7, an operator terminal device 5 used by an operator O who remotely operates the robot device 7, and a robot terminal that controls the operation of the robot device 7. It includes a device 1 and a robot operation support server 3 that provides a program or the like for supporting remote control of the robot device 7.
 本実施形態におけるロボット操作支援システム100の概要を以下で説明する。ロボット操作支援システム100は、操作者Oによるロボット装置7の遠隔操作を支援する。具体的には、ロボット操作支援システム100では、ロボット装置7の位置(例えば、カメラ9(撮像装置)がロボット装置7に配置されている位置P)と、遠隔にいる操作者Oの仮想視点とを関連づける。ロボット操作支援システム100では、カメラ9の配置位置Pの周囲(所定範囲)を撮像することによって得られた画像情報に基づく画像Gを、操作者端末装置5の表示部21に操作者Oが視認可能に表示する。よって、操作者Oは、あたかも、遠隔にあるロボット装置7の位置にいるかのように、ロボット装置7の周囲を確認できる。ロボット装置7の周囲を容易に視認可能な操作者Oによって操作者端末装置5に入力された操作情報は、ロボット端末装置1において、ロボット装置7の動作を制御するための制御情報に変換される。したがって、ロボット端末装置1は、制御情報をロボット装置7に提供することによって、ロボット装置7の各種動作を制御可能となる。 The outline of the robot operation support system 100 in this embodiment will be described below. The robot operation support system 100 supports remote control of the robot device 7 by the operator O. Specifically, in the robot operation support system 100, the position of the robot device 7 (for example, the position P where the camera 9 (imaging device) is arranged in the robot device 7) and the virtual viewpoint of the remote operator O. To relate. In the robot operation support system 100, the operator O visually recognizes the image G based on the image information obtained by imaging the periphery (predetermined range) of the arrangement position P of the camera 9 on the display unit 21 of the operator terminal device 5. Display as possible. Therefore, the operator O can confirm the surroundings of the robot device 7 as if he / she is at the remote position of the robot device 7. The operation information input to the operator terminal device 5 by the operator O who can easily see the surroundings of the robot device 7 is converted into control information for controlling the operation of the robot device 7 in the robot terminal device 1. .. Therefore, the robot terminal device 1 can control various operations of the robot device 7 by providing the control information to the robot device 7.
 本実施形態におけるロボット操作支援システム100では、ロボット装置7においてカメラ9が配置される位置が、ロボット装置7の操作者Oの仮想視点と関連付けられている。ロボット操作支援システム100は、操作者Oが仮想視点に基づいてロボット装置7を操作するための操作情報を取得し、取得された操作情報を、ロボット装置7の動作を制御するための制御情報に変換し、変換された制御情報に基づいてロボット装置7の動作を制御する。よって、操作者Oは、あたかも、遠隔にあるロボット装置7の位置にいるかのように、ロボット装置7の周囲を確認しながら、ロボット装置7を容易に操作可能である。したがって、ロボット装置7の遠隔操作を実行する場合の操作性を改善することができる。 In the robot operation support system 100 of the present embodiment, the position where the camera 9 is arranged in the robot device 7 is associated with the virtual viewpoint of the operator O of the robot device 7. The robot operation support system 100 acquires operation information for the operator O to operate the robot device 7 based on a virtual viewpoint, and uses the acquired operation information as control information for controlling the operation of the robot device 7. It is converted and the operation of the robot device 7 is controlled based on the converted control information. Therefore, the operator O can easily operate the robot device 7 while checking the surroundings of the robot device 7 as if he / she is at a remote position of the robot device 7. Therefore, it is possible to improve the operability when the robot device 7 is remotely controlled.
 また、操作者Oは、後述するとおり、操作情報を、操作者端末装置5のキーボード23及びマウス25の少なくとも一方を介して入力可能である。よって、上記したようなロボット装置7の遠隔操作を、汎用コンピュータ装置を使用することで実現可能である。高度な専用コンピュータ装置を採用する必要がないため、ロボット装置7の遠隔操作を簡便に且つ、低コストで実現可能である。ロボット操作支援システム100が備える各構成について以下で説明する。 Further, as will be described later, the operator O can input the operation information via at least one of the keyboard 23 and the mouse 25 of the operator terminal device 5. Therefore, the remote control of the robot device 7 as described above can be realized by using a general-purpose computer device. Since it is not necessary to adopt an advanced dedicated computer device, remote control of the robot device 7 can be realized easily and at low cost. Each configuration included in the robot operation support system 100 will be described below.
 ロボット装置7は、例えば機械や電子機器の組立作業などを実行する産業用ロボット装置である。ロボット装置7は、例えば、作業の対象物であるワークW1及びW2を把持し、移動させることが可能である。ワークW1及びW2のそれぞれを区別しない場合は、「ワークW」と呼ぶ。ロボット装置7は、多軸多関節構成のアーム部AUと、ワークWを把持等するためのハンド部HUと、カメラ9と、アーム部AU、ハンド部HU及びカメラ9の動作を制御する制御装置と、を備えている。なお、カメラ9は、ロボット装置7と別体で構成されてもよく、カメラ9の使用時にロボット装置7の所定位置に配置される。 The robot device 7 is an industrial robot device that executes, for example, assembly work of machines and electronic devices. The robot device 7 can, for example, grip and move the works W1 and W2, which are the objects of work. When each of the works W1 and W2 is not distinguished, it is called "work W". The robot device 7 is a control device that controls the operations of the arm portion AU having a multi-axis articulated configuration, the hand portion HU for gripping the work W, the camera 9, the arm portion AU, the hand portion HU, and the camera 9. And have. The camera 9 may be configured separately from the robot device 7, and is arranged at a predetermined position of the robot device 7 when the camera 9 is used.
 カメラ9は、ロボット装置7における配置位置Pからの所定範囲を撮像することによって得られた画像情報を取得する。カメラ9は、ロボット装置7を介して、操作者端末装置5に対して、取得した画像情報を送信する。ロボット装置7におけるカメラ9の配置位置Pは、任意である。例えば、ロボット装置7におけるカメラ9の配置位置Pは、ロボット装置7のアーム部AUの先端付近(先端部)に配置される。この構成によれば、アーム部AUの先端部は、カメラ9でワークWを撮像する際にはワークWに近い位置に移動することが通常であるから、カメラ9がよりワークWに近い位置に移動することになる。よって、カメラ9によってワークWをより正確に撮像することが可能となり、操作者Oは、操作者端末装置5における表示部21で正確にワークWを確認することが可能となる。カメラ9は、例えばCCD(harge-oupled evice)を備えるカメラである。なお、ロボット装置7におけるカメラ9の配置位置Pは、アーム部AUの先端付近に限られない。配置位置Pは、ロボット装置7のアーム部AUの外周面における所定位置でもよい。 The camera 9 acquires image information obtained by imaging a predetermined range from the arrangement position P in the robot device 7. The camera 9 transmits the acquired image information to the operator terminal device 5 via the robot device 7. The arrangement position P of the camera 9 in the robot device 7 is arbitrary. For example, the arrangement position P of the camera 9 in the robot device 7 is arranged near the tip end (tip portion) of the arm portion AU of the robot device 7. According to this configuration, the tip of the arm portion AU usually moves to a position closer to the work W when the camera 9 images the work W, so that the camera 9 is moved to a position closer to the work W. Will move. Therefore, the work W can be imaged more accurately by the camera 9, and the operator O can accurately confirm the work W on the display unit 21 of the operator terminal device 5. The camera 9 is, for example, a camera equipped with a CCD (Charge - Grouped Device ) . The arrangement position P of the camera 9 in the robot device 7 is not limited to the vicinity of the tip of the arm portion AU. The arrangement position P may be a predetermined position on the outer peripheral surface of the arm portion AU of the robot device 7.
 操作者端末装置1は、例示的に、表示部21と、操作者Oの操作情報の入力を受け付けるポインティング装置であるキーボード23、及び、マウスコンピュータ25(マウス)と、を備える。後述するとおり、ロボット装置7の各種動作のそれぞれは、キーボード23が有する複数のキー、及び、マウス25のクリックボタン等に割り当てられている。よって、操作者Oは、キーボード23におけるキーの押下、及び、マウス25の操作の少なくとも一方の操作によって、容易に遠隔にあるロボット装置7の各種動作を制御可能である。なお、操作者端末装置1は、上記したとおり、カメラ9の配置位置Pの周囲(所定範囲)を撮像することによって得られた画像情報に基づく画像Gを表示可能な表示部21と、ポインティング装置とを備えればよく、例えば、スマートフォン又はタブレット端末装置でもよい。この場合、スマートフォン又はタブレット端末装置が備えるタッチパネル機能付きのディスプレイが表示部及びポインティング装置である。この場合、タッチパネル機能付きのディスプレイにおいて表示される仮想的なキーボード、又は、仮想的なゲームコントローラ等が操作者Oによって操作されることによって、操作情報が入力されてもよい。また、操作者端末装置1は、ポインティング装置として、ゲームコントローラ又はグローブ型のコントローラを備えてもよい。 The operator terminal device 1 includes, for example, a display unit 21, a keyboard 23 which is a pointing device for receiving input of operation information of the operator O, and a mouse computer 25 (mouse). As will be described later, each of the various operations of the robot device 7 is assigned to a plurality of keys of the keyboard 23, click buttons of the mouse 25, and the like. Therefore, the operator O can easily control various operations of the remote robot device 7 by pressing a key on the keyboard 23 and operating at least one of the operations of the mouse 25. As described above, the operator terminal device 1 has a display unit 21 capable of displaying an image G based on image information obtained by photographing the periphery (predetermined range) of the arrangement position P of the camera 9, and a pointing device. It may be provided with, for example, a smartphone or a tablet terminal device. In this case, the display with the touch panel function included in the smartphone or tablet terminal device is the display unit and the pointing device. In this case, the operation information may be input by operating the virtual keyboard displayed on the display with the touch panel function, the virtual game controller, or the like by the operator O. Further, the operator terminal device 1 may include a game controller or a glove-type controller as a pointing device.
 ロボット端末装置1は、ロボット装置7の遠隔操作を支援するためのロボット操作支援処理を実行するための装置である。ロボット端末装置1の各機能については、後述する図2を参照して説明する。ロボット端末装置1は、例えば、IoTデバイス、スマートフォン、携帯電話機、携帯情報端末(PDA)、タブレット端末、携帯ゲーム機、携帯音楽プレーヤ、ウェアラブル端末等の携帯型情報通信機器が挙げられる。ロボット操作支援システム100におけるロボット装置7は、ロボット端末装置1と通信可能に構成される。ロボット端末装置1は、ロボット操作支援サーバ3を介して操作者端末装置5と通信可能に構成される。また、ロボット端末装置1は、操作者端末装置5とP2P通信可能に構成される。ロボット操作支援システム100における上記各装置間の通信は、任意の通信ネットワークが採用され、例えば、有線又は無線の通信ネットワークが採用可能である。 The robot terminal device 1 is a device for executing a robot operation support process for supporting the remote control of the robot device 7. Each function of the robot terminal device 1 will be described with reference to FIG. 2, which will be described later. Examples of the robot terminal device 1 include portable information communication devices such as IoT devices, smartphones, mobile phones, personal digital assistants (PDAs), tablet terminals, portable game machines, portable music players, and wearable terminals. The robot device 7 in the robot operation support system 100 is configured to be able to communicate with the robot terminal device 1. The robot terminal device 1 is configured to be able to communicate with the operator terminal device 5 via the robot operation support server 3. Further, the robot terminal device 1 is configured to be capable of P2P communication with the operator terminal device 5. For communication between the above-mentioned devices in the robot operation support system 100, an arbitrary communication network is adopted, and for example, a wired or wireless communication network can be adopted.
 図2は、第1実施形態に係るロボット端末装置の構成を示すブロック図である。図2に示すように、ロボット端末装置1は、例示的に、図1に示すロボット装置7の遠隔操作を支援するロボット操作支援処理を実行する情報処理部11と、ロボット操作支援処理を実行する際に使用されるデータ又はロボット操作支援処理の結果に関するデータを記録する記録部12と、を備える。情報処理部11は、機能的に、視点関連付け部13と、画像情報取得部14と、操作情報取得部15(取得部)と、動作割当部16と、情報変換部17(変換部)と、動作制御部18(制御部)と、を備える。なお、情報処理部11の上記各部は、例えば、メモリやハードディスク等の記憶領域を用いたり、記憶領域に格納されているプログラムをプロセッサが実行したりすることにより実現することができる。 FIG. 2 is a block diagram showing the configuration of the robot terminal device according to the first embodiment. As shown in FIG. 2, the robot terminal device 1 exemplifies the information processing unit 11 that executes the robot operation support process that supports the remote control of the robot device 7 shown in FIG. 1, and the robot operation support process. A recording unit 12 for recording data used in the case or data related to the result of robot operation support processing is provided. The information processing unit 11 functionally includes a viewpoint associating unit 13, an image information acquisition unit 14, an operation information acquisition unit 15 (acquisition unit), an operation allocation unit 16, an information conversion unit 17 (conversion unit), and the like. An operation control unit 18 (control unit) is provided. Each of the above parts of the information processing unit 11 can be realized, for example, by using a storage area such as a memory or a hard disk, or by executing a program stored in the storage area by a processor.
 視点関連付け部13は、図1に示すロボット装置7においてカメラ9が配置される位置Pと、ロボット装置7の操作者Oの仮想視点とを関連付ける。この構成によれば、図1に示すロボット操作支援システム100では、カメラ9の配置位置Pの周囲を撮像することによって得られた画像情報に基づく画像Gを、操作者端末装置5の表示部21に操作者Oが視認可能に表示する。よって、操作者Oは、あたかも、遠隔にあるロボット装置7の位置にいるかのようにロボット装置7の周囲を確認できる(仮想的に、カメラ9の配置位置Pから当該位置Pの周囲を確認可能である。)。 The viewpoint associating unit 13 associates the position P where the camera 9 is arranged in the robot device 7 shown in FIG. 1 with the virtual viewpoint of the operator O of the robot device 7. According to this configuration, in the robot operation support system 100 shown in FIG. 1, the image G based on the image information obtained by photographing the periphery of the arrangement position P of the camera 9 is displayed on the display unit 21 of the operator terminal device 5. The operator O can visually display the image. Therefore, the operator O can confirm the surroundings of the robot device 7 as if he / she is at the position of the remote robot device 7 (virtually, the surroundings of the position P can be confirmed from the arrangement position P of the camera 9). It is.).
 画像情報取得部14は、ロボット装置7から、カメラ9が配置位置Pからの所定範囲を撮像することによって得られた画像情報を取得する。画像情報取得部14は、図1に示すように、当該画像情報に対応する画像Gを操作者Oが視認可能に操作者端末装置5の表示部21に表示させるために、当該画像情報を操作者端末装置5に転送する。 The image information acquisition unit 14 acquires image information obtained by the camera 9 taking an image of a predetermined range from the arrangement position P from the robot device 7. As shown in FIG. 1, the image information acquisition unit 14 operates the image information in order to display the image G corresponding to the image information on the display unit 21 of the operator terminal device 5 so that the operator O can visually recognize the image G. Transfer to the personal terminal device 5.
 操作情報取得部15は、操作者Oが仮想視点に基づいてロボット装置7を操作するための操作情報OIを取得する。例えば、操作者Oは、操作者端末装置5の表示部21に表示される画像G(カメラ9が配置位置Pからの所定範囲を撮像することによって得られた画像情報に対応する画像G)を確認しながら、ロボット装置7を遠隔操作する。より具体的には、操作者Oは、ロボット装置7を操作するための操作情報OIを、ポインティング装置(例えばキーボード23及びマウス25)を介して入力する。操作情報取得部15は、操作者端末装置5のポインティング装置が受け付けた操作者Oの操作情報OIを、操作者端末装置5から取得する。 The operation information acquisition unit 15 acquires the operation information OI for the operator O to operate the robot device 7 based on the virtual viewpoint. For example, the operator O obtains an image G (an image G corresponding to the image information obtained by the camera 9 capturing a predetermined range from the arrangement position P) displayed on the display unit 21 of the operator terminal device 5. While checking, remotely control the robot device 7. More specifically, the operator O inputs the operation information OI for operating the robot device 7 via the pointing device (for example, the keyboard 23 and the mouse 25). The operation information acquisition unit 15 acquires the operation information OI of the operator O received by the pointing device of the operator terminal device 5 from the operator terminal device 5.
 動作割当部16は、図1に示すキーボード23及びマウス25の操作と、ロボット装置7の各種動作とを関連づける。例えば、動作割当部16は、ロボット装置7のアーム部の各種動作を、キーボード23が有する複数のキーの操作に割り当てる。また、動作割当部16は、ロボット装置7のアーム部及びハンド部の少なくとも一方の各種動作を、マウス25の移動操作及びクリック操作の少なくとも一方の操作に割り当てる。 The motion assignment unit 16 associates the operations of the keyboard 23 and the mouse 25 shown in FIG. 1 with various motions of the robot device 7. For example, the motion assigning unit 16 allocates various operations of the arm portion of the robot device 7 to the operation of a plurality of keys of the keyboard 23. Further, the motion allocation unit 16 allocates at least one of the various motions of the arm portion and the hand portion of the robot device 7 to at least one of the movement operation and the click operation of the mouse 25.
 図3から図9を参照して、キーボード23及びマウス25の操作の、ロボット装置7の各種動作への割当、及び、ロボット装置7の各種動作制御の概要を説明する。図3は、本発明の第1実施形態に係るキー割当情報の一例を説明するための概念図である。図4は、本発明の第1実施形態に係るマウス割当情報の一例を説明するための概念図である。図5から図7は、ロボット装置のアーム部の動作制御の一例を示す図である。 With reference to FIGS. 3 to 9, the operation of the keyboard 23 and the mouse 25 will be assigned to various operations of the robot device 7, and the outline of various operation controls of the robot device 7 will be described. FIG. 3 is a conceptual diagram for explaining an example of key assignment information according to the first embodiment of the present invention. FIG. 4 is a conceptual diagram for explaining an example of mouse allocation information according to the first embodiment of the present invention. 5 to 7 are views showing an example of motion control of an arm portion of a robot device.
 図5から図7に示すロボット装置7のアーム部AUは、変位や力を伝達する複数のリンク部LUを備える。各リンク部LUは、相互に旋回又は回転可能に連結する複数の関節部JUによって直列的に接続されている。ロボット装置7のアーム部AUの先端付近には、先端部Tが旋回可能に接続されている。ロボット装置7は、底部BUを介して床や台座などの所定面に配置されて用いられる。 The arm portion AU of the robot device 7 shown in FIGS. 5 to 7 includes a plurality of link portions LU for transmitting displacement and force. Each link portion LU is connected in series by a plurality of joint portions JUs that are rotatably or rotatably connected to each other. A tip portion T is rotatably connected to the vicinity of the tip end of the arm portion AU of the robot device 7. The robot device 7 is arranged and used on a predetermined surface such as a floor or a pedestal via a bottom BU.
 図3に示すように、キー割当情報KAIは、図5から図7に示すロボット装置7のアーム部AUの各種動作と、キーボード23が有する複数のキーの操作とを関連づけて管理される。キー割当情報KAIの一例について以下に列挙する。
(1)「W」キーK1及び「S」キーK2は、ロボット装置7のアーム部AUの前後方向の動作(例えば図5及び図6に示す矢印A1方向の移動動作)と関連付けられている。例えば、「W」キーK1を押下すると、アーム部AUが前方向に移動する。例えば、「S」キーK2を押下すると、アーム部AUが後ろ方向に移動する。以下同様に、各キーを押下することによって、当該キーに対応するロボット装置7の動作が制御される。
(2)「A」キーK3及び「D」キーK4は、ロボット装置7のアーム部AUの左右方向の動作(例えば図5及び図7に示す矢印A3方向の移動動作)と関連付けられている。
(3)「Q」キーK6及び「E」キーK7は、ロボット装置7の先端部Tの旋回動作(例えば図6及び図7に示す、中心点CPを基準とする矢印A7方向の回転動作)と関連付けられている。
(4)スペースキーK8及びコントロールキーK9は、ロボット装置7のアーム部AUの上下方向の動作(例えば図5から図7に示す矢印A2方向の移動動作)と関連付けられている。
(5)シフトキーK10は、ロボット装置7のアーム部AUの移動動作の速度上昇と関連付けられている。例えば、シフトキーK10を押下しながら、スペースキーK8を押下すると、ロボット装置7のアーム部AUの上方向の移動動作が通常よりも速い速度で実行される。
(6)「F」キーK11は、ロボット装置7のアーム部AUの特殊動作(例えば上記(1)から(5)に記載の動作以外の動作)と関連付けられている。
As shown in FIG. 3, the key assignment information KAI is managed in association with various operations of the arm portion AU of the robot device 7 shown in FIGS. 5 to 7 and operations of a plurality of keys included in the keyboard 23. An example of key assignment information KAI is listed below.
(1) The "W" key K1 and the "S" key K2 are associated with an operation in the front-rear direction of the arm portion AU of the robot device 7 (for example, a movement operation in the arrow A1 direction shown in FIGS. 5 and 6). For example, when the "W" key K1 is pressed, the arm portion AU moves in the forward direction. For example, when the "S" key K2 is pressed, the arm portion AU moves backward. Similarly, by pressing each key, the operation of the robot device 7 corresponding to the key is controlled.
(2) The "A" key K3 and the "D" key K4 are associated with a left-right movement of the arm portion AU of the robot device 7 (for example, a movement movement in the arrow A3 direction shown in FIGS. 5 and 7).
(3) The "Q" key K6 and the "E" key K7 rotate the tip T of the robot device 7 (for example, the rotation operation in the direction of the arrow A7 with respect to the center point CP shown in FIGS. 6 and 7). Is associated with.
(4) The space key K8 and the control key K9 are associated with a vertical movement of the arm portion AU of the robot device 7 (for example, a movement movement in the arrow A2 direction shown in FIGS. 5 to 7).
(5) The shift key K10 is associated with an increase in the speed of the moving operation of the arm portion AU of the robot device 7. For example, when the space key K8 is pressed while pressing the shift key K10, the upward movement operation of the arm portion AU of the robot device 7 is executed at a speed faster than usual.
(6) The "F" key K11 is associated with a special operation of the arm portion AU of the robot device 7 (for example, an operation other than the operations described in (1) to (5) above).
 上記の割り当て例はあくまで一例であって、複数の他のキーをロボット装置7のアーム部AUの他の動作に割り当ててもよいし、キーボード23が有する複数のキーをロボット装置7のハンド部HUの動作に割り当ててもよい。例えば、特定のキーを、ロボット装置7のアーム部AUの移動動作の速度減少と関連付けてもよい。また、図6及び図7に示すように、ロボット装置7における底部BUの矢印A9方向の回転動作を、キーボード25における一又は複数のキーに関連付けてもよい。 The above assignment example is merely an example, and a plurality of other keys may be assigned to other operations of the arm portion AU of the robot device 7, or a plurality of keys of the keyboard 23 may be assigned to the hand portion HU of the robot device 7. It may be assigned to the operation of. For example, a specific key may be associated with a decrease in the speed of movement of the arm portion AU of the robot device 7. Further, as shown in FIGS. 6 and 7, the rotation operation of the bottom BU in the arrow A9 direction in the robot device 7 may be associated with one or more keys on the keyboard 25.
 図8は、本発明の第1実施形態に係るロボットのハンド部の動作制御の一例を示す図である。図9は、本発明の第1実施形態に係るロボットのハンド部の動作制御の一例を示す図である。図8及び図9に示すように、ロボット装置7のアーム部AUの先端付近の先端部Tには、ハンド部HUが連結されている。ハンド部HUは、組立作業などの操作の対象物であるワーク等を把持し、移動させることが可能である。 FIG. 8 is a diagram showing an example of motion control of the hand portion of the robot according to the first embodiment of the present invention. FIG. 9 is a diagram showing an example of motion control of the hand portion of the robot according to the first embodiment of the present invention. As shown in FIGS. 8 and 9, a hand portion HU is connected to a tip portion T near the tip of the arm portion AU of the robot device 7. The hand unit HU can grip and move a work or the like, which is an object of operation such as assembly work.
 図4に示すように、マウス割当情報MAIは、ロボット装置7のアーム部AU及びハンド部HUの少なくとも一方の各種動作と、マウス25の移動操作及びクリック操作の少なくとも一方の操作とを関連づけて管理される。マウス割当情報MAIの一例について以下に列挙する。
(1)左クリックLCを所定期間押下(いわゆる「長押し」)した状態でのマウス25の移動操作は、ロボット装置7のアーム部AUの先端部Tの首振り動作(例えば図5に示す、中心点CPを起点とした矢印A4及びA5方向などの首振り動作)と関連付けられている。
(2)左クリックLCの押下操作は、図8に示すハンド部HUを開ける動作と関連づけられている。
(3)左クリックLCを複数回(例えば二回)押下(いわゆる「ダブルクリック」)する操作は、図9に示すハンド部HUを閉じる動作と関連づけられている。
 上記の割り当て例はあくまで一例であって、他の特定のマウス操作をロボット装置7のアーム部AU及びハンド部HUの少なくとも一方の各種動作に割り当ててもよい。
As shown in FIG. 4, the mouse allocation information MAI manages the various operations of at least one of the arm portion AU and the hand portion HU of the robot device 7 in association with at least one of the movement operation and the click operation of the mouse 25. Will be done. An example of mouse allocation information MAI is listed below.
(1) The movement operation of the mouse 25 in a state where the left-click LC is pressed for a predetermined period (so-called “long press”) is a swing operation of the tip portion T of the arm portion AU of the robot device 7 (for example, shown in FIG. 5). It is associated with a swinging motion in the directions of arrows A4 and A5 starting from the center point CP).
(2) The operation of pressing the left-click LC is associated with the operation of opening the hand portion HU shown in FIG.
(3) The operation of pressing the left-click LC a plurality of times (for example, twice) (so-called “double-click”) is associated with the operation of closing the hand portion HU shown in FIG.
The above assignment example is merely an example, and other specific mouse operations may be assigned to at least one of the various movements of the arm portion AU and the hand portion HU of the robot device 7.
 また、ハンド部HUは、ワークWを保持可能であればよく、ワークWを把持可能な機能の他、例えば、ワークWを吸引又は吸着可能な機能を有してもよい。その場合、左クリックLCの押下操作は、ワークWの吸引又は吸着動作に関連づけられ、左クリックLCのダブルクリック操作は、ワークWをハンド部HUから離す動作に関連づけられてもよい。なお、図8及び図9に示すハンド部HUの開閉動作又は吸引もしくは吸着動作等をマウス25の操作ではなく、キーボード23を介した操作によって制御してもよい。 Further, the hand portion HU may have a function of being able to hold the work W, and may have a function of being able to suck or suck the work W, for example, in addition to the function of being able to grip the work W. In that case, the pressing operation of the left-click LC may be associated with the suction or suction operation of the work W, and the double-click operation of the left-click LC may be associated with the operation of separating the work W from the hand portion HU. The opening / closing operation, suction or suction operation of the hand portion HU shown in FIGS. 8 and 9 may be controlled not by the operation of the mouse 25 but by the operation via the keyboard 23.
 なお、動作割当部16は、図1に示すキーボード23及びマウス25の操作を、ロボット装置7に接続されるカメラ9の向き(撮影角度)及び撮像処理(撮像開始又は撮像終了等)などを調整する動作と関連づけてもよい。この構成によれば、操作者Oは、遠隔からロボット装置7に配置されるカメラ9の動作を制御することが可能である。 The motion allocation unit 16 adjusts the operation of the keyboard 23 and the mouse 25 shown in FIG. 1 such as the direction (shooting angle) of the camera 9 connected to the robot device 7 and the imaging process (starting or ending imaging). It may be associated with the action to be performed. According to this configuration, the operator O can remotely control the operation of the camera 9 arranged in the robot device 7.
 また、ロボット装置7の所定の動作を、キーボード25における複数キーを用いたショートカットキー操作に関連づけてもよい。例えば、ロボット装置7の所定姿勢と、特定のショートカットキー操作と関連付けることで、特定のショートカットキー操作を実行すると、ロボット装置7が所定姿勢となるように、自動的に、アーム部AU及びハンド部HUの少なくとも一方が制御される。この構成によれば、ロボット装置7が所定動作を行うための操作者の操作がより簡易となる。 Further, a predetermined operation of the robot device 7 may be associated with a shortcut key operation using a plurality of keys on the keyboard 25. For example, by associating a predetermined posture of the robot device 7 with a specific shortcut key operation, when a specific shortcut key operation is executed, the arm portion AU and the hand portion are automatically set so that the robot device 7 is in the predetermined posture. At least one of the HUs is controlled. According to this configuration, the operation of the operator for the robot device 7 to perform a predetermined operation becomes simpler.
 図2に戻り、情報変換部17は、操作情報取得部15により取得された操作情報OIを、ロボット装置7の動作を制御するための制御情報CIに変換する。情報変換部17は、操作情報OIを取得すると、記録部12に記録されているキー割当情報KAI及びマウスと割当情報MAIを参照して、操作情報OIに対応する制御情報CIに変換する。 Returning to FIG. 2, the information conversion unit 17 converts the operation information OI acquired by the operation information acquisition unit 15 into a control information CI for controlling the operation of the robot device 7. When the information conversion unit 17 acquires the operation information OI, the information conversion unit 17 refers to the key allocation information KAI recorded in the recording unit 12 and the mouse and the allocation information MAI, and converts the operation information OI into a control information CI corresponding to the operation information OI.
 動作制御部18は、情報変換部17により変換された制御情報CIに基づいてロボット装置7の動作を制御する。例えば、動作制御部18は、制御情報CIをロボット装置7に送信する。ロボット装置7における動作を制御する制御装置は、制御情報CIを受信すると、当該制御情報CIに基づいてロボット装置7における動作を制御する。具体的には、ロボット装置7のアーム部AU及びハンド部HUの制御に関して、動作制御部18は、ロボット装置7から、例えば、目標となるハンド部HU(手先)の位置、及び、アーム部AUの姿勢に関する情報(今回の受信情報)を受信する。 The motion control unit 18 controls the motion of the robot device 7 based on the control information CI converted by the information conversion unit 17. For example, the motion control unit 18 transmits the control information CI to the robot device 7. When the control device that controls the operation in the robot device 7 receives the control information CI, the control device controls the operation in the robot device 7 based on the control information CI. Specifically, regarding the control of the arm portion AU and the hand portion HU of the robot device 7, the motion control unit 18 is, for example, from the robot device 7, the position of the target hand portion HU (hand) and the arm portion AU. Receives information about the posture of the robot (information received this time).
 動作制御部18は、以前受信したハンド部HU(手先)の位置、及び、アーム部AUの姿勢に関する情報と、今回の受信情報とを比較することによって、ハンド部HUの移動方向と速度を算出する。動作制御部18は、算出したハンド部HUの移動方向と速度に基づいて、例えばヤコビアンを用いた逆運動学に基づく処理を実行する。動作制御部18は、次に想定されるロボット装置7の姿勢等に到達するための各関節部JUの速度等を算出する。動作制御部18は、算出結果を、制御情報としてロボット装置7に送信する。 The motion control unit 18 calculates the moving direction and speed of the hand unit HU by comparing the previously received information regarding the position of the hand unit HU (hand) and the posture of the arm unit AU with the received information this time. do. The motion control unit 18 executes a process based on inverse kinematics using, for example, Jacobian, based on the calculated movement direction and speed of the hand unit HU. The motion control unit 18 calculates the speed and the like of each joint portion JU for reaching the posture and the like of the robot device 7 assumed next. The motion control unit 18 transmits the calculation result to the robot device 7 as control information.
 記録部12は、例えばカメラ9から取得した撮影画像情報IIと、キー割当情報KAIと、マウス割当情報MAIと、図1に示す操作者Oが入力した操作情報OIと、ロボット装置7の動作を制御するための制御情報CIと、を記録する。なお、記録部12には、ロボット操作支援システム100が実行するロボット操作支援処理の内容を記述したプログラムがインストールされていてもよい。 The recording unit 12, for example, captures the captured image information II acquired from the camera 9, the key allocation information KAI, the mouse allocation information MAI, the operation information OI input by the operator O shown in FIG. 1, and the operation of the robot device 7. The control information CI for control is recorded. A program describing the contents of the robot operation support process executed by the robot operation support system 100 may be installed in the recording unit 12.
 [ロボット操作支援システムの動作]
 図10を参照して、第1実施形態に係るロボット操作支援処理方法の一例を説明する。図10は、第1実施形態に係るロボット操作支援処理方法の一例を示すフローチャートである。
[Operation of robot operation support system]
An example of the robot operation support processing method according to the first embodiment will be described with reference to FIG. FIG. 10 is a flowchart showing an example of the robot operation support processing method according to the first embodiment.
 図10に示すように、図1に示すロボット端末装置1は、操作者Oが仮想視点に基づいてロボット装置7を操作するための操作情報OIを取得する(S1)。ロボット端末装置1は、取得された操作情報OIを、ロボット装置7の動作を制御するための制御情報CIに変換する(S2)。ロボット端末装置1は、変換された制御情報CIに基づいてロボット装置7の動作を制御する(S3)。 As shown in FIG. 10, the robot terminal device 1 shown in FIG. 1 acquires operation information OI for the operator O to operate the robot device 7 based on a virtual viewpoint (S1). The robot terminal device 1 converts the acquired operation information OI into a control information CI for controlling the operation of the robot device 7 (S2). The robot terminal device 1 controls the operation of the robot device 7 based on the converted control information CI (S3).
 以上説明したように、第1実施形態におけるロボット操作支援システム100において、ロボット装置7においてカメラ9が配置される位置Pは、ロボット装置7の操作者Oの仮想視点と関連付けられている。ロボット操作支援システム100は、操作者Oが仮想視点に基づいてロボット装置7を操作するための操作情報を取得し、取得された操作情報を、ロボット装置7の動作を制御するための制御情報に変換し、変換された制御情報に基づいてロボット装置7の動作を制御する。よって、操作者Oは、あたかも、遠隔にあるロボット装置7の位置にいるかのように、ロボット装置7の周囲を確認しながら、ロボット装置7を容易に操作可能である。したがって、ロボット装置7の遠隔操作を実行する場合の操作性を改善することができる。 As described above, in the robot operation support system 100 of the first embodiment, the position P where the camera 9 is arranged in the robot device 7 is associated with the virtual viewpoint of the operator O of the robot device 7. The robot operation support system 100 acquires operation information for the operator O to operate the robot device 7 based on a virtual viewpoint, and uses the acquired operation information as control information for controlling the operation of the robot device 7. It is converted and the operation of the robot device 7 is controlled based on the converted control information. Therefore, the operator O can easily operate the robot device 7 while checking the surroundings of the robot device 7 as if he / she is at a remote position of the robot device 7. Therefore, it is possible to improve the operability when the robot device 7 is remotely controlled.
 また、第1実施形態においては、キー割当情報KAIに関して上記で説明したように、キーK1~K4の押下操作に応じた、アーム部AUの前後左右方向(例えば、図5及び図6に示す矢印A1及びA3の方向)への移動制御は、ハンド部HUの付け根の向きに基づいて変わる。このような場合でも、ロボット操作支援システム100においては、ロボット装置7に配置されたカメラ9を、いわゆる一人称カメラとして用いるから、操作者Oは、ロボット装置7を容易に操作可能である。 Further, in the first embodiment, as described above with respect to the key assignment information KAI, the arrows in the front-back and left-right directions of the arm portion AU (for example, the arrows shown in FIGS. 5 and 6) in response to the pressing operation of the keys K1 to K4. The movement control in the direction of A1 and A3) changes based on the direction of the base of the hand portion HU. Even in such a case, since the camera 9 arranged in the robot device 7 is used as a so-called first-person camera in the robot operation support system 100, the operator O can easily operate the robot device 7.
 <第2実施形態>
 図11を参照して本発明の第2実施形態に係るロボット操作支援システムを説明する。第2実施形態に係るロボット操作支援システムでは、カメラ9がロボット装置7の周辺の所定面(例えば天井C)に配置される点で、カメラ9がロボット装置7に配置される第1実施形態に係るロボット操作支援システム(図1を参照)とは異なる。なお、所定面は、天井Cに限られず、例えば、ロボット装置7が配置される部屋の壁やテーブルなどの天板を含む。図11は、本発明の第2実施形態に係るロボット操作支援システムの全体構成の一例を示す模式図である。
<Second Embodiment>
The robot operation support system according to the second embodiment of the present invention will be described with reference to FIG. In the robot operation support system according to the second embodiment, in the point that the camera 9 is arranged on a predetermined surface (for example, the ceiling C) around the robot device 7, the camera 9 is arranged in the robot device 7 in the first embodiment. It is different from the robot operation support system (see FIG. 1). The predetermined surface is not limited to the ceiling C, and includes, for example, a top plate such as a wall or a table of a room in which the robot device 7 is arranged. FIG. 11 is a schematic diagram showing an example of the overall configuration of the robot operation support system according to the second embodiment of the present invention.
 図11に示すロボット操作支援システム100では、ロボット装置7の周辺の壁Cにおけるカメラ9の配置位置P1と、遠隔にいる操作者Oの仮想視点とを関連づける。ロボット操作支援システム100では、カメラ9の配置位置P1の周囲(所定範囲)を撮像することによって得られた画像情報に基づく画像Gを、操作者端末装置5の表示部21に操作者Oが視認可能に表示する。 In the robot operation support system 100 shown in FIG. 11, the arrangement position P1 of the camera 9 on the wall C around the robot device 7 is associated with the virtual viewpoint of the remote operator O. In the robot operation support system 100, the operator O visually recognizes the image G based on the image information obtained by photographing the periphery (predetermined range) of the arrangement position P1 of the camera 9 on the display unit 21 of the operator terminal device 5. Display as possible.
 以上、第2実施形態におけるロボット操作支援システム100では、ロボット装置7の周辺においてカメラ9が配置される位置が、ロボット装置7の操作者Oの仮想視点と関連付けられている。ロボット操作支援システム100は、操作者Oが仮想視点に基づいてロボット装置7を操作するための操作情報を取得し、取得された操作情報を、ロボット装置7の動作を制御するための制御情報に変換し、変換された制御情報に基づいてロボット装置7の動作を制御する。よって、操作者Oは、あたかも、遠隔にあるロボット装置7の周辺にいるかのように、ロボット装置7の周囲を確認しながら、ロボット装置7を容易に操作可能である。したがって、ロボット装置7の遠隔操作を実行する場合の操作性を改善することができる。 As described above, in the robot operation support system 100 in the second embodiment, the position where the camera 9 is arranged around the robot device 7 is associated with the virtual viewpoint of the operator O of the robot device 7. The robot operation support system 100 acquires operation information for the operator O to operate the robot device 7 based on a virtual viewpoint, and uses the acquired operation information as control information for controlling the operation of the robot device 7. It is converted and the operation of the robot device 7 is controlled based on the converted control information. Therefore, the operator O can easily operate the robot device 7 while checking the surroundings of the robot device 7 as if it were in the vicinity of the remote robot device 7. Therefore, it is possible to improve the operability when the robot device 7 is remotely controlled.
 また、例えばキーK1~K4の押下操作に応じたアーム部AUの特定方向への移動制御が、ロボット装置7自体の向き(配置方向)に基づいて変わる場合には、第2実施形態におけるロボット操作支援システム100は、好適である。例えば、ロボット装置7が特定方向(例えば北向き)に配置されているような場合に、アーム部AUは、第1実施形態のようにハンド部HUの付け根の向きによらず、特定の4方向(例えば、東西南北のそれぞれの方向)に移動制御される。すなわち、第2実施形態におけるロボット操作支援システム100においては、ロボット装置7の周辺におけるカメラ9の配置位置に合わせて、操作入力に対するアーム部AUの並行移動への処理内容を決定可能である。このように、第2実施形態におけるロボット操作支援システム100においては、カメラ9がロボット装置7の周辺の任意の位置に配置可能であるから、操作者Oは、ロボット装置7を容易に操作可能である。 Further, for example, when the movement control of the arm portion AU in a specific direction according to the pressing operation of the keys K1 to K4 changes based on the direction (arrangement direction) of the robot device 7 itself, the robot operation in the second embodiment. The support system 100 is suitable. For example, when the robot device 7 is arranged in a specific direction (for example, facing north), the arm portion AU has four specific directions regardless of the orientation of the base of the hand portion HU as in the first embodiment. Movement is controlled (for example, in each direction of north, south, east, and west). That is, in the robot operation support system 100 according to the second embodiment, it is possible to determine the processing content for the translation of the arm unit AU with respect to the operation input according to the arrangement position of the camera 9 around the robot device 7. As described above, in the robot operation support system 100 according to the second embodiment, since the camera 9 can be arranged at an arbitrary position around the robot device 7, the operator O can easily operate the robot device 7. be.
 <第3実施形態>
 図12を参照して本発明の第3実施形態に係るロボット操作支援システムを説明する。第3実施形態に係るロボット操作支援システムでは、複数のカメラ9を備える点で、単一のカメラ9を備える第1及び第2実施形態に係るロボット操作支援システム(図1及び図11を参照)とは異なる。なお、複数のカメラ9の数は、二以上の任意の数でよいが、以下では、二台の場合を例に挙げて説明する。図12は、本発明の第3実施形態に係るロボット操作支援システムの全体構成の一例を示す模式図である。
<Third Embodiment>
The robot operation support system according to the third embodiment of the present invention will be described with reference to FIG. In the robot operation support system according to the third embodiment, the robot operation support system according to the first and second embodiments including a single camera 9 is provided in that a plurality of cameras 9 are provided (see FIGS. 1 and 11). Is different. The number of the plurality of cameras 9 may be any number of two or more, but the case of two cameras will be described below as an example. FIG. 12 is a schematic diagram showing an example of the overall configuration of the robot operation support system according to the third embodiment of the present invention.
 図12に示すように、ロボット操作支援システム100は、複数のカメラ9を備え、例えば、一のカメラ9は、ロボット装置7における位置Pに配置され、他のカメラ9は、ロボット装置7が配置された部屋の天井C(所定面)の位置P1に配置されてもよい。また、複数のカメラ9は、ロボット装置7における複数箇所に個別に配置されてもよい。例えば、一のカメラ9は、ロボット装置7におけるアーム部AUに配置され、他のカメラ9は、ロボット装置7におけるハンド部HUに配置されてもよい。さらに複数のカメラ9は、アーム部AUの複数の所定位置に配置されてもよいし、ハンド部HUの複数の所定位置に配置されてもよい。 As shown in FIG. 12, the robot operation support system 100 includes a plurality of cameras 9, for example, one camera 9 is arranged at a position P in the robot device 7, and the other cameras 9 are arranged by the robot device 7. It may be arranged at the position P1 of the ceiling C (predetermined surface) of the room. Further, the plurality of cameras 9 may be individually arranged at a plurality of locations in the robot device 7. For example, one camera 9 may be arranged in the arm portion AU of the robot device 7, and the other camera 9 may be arranged in the hand portion HU of the robot device 7. Further, the plurality of cameras 9 may be arranged at a plurality of predetermined positions of the arm portion AU, or may be arranged at a plurality of predetermined positions of the hand portion HU.
 例えば、一のカメラ9が取得した画像情報に基づく画像と、他のカメラ9が取得した画像情報に基づく画像とを切替可能に表示部21に表示してもよい。より具体的には、基本的には、一のカメラ9が取得した画像情報に基づく画像をメイン画像として表示部21に表示させながら、必要に応じて、他のカメラ9が取得した画像情報に基づく画像をサブ画像として表示部21に表示させてもよい。なお、両画像を表示部21において同一画面に同時に表示してもよい。 For example, an image based on the image information acquired by one camera 9 and an image based on the image information acquired by another camera 9 may be displayed on the display unit 21 so as to be switchable. More specifically, basically, while displaying an image based on the image information acquired by one camera 9 on the display unit 21 as the main image, the image information acquired by the other camera 9 can be displayed as needed. The based image may be displayed on the display unit 21 as a sub-image. Both images may be displayed on the same screen at the same time on the display unit 21.
 図13は、一実施形態に係るコンピュータのハードウェア構成の一例を示す図である。図13を参照して、図1、11及び12に示すロボット操作支援100における各種装置、例えば、ロボット端末装置1、ロボット操作支援サーバ3、及び、操作者端末装置5を構成するのに用いることができるコンピュータのハードウェア構成の一例について説明する。 FIG. 13 is a diagram showing an example of the hardware configuration of the computer according to the embodiment. With reference to FIG. 13, it is used to configure various devices in the robot operation support 100 shown in FIGS. 1, 11 and 12, for example, a robot terminal device 1, a robot operation support server 3, and an operator terminal device 5. An example of the hardware configuration of a computer that can be used will be described.
 図13に示すように、コンピュータ40は、ハードウェア資源として、主に、プロセッサ41と、主記録装置42と、補助記録装置43と、入出力インターフェイス44と、通信インターフェイス45とを備えており、これらはアドレスバス、データバス、コントロールバス等を含むバスライン46を介して相互に接続されている。なお、バスライン46と各ハードウェア資源との間には適宜インターフェイス回路(図示せず)が介在している場合もある。 As shown in FIG. 13, the computer 40 mainly includes a processor 41, a main recording device 42, an auxiliary recording device 43, an input / output interface 44, and a communication interface 45 as hardware resources. These are connected to each other via a bus line 46 including an address bus, a data bus, a control bus and the like. An interface circuit (not shown) may be appropriately interposed between the bus line 46 and each hardware resource.
 プロセッサ41は、コンピュータ全体の制御を行う。プロセッサ41は、例えば、図2に示すロボット端末装置1の情報処理部11に相当する。主記録装置42は、プロセッサ41に対して作業領域を提供し、SRAM(tatic andom ccess emory)やDRAM(ynamic andom ccess emory)等の揮発性メモリである。補助記録装置43は、ソフトウェアであるプログラム等やデータ等を格納する、HDDやSSD、フラッシュメモリ等の不揮発性メモリである。当該プログラムやデータ等は、任意の時点で補助記録装置43からバスライン46を介して主記録装置42へとロードされる。補助記録装置43は、例えば図2に示すロボット端末装置1の記録部12に相当する。 The processor 41 controls the entire computer. The processor 41 corresponds to, for example, the information processing unit 11 of the robot terminal device 1 shown in FIG. The main recording device 42 provides a work area for the processor 41, and is a volatile memory such as a SRAM ( Static Random Access Memory) or a DRAM ( Dynamic Random Access Memory). The auxiliary recording device 43 is a non-volatile memory such as an HDD, SSD, or flash memory that stores software programs and the like and data. The program, data, or the like is loaded from the auxiliary recording device 43 to the main recording device 42 via the bus line 46 at an arbitrary time point. The auxiliary recording device 43 corresponds to, for example, the recording unit 12 of the robot terminal device 1 shown in FIG. 2.
 入出力インターフェイス44は、情報を提示すること及び情報の入力を受けることの一方又は双方を行うものであり、カメラ、キーボード(例えば図1、11及び12に示す操作者端末装置5のキーボード23)、マウス(例えば図1、11及び12に示す操作者端末装置5のマウス25)、ディスプレイ(例えば図1、11及び12に示す操作者端末装置5の表示部21)、タッチパネル・ディスプレイ、マイク、スピーカ等である。通信インターフェイス45は、所定の通信ネットワークを介して、図1、11及び12に示すロボット操作支援100における各種装置間で各種データを送受信するためのものである。通信インターフェイス45と所定の通信ネットワークとは、有線又は無線で接続されうる。通信インターフェイス45は、ネットワークに係る情報、例えば、Wi-Fiのアクセスポイントに係る情報、通信キャリアの基地局に関する情報等も取得することがある。 The input / output interface 44 performs one or both of presenting information and receiving input of information, and is a camera and a keyboard (for example, the keyboard 23 of the operator terminal device 5 shown in FIGS. 1, 11 and 12). , Mouse (eg, mouse 25 of operator terminal device 5 shown in FIGS. 1, 11 and 12), display (eg, display unit 21 of operator terminal device 5 shown in FIGS. 1, 11 and 12), touch panel display, microphone. Such as a speaker. The communication interface 45 is for transmitting and receiving various data between various devices in the robot operation support 100 shown in FIGS. 1, 11 and 12 via a predetermined communication network. The communication interface 45 and the predetermined communication network may be connected by wire or wirelessly. The communication interface 45 may also acquire information related to the network, for example, information related to a Wi-Fi access point, information related to a base station of a communication carrier, and the like.
 上に例示したハードウェア資源とソフトウェアとの協働により、コンピュータ40は、所望の手段として機能し、所望のステップを実行し、所望の機能を実現させることできることは、当業者には明らかである。 It will be apparent to those skilled in the art that the collaboration of the hardware resources and software exemplified above allows the computer 40 to function as the desired means, perform the desired steps, and achieve the desired functionality. ..
 なお、上記各実施形態は、本発明の理解を容易にするためのものであり、本発明を限定して解釈するものではない。本発明はその趣旨を逸脱することなく、変更又は改良され得るとともに、本発明にはその等価物も含まれる。また、本発明は、上記各実施形態に開示されている複数の構成要素の適宜な組み合わせにより種々の開示を形成できる。例えば、実施形態に示される全構成要素から幾つかの構成要素は削除してもよい。さらに、異なる実施形態に構成要素を適宜組み合わせてもよい。 It should be noted that each of the above embodiments is for facilitating the understanding of the present invention, and does not limit the interpretation of the present invention. The present invention may be modified or improved without departing from the spirit thereof, and the present invention also includes an equivalent thereof. In addition, the present invention can form various disclosures by appropriately combining the plurality of components disclosed in each of the above embodiments. For example, some components may be removed from all the components shown in the embodiments. Further, the components may be appropriately combined in different embodiments.
 上記実施形態では、図2に示すように、ロボット端末装置1(ロボット操作支援装置)が、情報処理部11における各機能(例えば、視点関連付け部13、画像情報取得部14、操作情報取得部15、動作割当部16、情報変換部17、及び動作制御部18)と、記録部12とを備えるが、これに限られない。図1、11及び12に示すロボット操作支援サーバ3(ロボット操作支援装置)が、図2に示す情報処理部11における各機能の少なくとも一部及び記録部12の少なくとも一つの機能を備えてもよい。例えば、ロボット端末装置1が、動作制御部18を備え、ロボット操作支援サーバ3が、画像情報取得部15、操作情報取得部16、及び情報変換部17を備えてもよい。また、ロボット操作支援サーバ3は、視点関連付け部13及び動作割当部16をさらに備えてもよい。 In the above embodiment, as shown in FIG. 2, the robot terminal device 1 (robot operation support device) has each function in the information processing unit 11 (for example, a viewpoint association unit 13, an image information acquisition unit 14, and an operation information acquisition unit 15). , An operation allocation unit 16, an information conversion unit 17, and an operation control unit 18), and a recording unit 12, but the present invention is not limited to this. The robot operation support server 3 (robot operation support device) shown in FIGS. 1, 11 and 12 may have at least a part of each function in the information processing unit 11 shown in FIG. 2 and at least one function of the recording unit 12. .. For example, the robot terminal device 1 may include an operation control unit 18, and the robot operation support server 3 may include an image information acquisition unit 15, an operation information acquisition unit 16, and an information conversion unit 17. Further, the robot operation support server 3 may further include a viewpoint associating unit 13 and an action assigning unit 16.
 1…ロボット端末装置、3…ロボット操作支援サーバ、5…操作者端末装置、7…ロボット装置、9…カメラ、11…情報処理部、12…記録部、13…視点関連付け部、14…画像情報取得部、15…操作情報取得部、16…動作割当部、17…情報変換部、18…動作制御部、41…プロセッサ、42…主記録装置、43…補助記録装置、44…入出力インターフェイス、45…通信インターフェイス、46…バスライン、100…ロボット操作支援システム 1 ... Robot terminal device, 3 ... Robot operation support server, 5 ... Operator terminal device, 7 ... Robot device, 9 ... Camera, 11 ... Information processing unit, 12 ... Recording unit, 13 ... Viewpoint association unit, 14 ... Image information Acquisition unit, 15 ... Operation information acquisition unit, 16 ... Operation allocation unit, 17 ... Information conversion unit, 18 ... Operation control unit, 41 ... Processor, 42 ... Main recording device, 43 ... Auxiliary recording device, 44 ... Input / output interface, 45 ... communication interface, 46 ... bus line, 100 ... robot operation support system

Claims (7)

  1.  ロボットの遠隔操作を支援するロボット操作支援装置を、
     前記ロボット又は前記ロボットの周辺に配置された撮像装置の位置と関連づけられた前記ロボットの操作者の仮想視点に基づく前記ロボットを操作するための操作情報を取得する取得部と、
     取得された前記操作情報を、前記ロボットの動作を制御するための制御情報に変換する変換部と、して機能させる、
    プログラム。
    A robot operation support device that supports remote control of robots,
    An acquisition unit for acquiring operation information for operating the robot based on a virtual viewpoint of the operator of the robot, which is associated with the position of the robot or an image pickup device arranged around the robot.
    It functions as a conversion unit that converts the acquired operation information into control information for controlling the operation of the robot.
    program.
  2.  前記操作者の操作者端末装置は、前記撮像装置が前記位置からの所定範囲を撮像することによって得られた画像情報に基づく画像を前記操作者が視認可能に表示し、
     前記操作情報は、前記操作者によって、前記操作者端末装置が備えるポインティング装置を介して入力される、
     請求項1に記載のプログラム。
    The operator terminal device of the operator visually displays an image based on the image information obtained by the image pickup device capturing a predetermined range from the position, so that the operator can visually display the image.
    The operation information is input by the operator via a pointing device included in the operator terminal device.
    The program according to claim 1.
  3.  前記ロボットは、アーム部を備え、
     前記操作者の操作者端末装置は、前記操作者が前記操作情報を入力可能なキーボードを備え、
     前記キーボードが備える複数のキーの少なくとも一部は、前記アーム部の複数の動作に対応付けられている、
    請求項1又は2に記載のプログラム。
    The robot includes an arm portion and has an arm portion.
    The operator terminal device of the operator includes a keyboard on which the operator can input the operation information.
    At least a part of the plurality of keys included in the keyboard is associated with the plurality of operations of the arm portion.
    The program according to claim 1 or 2.
  4.  前記ロボットは、ハンド部を備え、
     前記操作者の操作者端末装置は、前記操作者が前記操作情報を入力可能なマウスを更に備え、
     前記マウスを介して入力される前記操作情報は、前記ハンド部の複数の動作に対応づけられている、
     請求項1から3のいずれか一項に記載のプログラム。
    The robot is provided with a hand portion and has a hand portion.
    The operator terminal device of the operator further includes a mouse capable of inputting the operation information by the operator.
    The operation information input via the mouse is associated with a plurality of operations of the hand unit.
    The program according to any one of claims 1 to 3.
  5.  前記ロボットにおいて前記撮像装置が配置される前記位置は、前記アーム部の先端部である、
     請求項3又は4に記載のプログラム。
    The position where the image pickup device is arranged in the robot is the tip end portion of the arm portion.
    The program according to claim 3 or 4.
  6.  ロボットの遠隔操作を支援するロボット操作支援装置が実行するロボット操作支援方法であって、
     前記ロボット又は前記ロボットの周辺に配置された撮像装置の位置と関連づけられた前記ロボットの操作者の仮想視点に基づく前記ロボットを操作するための操作情報を取得するステップと、
     取得された前記操作情報を、前記ロボットを操作するための制御情報に変換するステップと、を含む、
     ロボット操作支援方法。
    It is a robot operation support method executed by a robot operation support device that supports remote control of a robot.
    A step of acquiring operation information for operating the robot based on a virtual viewpoint of the operator of the robot associated with the position of the robot or an image pickup device arranged around the robot, and a step of acquiring operation information.
    A step of converting the acquired operation information into control information for operating the robot, and the like.
    Robot operation support method.
  7.  ロボットの遠隔操作を支援するロボット操作支援装置であって、
    前記ロボット又は前記ロボットの周辺に配置された撮像装置の位置と関連づけられた前記ロボットの操作者の仮想視点に基づく前記ロボットを操作するための操作情報を取得する取得部と、
     取得された前記操作情報を、前記ロボットの動作を制御するための制御情報に変換する変換部と、を備える、
     ロボット操作支援装置。
    A robot operation support device that supports remote control of a robot.
    An acquisition unit for acquiring operation information for operating the robot based on a virtual viewpoint of the operator of the robot, which is associated with the position of the robot or an image pickup device arranged around the robot.
    A conversion unit that converts the acquired operation information into control information for controlling the operation of the robot is provided.
    Robot operation support device.
PCT/JP2020/040789 2020-10-30 2020-10-30 Program, robot operation assistance method, and robot operation assistance device WO2022091333A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/040789 WO2022091333A1 (en) 2020-10-30 2020-10-30 Program, robot operation assistance method, and robot operation assistance device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/040789 WO2022091333A1 (en) 2020-10-30 2020-10-30 Program, robot operation assistance method, and robot operation assistance device

Publications (1)

Publication Number Publication Date
WO2022091333A1 true WO2022091333A1 (en) 2022-05-05

Family

ID=81382068

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/040789 WO2022091333A1 (en) 2020-10-30 2020-10-30 Program, robot operation assistance method, and robot operation assistance device

Country Status (1)

Country Link
WO (1) WO2022091333A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0752008A (en) * 1993-08-19 1995-02-28 Nagase Integrex:Kk Rotary grinding machine
JP2009531184A (en) * 2006-03-27 2009-09-03 コミッサリア タ レネルジー アトミーク Intelligent interface device for gripping an object by an operating robot and method of operating this device
JP2012066378A (en) * 2010-09-22 2012-04-05 Toyota Motor Engineering & Manufacturing North America Inc Human-robot interface apparatus and method for controlling robot
JP2018126851A (en) * 2017-02-10 2018-08-16 日本電信電話株式会社 Remote control communication system and relay method for the same, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0752008A (en) * 1993-08-19 1995-02-28 Nagase Integrex:Kk Rotary grinding machine
JP2009531184A (en) * 2006-03-27 2009-09-03 コミッサリア タ レネルジー アトミーク Intelligent interface device for gripping an object by an operating robot and method of operating this device
JP2012066378A (en) * 2010-09-22 2012-04-05 Toyota Motor Engineering & Manufacturing North America Inc Human-robot interface apparatus and method for controlling robot
JP2018126851A (en) * 2017-02-10 2018-08-16 日本電信電話株式会社 Remote control communication system and relay method for the same, and program

Similar Documents

Publication Publication Date Title
US20180236661A1 (en) Teaching Apparatus And Robot System
CN105512086B (en) Information processing equipment and information processing method
JP6205067B2 (en) Pan / tilt operating device, camera system, pan / tilt operating program, and pan / tilt operating method
US20210026589A1 (en) Communication terminal, image communication system, display method, and non-transitory recording medium
JP2018015857A (en) Control device, and robot
CN111010512A (en) Display control method and electronic equipment
KR20150070199A (en) Robotic stand and systems and methods for controlling the stand during videoconference
CN103986878A (en) Remote control device and method for operating pan-tilt camera
JP2019126655A (en) Radiographic system, medical image capturing system, radiographic method, and program
CN106162150A (en) A kind of photographic method and mobile terminal
JP2015225400A (en) Communication system, transfer control device, communication method, and program
WO2022091333A1 (en) Program, robot operation assistance method, and robot operation assistance device
KR102163894B1 (en) Holder for mobile terminal and method for changing direction of the mobile terminal using the same
JP2016059982A (en) Image processing apparatus and robot system
JPWO2017022031A1 (en) Information terminal equipment
US20240031667A1 (en) Image Processing Method, Mobile Terminal, and Storage Medium
JP6473048B2 (en) Mobile device operation terminal, mobile device operation method, and mobile device operation program
JP6488571B2 (en) Teaching apparatus and robot system
JP2014235699A (en) Information processing apparatus, device setting system, device setting method, and program
JP6958091B2 (en) Robot system and robot control method
WO2021007792A1 (en) Photographing method, device and system, and computer readable storage medium
JP2018017610A (en) Three-dimensional measuring device, robot, robot controlling device, and robot system
JP2018034243A (en) Robot, robot control device, and robot system
JP6375810B2 (en) Image processing apparatus and robot system
US20180150231A1 (en) Data management device, data management method, and robot system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20959851

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM F1205A DATED 11/08/2023)

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 20959851

Country of ref document: EP

Kind code of ref document: A1