WO2015129474A1 - ロボットアーム装置、ロボットアーム制御方法及びプログラム - Google Patents
ロボットアーム装置、ロボットアーム制御方法及びプログラム Download PDFInfo
- Publication number
- WO2015129474A1 WO2015129474A1 PCT/JP2015/053876 JP2015053876W WO2015129474A1 WO 2015129474 A1 WO2015129474 A1 WO 2015129474A1 JP 2015053876 W JP2015053876 W JP 2015053876W WO 2015129474 A1 WO2015129474 A1 WO 2015129474A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- arm
- joint
- control
- robot arm
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/32—Surgical robots operating autonomously
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/061—Measuring instruments not otherwise provided for for measuring dimensions, e.g. length
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
- A61B2090/066—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring torque
Definitions
- the present disclosure relates to a robot arm device, a robot arm control method, and a program.
- a balanced arm hereinafter also referred to as a support arm
- an imaging device at the tip of an arm
- Methods for performing various treatments such as surgery have been proposed.
- the balanced arm By using the balanced arm, the affected part can be observed stably from a desired direction, and the treatment can be performed efficiently.
- Patent Document 1 discloses a medical device support device (balanced arm) that realizes a pivoting operation of an imaging apparatus by appropriately connecting a plurality of link mechanisms and an interlocking mechanism that interlocks these link mechanisms. It is disclosed.
- the present disclosure proposes a new and improved robot arm device, robot arm control method, and program that can further improve user convenience.
- a plurality of links are configured to be connected to each other by a joint portion, and the arm portion to which the imaging portion can be connected and the driving of the arm portion are controlled by driving each of the joint portions in cooperation with each other.
- a drive control unit configured to use the relative position information of the reference position with respect to the arm unit based on the state of the arm unit and distance information between the imaging unit and the reference position.
- a state of an arm unit configured by connecting a plurality of links to each other by a joint unit and connecting an imaging unit is obtained, and distance information between the imaging unit and a reference position is obtained. And the relative position information of the reference position with respect to the arm unit based on the state of the arm unit and the distance information, so that the reference position is located on the optical axis of the imaging unit. And controlling the driving of the arm part by driving each of the joint parts in a coordinated manner based on the state of the part.
- a function of acquiring a state of an arm unit that is configured by connecting a plurality of links to each other by a joint unit and is connectable to the imaging unit, and the imaging unit and the reference position
- the reference position is located on the optical axis of the imaging unit using the function of acquiring distance information, the relative position information of the reference position with respect to the arm unit based on the state of the arm unit and the distance information.
- the state of the arm unit in a state where the imaging unit is directed to a reference position that is a predetermined point in real space, and the imaging unit and the reference position in a state where the imaging unit is directed to the reference position Based on the distance, a relative position of the reference position with respect to the arm portion is calculated. Based on the calculated relative position, the drive of the arm unit is controlled so that the reference position is located on the optical axis of the imaging unit.
- the drive control that the arm unit is driven so that the imaging unit always faces the reference position is realized by a simpler operation in which the user operates the arm unit to point the imaging unit to the reference position. The Accordingly, it is possible to improve user convenience when setting the reference position.
- Such drive control of the arm unit is realized by whole body cooperative control in which each of the joint units is driven in cooperation based on the state of the arm unit.
- the user can operate the arm part more easily by controlling the driving of the arm part by so-called force control with higher user operability, and further improve the convenience for the user. Can do.
- FIG. 9 is an explanatory diagram for describing a pivot operation that is a specific example of an arm operation according to an embodiment of the present disclosure. It is explanatory drawing for demonstrating the exercise
- the reference position in the pivot operation is also referred to as a pivot center point.
- the configuration of the control system for the robot arm device and the control method for the robot arm device for realizing such control will be described in detail.
- the pivot operation as described above is realized, for example, by calculating a control value for controlling the drive of the arm unit under the constraint that the reference position is located on the optical axis of the imaging device. Can be done.
- a pivot operation in which the imaging device moves on a hemisphere centered on the reference position that is, a pivot operation in a state where the distance between the imaging device and the reference position is kept constant.
- driving the arm unit so that various operations such as a fixing operation in which the position and posture of the arm unit are fixed in a predetermined state (that is, the position and posture of the imaging device are fixed) are realized.
- a fixing operation in which the position and posture of the arm unit are fixed in a predetermined state (that is, the position and posture of the imaging device are fixed) are realized.
- the drive control of the robot arm device based on such constraint conditions can be realized by a control method called whole body cooperative control. Therefore, ⁇ 5. Regarding whole body cooperative control>, a configuration of a control system and a control method for realizing whole body cooperative control will be described. In addition, ⁇ 5. Regarding whole body cooperative control>, the whole body cooperative control of the robot arm apparatus will be described from a broader viewpoint as well as the drive control for realizing the pivot operation and the fixed operation described above.
- a robot arm device mainly used for medical purposes will be described as an example.
- the present embodiment is not limited to such an example, and can be applied to other fields such as industrial use.
- a robot apparatus is configured by a multi-link structure in which a plurality of links are connected to each other by a plurality of joints, and the rotation drive at the plurality of the joints is controlled. Some drive is controlled.
- the arm portion corresponds to the multi-link structure, and the drive of the entire arm portion is controlled by the drive of each joint portion.
- position control and force control are known as control methods for the robot apparatus and each joint.
- a command value such as an angle is given to the actuator of the joint, and the driving of the joint is controlled so as to follow the command value.
- force control a target value of force to be applied to the work target is given as the entire robot apparatus, and driving of the joint portion (for example, torque generated by the joint portion) is controlled so as to realize the force indicated by the target value. Is done.
- robot devices driven by position control are widely used because of ease of control and ease of system configuration.
- position control is difficult to respond flexibly to external forces, it is sometimes called “hard control” and performs tasks while performing physical interactions with various external worlds (for example, interpersonal physical interactions). It is not suitable for a robot device.
- force control is a control method that is particularly suitable for robotic devices that perform interpersonal physical interaction, because the system configuration is complicated, but “soft control” in the order of force can be realized. It can be said.
- a balance type arm also referred to as a support arm
- various medical units tip units
- various imaging devices having an imaging function such as a microscope, an endoscope, and a camera are provided at the tip of the arm portion of the balance type arm, and a practitioner (user) observes an image of the surgical part taken by the imaging device
- various methods for performing various treatments have been proposed.
- the balance-type arm needs to have a counterbalance weight (also referred to as a counterweight or a balancer) for balancing the force when the arm portion is moved. is there.
- a counterbalance weight also referred to as a counterweight or a balancer
- the device used for the operation is required to be further downsized, and it is difficult to meet such a request with the generally proposed balanced arm.
- Met. In the balanced arm only a part of the drive of the arm part, for example, only two-axis drive for moving the tip unit on a plane (two-dimensionally) is electric drive, and the arm part and the tip Movement of the unit requires manual positioning by the practitioner and surrounding medical staff.
- an imaging device may be provided at the tip of the arm portion of a balanced arm or robot arm device, and the product may be observed by the imaging device. It is done. Even in such work, when using a balanced arm or a robot arm device whose drive is controlled by position control, there is a concern that the burden on the user will increase due to the low operability.
- the present inventors have conceived a robot arm device whose drive is controlled by force control in order to further improve convenience for the user and reduce the burden on the user.
- the operation of the arm unit according to the user's intuition is realized, and high operability can be obtained.
- the present inventors control the drive by force control by applying whole body cooperative control using a dynamic system called generalized inverse dynamics as a control method.
- a dynamic system called generalized inverse dynamics as a control method.
- the driving of the entire arm unit is controlled by driving each joint unit of the arm unit in a coordinated manner.
- the control value for controlling the drive of each joint part can be calculated based on the purpose of exercise and the constraint condition set for the entire arm part.
- the restraint conditions are conditions such as position, speed, force, etc. that limit the movement of the arm part.
- a constraint condition that a reference position that is a predetermined point in real space is located on the optical axis of the imaging device, each joint for driving the arm unit to realize the constraint condition It is possible to calculate a drive control value of the unit.
- the drive of the arm unit is controlled so that the imaging device is always facing the surgical site, and the line of sight is It is possible to perform control in response to the user's request to observe the surgical site from different distances or different angles while being fixed to the surgical site.
- the relative position of the reference position with respect to the robot arm apparatus is derived, and the robot arm apparatus Need to recognize the relative position.
- the derivation of the relative position corresponds to the process of deriving the reference position in the reference coordinates used by the robot arm device to drive the arm unit.
- the reference coordinates may be a coordinate system in an internal model of the robot arm device, for example.
- FIG. 1 is a functional block diagram illustrating a functional configuration of a robot arm control system according to an embodiment of the present disclosure.
- the robot arm control system 2 includes a robot arm device 10, a control device 20, and a display device 30.
- the control device 20 performs various calculations for driving the robot arm device 10 by whole body cooperative control, and the drive of the arm portion of the robot arm device 10 is controlled based on the calculation results.
- the arm unit of the robot arm device 10 is provided with an imaging unit 140 described later, and an image photographed by the imaging unit 140 is displayed on the display screen of the display device 30.
- the configurations of the robot arm device 10, the control device 20, and the display device 30 will be described in detail.
- the display device 30 displays various types of information on the display screen in various formats such as text and images, thereby visually notifying the user of the information.
- the display device 30 displays an image captured by the imaging unit 140 of the robot arm device 10 on a display screen.
- the display device 30 displays on the display screen an image signal processing unit (not shown) that performs various types of image processing on the image signal acquired by the imaging unit 140 and an image based on the processed image signal. It has the function and configuration of a display control unit (not shown) that performs control to display. Since the configuration of the display device 30 may be the same as that of a general display device, detailed description thereof is omitted here.
- the robot arm device 10 has an arm part which is a multi-link structure composed of a plurality of joint parts and a plurality of links, and is provided at the tip of the arm part by driving the arm part within a movable range.
- the position and orientation of the tip unit (imaging unit in this embodiment) to be controlled are controlled.
- the robot arm device 10 has an arm unit 120.
- the arm unit 120 includes a joint unit 130 and an imaging unit 140.
- the arm part 120 is a multi-link structure composed of a plurality of joint parts 130 and a plurality of links, and the drive of each joint part 130 is controlled by being controlled. Since the functions and configurations of the plurality of joint portions 130 included in the arm portion 120 are the same as each other, FIG. 6 illustrates the configuration of one joint portion 130 as a representative of the plurality of joint portions 130.
- the joint unit 130 rotatably connects between the links in the arm unit 120, and drives the arm unit 120 by controlling the rotation drive by the control from the joint control unit 135 described later.
- the joint unit 130 includes a joint drive unit 131, a joint state detection unit 132, and a joint control unit 135.
- the joint control unit 135 includes various processors such as a CPU (Central Processing Unit), and controls the operation of the joint unit 130.
- the arm control unit 110 includes a drive control unit 111, and the drive of the arm unit 120 is controlled by controlling the drive of the joint unit 130 by the control from the drive control unit 111.
- the drive control unit 111 controls the rotational speed of the motor that constitutes the joint drive unit 131 by controlling the amount of current supplied to the joint drive unit 131 of the joint unit 130, and The rotation angle and generated torque in the unit 130 are controlled.
- the drive control of the joint unit 130 by the drive control unit 111 may be performed based on the calculation result in the control device 20.
- the joint drive unit 131 is a drive mechanism such as a motor that constitutes an actuator of the joint unit 130.
- the joint unit 130 is rotationally driven.
- the drive of the joint drive unit 131 is controlled by the drive control unit 111.
- the motor constituting the joint drive unit 131 is driven by the amount of current according to a command from the drive control unit 111.
- the joint state detection unit 132 detects the state of the joint unit 130.
- the state of the joint 130 may mean the state of motion of the joint 130.
- the state of the joint unit 130 includes information such as the rotation angle, rotation angular velocity, rotation angular acceleration, and generated torque of the joint unit 130.
- the joint state detection unit 132 is configured by various sensors such as an encoder and a torque sensor, for example, and detects the rotation angle of the joint unit 130 and the generated torque and the external torque of the joint unit 130. be able to.
- the joint state detection unit 132 transmits the detected state of the joint unit 130 to the control device 20.
- the imaging unit 140 is an example of a tip unit provided at the tip of the arm unit 120, and has a function of acquiring an image to be shot.
- the imaging unit 140 is configured by various imaging devices such as a camera and a microscope.
- the imaging unit 140 is configured by a digital video camera, and can acquire an image signal representing an image to be captured.
- the imaging unit 140 transmits the acquired image signal to the display device 30.
- the imaging unit 140 is provided at the tip of the arm unit 120.
- FIG. 1 a state in which the imaging unit 140 is provided at the distal end of the link at the final stage via the plurality of joint units 130 and the plurality of links is schematically illustrated between the joint unit 130 and the imaging unit 140. It is expressed by However, in this embodiment, the site where the imaging unit 140 is provided is not limited to the tip of the arm unit 120, and the imaging unit 140 may be provided at any site of the arm unit 120.
- the control device 20 includes a storage unit 220 and a control unit 230.
- the control unit 230 includes various processors such as a CPU, for example, and controls the control device 20 in an integrated manner and performs various calculations for controlling the driving of the arm unit 120 in the robot arm device 10. Specifically, the control unit 230 performs various calculations in the whole body cooperative control and the ideal joint control in order to control the driving of the arm unit 120 of the robot arm device 10. Further, the control unit 230 performs various processes for deriving a reference position when the imaging unit 140 observes the object.
- various processors such as a CPU, for example
- the control unit 230 includes a whole body cooperative control unit 240, an ideal joint control unit 250, and a reference position deriving unit 260.
- the whole body cooperative control unit 240 performs various calculations related to whole body cooperative control using generalized inverse dynamics.
- the ideal joint control unit 250 performs various calculations related to ideal joint control that realizes an ideal response based on a theoretical model. By controlling the drive of the robot arm device 10 based on these calculation results, the robot arm device 10 is driven by force control.
- the processing performed by the whole body cooperative control unit 240 and the ideal joint control unit 250 is described in ⁇ 4.
- the whole body cooperative control will be described in detail again in>, and only the outline will be briefly described here.
- the operation space is an important concept in the force control of the robot device.
- the operation space is a space for describing the relationship between the force acting on the multi-link structure and the acceleration of the multi-link structure.
- the operation space is, for example, a joint space, a Cartesian space, a momentum space or the like to which a multi-link structure belongs.
- the motion purpose represents a target value in the drive control of the multi-link structure, and is, for example, a target value such as position, speed, acceleration, force, impedance, etc. of the multi-link structure to be achieved by the drive control.
- Constraint conditions are constraints regarding the position, speed, acceleration, force, etc. of the multi-link structure, which are determined by the shape and structure of the multi-link structure, the environment around the multi-link structure, settings by the user, and the like.
- the constraint condition may be various information that restricts (restrains) the movement of the arm unit 120.
- the constraint condition includes information on generated force, priority, presence / absence of a non-driven joint, vertical reaction force, friction weight, support polygon, and the like.
- the whole-body cooperative control unit 240 uses inverse generalized dynamics to control values for driving the arm unit 120 so as to achieve a predetermined motion purpose (for example, driving parameters of each joint unit 130 (for example, the joint unit 130).
- the generated torque value) can be calculated in consideration of a predetermined constraint condition.
- the control value of the arm unit 120 is calculated by the whole-body cooperative control unit 240 under the constraint that the reference position is located on the optical axis of the imaging unit 140.
- a control value of the arm unit 120 that causes the imaging unit 140 to perform a pivot operation is calculated.
- the control value of the arm unit 120 is calculated by the whole body cooperative control unit 240 under the constraint that the position and posture of the arm unit 120 are fixed in a predetermined state, so that the arm unit 120 is fixed. That is, the control value of the arm unit 120 is calculated (that is, the imaging unit 140 is also fixed). Further, for example, the control value of the arm unit 120 is calculated by the whole-body cooperative control unit 240 in a state where the constraint condition is not particularly set, so that the positions and postures of the arm unit 120 and the imaging unit 140 are freely moved. The control value of the arm unit 120 that realizes a free operation that can be performed is calculated. The whole body cooperative control unit 240 provides information about the calculated control value to the ideal joint control unit 250.
- the ideal joint control unit 250 calculates a command value that is finally used to drive the arm unit 120 by correcting the control value calculated by the whole-body cooperative control unit 240 in consideration of the influence of disturbance.
- the command value may be a generated torque value of the joint unit 130 in consideration of the influence of disturbance.
- the ideal joint control unit 250 transmits information about the calculated command value to the robot arm device 10. Based on the command value, the drive control unit 111 drives each joint unit 130, so that the arm unit 120 is driven so as to achieve a predetermined motion purpose under a predetermined constraint condition.
- the reference position deriving unit 260 derives a reference position that is a reference point in the observation of the object by the imaging unit 140.
- the derivation of the reference position refers to a reference coordinate system (for example, a coordinate in an internal model) used by the robot arm device 10 to drive the arm unit 120 by calculating a relative position of the reference position with respect to the arm unit 120. It may mean a process of deriving a reference position in the system.
- a constraint condition that the reference position as described above is located on the optical axis of the imaging unit 140 is set, and under the constraint condition, the whole body cooperation
- a control value and a command value for driving the arm unit 120 are calculated by the control unit 240 and the ideal joint control unit 250.
- the reference position deriving unit 260 includes an arm state acquisition unit 241, a distance information acquisition unit 261, and a relative position calculation unit 262.
- the arm state acquisition unit 241 acquires the state (arm state) of the arm unit 120 based on the state of the joint unit 130 detected by the joint state detection unit 132.
- the arm state may mean a state of movement of the arm unit 120.
- the arm state includes information such as the position, speed, acceleration, and force of the arm unit 120.
- the joint state detection unit 132 acquires information such as the rotation angle, the rotation angular velocity, the rotation angular acceleration, and the generated torque in each joint unit 130 as the state of the joint unit 130.
- the storage unit 220 stores various types of information processed by the control device 20, and in the present embodiment, the storage unit 220 stores various types of information (arm information) about the arm unit 120.
- the arm state acquisition unit 241 can acquire the arm information from the storage unit 220. Therefore, the arm state acquisition unit 241 determines the position (coordinates) in space of the plurality of joint units 130, the plurality of links, and the imaging unit 140 based on the state of the joint unit 130 and the arm information (that is, the arm unit 120). The position and orientation, and the position and orientation of the imaging unit 140) and information such as the force acting on each joint unit 130, link, and imaging unit 140 can be acquired as the arm state.
- various physical quantities included in the arm state may be expressed by a reference coordinate system used for drive control of the arm unit 120, such as a coordinate system of an internal model.
- the arm state acquisition unit 241 acquires the arm state in a state where the imaging unit 140 is directed to the reference position. Note that the operation of directing the image capturing unit 140 to the reference position may be performed manually by the user while referring to an image captured by the image capturing unit 140 displayed on the display device 30, for example.
- the arm state acquisition unit 241 provides information about the acquired arm state to the relative position calculation unit 262.
- the distance information acquisition unit 261 acquires distance information about the distance between the imaging unit 140 and the reference position in a state where the imaging unit 140 is directed to the reference position.
- the distance information can be acquired based on the focal length of the imaging unit 140.
- information about the focal length when the imaging unit 140 is directed to the reference position and focused on the reference position may be transmitted from the imaging unit 140 to the distance information acquisition unit 261.
- the process of focusing on the reference position may be performed manually by the user, or may be performed by the AF function if the imaging unit 140 has an autofocus (AF) function.
- AF autofocus
- the storage unit 220 stores information about the performance of the imaging device that constitutes the imaging unit 140, and the distance information acquisition unit 261 refers to the storage unit 220, thereby operating distance corresponding to the focal length.
- the distance information between the imaging unit 140 and the reference position can be acquired as (WD: Work Distance).
- the focal length can be calculated by the imaging unit 140 based on the state of the optical system in the imaging unit 140 (for example, the position of a focus adjustment lens).
- the imaging unit 140 may transmit information about the state of the optical system at the time of focusing to the distance information acquisition unit 261, and the calculation of the focal distance may be performed by the distance information acquisition unit 261.
- the distance information acquisition unit 261 provides the acquired distance information to the relative position calculation unit 262.
- the relative position calculation unit 262 calculates the relative position of the reference position with respect to the arm unit 120 based on the arm state acquired by the arm state acquisition unit 241 and the distance information acquired by the distance information acquisition unit 261. Specifically, the relative position calculation unit 262 can recognize the positions and postures of the arm unit 120 and the imaging unit 140 in the reference coordinate system from the arm state of the arm unit 120. Further, the relative position calculation unit 262 can recognize the distance between the imaging unit 140 and the reference position based on the distance information. Therefore, the relative position calculation unit 262 can calculate the relative position of the reference position with respect to the arm unit 120 from these pieces of information.
- the relative position calculation unit 262 provides information about the calculated relative position to the whole body cooperative control unit 240.
- the calculation of the relative position of the reference position with respect to the arm unit 120 means that the coordinates of the reference position in the reference coordinate system have been derived. Therefore, the reference position can be set as a constraint condition.
- the arm unit 120 under the constraint that the reference position is located on the optical axis of the imaging unit 140 using the reference position by the whole body cooperative control unit 240 and the ideal joint control unit 250.
- a control value and a command value for driving are calculated.
- the arm unit 120 is driven so that the distance and angle from the reference position can be freely changed while the imaging unit 140 is always facing the reference position (that is, the pivot operation is performed). Will be controlled.
- a constraint condition that a predetermined point on the optical axis of the imaging unit 140 is fixed at the reference position is further set, so that the distance between the imaging unit 140 and the reference position is kept constant.
- the drive of the arm unit 120 is controlled so that the pivoting operation is performed in the leaned state.
- the storage unit 220 stores various types of information processed by the control device 20.
- the storage unit 220 can store various parameters used in calculations related to whole body cooperative control and ideal joint control performed by the control unit 230.
- the storage unit 220 may store an exercise purpose and a constraint condition used in a calculation related to the whole body cooperative control by the whole body cooperative control unit 240.
- the storage unit 220 may store calculation results in calculations related to whole body cooperative control and ideal joint control by the control unit 230, numerical values calculated in the calculation process, and the like.
- the storage unit 220 may store various types of information related to the arm unit 120 used when the arm state acquisition unit 241 acquires the arm state.
- the storage unit 220 can store information such as parameters used in various processes performed by the reference position deriving unit 260 and results of the processes. As described above, the storage unit 220 may store various parameters related to various processes performed by the control unit 230, and the control unit 230 performs various processes while transmitting and receiving information to and from the storage unit 220. be able to.
- the reference position for observing the object by the imaging unit 140 is derived by the reference position deriving unit 260. Then, the derived reference position is set as a constraint condition when performing whole-body cooperative control, whereby a pivot operation centered on the reference position is realized.
- the derivation of the reference position is performed by acquiring the arm state of the arm unit 120 and the distance information between the imaging unit 140 and the reference position in a state where the imaging unit 140 is directed to the reference position.
- the arm state of the arm unit 120 can be automatically acquired by the arm state acquisition unit 241 based on the state of the joint unit 130.
- the distance information can also be automatically acquired by the distance information acquisition unit 261 based on information about the focal length of the imaging unit 140, for example. Therefore, the operation performed by the user for deriving the reference position is only an operation of directing the imaging unit 140 to the reference position while referring to an image captured by the imaging unit 140 displayed on the display device 30. Therefore, for example, the derivation of the reference position and the pivot operation centered on the reference position are realized by a simpler operation than the operation described in Patent Document 1 regardless of the skill level of the operator. As a result, the workload of the operator of the robot arm device 10 at the time of surgery is reduced, so that the user's convenience can be further improved, such as shortening the operation time and reducing the operator's fatigue.
- such a pivoting operation is performed in the robot arm device 10 whose driving is controlled by so-called force control. Accordingly, the user can more intuitively operate the arm unit 120 at the time of deriving the reference position and at the time of the pivot operation, and the drive control of the robot arm device 10 with higher operability and higher user convenience is realized. Is done.
- each component of the robot arm control system 2 according to the present embodiment described above may be configured using a general-purpose member or circuit, or hardware specialized for the function of each component. It may be constituted by.
- the CPU or the like may perform all functions of each component. Therefore, it is possible to appropriately change the configuration to be used according to the technical level at the time of carrying out the present embodiment.
- a computer program for realizing each function of the robot arm control system 2 according to the present embodiment as described above and mount it on a personal computer or the like.
- a computer-readable recording medium storing such a computer program can be provided.
- the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
- the above computer program may be distributed via a network, for example, without using a recording medium.
- Robot arm control method Next, a robot arm control method according to an embodiment of the present disclosure will be described.
- arm control using the reference position is performed (for example, the imaging apparatus performs a pivot operation with the reference position as the pivot center point).
- the drive control of the arm portion is performed.
- FIG. 2 the outline
- the processing procedure of the robot arm control method according to the present embodiment including the reference position derivation method will be described in detail with reference to FIG.
- FIG. 2 is an explanatory diagram for explaining the outline of the reference position deriving method according to the present embodiment.
- an imaging device (not shown) provided at the tip of the arm unit 310 of the robot arm device is at a reference position 360 in real space.
- the arm portion 310 is configured by connecting a plurality of links 312 by a plurality of joint portions 311.
- the arm part 310, the link 312 and the joint part 311 correspond to the arm part 120, the link and joint part 130 shown in FIG.
- the robot arm device can be configured by installing the arm unit 310 on a base such as a base.
- a base such as a base.
- illustration of a member that hits the base is omitted. .
- the arm unit 310 is configured to have six degrees of freedom.
- the configuration of the arm unit 310 of the robot arm device is not limited to the illustrated example.
- the specific configuration of the arm unit 310 such as the number of joint units 311 disposed and the connection status between the link 312 and the joint unit 311, is appropriately determined so as to realize a desired degree of freedom according to the application of the robot arm device. May be determined.
- the specific configuration of the arm unit 310 may be determined so that the arm unit 310 has a degree of freedom that allows the imaging apparatus to perform a pivot operation.
- an arrow representing a reference coordinate system (XYZ coordinates) (for example, a coordinate system in the internal model) used by the robot arm device to drive the arm unit 310 is superimposed on the arm unit 310.
- XYZ coordinates for example, a coordinate system in the internal model
- the origin of the XYZ coordinates may be appropriately set at a position where the driving of the arm unit 310 can be easily described.
- the imaging device is, for example, a camera, a microscope, or the like, and the reference position 360 may be a predetermined position on an observation target (for example, a surgical site of a patient).
- the line-of-sight direction (optical axis direction) of the imaging device is important. Therefore, in FIG. 2, the imaging device is not illustrated in FIG. 2, and the visual field region 330 of the imaging device is schematically illustrated as a plane. In addition, an arrow indicating coordinates (xyz coordinates) in the visual field region 330 is shown.
- FIG. 2 the imaging device is not illustrated in FIG. 2
- the visual field region 330 of the imaging device is schematically illustrated as a plane.
- an arrow indicating coordinates (xyz coordinates) in the visual field region 330 is shown.
- the two axes orthogonal to each other in the plane representing the visual field region 330 are the x axis and the y axis, and the line-of-sight direction of the imaging device is a direction perpendicular to the plane representing the visual field region 330.
- the z axis is used.
- the imaging device is arranged at the tip of the arm unit 310 so that the line-of-sight direction of the imaging device matches the direction of the link 312 (the hand end direction of the arm unit 310) arranged at the tip of the arm unit 310. It is connected.
- the present embodiment is not limited to such an example, and the imaging apparatus may be provided in any part of the arm unit 310, and the installation position thereof is not limited.
- the reference position in order to derive the reference position, it is only necessary to know the position of the imaging device and the distance between the imaging device and the reference position.
- the tip of the arm part 310 may not be.
- the process in which the imaging device is directed to the reference position 360 is performed manually by a user with reference to a display device on which an image photographed by the imaging device is displayed, for example, and the reference position 360 is set to the approximate center of the visual field region 330.
- the arm unit 310 is operated to adjust the position and orientation of the imaging apparatus.
- the operation mode of the arm unit 310 is set to a free operation mode in which the above-described free operation can be performed.
- each joint portion 311 can be freely moved in accordance with a user operation input.
- the user may move the imaging device to an appropriate position and posture by directly applying an external force to the arm unit 310, or by moving the arm unit 310 via various input devices such as a remote controller and a controller.
- the imaging device may be moved to an appropriate position and posture.
- the imaging apparatus When the imaging apparatus is directed to the reference position 360, the state of each joint 311 in this state is detected, and the arm state is acquired based on the detected state of each joint 311.
- the process is performed by a configuration corresponding to the arm state acquisition unit 241 illustrated in FIG.
- the arm state includes information about the position and orientation of the arm unit 310, and the robot arm device can recognize the positions of the arm unit 310 and the imaging device in the reference coordinate system based on the arm state.
- distance information representing the distance between the imaging device and the reference position 360 is acquired in a state where the imaging device is directed to the reference position 360.
- the processing is performed by a configuration corresponding to the distance information acquisition unit 261 shown in FIG.
- the distance information may be acquired based on the focal distance when the focus of the imaging apparatus is adjusted to the reference position 360.
- the optical system in the imaging apparatus is appropriately adjusted so that the focus of the imaging apparatus is aligned with the reference position 360 by an AF (autofocus) function provided in the imaging apparatus or manually by the user.
- the focal length may be calculated based on the state of the optical system.
- the relative position of the reference position 360 with respect to the arm unit 310 is then calculated.
- This processing is performed by a configuration corresponding to the relative position calculation unit 262 shown in FIG.
- the process may be a process of calculating a position (coordinates) of 360 as a reference position in the reference coordinate system.
- the robot arm device recognizes the position of the imaging device in the reference coordinate system based on the arm state of the arm unit 310, and recognizes the distance from the imaging device to the reference position 360 based on the distance information. From this information, the position of the reference position 360 in the reference coordinate system can be calculated.
- the coordinates of the reference position 360 in the reference coordinate system are calculated in this way, and drive control of the arm unit 310 using the coordinates of the reference position 360, for example, a pivot operation around the reference position 360 is performed. Etc. will be performed.
- FIG. 3 is a flowchart showing an example of a processing procedure of the robot arm control method according to the present embodiment.
- the flowchart shown in FIG. 3 shows a series of processes when the pivot operation is performed in the robot arm apparatus according to the present embodiment. 3 can be executed by the robot arm control system 2 shown in FIG. 1, for example.
- each process shown in FIG. 3 will be described in association with the configuration of the robot arm control system 2 shown in FIG.
- the operation mode of the arm unit 120 is shifted to the free operation mode (step S101).
- the whole body cooperative control unit 240 shown in FIG. 1 calculates a control value that causes the arm unit 120 to perform a free operation, and the drive of the arm unit 120 is controlled based on the control value. It will be.
- the arm unit 120 can be freely moved in accordance with a user operation input.
- the arm unit 120 is moved so that the reference position is in the center of the field of view of the imaging unit 140 (step S103).
- the reference position is, for example, a predetermined point on the patient's surgical site, and is a point that becomes the pivot center point during the pivot operation.
- the position of the arm unit 120 may be adjusted by a user's manual operation while referring to the display device 30 on which an image captured by the imaging unit 140 is displayed.
- step S105 the operation mode of the arm unit 120 is shifted to a pivot operation mode in which the imaging unit 140 performs a pivot operation.
- step S107 it is necessary to set the pivot center point, and therefore it is determined whether or not the pivot center point is set.
- the pivot center point may also change. Therefore, when the pivot center point is set, the pivot mode is once set and then the mode shifts to the fixed mode. (Step S119 described later), which corresponds to the case of shifting to the pivot operation again. Accordingly, when the process reaches step S107 via step S103, it is basically determined that the pivot center point is not set, and the process proceeds to step S109.
- step S109 the arm state in a state where the imaging unit 140 is directed to the pivot center point is acquired.
- step S111 distance information between the imaging unit 140 and the pivot center point is acquired.
- step S113 the relative position of the pivot center point with respect to the arm unit 120 is calculated.
- FIG. 3 in order to emphasize that the arm state is acquired in order to derive the reference position, the process of acquiring the arm state is illustrated as step S109 formally.
- the arm state is always acquired by the arm state acquisition unit 241. Therefore, the arm state is acquired only at the timing shown in step S109. is not.
- the reference position is set as the pivot center point (step S115).
- the drive of the arm unit 120 is controlled so that the imaging unit 140 performs a pivot operation around the pivot center point.
- the user can freely adjust the viewpoint (distance and angle) with the line of sight (optical axis) of the imaging unit 140 directed toward the pivot center point (step S117). Thereby, the user can perform various treatments while observing the same observation target (operation part) from a plurality of viewpoints.
- the operation mode of the arm unit 120 can be shifted to the fixed operation mode (step S119).
- the driving of the arm unit 120 is controlled so as to maintain the state at that time, and the position and posture of the arm unit 120 are fixed (step S121). Since the position and orientation of the imaging unit 140 are also fixed at a predetermined position in the fixed operation mode, a captured image from a specific distance and angle is displayed on the display device 30.
- step S123 After shifting to the fixed operation mode, it is determined whether or not to change the viewpoint again (step S123). When the treatment is continued with the viewpoint fixed as it is, the fixed operation mode is maintained. On the other hand, when it is desired to change the viewpoint again, the process returns to step S105 to shift to the pivot operation mode.
- the fixed operation mode is shifted to the pivot operation mode
- the imaging unit 140 remains in the state of being directed to the pivot center point when the pivot operation is performed earlier. Therefore, when the fixed operation mode is shifted to the pivot operation mode, it is determined that the pivot center point is set in the process shown in step S107, and the process of deriving the reference position shown in steps S109 to S115 is omitted. The pivot operation is performed based on the set pivot center point.
- step S101 is performed again.
- an operation for directing the imaging unit 140 to the reference position is performed. Specifically, in this operation, the positions and postures of the arm unit 120 and the imaging unit 140 are adjusted so that the reference position is approximately at the center of the visual field region of the imaging unit 140.
- the position of the imaging unit 140 is adjusted by the user directly applying an external force to the arm unit 120 or moving the arm unit 120 via various input devices such as a remote controller and a controller. It was.
- the present embodiment is not limited to such an example, and the positions and postures of the arm unit 120 and the imaging unit 140 are adjusted by other methods so that the reference position is approximately at the center of the field of view of the imaging unit 140. Also good.
- the visual field region may move based on an operation input on a captured image that is captured by the imaging unit 140 and displayed on the display device 30.
- the operation input may be, for example, processing for selecting a predetermined point on the screen.
- the selection process may be performed by operating a cursor or pointer in the screen using an input device such as a mouse, or when the screen of the display device 30 is configured by a touch panel. Alternatively, it may be performed by selecting a predetermined point directly by an operating body such as a finger or a stylus pen.
- the control device 20 acquires the position information of the point selected in the screen of the display device 30, and based on the position information
- the arm unit 120 may be driven so that the selected point is the center of the field of view of the imaging unit 140.
- the visual field region may be moved by performing image analysis on a captured image captured by the imaging unit 140.
- the surgeon attaches a marker to a site to be observed (that is, a site corresponding to the reference position) in the surgical site.
- the operation of attaching the marker may be performed by attaching a predetermined mark directly to the surgical site using a surgical tool such as an electric knife, or by using a staining solution to remove the surgical site in a predetermined color different from the surrounding tissue. It may be dyed.
- the control device 20 performs image analysis on a captured image captured by the imaging unit 140 to extract a part to which the marker is attached from the captured image, and the extracted part is a visual field region of the imaging unit 140. You may control the drive of the arm part 120 so that it may become the center.
- the surgeon can adjust the visual field region by performing an operation on the screen or the surgical site while referring to the screen of the display device 30. It is possible to adjust the visual field region more efficiently and more easily than when directly operating 120.
- various distance sensors using laser, ultrasonic waves, infrared rays, or the like are provided at the tip of the arm unit 120, and the distance between the imaging unit 140 and the reference position is measured by the distance sensor.
- the distance information may be acquired.
- the imaging unit 140 when configured by a plurality of cameras such as a stereo camera and a compound eye camera, the parallax information acquired based on captured images captured by the plurality of cameras.
- the distance information between the imaging unit 140 and the reference position may be acquired using.
- an imaging device that captures images of the robot arm device 10 from the outside is provided separately, the distance between the external imaging device and the imaging unit 140 provided in the arm unit 120, and the external imaging.
- the distance information between the imaging unit 140 and the reference position may be acquired by measuring the distance between the apparatus and the surgical part (reference position).
- the method of measuring the distance between the external imaging device and the imaging unit 140 or the surgical unit may use the focal length of the external imaging device, or the external imaging device may be a stereo camera or a compound eye camera. In some cases, parallax information may be used.
- distance information between the imaging unit 140 and the reference position may be acquired by the operator directly inputting distance information as a numerical value.
- the driving of the arm unit 120 can be controlled by the control device 20 so that the distance between the imaging unit 140 and the reference position becomes the input value.
- an appropriate method may be selected from these modified examples in consideration of the use of the robot arm device 10 and the environment of the site where the treatment is performed, measurement accuracy, cost, and the like. Further, a plurality of these methods may be used in combination. By measuring the distance by a plurality of methods, the measurement accuracy can be improved, and a pivot operation in which the pivot center point is positioned with higher accuracy can be realized.
- FIG. 4 is an explanatory diagram for describing an application example in which the robot arm device according to the embodiment of the present disclosure is used for medical purposes.
- FIG. 4 schematically shows a state of treatment using the robot arm device according to the present embodiment.
- a doctor who is a practitioner (user) 520 uses a surgical instrument 521 such as a scalpel, a scissors, or a forceps to perform a treatment target (patient) on the treatment table 530.
- a state in which an operation is performed on 540 is illustrated.
- the treatment is a general term for various medical treatments performed on a patient who is a treatment target 540 by a doctor who is a user 520, such as surgery and examination.
- the state of the operation is illustrated as an example of the operation.
- the operation using the robot arm device 510 is not limited to the operation, and other various operations such as an endoscope are used. It may be an inspection or the like.
- a robot arm device 510 is provided beside the treatment table 530.
- the robot arm device 510 includes a base portion 511 that is a base and an arm portion 512 that extends from the base portion 511.
- the arm portion 512 includes a plurality of joint portions 513a, 513b, and 513c, a plurality of links 514a and 514b connected by the joint portions 513a and 513b, and an imaging unit 515 provided at the tip of the arm portion 512.
- the arm unit 512 includes three joint units 513a to 513c and two links 514a and 514b.
- the positions of the arm unit 512 and the imaging unit 515 and The number and shape of the joint portions 513a to 513c and the links 514a and 514b, the direction of the drive shaft of the joint portions 513a to 513c, etc. may be appropriately set so as to realize a desired degree of freedom in consideration of the freedom of posture. .
- the joint portions 513a to 513c have a function of connecting the links 514a and 514b to each other so as to be rotatable, and the drive of the arm portion 512 is controlled by driving the rotation of the joint portions 513a to 513c.
- the position of each component of the robot arm device 510 means the position (coordinates) in the space defined for drive control, and the posture of each component is the drive. It means the direction (angle) with respect to an arbitrary axis in the space defined for control.
- the driving (or driving control) of the arm unit 512 refers to driving (or driving control) of the joint units 513a to 513c and driving (or driving control) of the joint units 513a to 513c. This means that the position and posture of each component of the arm portion 512 are changed (change is controlled).
- an imaging unit 515 is provided at the tip of the arm unit 512 as an example of the tip unit.
- the imaging unit 515 is a unit that acquires an image to be captured (captured image), and is, for example, a camera that can capture a moving image or a still image.
- the posture and position of the arm unit 512 and the imaging unit 515 are detected by the robot arm device 510 so that the imaging unit 515 provided at the distal end of the arm unit 512 images the state of the treatment site of the treatment target 540. Is controlled.
- the tip unit provided at the tip of the arm portion 512 is not limited to the imaging unit 515, and may be various medical instruments.
- the medical instrument include a unit having an imaging function, such as an endoscope, a microscope, and the above-described imaging unit 515, and various units used in the operation, such as various surgical instruments and inspection apparatuses.
- the robot arm apparatus 510 according to the present embodiment is a medical robot arm apparatus provided with a medical instrument.
- a stereo camera having two imaging units (camera units) may be provided at the tip of the arm unit 512, and shooting may be performed so that the imaging target is displayed as a three-dimensional image (3D image).
- the robot arm device 510 provided with an imaging unit 515 for photographing a treatment site and a camera unit such as the stereo camera as the distal unit is also referred to as a robot arm device for a video microscope.
- a display device 550 such as a monitor or a display is installed at a position facing the user 520.
- a captured image of the treatment site imaged by the imaging unit 515 is displayed on the display screen of the display device 550.
- the user 520 performs various treatments while viewing the captured image of the treatment site displayed on the display screen of the display device 550.
- the robot arm device 510 in the medical field, it is proposed to perform an operation while imaging a treatment site by the robot arm device 510.
- various treatments including surgery it is required to reduce fatigue and burden on the user 520 and the patient 540 by performing the treatment more efficiently.
- the robot arm device 510 is considered to require the following performance, for example.
- the robot arm device 510 is required to secure a working space in the operation.
- the arm unit 512 or the imaging unit 515 obstructs the view of the practitioner or the movement of the hand performing the treatment, the efficiency of the operation Leading to a decline.
- a plurality of other doctors who perform various support operations such as handing instruments to the user 520 and checking various vital signs of the patient 540 Since a nurse or the like is generally around the user 520 and the patient 540 and there are other devices for performing the support operation, the surgical environment is complicated. Therefore, it is desirable that the robot arm device 510 be smaller.
- the robot arm device 510 is required to have high operability when moving the imaging unit 515.
- the user 520 is required to observe the same surgical site from various positions and angles while performing the treatment on the surgical site.
- the angle of the imaging unit 515 it is necessary to change the angle of the imaging unit 515 with respect to the treatment site.
- the imaging direction of the imaging unit 515 is fixed to the treatment site (that is, the same site). It is more desirable that only the angle at which the image is taken changes.
- the imaging unit 515 moves in the plane of the cone with the treatment site as the apex, and a turning operation with the cone axis as a turning axis (
- the robot arm device 510 has been required to have operability with a higher degree of freedom, such as a pivot operation.
- the pivot operation is also called a point lock operation.
- the image pickup unit 515 can be easily moved with one hand, for example, by moving the imaging unit 515 or the above-described pivoting operation.
- the photographing center of the photographed image photographed by the imaging unit 515 is changed from a site where treatment is performed to another site (for example, a site where the next treatment is performed). There may be a request to move it. Therefore, when changing the position and orientation of the imaging unit 515, not only the method of manually controlling the driving of the arm unit 512 as described above, but also the driving of the arm unit 512 by an operation input from an input unit such as a pedal, for example.
- Various driving methods for the arm portion 512 such as a method for controlling the movement, are required.
- the robot arm device 510 is required to have high operability that meets the intuition and demands of the user 520, for example, to realize the above-described pivoting operation and easy manual movement.
- the robot arm device 510 is required to have stability in drive control of the arm unit 512.
- the stability of the arm unit 512 with respect to the drive control may be the stability of the position and posture of the tip unit when the arm unit 512 is driven.
- the stability of the arm unit 512 with respect to the drive control includes smooth movement of the tip unit and suppression of vibration (vibration suppression) when the arm unit 512 is driven.
- vibration suppression vibration suppression
- the robot arm device 510 when the robot arm device 510 is used for surgery, a stereo camera having two imaging units (camera units) as a tip unit is provided, and a three-dimensional image (based on an image captured by the stereo camera) A usage method in which a 3D image) is displayed on the display device 550 can be assumed.
- a 3D image when a 3D image is displayed, if the position and posture of the stereo camera are unstable, there is a possibility of inducing a so-called 3D sickness of the user.
- the observation range imaged by the imaging unit 515 may be expanded to about ⁇ 15 mm.
- the present inventors examined a general existing balanced arm and a robot arm device by position control from the viewpoint of the above three performances.
- a general balance arm usually has a counterbalance weight (both counterweight or balancer) for balancing the force when the arm is moved. Is provided inside the base portion, etc., it is difficult to reduce the size of the balance-type arm device, and it is difficult to say that the performance is satisfied.
- the present inventors have obtained knowledge that there is a demand for the above-described three performances regarding the robot arm device. However, it is considered that it is difficult to satisfy these performances with a general existing balanced arm or a robot arm device based on position control. As a result of studying a configuration that satisfies the above three performances, the present inventors have come up with a robot arm device, a robot arm control system, a robot arm control method, and a program according to the following embodiments. Hereinafter, these embodiments will be described in detail.
- the distal end unit of the arm portion of the robot arm device is an imaging unit, and a surgical site is imaged by the imaging unit at the time of surgery as illustrated in FIG.
- the present embodiment is not limited to such an example.
- the robot arm control system according to the present embodiment is applicable even when a robot arm device having another tip unit is used for other purposes.
- FIG. 5 is a schematic diagram illustrating an appearance of a robot arm device according to an embodiment of the present disclosure.
- the robot arm device 400 includes a base portion 410 and an arm portion 420.
- the base unit 410 is a base of the robot arm device 400, and the arm unit 420 is extended from the base unit 410.
- a control unit that integrally controls the robot arm device 400 may be provided in the base unit 410, and the driving of the arm unit 420 may be controlled by the control unit.
- the said control part is comprised by various signal processing circuits, such as CPU (Central Processing Unit) and DSP (Digital Signal Processor).
- the arm part 420 includes a plurality of joint parts 421a to 421f, a plurality of links 422a to 422c connected to each other by the joint parts 421a to 421f, and an imaging unit 423 provided at the tip of the arm part 420.
- the links 422a to 422c are rod-shaped members, one end of the link 422a is connected to the base part 410 via the joint part 421a, the other end of the link 422a is connected to one end of the link 422b via the joint part 421b, The other end of the link 422b is connected to one end of the link 422c via the joint portions 421c and 421d. Furthermore, the imaging unit 423 is connected to the tip of the arm part 420, that is, the other end of the link 422c via joint parts 421e and 421f.
- the ends of the plurality of links 422a to 422c are connected to each other by the joint portions 421a to 421f with the base portion 410 as a fulcrum, thereby forming an arm shape extending from the base portion 410.
- the imaging unit 423 is a unit that acquires an image to be captured, and is, for example, a camera that captures a moving image or a still image. By controlling the driving of the arm unit 420, the position and orientation of the imaging unit 423 are controlled. In the present embodiment, the imaging unit 423 images a partial region of the patient's body that is a treatment site, for example.
- the tip unit provided at the tip of the arm unit 420 is not limited to the imaging unit 423, and various medical instruments may be connected to the tip of the arm unit 420 as the tip unit.
- the robot arm device 400 according to the present embodiment is a medical robot arm device provided with a medical instrument.
- the robot arm device 400 will be described with the coordinate axes defined as shown in FIG.
- the vertical direction, the front-rear direction, and the left-right direction are defined according to the coordinate axes. That is, the vertical direction with respect to the base portion 410 installed on the floor is defined as the z-axis direction and the vertical direction.
- the direction perpendicular to the z-axis and extending from the base portion 410 to the arm portion 420 (that is, the direction in which the imaging unit 423 is located with respect to the base portion 410) is defined as the y-axis direction and It is defined as the front-rear direction.
- the directions orthogonal to the y-axis and z-axis are defined as the x-axis direction and the left-right direction.
- the joint portions 421a to 421f connect the links 422a to 422c so as to be rotatable.
- the joint portions 421a to 421f have actuators, and have a rotation mechanism that is driven to rotate about a predetermined rotation axis by driving the actuators.
- the rotational drive in each joint portion 421a to 421f for example, the drive of the arm portion 420 such as extending or shrinking (folding) the arm portion 420 can be controlled.
- the joint parts 421a to 421f are the whole body cooperative control described later in (5-2-2. Generalized inverse dynamics) and the ideal joint described later in (5-2-3. About ideal joint control).
- the drive is controlled by the control.
- the drive control of the joint portions 421a to 421f is specifically the joint portions 421a to 421f. This means that the rotation angle and / or generated torque (torque generated by the joint portions 421a to 421f) is controlled.
- the robot arm device 400 has six joint portions 421a to 421f, and six degrees of freedom are realized with respect to driving of the arm portion 420.
- the joint portions 421a, 421d, and 421f have the major axis direction of each of the connected links 422a to 422c and the shooting direction of the connected imaging unit 473 as the rotation axis direction.
- the joint portions 421b, 421c, and 421e are configured so that the link angle between the links 422a to 422c and the imaging unit 473 connected to each other is a yz plane (a plane defined by the y axis and the z axis).
- the x-axis direction which is the direction to be changed in (), is provided as the rotation axis direction.
- the joint portions 421a, 421d, and 421f have a function of performing so-called yawing
- the joint portions 421b, 421c, and 421e have a function of performing so-called pitching.
- the robot arm device 400 realizes six degrees of freedom for driving the arm unit 420. Therefore, the imaging unit is within the movable range of the arm unit 420. 423 can be moved freely.
- a hemisphere is illustrated as an example of the movable range of the imaging unit 423. Assuming that the center point of the hemisphere is the imaging center of the treatment site imaged by the imaging unit 423, the imaging unit 423 is moved on the spherical surface of the hemisphere while the imaging center of the imaging unit 423 is fixed to the center point of the hemisphere. By doing so, the treatment site can be imaged from various angles.
- the configuration of the joint portions 421a to 421f shown in FIG. 5 will be described in more detail with reference to FIG.
- the configuration of the actuator which is the configuration mainly related to the rotational drive of the joint portions 421a to 421f, among the configurations of the joint portions 421a to 421f will be described.
- FIG. 6 is a cross-sectional view schematically illustrating a state in which the actuators of the joint portions 421a to 421f according to an embodiment of the present disclosure are cut along a cross section passing through the rotation axis.
- the actuator is illustrated among the configurations of the joint portions 421a to 421f, but the joint portions 421a to 421f may have other configurations.
- the joint portions 421a to 421f have a control unit for controlling driving of the actuator, a support member for connecting and supporting the links 422a to 422c and the imaging unit 423, in addition to the configuration illustrated in FIG.
- Various configurations necessary for driving the arm portion 420 are provided.
- the driving of the joint portion of the arm portion may mean the driving of the actuator in the joint portion.
- the driving of the joint portions 421a to 421f is controlled by ideal joint control described later (5-2-3. About ideal joint control). Therefore, the actuators of the joint portions 421a to 421f shown in FIG. 6 are configured to be able to drive corresponding to ideal joint control. Specifically, the actuators of the joint portions 421a to 421f are configured to be able to adjust the rotation angle and the torque associated with the rotation drive in the joint portions 421a to 421f. In addition, the actuators of the joint portions 421a to 421f are configured so as to be able to arbitrarily adjust the viscous resistance coefficient with respect to the rotational motion.
- the actuator is easily rotated with respect to an external force (that is, the arm portion 420 is manually operated). It is possible to realize a state of being easily moved) and a state of being difficult to rotate (that is, it is difficult to manually move the arm unit 420).
- the actuator 430 of the joint portions 421a to 421f includes a motor 424, a motor driver 425, a speed reducer 426, an encoder 427, a torque sensor 428, and a drive shaft 429.
- the encoder 427, the motor 424, the speed reducer 426, and the torque sensor 428 are connected to the drive shaft 429 in series in this order.
- the motor 424 is a prime mover in the actuator 430, and rotates the drive shaft 429 around the axis.
- the motor 424 is an electric motor such as a brushless DC motor.
- the rotation of the motor 424 is controlled by supplying a current.
- the motor driver 425 is a driver circuit (driver IC (Integrated Circuit)) that rotates the motor 424 by supplying current to the motor 424, and the rotation of the motor 424 is adjusted by adjusting the amount of current supplied to the motor 424.
- the number can be controlled.
- the motor driver 425 can adjust the viscous resistance coefficient with respect to the rotational motion of the actuator 430 as described above by adjusting the amount of current supplied to the motor 424.
- the speed reducer 426 is connected to the drive shaft 429 and generates a rotational drive force (ie, torque) having a predetermined value by reducing the rotational speed of the drive shaft 429 generated by the motor 424 at a predetermined reduction ratio.
- a rotational drive force ie, torque
- the speed reducer 426 a backlashless high performance speed reducer is used.
- the speed reducer 426 may be a harmonic drive (registered trademark).
- Torque generated by the speed reducer 426 is output to a subsequent stage (not shown, for example, a connecting member such as the links 422a to 422c or the imaging unit 423) via a torque sensor 428 connected to the output shaft of the speed reducer 426. Is transmitted to.
- Encoder 427 is connected to drive shaft 429 and detects the rotational speed of drive shaft 429. Based on the relationship between the rotational speed of the drive shaft 429 detected by the encoder and the reduction ratio of the speed reducer 426, information such as the rotational angles, rotational angular velocities, and rotational angular accelerations of the joint portions 421a to 421f can be obtained.
- the torque sensor 428 is connected to the output shaft of the speed reducer 426, and detects the torque generated by the speed reducer 426, that is, the torque output by the actuator 430.
- the torque output by the actuator 430 is simply referred to as generated torque.
- the number of rotations of the motor 424 can be adjusted by adjusting the amount of current supplied to the motor 424.
- the reduction ratio in the reduction gear 426 may be set as appropriate according to the application of the robot arm device 400. Therefore, the generated torque can be controlled by appropriately adjusting the rotational speed of the motor 424 in accordance with the reduction ratio of the speed reducer 426.
- information such as the rotation angle, rotation angular velocity, and rotation angular acceleration of the joint portions 421a to 421f can be obtained based on the rotation speed of the drive shaft 429 detected by the encoder 427, and the torque sensor 428 can be obtained.
- the generated torque in the joint portions 421a to 421f can be detected.
- the torque sensor 428 can detect not only torque generated by the actuator 430 but also external torque applied from the outside. Therefore, by adjusting the amount of current that the motor driver 425 supplies to the motor 424 based on the external torque detected by the torque sensor 428, the viscous resistance coefficient with respect to the rotational motion as described above can be adjusted. It is possible to realize a state that is easy to rotate or a state that is difficult to rotate with respect to a force applied from the outside.
- FIG. 7A is a schematic diagram schematically showing a state where the torque sensor 428 shown in FIG. 6 is viewed from the axial direction of the drive shaft 429.
- the torque sensor 428 includes an outer ring portion 431, an inner ring portion 432, beam portions 433a to 433d, and strain detection elements 434a to 434d.
- the outer ring portion 431 and the inner ring portion 432 are arranged concentrically.
- the inner ring portion 432 is connected to the input side, that is, the output shaft from the speed reducer 426, and the outer ring portion 431 is connected to the output side, that is, a rear-stage output member (not shown).
- the four beam portions 433a to 433d are arranged between the outer ring portion 431 and the inner ring portion 432 arranged concentrically, and connect the outer ring portion 431 and the inner ring portion 432 to each other. As shown in FIG. 7A, the beam portions 433a to 433d are interposed between the outer ring portion 431 and the inner ring portion 432 so that the adjacent beam portions 433a to 433d form an angle of 90 degrees with each other.
- strain detection elements 434a to 434d are provided in two of the beam portions 433a to 433d facing each other, that is, provided at an angle of 180 degrees with each other. Based on the deformation amounts of the beam portions 433a to 433d detected by the strain detection elements 434a to 434d, the generated torque and the external torque of the actuator 430 can be detected.
- strain detection elements 434a and 434b are provided in the beam portion 433a, and strain detection elements 434c and 434d are provided in the beam portion 433c.
- the strain detection elements 434a and 434b are provided so as to sandwich the beam portion 433a, and the strain detection elements 434c and 434d are provided so as to sandwich the beam portion 433c.
- the strain detection elements 434a to 434d are strain gauges, and are attached to the surfaces of the beam portions 433a and 433c, thereby detecting a geometric deformation amount of the beam portions 433a and 433c based on a change in electric resistance. As shown in FIG.
- the strain detection elements 434a to 434d are provided at four locations, so that the detection elements 434a to 434d constitute a so-called Wheatstone bridge. Therefore, since the strain can be detected by using a so-called 4-gauge method, it is possible to reduce the influence of interference other than the shaft for detecting strain, the eccentricity of the drive shaft 429, temperature drift, and the like.
- the beam portions 433a to 433d serve as strain generating bodies for detecting strain.
- the types of the strain detection elements 434a to 434d according to the present embodiment are not limited to strain gauges, and other elements may be used.
- the strain detection elements 434a to 434d may be elements that detect deformation amounts of the beam portions 433a to 433d based on changes in magnetic characteristics.
- the following configuration may be applied in order to improve the detection accuracy of the torque generated by the torque sensor 428 and the external torque.
- the support moment is released by making the portion of the beam portions 433a to 433d connected to the outer ring portion 431 thinner than other portions, so that the linearity of the detected deformation amount is improved and the radial load is increased. The influence of is reduced.
- by supporting both the outer ring portion 431 and the inner ring portion 432 with the housing via the bearing it is possible to eliminate the effects of other axial forces and moments from both the input shaft and the output shaft.
- a both-end supported bearing may be disposed at the other end of the actuator 430 shown in FIG.
- the configuration of the torque sensor 428 has been described above with reference to FIG. 7A. As described above, the configuration of the torque sensor 428 shown in FIG. 7A enables highly accurate detection of the torque generated by the actuator 430 and the external torque.
- the configuration of the torque sensor 428 is not limited to the configuration shown in FIG. 7A, and may be another configuration.
- An example of a configuration other than the torque sensor 428 regarding the torque sensor applied to the actuator 430 will be described with reference to FIG. 7B.
- FIG. 7B is a schematic diagram illustrating another configuration example of the torque sensor applied to the actuator 430 illustrated in FIG. 6.
- a torque sensor 428a according to this modification includes an outer ring portion 441, an inner ring portion 442, beam portions 443a to 443d, and strain detection elements 444a to 444d.
- 7B schematically shows a state where the torque sensor 428a is viewed from the axial direction of the drive shaft 429, as in FIG. 7A.
- the functions and configurations of the outer ring portion 441, the inner ring portion 442, the beam portions 443a to 443d, and the strain detection elements 444a to 444d are the same as the outer ring portion 431 and the inner ring portion 432 of the torque sensor 428 described with reference to FIG.
- the beam portions 433a to 433d and the strain detection elements 434a to 434d have substantially the same function and configuration.
- the torque sensor 428a according to this modification differs in the configuration of the connection portion between the beam portions 443a to 443d and the outer ring portion 441. Therefore, with regard to the torque sensor 428a shown in FIG. 7B, the configuration of the connection portion between the beam portions 443a to 443d and the outer ring portion 441, which is different from the torque sensor 428 shown in FIG. Description of is omitted.
- connection portion between the beam portion 443b and the outer ring portion 441 is enlarged and shown together with the overall view of the torque sensor 428a.
- FIG. 7B only the connection portion between the beam portion 443b and the outer ring portion 441, which is one of the four connection portions between the beam portions 443a to 443d and the outer ring portion 441, is illustrated in an enlarged manner.
- the other three portions, the beam portions 443a, 443c, 443d and the connection portion of the outer ring portion 441 have the same configuration.
- the outer ring portion 441 is provided with an engagement recess, and the tip of the beam portion 443b is engaged with the engagement recess.
- gaps G1 and G2 are provided between the beam portion 443b and the outer ring portion 441.
- the gap G1 represents the gap between the beams 443b in the direction in which the beam portion 443b extends toward the outer ring portion 441
- the gap G2 represents the gap between the two in a direction orthogonal to the direction.
- the beam portions 443a to 443d and the outer ring portion 441 are disposed separately with predetermined gaps G1 and G2. That is, in the torque sensor 428a, the outer ring portion 441 and the inner ring portion 442 are separated. Accordingly, since the inner ring portion 442 is not restricted with respect to the outer ring portion 441 and has a degree of freedom of movement, for example, even if vibration occurs when the actuator 430 is driven, a distortion component due to vibration is generated between the inner ring portion 442 and the outer ring portion 441. It can be absorbed by the gaps G1 and G2. Therefore, by applying the torque sensor 428a as the torque sensor of the actuator 430, it is possible to detect the generated torque and the external torque with higher accuracy.
- various operation spaces are used in a multi-link structure (for example, the arm unit 420 shown in FIG. 5 in the present embodiment) in which a plurality of links are connected by a plurality of joints.
- Operaation Space Is a basic calculation in the whole body cooperative control of the multi-link structure, which converts the motion purpose regarding various dimensions into torque generated in a plurality of the joint portions in consideration of various constraint conditions.
- the operation space is an important concept in the force control of the robot device.
- the operation space is a space for describing the relationship between the force acting on the multi-link structure and the acceleration of the multi-link structure.
- the operation space is, for example, a joint space, a Cartesian space, a momentum space or the like to which a multi-link structure belongs.
- the motion purpose represents a target value in the drive control of the multi-link structure, and is, for example, a target value such as position, speed, acceleration, force, impedance, etc. of the multi-link structure to be achieved by the drive control.
- Constraint conditions are constraints regarding the position, speed, acceleration, force, etc. of the multi-link structure, which are determined by the shape and structure of the multi-link structure, the environment around the multi-link structure, settings by the user, and the like.
- the constraint condition includes information on generated force, priority, presence / absence of a non-driven joint, vertical reaction force, friction weight, support polygon, and the like.
- the computation algorithm includes a first stage virtual force determination process (virtual force calculation process), It is configured by a two-stage real force conversion process (real force calculation process).
- virtual force calculation process which is the first stage
- the virtual force which is a virtual force acting on the operation space, necessary to achieve each exercise purpose is considered in consideration of the priority of the exercise purpose and the maximum value of the virtual force. decide.
- actual force calculation process which is the second stage
- the virtual force obtained above is used as an actual force such as joint force and external force while taking into account constraints on non-driving joints, vertical reaction forces, friction weights, support polygons, and the like.
- a vector constituted by a certain physical quantity in each joint portion of the multi-link structure is referred to as a generalized variable q (also referred to as a joint value q or a joint space q).
- the operation space x is defined by the following formula (1) using the time differential value of the generalized variable q and the Jacobian J.
- q is a rotation angle in the joint portions 421a to 421f of the arm portion 420.
- equation (2) The equation of motion related to the operation space x is described by the following equation (2).
- f represents a force acting on the operation space x.
- ⁇ ⁇ 1 is called an operation space inertia inverse matrix
- c is called an operation space bias acceleration, which are expressed by the following equations (3) and (4), respectively.
- H is a joint space inertia matrix
- ⁇ is a joint force corresponding to the joint value q (for example, generated torque in the joint portions 421a to 421f)
- b is a term representing gravity, Coriolis force, and centrifugal force.
- the LCP can be solved using, for example, an iterative method, a pivot method, a method applying robust acceleration control, or the like.
- the operation space inertia inverse matrix ⁇ ⁇ 1 and the bias acceleration c are calculated as the above formulas (3) and (4), the calculation cost is high. Therefore, by applying the quasi-dynamics calculation (FWD) that obtains the generalized acceleration (joint acceleration) from the generalized force (joint force ⁇ ) of the multi-link structure, the operation space inertia inverse matrix ⁇ ⁇ 1 is calculated. A method of calculating at higher speed has been proposed.
- the operation space inertia inverse matrix ⁇ ⁇ 1 and the bias acceleration c are obtained by using a forward dynamics calculation FWD, so that a multi-link structure (eg, arm portion) such as a joint space q, a joint force ⁇ , and a gravity g is used. 420 and information on the forces acting on the joints 421a to 421f).
- a forward dynamics calculation FWD related to the operation space
- the operation space inertia inverse matrix ⁇ ⁇ 1 can be calculated with a calculation amount of O (N) for the number N of joints.
- the condition for achieving the target value of the operation space acceleration (represented by attaching a superscript bar to the second-order differential of x) with a virtual force f vi equal to or less than the absolute value F i is Can be expressed by the following mathematical formula (6).
- the motion purpose related to the position and speed of the operation space x can be expressed as a target value of the operation space acceleration, and specifically expressed by the following formula (7) (the position of the operation space x
- the target value of speed is expressed by adding a superscript bar to the first derivative of x and x).
- the concept of the decomposition operation space it is also possible to set a motion purpose related to an operation space (momentum, Cartesian relative coordinates, interlocking joint, etc.) represented by a linear sum of other operation spaces. It is necessary to give priority between competing exercise purposes.
- the LCP can be solved for each priority and sequentially from the low priority, and the virtual force obtained by the previous LCP can be applied as a known external force of the next LCP.
- the subscript a represents a set of drive joint portions (drive joint set), and the subscript u represents a set of non-drive joint portions (non-drive joint set). That is, the upper stage of the above formula (8) represents the balance of the force of the space (non-drive joint space) by the non-drive joint part, and the lower stage represents the balance of the force of the space (drive joint space) by the drive joint part.
- J vu and J va are a Jacobian non-drive joint component and drive joint component related to the operation space on which the virtual force f v acts, respectively.
- J eu and J ea are Jacobian non-drive joint components and drive joint components related to the operation space on which the external force fe is applied.
- ⁇ f v represents a component of the virtual force f v that cannot be realized by the actual force.
- Equation (8) The upper part of the above equation (8) is indefinite, and for example, fe and ⁇ f v can be obtained by solving a quadratic programming problem (QP: Quadratic Programming Problem) as shown in the following equation (9).
- QP Quadratic Programming Problem
- ⁇ is the difference between the upper sides of the above equation (8) and represents the equation error of equation (8).
- ⁇ is a connection vector between fe and ⁇ f v and represents a variable vector.
- Q 1 and Q 2 are positive definite symmetric matrices that represent weights at the time of minimization.
- the inequality constraint in the above formula (9) is used to express a constraint condition related to an external force such as a vertical reaction force, a friction cone, a maximum value of an external force, a support polygon, and the like.
- the inequality constraint relating to the rectangular support polygon is expressed as the following formula (10).
- z represents the normal direction of the contact surface
- x and y represent orthogonal two tangential directions perpendicular to z.
- (F x , F y , F z ) and (M x , M y , M z ) are external force and external force moment acting on the contact point.
- ⁇ t and ⁇ r are friction coefficients relating to translation and rotation, respectively.
- (D x , d y ) represents the size of the support polygon.
- the joint force ⁇ a for achieving a desired exercise purpose can be obtained by sequentially performing the virtual force calculation process and the actual force calculation process. That is, conversely, by reflecting the calculated joint force ⁇ a in the theoretical model in the motion of the joint portions 421a to 421f, the joint portions 421a to 421f are driven to achieve a desired motion purpose. .
- I a is the moment of inertia (inertia) at the joint
- ⁇ a is the torque generated by the joints 421a to 421f
- ⁇ e is the external torque acting on the joints 421a to 421f from the outside
- ⁇ a is each joint Viscosity resistance coefficient at 421a to 421f.
- the above formula (12) can be said to be a theoretical model representing the motion of the actuator 430 in the joint portions 421a to 421f.
- Modeling error may occur between the motion of the joint portions 421a to 421f and the theoretical model shown in the above equation (12) due to the influence of various disturbances.
- Modeling errors can be broadly classified into those caused by mass properties such as the weight, center of gravity, and inertia tensor of the multi-link structure, and those caused by friction and inertia in the joint portions 421a to 421f.
- the modeling error due to the former mass property can be reduced relatively easily during the construction of the theoretical model by increasing the accuracy of CAD (Computer Aided Design) data and applying an identification method.
- CAD Computer Aided Design
- the modeling error due to the friction and inertia in the latter joint portions 421a to 421f is caused by a phenomenon that is difficult to model, such as friction in the speed reducer 426 of the joint portions 421a to 421f.
- Modeling errors that cannot be ignored during model construction may remain.
- an error occurs between the value of inertia I a and viscosity resistance coefficient [nu a in the equation (12), and these values in the actual joints 421a ⁇ 421f.
- the movement of the joint portions 421a to 421f may not respond according to the theoretical model shown in the above equation (12) due to the influence of such disturbance. Therefore, even if the actual force ⁇ a that is the joint force calculated by the generalized inverse dynamics is applied, there is a case where the motion purpose that is the control target is not achieved.
- the responses of the joint portions 421a to 421f are corrected so as to perform an ideal response according to the theoretical model shown in the formula (12). Think about it.
- the ideal joint control is performed by controlling the joints so that the joints 421a to 421f of the robot arm device 400 perform an ideal response as shown in the above formula (12). Called.
- the actuator whose drive is controlled by the ideal joint control is also referred to as a virtual actuator (VA) because an ideal response is performed.
- VA virtual actuator
- FIG. 8 is an explanatory diagram for explaining ideal joint control according to an embodiment of the present disclosure.
- the conceptual calculator which performs the various calculations which concern on ideal joint control is typically shown with the block.
- an actuator 610 schematically represents the mechanism of the actuator 430 shown in FIG. 6, and includes a motor 611, a reduction gear 612, an encoder 613, and a torque sensor (Torque).
- Sensor 614 corresponds to the motor 424, the speed reducer 426, the encoder 427, and the torque sensor 428 (or the torque sensor 428a shown in FIG. 7B) shown in FIG.
- the actuator 610 responds in accordance with the theoretical model expressed by the mathematical formula (12), and when the right side of the mathematical formula (12) is given, the rotational angular acceleration of the left side is achieved. It is none other than.
- the theoretical model includes an external torque term ⁇ e that acts on the actuator 610.
- the external torque ⁇ e is measured by the torque sensor 614.
- a disturbance observer 620 is applied to calculate a disturbance estimated value ⁇ d that is an estimated value of torque caused by a disturbance based on the rotation angle q of the actuator 610 measured by the encoder 613.
- a block 631 represents an arithmetic unit that performs an operation in accordance with an ideal joint model (Ideal Joint Model) of the joint portions 421a to 421f shown in the equation (12).
- the block 631 receives the generated torque ⁇ a , the external torque ⁇ e , and the rotational angular velocity (the first derivative of the rotational angle q) as inputs, and the rotational angular acceleration target value (the rotational angle target value q ref ) shown on the left side of the equation (12). Can be output.
- the generated torque ⁇ a calculated by the method described in (5-2-2. Generalized inverse dynamics) and the external torque ⁇ e measured by the torque sensor 614 are stored in the block 631. Entered.
- a rotational angular velocity (first-order differential of the rotational angle q) is calculated by inputting the rotational angle q measured by the encoder 613 to a block 632 representing a computing unit that performs a differential operation.
- the rotational angular velocity calculated by the block 632 is input to the block 631, whereby the rotational angular acceleration target value is calculated by the block 631.
- the calculated rotational angular acceleration target value is input to block 633.
- a block 633 represents a calculator that calculates torque generated in the actuator 610 based on the rotational angular acceleration of the actuator 610.
- the block 633 can obtain the torque target value ⁇ ref by multiplying the rotational angular acceleration target value by the nominal inertia (nominal inertia) J n in the actuator 610.
- the desired motion objective should be achieved by causing the actuator 610 to generate the torque target value ⁇ ref.
- the actual response is affected by disturbances and the like. There is a case. Accordingly, in the present embodiment, to calculate the estimated disturbance value tau d by the disturbance observer 620, corrects the torque target value tau ref using the disturbance estimated value tau d.
- the disturbance observer 620 calculates a disturbance estimated value ⁇ d based on the torque command value ⁇ and the rotation angular velocity calculated from the rotation angle q measured by the encoder 613.
- the torque command value ⁇ is a torque value finally generated in the actuator 610 after the influence of the disturbance is corrected.
- the torque command value ⁇ becomes the torque target value ⁇ ref .
- the disturbance observer 620 includes a block 634 and a block 635.
- Block 634 represents a calculator that calculates torque generated in the actuator 610 based on the rotational angular velocity of the actuator 610.
- the rotational angular velocity calculated by the block 632 is input to the block 634 from the rotational angle q measured by the encoder 613.
- Block 634 obtains the rotational angular acceleration by performing an operation represented by the transfer function J n s, that is, differentiating the rotational angular velocity, and multiplies the calculated rotational angular acceleration by Nominal Inertia J n.
- an estimated value (torque estimated value) of the torque actually acting on the actuator 610 can be calculated.
- a difference between the estimated torque value and the torque command value ⁇ is taken to estimate a disturbance estimated value ⁇ d that is a torque value due to the disturbance.
- the estimated disturbance value ⁇ d may be a difference between the torque command value ⁇ in the previous control and the estimated torque value in the current control.
- the estimated torque value calculated by the block 634 is based on an actual measured value
- the torque command value ⁇ calculated by the block 633 is based on an ideal theoretical model of the joint portions 421a to 421f shown in the block 631. Therefore, by taking the difference between the two, it is possible to estimate the influence of a disturbance that is not considered in the theoretical model.
- the disturbance observer 620 is provided with a low pass filter (LPF) indicated by a block 635 in order to prevent system divergence.
- the block 635 performs the operation represented by the transfer function g / (s + g), thereby outputting only the low frequency component for the input value and stabilizing the system.
- the difference value between the estimated torque value calculated by the block 634 and the torque command value ⁇ ref is input to the block 635, and the low frequency component is calculated as the estimated disturbance value ⁇ d .
- the torque command value is a torque value that causes the actuator 610 ⁇ is calculated. Then, the actuator 610 is driven based on the torque command value ⁇ . Specifically, the torque command value ⁇ is converted into a corresponding current value (current command value), and the current command value is applied to the motor 611, whereby the actuator 610 is driven.
- the response of the actuator 610 is obtained even when there is a disturbance component such as friction. Can follow the target value. Further, the drive control of the joint portion 421a ⁇ 421f, it is possible to perform an ideal response that theoretical models according to the assumed inertia I a and viscosity resistance coefficient [nu a.
- the generalized inverse dynamics used in the present embodiment has been described above, and the ideal joint control according to the present embodiment has been described with reference to FIG.
- the drive parameters for example, the joint portions 421a to 421f of the joint portions 421a to 421f
- the whole body cooperative control is performed in which the generated torque value) is calculated in consideration of the constraint conditions.
- the generated torque value calculated by the whole body cooperative control using the generalized inverse dynamics is corrected in consideration of the influence of disturbance.
- FIG. 9 is a functional block diagram illustrating a configuration example of a robot arm control system according to an embodiment of the present disclosure.
- the configuration related to the drive control of the arm unit of the robot arm device is mainly illustrated.
- a robot arm control system 1 includes a robot arm device 10, a control device 20, and a display device 30.
- the control device 20 controls the whole body cooperative control described in (5-2-2. Generalized inverse dynamics) and the ideal described in (5-2-3. Ideal joint control). Various calculations in the joint control are performed, and the driving of the arm portion of the robot arm apparatus 10 is controlled based on the calculation results. Further, the arm unit of the robot arm device 10 is provided with an imaging unit 140 described later, and an image photographed by the imaging unit 140 is displayed on the display screen of the display device 30.
- the robot arm control system 1 shown in FIG. 9 corresponds to the robot arm control system 2 described with reference to FIG.
- the robot arm device 10 has an arm part which is a multi-link structure composed of a plurality of joint parts and a plurality of links, and is provided at the tip of the arm part by driving the arm part within a movable range. The position and orientation of the tip unit to be controlled are controlled.
- the robot arm device 10 corresponds to the robot arm device 400 shown in FIG.
- the robot arm device 10 includes an arm control unit 110 and an arm unit 120.
- the arm unit 120 includes a joint unit 130 and an imaging unit 140.
- the arm control unit 110 controls the robot arm device 10 in an integrated manner and controls the driving of the arm unit 120.
- the arm control unit 110 corresponds to the control unit (not shown in FIG. 5) described with reference to FIG.
- the arm control unit 110 includes a drive control unit 111, and the drive of the arm unit 120 is controlled by controlling the drive of the joint unit 130 by the control from the drive control unit 111.
- the drive control unit 111 controls the number of rotations of the motor by controlling the amount of current supplied to the motor in the actuator of the joint unit 130, and the rotation angle and generation in the joint unit 130. Control torque.
- the drive control of the arm unit 120 by the drive control unit 111 is performed based on the calculation result in the control device 20.
- each joint unit 130 may be provided with a joint control unit 135, and the driving of the joint unit 130 may be controlled by the joint control unit 135.
- the arm unit 120 is a multi-link structure composed of a plurality of joints and a plurality of links, and the driving thereof is controlled by the control from the arm control unit 110.
- the arm part 120 corresponds to the arm part 420 shown in FIG.
- the arm unit 120 includes a joint unit 130 and an imaging unit 140.
- the structure of the one joint part 130 is illustrated on behalf of these some joint parts.
- the joint unit 130 rotatably connects between the links in the arm unit 120, and drives the arm unit 120 by controlling the rotation drive by the control from the arm control unit 110.
- the joint portion 130 corresponds to the joint portions 421a to 421f shown in FIG.
- the joint unit 130 includes an actuator, and the configuration of the actuator is the same as the configuration illustrated in FIGS. 6, 7A, and 7B, for example.
- the joint unit 130 includes a joint drive unit 131 and a joint state detection unit 132.
- the joint drive part 131 is a drive mechanism in the actuator of the joint part 130, and when the joint drive part 131 drives, the joint part 130 rotationally drives.
- the drive of the joint drive unit 131 is controlled by the drive control unit 111.
- the joint driving unit 131 has a configuration corresponding to the motor 424 and the motor driver 425 illustrated in FIG. 6, and the driving of the joint driving unit 131 means that the motor driver 425 responds to a command from the drive control unit 111. This corresponds to driving the motor 424 by the amount.
- the joint state detection unit 132 detects the state of the joint unit 130.
- the state of the joint 130 may mean the state of motion of the joint 130.
- the state of the joint unit 130 includes information such as the rotation angle, rotation angular velocity, rotation angular acceleration, and generated torque of the joint unit 130.
- the joint state detection unit 132 includes a rotation angle detection unit 133 that detects the rotation angle of the joint unit 130, and a torque detection unit 134 that detects the generated torque and the external torque of the joint unit 130.
- the rotation angle detection unit 133 and the torque detection unit 134 correspond to the encoder 427 of the actuator 430 shown in FIG. 6 and the torque sensors 428 and 428a shown in FIGS. 7A and 7B, respectively.
- the joint state detection unit 132 transmits the detected state of the joint unit 130 to the control device 20.
- the imaging unit 140 is an example of a tip unit provided at the tip of the arm unit 120, and acquires an image to be shot.
- the imaging unit 140 corresponds to the imaging unit 423 shown in FIG.
- the imaging unit 140 is a camera or the like that can shoot a shooting target in the form of a moving image or a still image.
- the imaging unit 140 has a plurality of light receiving elements arranged two-dimensionally, and can acquire an image signal representing an image to be photographed by photoelectric conversion in the light receiving elements.
- the imaging unit 140 transmits the acquired image signal to the display device 30.
- the imaging unit 140 is actually provided at the tip of the arm unit 120, as in the robot arm device 400 illustrated in FIG. 5, where the imaging unit 423 is provided at the tip of the arm unit 420. ing.
- FIG. 9 the state in which the imaging unit 140 is provided at the distal end of the link at the final stage via the plurality of joint units 130 and the plurality of links is schematically illustrated between the joint unit 130 and the imaging unit 140. It is expressed by
- various medical instruments can be connected to the tip of the arm unit 120 as a tip unit.
- the medical instrument include various units used for the treatment, such as various surgical instruments such as a scalpel and forceps, and a unit of various inspection apparatuses such as a probe of an ultrasonic inspection apparatus.
- a unit having an imaging function such as the imaging unit 140 illustrated in FIG. 9, an endoscope, or a microscope may be included in the medical instrument.
- the robot arm apparatus 10 according to the present embodiment is a medical robot arm apparatus provided with a medical instrument.
- the robot arm control system 1 according to the present embodiment is a medical robot arm control system. It can be said that the robot arm apparatus 10 shown in FIG.
- a video microscope robot arm apparatus provided with a unit having an imaging function as a tip unit. Further, a stereo camera having two imaging units (camera units) may be provided at the tip of the arm unit 120, and shooting may be performed so that the imaging target is displayed as a 3D image.
- the control device 20 includes an input unit 210, a storage unit 220, and a control unit 230.
- the control unit 230 controls the control device 20 in an integrated manner, and performs various calculations for controlling the driving of the arm unit 120 in the robot arm device 10. Specifically, the control unit 230 performs various calculations in the whole body cooperative control and the ideal joint control in order to control the driving of the arm unit 120 of the robot arm device 10.
- the function and configuration of the control unit 230 will be described in detail.
- the whole body cooperative control and the ideal joint control are described above (5-2-2. Generalized inverse dynamics) and the above (5-2-3. Ideal. Since joint control has already been described, detailed description thereof is omitted here.
- the control unit 230 includes a whole body cooperative control unit 240, an ideal joint control unit 250, and a reference position deriving unit 260.
- the whole body cooperative control unit 240 performs various calculations related to whole body cooperative control using generalized inverse dynamics.
- the whole body cooperative control unit 240 acquires the state of the arm unit 120 (arm state) based on the state of the joint unit 130 detected by the joint state detection unit 132. Further, the whole body cooperative control unit 240 generates a generalized inverse power based on the control value for the whole body cooperative control of the arm unit 120 in the operation space based on the arm state, the motion purpose and the constraint condition of the arm unit 120. Calculate using science.
- the operation space is a space for describing the relationship between the force acting on the arm unit 120 and the acceleration generated in the arm unit 120, for example.
- the whole body cooperative control unit 240 includes an arm state acquisition unit 241, a calculation condition setting unit 242, a virtual force calculation unit 243, and a real force calculation unit 244.
- the arm state acquisition unit 241 is illustrated as one function included in the reference position deriving unit 260 for the sake of convenience, but these have similar functions. is there.
- the arm state acquisition unit 241 acquires the state (arm state) of the arm unit 120 based on the state of the joint unit 130 detected by the joint state detection unit 132.
- the arm state may mean a state of movement of the arm unit 120.
- the arm state includes information such as the position, speed, acceleration, and force of the arm unit 120.
- the joint state detection unit 132 acquires information such as the rotation angle, the rotation angular velocity, the rotation angular acceleration, and the generated torque in each joint unit 130 as the state of the joint unit 130.
- the storage unit 220 stores various types of information processed by the control device 20, and in the present embodiment, the storage unit 220 stores various types of information (arm information) about the arm unit 120.
- the arm state acquisition unit 241 can acquire the arm information from the storage unit 220. Therefore, the arm state acquisition unit 241 determines the position (coordinates) in space of the plurality of joint units 130, the plurality of links, and the imaging unit 140 based on the state of the joint unit 130 and the arm information (that is, the arm unit 120). Information such as the shape, the position and orientation of the image capturing unit 140), the force acting on each joint unit 130, the link, and the image capturing unit 140 can be acquired as an arm state.
- the arm state acquisition unit 241 transmits the acquired arm information to the calculation condition setting unit 242.
- the calculation condition setting unit 242 sets calculation conditions for calculation related to whole body cooperative control using generalized inverse dynamics.
- the calculation condition may be an exercise purpose and a constraint condition.
- the exercise purpose may be various types of information regarding the exercise of the arm unit 120.
- the purpose of motion is a target value such as the position and orientation (coordinates), speed, acceleration, and force of the imaging unit 140, or the positions (coordinates) of the joints 130 and the links of the arm unit 120. ), Target values such as speed, acceleration and force.
- the constraint condition may be various types of information that limits (restrains) the movement of the arm unit 120.
- the constraint condition may be coordinates of a region in which each component of the arm unit is not movable, a non-movable speed, an acceleration value, a force value that cannot be generated, or the like.
- the limitation range of various physical quantities in the constraint condition may be set because it is impossible to realize the structure of the arm unit 120, or may be set as appropriate by the user.
- the calculation condition setting unit 242 also includes a physical model for the structure of the arm unit 120 (for example, the number and length of links constituting the arm unit 120, the connection status through the link joint unit 130, and the movement of the joint unit 130).
- the motion condition and the constraint condition may be set by generating a control model in which the desired motion condition and the constraint condition are reflected in the physical model.
- the arm unit 120 it is possible to cause the arm unit 120 to perform a desired operation by appropriately setting the exercise purpose and the constraint condition. For example, by setting a target value for the position of the imaging unit 140 as an exercise purpose, the arm unit 120 does not enter a predetermined area in the space as well as moving the imaging unit 140 to the target position. For example, it is possible to drive the arm unit 120 by restricting movement according to the constraint conditions.
- the purpose of exercise is to move the imaging unit 140 in the plane of the cone with the treatment site as a vertex in a state where the imaging direction of the imaging unit 140 is fixed to the treatment site.
- a pivoting operation that is a pivoting operation with the axis as a pivotal axis may be used.
- the turning operation may be performed in a state where the distance between the imaging unit 140 and the point corresponding to the apex of the cone is kept constant.
- the purpose of exercise may be a content for controlling the torque generated at each joint 130.
- the purpose of the exercise is to control the state of the joint 130 so as to cancel the gravity acting on the arm 120, and to further support the movement of the arm 120 in the direction of the force applied from the outside.
- a power assist operation for controlling the state of the joint 130 may be used. More specifically, in the power assist operation, the driving of each joint unit 130 is controlled so as to cause each joint unit 130 to generate generated torque that cancels the external torque due to gravity in each joint unit 130 of the arm unit 120. Thus, the position and posture of the arm unit 120 are held in a predetermined state.
- each joint 130 is controlled so that a generated torque in the same direction as the applied external torque is generated in each joint 130.
- The By performing such a power assist operation, when the user manually moves the arm unit 120, the user can move the arm unit 120 with a smaller force, so that the arm unit 120 is moved under zero gravity. It is possible to give the user a feeling of being. It is also possible to combine the above-described pivot operation and the power assist operation.
- the exercise purpose may mean an operation (exercise) of the arm unit 120 realized in the whole body cooperative control, or an instantaneous exercise purpose (that is, an exercise purpose) in the operation.
- Target value For example, in the case of the pivot operation described above, the purpose of the image capturing unit 140 to perform the pivot operation itself is a movement purpose. However, during the pivot operation, the image capturing unit 140 within the conical surface in the pivot operation is used. Values such as position and speed are set as instantaneous exercise objectives (target values for the exercise objectives). Further, for example, in the case of the power assist operation described above, the power assist operation for supporting the movement of the arm unit 120 in the direction of the force applied from the outside is itself an exercise purpose, but the power assist operation is performed.
- the value of the generated torque in the same direction as the external torque applied to each joint portion 130 is set as an instantaneous exercise purpose (target value for the exercise purpose).
- the instantaneous movement objective for example, the target value of the position, speed, force, etc. of each component member of the arm unit 120 at a certain time
- the instantaneous movement objective are continuously achieved.
- it is a concept including both of the operations of the respective constituent members of the arm unit 120 realized over time.
- an instantaneous exercise purpose is set each time, and the calculation is repeatedly performed, so that the desired exercise purpose is finally achieved.
- the viscous resistance coefficient in the rotational motion of each joint 130 may be set as appropriate.
- the joint portion 130 according to the present embodiment is configured so that the viscous resistance coefficient in the rotational motion of the actuator 430 can be appropriately adjusted. Therefore, by setting the viscous resistance coefficient in the rotational motion of each joint portion 130 when setting the motion purpose, for example, it is possible to realize a state that is easy to rotate or a state that is difficult to rotate with respect to a force applied from the outside.
- the viscous resistance coefficient in the joint portion 130 is set to be small, so that the force required for the user to move the arm portion 120 may be smaller, and the feeling of weight given to the user may be reduced. More conducive. As described above, the viscous resistance coefficient in the rotational motion of each joint 130 may be appropriately set according to the content of the motion purpose.
- the storage unit 220 may store parameters related to calculation conditions such as exercise purpose and constraint conditions used in calculations related to whole body cooperative control.
- the calculation condition setting unit 242 can set the constraint condition stored in the storage unit 220 as the constraint condition used for the calculation of the whole body cooperative control.
- the calculation condition setting unit 242 can set the exercise purpose by a plurality of methods.
- the calculation condition setting unit 242 may set the exercise purpose based on the arm state transmitted from the arm state acquisition unit 241.
- the arm state includes information on the position of the arm unit 120 and information on the force acting on the arm unit 120. Therefore, for example, when the user intends to move the arm unit 120 manually, the arm state acquisition unit 241 also acquires information on how the user is moving the arm unit 120 as the arm state. The Therefore, the calculation condition setting unit 242 can set the position, speed, force, and the like at which the user moved the arm unit 120 as an instantaneous exercise purpose based on the acquired arm state. By setting the purpose of exercise in this way, the driving of the arm unit 120 is controlled so as to follow and support the movement of the arm unit 120 by the user.
- the calculation condition setting unit 242 may set the exercise purpose based on an instruction input by the user from the input unit 210.
- the input unit 210 is an input interface for a user to input information, commands, and the like regarding drive control of the robot arm device 10 to the control device 20, and in this embodiment, the input unit 210 from the input unit 210 by the user.
- the exercise purpose may be set based on the operation input.
- the input unit 210 has operation means operated by a user such as a lever and a pedal, for example, and the position and speed of each constituent member of the arm unit 120 according to the operation of the lever and the pedal.
- the calculation condition setting unit 242 may set as an instantaneous exercise purpose.
- the calculation condition setting unit 242 may set the exercise purpose stored in the storage unit 220 as the exercise purpose used for the calculation of the whole body cooperative control.
- the purpose of movement is to stop the imaging unit 140 at a predetermined point in space
- the coordinates of the predetermined point can be set in advance as the purpose of movement.
- the imaging purpose 140 is a motion purpose of moving on a predetermined trajectory in space
- the coordinates of each point representing the predetermined trajectory can be set in advance as the motion purpose.
- the exercise purpose may be stored in the storage unit 220 in advance.
- the purpose of motion is limited to the target value such as the position and speed in the plane of the cone
- the purpose of motion is the force as the target value. Limited to things.
- exercise objectives such as pivot action and power assist action
- information on the range and type of target values that can be set as instantaneous exercise objectives in these exercise objectives It may be stored in the storage unit 220.
- the calculation condition setting unit 242 can set the exercise purpose including various information related to the exercise purpose.
- the calculation condition setting unit 242 sets the exercise purpose may be appropriately set by the user according to the use of the robot arm device 10 or the like.
- the calculation condition setting unit 242 may also set the exercise purpose and the constraint condition by appropriately combining the above methods.
- the priority of the exercise purpose may be set in the constraint conditions stored in the storage unit 220, and when there are a plurality of different exercise purposes, the calculation condition setting unit 242 The exercise purpose may be set according to the priority of the condition.
- the calculation condition setting unit 242 transmits the arm state and the set exercise purpose and constraint condition to the virtual force calculation unit 243.
- the virtual force calculation unit 243 calculates a virtual force in a calculation related to whole body cooperative control using generalized inverse dynamics.
- the virtual force calculation process performed by the virtual force calculation unit 243 may be, for example, the series of processes described in (5-2-2-1. Virtual force calculation process) above.
- the virtual force calculation unit 243 transmits the calculated virtual force f v to the real force calculation unit 244.
- the real force calculation unit 244 calculates the real force in a calculation related to whole body cooperative control using generalized inverse dynamics.
- the real force calculation process performed by the real force calculation unit 244 may be, for example, the series of processes described above in (5-2-2-2. Real force calculation process).
- the actual force calculation unit 244 transmits the calculated actual force (generated torque) ⁇ a to the ideal joint control unit 250.
- the generated torque ⁇ a calculated by the actual force calculation unit 244 is also referred to as a control value or a control torque value in the sense of a control value of the joint unit 130 in the whole body cooperative control.
- the ideal joint control unit 250 performs various calculations related to ideal joint control that realizes an ideal response based on a theoretical model.
- the ideal joint control unit 250 corrects the influence of disturbance on the generated torque ⁇ a calculated by the actual force calculation unit 244, thereby realizing a torque command that realizes an ideal response of the arm unit 120.
- the value ⁇ is calculated. Note that the arithmetic processing performed by the ideal joint control unit 250 corresponds to the series of processing described above (5-2-3. About ideal joint control).
- the ideal joint control unit 250 includes a disturbance estimation unit 251 and a command value calculation unit 252.
- the disturbance estimation unit 251 calculates a disturbance estimated value ⁇ d based on the torque command value ⁇ and the rotation angular velocity calculated from the rotation angle q detected by the rotation angle detection unit 133.
- the torque command value ⁇ here is a command value representing the torque generated in the arm unit 120 that is finally transmitted to the robot arm device 10.
- the disturbance estimation unit 251 has a function corresponding to the disturbance observer 620 illustrated in FIG.
- the command value calculator 252 uses the estimated disturbance value ⁇ d calculated by the disturbance estimator 251, and is a torque command that is a command value representing a torque generated in the arm unit 120 that is finally transmitted to the robot arm device 10.
- the value ⁇ is calculated.
- the command value calculation unit 252 adds the disturbance estimated value ⁇ d calculated by the disturbance estimation unit 251 to ⁇ ref calculated from the ideal model of the joint unit 130 expressed by the mathematical formula (12).
- the torque command value ⁇ is calculated. For example, when the disturbance estimated value ⁇ d is not calculated, the torque command value ⁇ becomes the torque target value ⁇ ref .
- the function of the command value calculation unit 252 corresponds to functions other than the disturbance observer 620 shown in FIG.
- the ideal joint control unit 250 transmits the calculated torque command value ⁇ to the drive control unit 111 of the robot arm device 10.
- the drive control unit 111 controls the number of rotations of the motor by performing control to supply a current amount corresponding to the transmitted torque command value ⁇ to the motor in the actuator of the joint unit 130. The rotation angle and generated torque at are controlled.
- the drive control of the arm unit 120 in the robot arm device 10 is continuously performed while work using the arm unit 120 is performed. And the process demonstrated above in the control apparatus 20 is performed repeatedly. That is, the state of the joint unit 130 is detected by the joint state detection unit 132 of the robot arm device 10 and transmitted to the control device 20.
- the control device 20 performs various calculations related to the whole body cooperative control and the ideal joint control for controlling the driving of the arm unit 120 based on the state of the joint unit 130, the purpose of exercise, and the constraint condition. Is transmitted to the robot arm device 10.
- the driving of the arm unit 120 is controlled based on the torque command value ⁇ , and the state of the joint unit 130 during or after driving is detected again by the joint state detection unit 132.
- the reference position deriving unit 260 derives a reference position that is a reference point in the observation of the object by the imaging unit 140.
- the function of the reference position deriving unit 260 is described in ⁇ 2. Configuration of Robot Arm Control System> will be described in detail, and thus detailed description thereof is omitted here.
- Information about the reference position derived by the reference position deriving unit 260 is provided to the calculation condition setting unit 242 or stored in the storage unit 220, for example.
- the calculation condition setting unit 242 uses the reference position to set a constraint condition such that the reference position is the pivot center point, for example, so that the virtual force calculation unit 243 and the real force calculation unit 244 perform the pivot operation. A control value to be realized is calculated.
- control device 20 The description of other configurations of the control device 20 will be continued.
- the input unit 210 is an input interface for a user to input information, commands, and the like regarding drive control of the robot arm device 10 to the control device 20.
- the driving of the arm unit 120 of the robot arm device 10 may be controlled based on the operation input from the input unit 210 by the user, and the position and posture of the imaging unit 140 may be controlled.
- the calculation condition setting unit 242 includes the instruction information.
- the exercise purpose in the whole body cooperative control may be set. As described above, the whole body cooperative control is performed using the exercise purpose based on the instruction information input by the user, thereby realizing the driving of the arm unit 120 according to the operation input of the user.
- the input unit 210 includes operation means operated by the user such as a mouse, a keyboard, a touch panel, a button, a switch, a lever, and a pedal.
- operation means operated by the user such as a mouse, a keyboard, a touch panel, a button, a switch, a lever, and a pedal.
- the input unit 210 includes a pedal
- the user can control the driving of the arm unit 120 by operating the pedal with a foot. Therefore, even when the user is performing treatment on the patient's surgical site using both hands, the position and posture of the imaging unit 140, that is, the imaging position and the imaging angle of the surgical site by the pedal operation with the foot Can be adjusted.
- the storage unit 220 stores various types of information processed by the control device 20.
- the storage unit 220 can store various parameters used in calculations related to whole body cooperative control and ideal joint control performed by the control unit 230.
- the storage unit 220 may store an exercise purpose and a constraint condition used in a calculation related to the whole body cooperative control by the whole body cooperative control unit 240.
- the exercise purpose stored in the storage unit 220 may be an exercise purpose that can be set in advance, for example, the imaging unit 140 is stationary at a predetermined point in space.
- the constraint condition may be set in advance by the user and stored in the storage unit 220 in accordance with the geometric configuration of the arm unit 120, the use of the robot arm device 10, or the like.
- the storage unit 220 may store various types of information related to the arm unit 120 used when the arm state acquisition unit 241 acquires the arm state. Furthermore, the storage unit 220 may store calculation results in calculations related to whole body cooperative control and ideal joint control by the control unit 230, numerical values calculated in the calculation process, and the like. As described above, the storage unit 220 may store various parameters related to various processes performed by the control unit 230, and the control unit 230 performs various processes while transmitting and receiving information to and from the storage unit 220. be able to.
- control device 20 The function and configuration of the control device 20 have been described above. Note that the control device 20 according to the present embodiment can be configured by various information processing devices (arithmetic processing devices) such as a PC (Personal Computer) and a server. Next, the function and configuration of the display device 30 will be described.
- information processing devices such as a PC (Personal Computer) and a server.
- the display device 30 displays various types of information on the display screen in various formats such as text and images, thereby visually notifying the user of the information.
- the display device 30 displays an image captured by the imaging unit 140 of the robot arm device 10 on a display screen.
- the display device 30 displays on the display screen an image signal processing unit (not shown) that performs various types of image processing on the image signal acquired by the imaging unit 140 and an image based on the processed image signal. It has the function and configuration of a display control unit (not shown) that performs control to display.
- the display device 30 may have various functions and configurations that are generally included in the display device in addition to the functions and configurations described above.
- the display device 30 corresponds to the display device 550 shown in FIG.
- each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component.
- the CPU or the like may perform all functions of each component. Therefore, it is possible to appropriately change the configuration to be used according to the technical level at the time of carrying out the present embodiment.
- the arm unit 120 which is a multi-link structure in the robot arm device 10 has a degree of freedom of at least 6 degrees of freedom, and a plurality of parts constituting the arm unit 120.
- Each drive of the joint part 130 is controlled by the drive control part 111.
- a medical instrument is provided at the tip of the arm unit 120.
- the state of the joint portion 130 is detected by the joint state detection unit 132 in the robot arm device 10.
- a torque command value ⁇ as a calculation result is calculated.
- the driving of the arm unit 120 is controlled based on the torque command value ⁇ .
- the drive of the arm part 120 is controlled by the whole body cooperative control using generalized inverse dynamics. Therefore, drive control of the arm unit 120 by force control is realized, and a robot arm device with higher operability for the user is realized.
- ideal joint control is applied to drive control of the arm unit 120 together with whole body cooperative control.
- disturbance components such as friction and inertia inside the joint portion 130 are estimated, and feedforward control using the estimated disturbance components is performed. Therefore, even when there is a disturbance component such as friction, it is possible to realize an ideal response for driving the joint portion 130. Therefore, in the drive control of the arm unit 120, high-accuracy responsiveness and high positioning accuracy and stability that are less affected by vibration and the like are realized.
- each of the plurality of joint portions 130 constituting the arm portion 120 has a configuration suitable for ideal joint control, for example, as shown in FIG.
- the generated torque and the viscous resistance coefficient can be controlled by the current value.
- the driving of each joint unit 130 is controlled by the current value, and the driving of each joint unit 130 is controlled by grasping the state of the entire arm unit 120 by the whole body cooperative control.
- the robot arm device 10 can be reduced in size.
- the power assist operation controls the state of the joint portion 130 so as to cancel the gravity acting on the arm portion 120 and further supports the movement of the arm portion 120 in the direction of the force applied from the outside.
- This is an operation to control the state of. Specifically, when the user manually moves the arm unit 120, this is an operation of controlling the driving of the arm unit 120 so as to support the force applied by the user. More specifically, in order to realize the power assist operation, first, an external torque in a state where a force other than gravity is not applied to the arm unit 120 is detected by the torque detection unit 134, and the detected external torque is canceled out. An instantaneous motion purpose is set so that the generated torque is generated in each joint 130. At this stage, the position and posture of the arm unit 120 are held in a predetermined state.
- the tip unit provided at the tip of the arm unit 120 moves on a conical surface having the predetermined point as a vertex in a state where the direction of the tip unit is fixed to a predetermined point in space.
- This is a turning operation using the cone axis as a turning axis.
- the pivoting operation is performed when the imaging unit 140 provided at the tip of the arm unit 120 is fixed at a predetermined point in space in the imaging direction of the imaging unit 140. In this state, it is a swiveling operation that moves on the surface of the cone with the predetermined point as the apex, and that uses the axis of the cone as a turning axis.
- a treatment site is selected as the point corresponding to the apex of the cone in the pivoting operation.
- the turning operation may be performed in a state where the distance between the tip unit or the imaging unit 140 and the point hitting the apex of the cone is kept constant. Since the direction of the tip unit or the imaging direction of the imaging unit 140 is fixed at a predetermined point in space (for example, a treatment site), the pivot operation is also called a point lock operation.
- FIG. 10 is an explanatory diagram for describing a pivot operation that is a specific example of an arm operation according to an embodiment of the present disclosure.
- FIG. 11 is an explanatory diagram for explaining an exercise purpose and a constraint condition for realizing the pivot operation shown in FIG.
- the treatment site on the patient 750 is set at the apex in the pivoting operation.
- This vertex is called a pivot point P i .
- the imaging unit 713 that is a unit corresponding to the imaging unit 140 in FIG. 9 is illustrated in the robot arm apparatus 10 according to the present embodiment.
- the image pickup unit 713 is moved only on the circumference of the base of the cone A, i.e., the distance between the imaging unit 713 and the pivot point P i is constant
- the purpose of movement and the constraint conditions may be set so that the imaging unit 713 moves in the plane of the cone A while being maintained.
- the shape of the cone A that is, the angle ⁇ of the apex of the cone A and the distance between the pivot point Pi and the imaging unit 713 may be appropriately set by the user.
- the distance between the pivot point P i and the imaging unit 713 is adjusted to the focal length of the optical system in the imaging unit 713.
- the position of the cone where the imaging unit 713 can move may be moved while the pivot point Pi is fixed.
- the pivot axis of the cone A is substantially perpendicular to the treatment site
- the pivot axis of the cone B is substantially horizontal to the treatment site.
- the conical performing, for example, pivoting, cones A, as in B, such that it can rotate about 90 degrees while fixing the pivot point P i, exercise objectives and constraints may be set.
- FIG. 10 an example in which the motion purpose and the constraint condition are set so that the imaging unit 713 can move only on the circumference of the bottom surface of the cone A is shown.
- Such pivoting operation is not limited to this example.
- the purpose of movement and the constraint conditions are set so that the distance between the pivot point P i and the imaging unit 713 can be freely moved while the position of the pivot point P i and the angle ⁇ of the apexes of the cones A and B are fixed. May be.
- the focal length (focus) of the imaging unit 713 is appropriately adjusted.
- it becomes possible to observe the treatment site that meets the user's demands such as observing the treatment site with enlargement or reduction.
- FIG. 11 how the arm unit 710 having an imaging unit 713 is performing the pivoting operation as a base point to pivot point P i is illustrated.
- the arm part 710 has a plurality of joint parts 711a, 711b, 711c and a plurality of links 712a, 712b, 712c, and the driving thereof is controlled by the whole body cooperative control and the ideal joint control according to this embodiment.
- the arm portion 710 and its constituent members have the same configuration as the arm portion 420 and its constituent members according to the present embodiment shown in FIG.
- the arm coordinate system with zero point at the origin O A is the fulcrum of the arm portion 710, a spatial coordinate system with zero point at the origin O S in space.
- the movement of the arm unit 710 is managed in the arm coordinate system.
- the arm coordinate system and the spatial coordinate system are defined so that coordinate conversion into a mutual coordinate system is possible.
- P w be the imaging center viewed from the spatial coordinate system. Further, in the arm coordinate system from the joint portion 711c for connecting the image pickup unit 713 and the link 712c, a position apart by the focal length f of the imaging unit 713 itself length D and the image sensing unit 713 and the pivot point P i.
- the purpose of movement and the constraint conditions are set so that the arm unit 710 is driven in a state where the pivot point P i and the imaging center P w coincide with each other. That is, the arm coordinate system is provided with a constraint that fixes the pivot point P i in the arm coordinate system to the imaging center P w in the spatial coordinate system.
- the imaging unit 713 is positioned on a conical surface having the pivot point P i (that is, the imaging center P w ) as a vertex, or imaging in which the imaging unit 713 faces the pivot point P i.
- the posture of the unit 713 is set.
- the orientation of the imaging unit 713 is always set to the imaging center Pw. (i.e., pivot point P i) remains facing, and will remain a distance between the imaging unit 713 and the imaging center P w is kept at the focal length f.
- pivoting in a state where the distance between the imaging unit 713 and the imaging center P w is held constant is achieved.
- the above-described method for setting the pivot point P i may be changed.
- a position separated from the joint unit 711c by the length D of the imaging unit 713 itself by an arbitrary distance is defined as a pivot point Pi , and the arbitrary distance is set as a variable parameter. Good.
- pivot operation and power assist operation may be used in combination.
- the pivot operation and the power assist operation are used in combination, for example, when the user manually moves the imaging unit 140, the user feels as if the user has moved the imaging unit 140 under zero gravity with less force.
- the imaging unit 140 can be moved, and the moving position of the imaging unit 140 is limited to a conical surface. Therefore, the operability of moving the imaging unit 140 during the pivot operation is improved.
- the power assist operation and the pivot operation have been described as specific examples of the exercise purpose according to the present embodiment.
- the exercise purpose according to the present embodiment is not limited to such an example. In the present embodiment, for example, the following exercise purposes can be realized.
- the coordinates of the imaging unit 140 may be set as the purpose of movement so that the position of the imaging unit 140 is fixed at a predetermined point.
- the purpose of the exercise and the constraint condition are set so that the joint unit 130 and the link are also fixed at a predetermined position and do not move.
- the joint unit 130 and the link move according to a given external force
- the purpose of movement and the constraint condition can be set so that the position of the imaging unit 140 is fixed. In the latter case, for example, when the arm unit 120 hinders the work and moves the arm unit 120, the position and posture of other constituent members of the arm unit 120 are moved while the image captured by the imaging unit 140 is fixed. Control with a higher degree of freedom is realized.
- the exercise purpose and the constraint condition are set so that the operation of immediately stopping the driving of the arm unit 120 is realized. Also good. By performing such an operation, it is possible to reduce the danger when the arm unit 120 collides with a person or an object.
- the contact of the arm unit 120 with a human or an object may be detected by a change in external torque applied to the joint unit 130 by the joint state detection unit 132, for example.
- the purpose of exercise may be set so that the imaging unit 140 moves on a predetermined locus in space.
- the coordinates of each point representing the predetermined locus may be set as the purpose of movement.
- the movable range of the imaging unit 140 is limited on the locus.
- the speed of the imaging unit 140, the time for passing through each point, and the like are set as the purpose of movement, so that the imaging unit 140 moves on the predetermined trajectory at a predetermined timing.
- Such drive control according to the motion setting is effective, for example, when the robot arm device 10 automatically repeats a predetermined work.
- the purpose of exercise and the restraint conditions may be set so that an operation in which the arm unit 120 does not enter a predetermined area in the space is realized.
- the user performs an operation while viewing the display screen. Therefore, if the arm part 120 is located in the area between the user and the display screen, the user's field of view is blocked, which may lead to a reduction in the efficiency of the operation. Therefore, for example, by setting the area between the user and the display screen as the intrusion prohibited area of the arm unit 120, the efficiency of the surgery can be improved.
- the degree of freedom of the arm part 120 is greater than six degrees of freedom. This is because a degree of freedom greater than 6 degrees of freedom can be used as a redundant degree of freedom, so that it is possible to deal with the intrusion prohibited area and the like as described above while ensuring driving of 6 degrees of freedom.
- the configuration of the robot arm apparatus including the arm unit having a degree of freedom greater than 6 degrees of freedom will be described in detail with reference to FIG.
- FIG. 12 is a schematic diagram illustrating an appearance of a modified example having redundant degrees of freedom in the robot arm device according to the embodiment of the present disclosure. 12 also shows the same coordinate axis as the direction defined in FIG.
- a robot arm device 450 includes a base portion 460 and an arm portion 470.
- the arm unit 470 includes a plurality of joint portions 471a to 471g, a plurality of links 472a to 472d connected to each other by the joint portions 471a to 471g, and an imaging unit 473 provided at the tip of the arm portion 470.
- the robot arm device 450 shown in FIG. 12 corresponds to a configuration in which the degree of freedom of the arm unit 470 is increased by one with respect to the robot arm device 400 described with reference to FIG.
- the functions and configurations of the base portion 460, the individual joint portions 471a to 471g and the links 472a to 472d, and the imaging unit 473 are the same as those of the base portion 410 of the robot arm device 400 described with reference to FIG. Since the functions and configurations of the units 421a to 421f, the links 422a to 422c, and the imaging unit 423 are the same, detailed description thereof is omitted. Below, the structure of the arm part 470 which is a difference with the robot arm apparatus 400 is mainly demonstrated.
- the robot arm device 450 has seven joint portions 471a to 471g, and seven degrees of freedom are realized with respect to driving of the arm portion 470.
- one end of the link 472a is connected to the base portion 460, and the other end of the link 472a is connected to one end of the link 472b via the joint portion 421a.
- the other end of the link 422b is connected to one end of the link 472c via the joint portions 471b and 471c.
- the other end of the link 472c is connected to one end of the link 472d via joint portions 471d and 471e, and the other end of the link 472d is connected to the imaging unit 473 via joint portions 471f and 471g.
- the ends of the plurality of links 472a to 472d are connected to each other by the joint portions 471a to 471g with the base portion 460 as a fulcrum, thereby forming the arm portion 470 extending from the base portion 460.
- the joint portions 471a, 471c, 471e, and 471g have the major axis direction of each connected link 472b to 472d and the shooting direction of the connected imaging unit 473 as the rotation axis direction.
- the joint portions 471b, 471d, and 471f are configured so that the link 472c to 472d and the imaging unit 473 connected to each other have a rotation axis in the x-axis direction, which is a direction in which the coupling angle is changed in the yz plane. The direction is provided.
- the joint portions 471a, 471c, 471e, and 471g have a so-called yawing function
- the joint portions 471b, 471d, and 471f have a so-called pitching function.
- the robot arm device 450 realizes seven degrees of freedom for driving the arm portion 470. Therefore, the imaging unit is within the movable range of the arm portion 470. 473 can be freely moved in the space and has a redundancy degree of freedom.
- a hemisphere is illustrated as an example of the movable range of the imaging unit 473. If the center point of the hemisphere is the imaging center of the treatment site imaged by the imaging unit 473, the imaging unit 473 is moved on the spherical surface of the hemisphere while the imaging center of the imaging unit 473 is fixed to the center point of the hemisphere.
- the robot arm device 450 can further limit the trajectory of the arm unit 470 along with the movement of the imaging unit 473 on the hemisphere by having one redundant degree of freedom. Restraint conditions such as prohibited areas can be easily handled.
- the driving of the arm unit 470 is controlled so that the arm unit 470 does not exist between the monitor on which the image captured by the imaging unit 473 is displayed and the practitioner and the staff. Thus, it is possible to prevent the practitioner and staff from obstructing visual recognition of the monitor.
- the intrusion prohibition area it is possible to control the drive of the arm unit 470 so that the arm unit 470 moves while avoiding interference (contact) with the practitioner, staff, and other peripheral devices. Become.
- FIG. 13 is a flowchart illustrating a processing procedure of a robot arm control method according to an embodiment of the present disclosure.
- the robot arm control method according to the present embodiment is realized by the configuration of the robot arm control system 1 shown in FIG. 9 as an example. Therefore, it can be said that the robot arm control method according to the present embodiment is a medical robot arm control method.
- the functions of the components of the robot arm control system 1 shown in FIG. 9 are described in (5-2-4. Robot arm control system). The detailed description is omitted here.
- step S801 the state of the joint unit 130 is detected by the joint state detection unit 132.
- the state of the joint portion 130 is, for example, a rotation angle, generated torque, and / or external torque in the joint portion 130.
- the arm state is acquired by the arm state acquisition unit 241 based on the state of the joint unit 130 detected in step S801.
- the arm state is a state of movement of the arm unit 120, and may be, for example, the position, speed, acceleration, force acting on each component of the arm unit 120, or the like.
- step S805 based on the arm state acquired in step S803, the exercise condition and constraint conditions used in the calculation in the whole body cooperative control are set by the calculation condition setting unit 242.
- the calculation condition setting unit 242 does not have to set the exercise purpose based on the arm state.
- the calculation condition setting unit 242 sets the exercise purpose based on instruction information about driving of the arm unit 120 input by the user from the input unit 210.
- an exercise purpose stored in advance in the storage unit 220 may be used.
- the exercise purpose may be set by appropriately combining the above methods.
- the calculation condition setting unit 242 may use constraint conditions stored in advance in the storage unit 220.
- step S807 the arm state, based on the motion objects and constraints, calculation for the systemic cooperative control using the generalized inverse dynamics is performed, the control value tau a is calculated.
- the process performed in step S807 has been described in the series of processes in the virtual force calculation unit 243 and the actual force calculation unit 244 shown in FIG. 9, that is, in the above (5-2-2. Generalized inverse dynamics). It may be a series of processes.
- step S809 the estimated disturbance value ⁇ d is calculated, the calculation for the ideal joint control is performed using the estimated disturbance value ⁇ d , and the command value ⁇ is calculated from the control value ⁇ a .
- the process performed in step S809 may be a series of processes in the ideal joint control unit 250 shown in FIG. 9, that is, a series of processes described in (5-2-3. About ideal joint control).
- step S811 the drive of the joint unit 130 is controlled by the drive control unit 111 based on the command value ⁇ .
- the arm unit 120 which is a multi-link structure in the robot arm device 10 has a degree of freedom of at least 6 degrees of freedom, and a plurality of parts constituting the arm unit 120.
- Each drive of the joint part 130 is controlled by the drive control part 111.
- a medical instrument is provided at the tip of the arm unit 120.
- the state of the joint portion 130 is detected by the joint state detection unit 132 in the robot arm device 10.
- a torque command value ⁇ as a calculation result is calculated.
- the driving of the arm unit 120 is controlled based on the torque command value ⁇ .
- the drive of the arm part 120 is controlled by the whole body cooperative control using generalized inverse dynamics. Therefore, drive control of the arm unit 120 by force control is realized, and a robot arm device with higher operability for the user is realized.
- ideal joint control is applied to drive control of the arm unit 120 together with whole body cooperative control.
- disturbance components such as friction and inertia inside the joint portion 130 are estimated, and feedforward control using the estimated disturbance components is performed. Therefore, even when there is a disturbance component such as friction, it is possible to realize an ideal response for driving the joint portion 130. Therefore, in the drive control of the arm unit 120, high-accuracy responsiveness and high positioning accuracy and stability that are less affected by vibration and the like are realized.
- each of the plurality of joint portions 130 constituting the arm portion 120 has a configuration suitable for ideal joint control, for example, as shown in FIG.
- the generated torque and the viscous resistance coefficient can be controlled by the current value.
- the driving of each joint unit 130 is controlled by the current value, and the driving of each joint unit 130 is controlled by grasping the state of the entire arm unit 120 by the whole body cooperative control.
- the robot arm device 10 can be reduced in size.
- the arm unit 120 of the robot arm device 10 is driven by force control, so that even if the arm unit 120 interferes (contacts) with a practitioner or a staff during the driving, However, an excessive force is not generated, and the arm unit 120 is safely stopped. Then, if released from the interference, the arm unit 120 moves to a desired position according to the set motion purpose, and the treatment is continued.
- force control for driving control of the robot arm device 10 higher safety is ensured against interference between the arm unit 120 and surrounding objects during driving. .
- the tip unit of the arm portion of the robot arm device is the imaging unit, and the case where the surgical site is imaged by the imaging unit at the time of surgery as shown in FIG. 4 is described. Is not limited to such an example.
- the robot arm control system 1 according to the present embodiment is applicable even when a robot arm device having another tip unit is used for other purposes.
- the tip unit may be an endoscope or a laparoscope, or may be another inspection device such as an ultrasonic inspection device or a stomach camera.
- a laparoscope is inserted into a patient's body, and various types of surgical tools such as forceps and an electric scalpel are inserted while observing images taken by the laparoscope. Treatment is performed.
- a treatment method for example, if a practitioner can directly operate a treatment tool while operating a laparoscope with a robot arm, a single user can perform the treatment, and more efficiently. Treatment is possible.
- a general existing balanced arm from the viewpoint of operability, it is difficult for one user to simultaneously operate a surgical instrument with his / her hand and a laparoscope with a robot arm. It was.
- the existing method requires a plurality of staff members, and it is common for one practitioner to operate a laparoscope with a robot arm while another performs a procedure using a surgical instrument. It was.
- the robot arm device according to the present embodiment as described above, high operability by the whole body cooperative control is realized.
- the ideal joint control realizes high-accuracy responsiveness and high stability with less influence of vibration and the like. Therefore, according to the present embodiment, the operation of the laparoscope for observation by the robot arm device and the operation of the surgical tool by one's own hand can be easily performed by one practitioner.
- the robot arm device according to the present embodiment may be used for purposes other than medical treatment.
- high-accuracy responsiveness and high stability are realized by ideal joint control, and thus, for example, it is possible to cope with work such as processing and assembly of industrial parts that require high accuracy. is there.
- the joint part of the robot arm apparatus has a rotation mechanism and the drive of the arm part is controlled by controlling the rotation drive of the rotation mechanism in the above embodiment
- the present embodiment is It is not limited to such an example.
- the link constituting the arm portion has a mechanism (for example, one driven by hydraulic pressure or one driven by a ball screw) that expands and contracts in the link extending direction.
- the length may be variable.
- the driving of the arm unit is controlled so as to achieve a desired exercise purpose by, for example, whole body cooperative control in consideration of the expansion and contraction of the link in addition to the rotation at the joint unit.
- the degree of freedom of the arm portion in the robot arm apparatus is 6 degrees of freedom or more has been described, but the present embodiment is not limited to such an example.
- the present embodiment is not limited to such an example.
- various exercise purposes can be set according to the use of the robot arm device. Therefore, as long as the set exercise purpose can be achieved, the arm portion may have a degree of freedom lower than 6 degrees of freedom, and a part of the plurality of joint portions constituting the arm portion is generally used. It may be a joint part having a typical joint mechanism.
- the configuration of the arm portion only needs to be configured so as to be able to achieve the exercise purpose, and may be appropriately configured according to the use of the robot arm device.
- FIG. 14 is a functional block diagram illustrating a configuration example of the hardware configuration of the robot arm device 10 and the control device 20 according to an embodiment of the present disclosure.
- the robot arm device 10 and the control device 20 mainly include a CPU 901, a ROM 903, and a RAM 905. Further, the robot arm device 10 and the control device 20 further include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, and a drive 921. Connection port 923 and communication device 925.
- the CPU 901 functions as an arithmetic processing device and a control device, and performs all or part of the operations in the robot arm device 10 and the control device 20 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927. Control.
- the ROM 903 stores programs used by the CPU 901, calculation parameters, and the like.
- the RAM 905 primarily stores programs used by the CPU 901, parameters that change as appropriate during execution of the programs, and the like. These are connected to each other by a host bus 907 constituted by an internal bus such as a CPU bus.
- the CPU 901 corresponds to, for example, the joint control unit 135, the arm control unit 110, and the control unit 230 illustrated in FIGS.
- the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
- an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925 are connected to the external bus 911 via an interface 913.
- the input device 915 is an operation means operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, a lever, and a pedal.
- the input device 915 may be, for example, remote control means (so-called remote control) using infrared rays or other radio waves, or a mobile phone, a PDA, or the like corresponding to the operation of the robot arm device 10 and the control device 20.
- the external connection device 929 may be used.
- the input device 915 includes an input control circuit that generates an input signal based on information input by a user using the above-described operation means and outputs the input signal to the CPU 901, for example.
- the user of the robot arm device 10 and the control device 20 can input various data and instruct processing operations to the robot arm device 10 and the control device 20 by operating the input device 915.
- the input device 915 corresponds to, for example, the input unit 210 illustrated in FIG.
- an exercise purpose for driving the arm unit 120 may be set by an operation input by the user via the input device 915, and whole body cooperative control may be performed according to the exercise purpose.
- the output device 917 is a device that can notify the user of the acquired information visually or audibly. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as lamps, audio output devices such as speakers and headphones, printer devices, and the like.
- the output device 917 outputs results obtained by various processes performed by the robot arm device 10 and the control device 20, for example. Specifically, the display device displays results obtained by various processes performed by the robot arm device 10 and the control device 20 as text or images.
- the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs the analog signal.
- various types of information related to the drive control of the arm unit 120 may be output from the output device 917 in any format.
- the movement trajectory of each component of the arm unit 120 in the drive control of the arm unit 120 may be displayed on the display screen of the output device 917 in the form of a graph.
- the display device 30 shown in FIGS. 1 and 9 is a device having a function and configuration as a display device of the output device 917 and a configuration such as a control unit for controlling driving of the display device. Also good.
- the storage device 919 is a data storage device configured as an example of a storage unit of the robot arm device 10 and the control device 20.
- the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 919 stores programs executed by the CPU 901 and various data.
- the storage device 919 corresponds to, for example, the storage unit 220 illustrated in FIGS. 1 and 9.
- the storage device 919 can store calculation conditions (exercise purpose and constraint conditions) in calculations related to whole body cooperative control using generalized inverse dynamics. 20 may perform calculations related to whole body cooperative control using these calculation conditions stored in the storage device 919.
- the drive 921 is a reader / writer for a recording medium, and is built in or externally attached to the robot arm device 10 and the control device 20.
- the drive 921 reads information recorded on a removable recording medium 927 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 905.
- the drive 921 can also write a record to a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory that is mounted.
- the removable recording medium 927 is, for example, a DVD medium, an HD-DVD medium, a Blu-ray (registered trademark) medium, or the like.
- the removable recording medium 927 may be a compact flash (registered trademark) (CF: CompactFlash), a flash memory, an SD memory card (Secure Digital memory card), or the like. Further, the removable recording medium 927 may be, for example, an IC card (Integrated Circuit card) on which a non-contact IC chip is mounted, an electronic device, or the like. In the present embodiment, various types of information related to the drive control of the arm unit 120 may be read from various types of removable recording media 927 by the drive 921 or written to various types of removable recording media 927.
- CF CompactFlash
- SD memory card Secure Digital memory card
- the connection port 923 is a port for directly connecting a device to the robot arm device 10 and the control device 20.
- Examples of the connection port 923 include a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, and the like.
- As another example of the connection port 923 there are an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, and the like.
- the robot arm device 10 and the control device 20 can directly acquire various data from the external connection device 929 or provide various data to the external connection device 929.
- various types of information related to the drive control of the arm unit 120 may be read from various external connection devices 929 via the connection port 923 or written to various external connection devices 929.
- the communication device 925 is a communication interface configured with, for example, a communication device for connecting to a communication network (network) 931.
- the communication device 925 is, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
- the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communication, or the like.
- the communication device 925 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet or other communication devices.
- the communication network 931 connected to the communication device 925 is configured by a wired or wireless network, and may be, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like. .
- various types of information related to the drive control of the arm unit 120 may be transmitted / received to / from other external devices via the communication network 931 by the communication device 925.
- the robot arm device 10 naturally includes various configurations corresponding to the arm unit 120 illustrated in FIGS. 1 and 9.
- a computer program for realizing the functions of the robot arm device 10, the control device 20, and the display device 30 according to the present embodiment as described above can be produced and installed in a personal computer or the like.
- a computer-readable recording medium storing such a computer program can be provided.
- the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
- the above computer program may be distributed via a network, for example, without using a recording medium.
- a drive control configured to control the driving of the arm unit by driving each of the joint unit in cooperation with an arm unit to which an imaging unit can be connected and a plurality of links connected to each other by the joint unit. And the drive control unit uses the relative position information of the reference position with respect to the arm unit based on the state of the arm unit and the distance information between the imaging unit and the reference position.
- a robot arm device that controls driving of the arm unit such that a reference position is located on an optical axis of the imaging unit.
- the drive control unit drives the joint unit based on a control value for cooperative control of the arm unit based on the state of the arm unit acquired based on a plurality of states of the joint unit.
- the robot arm device wherein the drive control unit controls the state of the arm unit acquired based on the detected states of the joint units, and the movement of the arm unit
- the robot arm device (2), wherein the driving of the joint unit is controlled based on a control value for whole body cooperative control of the arm unit by generalized inverse dynamics using an object and a constraint condition .
- the control value includes a virtual force that is applied to achieve the motion purpose in an operation space that describes a relationship between a force that acts on the arm unit and an acceleration that is generated on the arm unit, and the constraint condition.
- the robot arm device according to (3), wherein the virtual force is calculated based on the actual force for driving the joint unit based on the above.
- the drive control unit controls the drive of the joint unit based on a command value calculated by correcting the influence of disturbance on the control value.
- the command value is calculated by correcting the control value using a disturbance estimated value representing an influence of the disturbance on the driving of the joint portion estimated based on the detected state of the joint portion.
- the robot arm device according to (5).
- the drive control unit controls the drive of the joint unit based on a constraint condition that the reference position is located on the optical axis of the image pickup unit, so that the image pickup unit faces the reference position.
- the robot arm device according to any one of (2) to (6), wherein driving of the arm unit is controlled so as to perform a pivot operation with the reference position as a vertex in a state.
- the drive control unit controls the drive of the joint unit based on a constraint condition that a predetermined point on the optical axis of the imaging unit is fixed at the reference position.
- the robot arm device according to any one of (2) to (6), wherein the drive of the arm unit is controlled so that a pivot operation centered on the reference position is performed in a state where the reference position is directed.
- the drive control unit controls the driving of the joint unit based on a constraint condition that the position and posture of the arm unit are fixed, so that the imaging unit is set in a state where the imaging unit faces a predetermined point.
- the robot arm device according to any one of (2) to (6), wherein the drive of the arm unit is controlled so as to perform an operation fixed at a position.
- the drive control unit controls the drive of the joint unit so that the position and posture of the arm unit can be arbitrarily changed in accordance with a user operation input.
- the robot arm device according to claim 1.
- the drive control unit is fixed at a predetermined position with a pivot operation having the reference position as a vertex with the imaging unit facing the reference position, and with the imaging unit facing a predetermined point.
- the robot arm device Controlling the drive of the arm unit so as to perform any one of an operation that can be arbitrarily changed according to a user operation input and a position and posture of the arm unit,
- the robot arm device according to any one of (6) to (6).
- (12) The robot arm device according to any one of (1) to (11), wherein a distance between the imaging unit and the reference position is acquired based on a focal length of the imaging unit.
- the plurality of joint units include a joint state detection unit that detects a state of the joint unit, and the joint state detection unit is a torque generated in the joint unit and an externally applied to the joint unit.
- the robot arm device according to any one of (1) to (12), further including: a torque detection unit that detects torque; and a rotation angle detection unit that detects a rotation angle of the joint unit.
- the robot arm device according to any one of (2) to (11), wherein the control value is a torque generated at the joint.
- the robot arm device according to any one of (1) to (14), further including the imaging unit.
- the imaging unit is a camera used for medical treatment.
- a computer processor is configured to acquire a state of an arm unit that is configured by connecting a plurality of links to each other by joint units and to which an imaging unit can be connected, and distance information between the imaging unit and a reference position.
- the arm so that the reference position is located on the optical axis of the imaging unit using the relative position information of the reference position with respect to the arm unit based on the function, the state of the arm unit and the distance information And a function of controlling the driving of the arm unit by driving each of the joint units in a coordinated manner based on the state of the unit.
Abstract
Description
1.ロボットアーム装置についての検討
2.ロボットアーム制御システムの構成
3.ロボットアーム制御方法
3-1.ロボットアーム制御方法の概要
3-2.ロボットアーム制御方法の処理手順
4.変形例
4-1.撮像部を基準位置に向ける操作についての変形例
4-2.撮像部と基準位置との距離情報の取得についての変形例
5.全身協調制御について
5-1.医療用ロボットアーム装置についての検討
5-2.本開示の一実施形態
5-2-1.ロボットアーム装置の外観
5-2-2.一般化逆動力学について
5-2-2-1.仮想力算出処理
5-2-2-1.実在力算出処理
5-2-3.理想関節制御について
5-2-4.ロボットアーム制御システムの構成
5-2-5.運動目的の具体例
5-3.ロボットアーム制御方法の処理手順
5-4.全身協調制御に係るロボットアーム装置のまとめ
6.ハードウェア構成
7.補足
まず、本開示の好適な一実施形態について説明するに先立ち、本開示をより明確なものとするために、本発明者らが本開示に想到するに至った背景について説明する。
まず、図1を参照して、本開示の一実施形態に係るロボットアーム制御システムの構成について説明する。図1は、本開示の一実施形態に係るロボットアーム制御システムの機能構成を示す機能ブロック図である。
次に、本開示の一実施形態に係るロボットアーム制御方法について説明する。本実施形態に係るロボットアーム制御方法では、基準位置の導出が行われた後に、当該基準位置を用いたアーム部の駆動制御(例えば、撮像装置に基準位置をピボット中心点とするピボット動作を行わせるようなアーム部の駆動制御)が行われる。以下では、まず、図2を参照して、本実施形態に係る基準位置導出方法の概要について説明する。次に、図3を参照して、当該基準位置導出方法も含めた、本実施形態に係るロボットアーム制御方法の処理手順について詳細に説明する。
まず、図2を参照して、本実施形態に係る基準位置導出方法の概要について説明する。図2は、本実施形態に係る基準位置導出方法の概要について説明するための説明図である。
次に、図3を参照して、本実施形態に係るロボットアーム制御方法の処理手順について説明する。図3は、本実施形態に係るロボットアーム制御方法の処理手順の一例を示すフロー図である。図3に示すフロー図は、本実施形態に係るロボットアーム装置においてピボット動作を行う際における、一連の処理を示している。また、図3に示す一連の処理は、例えば図1に示すロボットアーム制御システム2によって実行され得る。ここでは、図1に示すロボットアーム制御システム2の構成と対応付けながら、図3に示す各処理について説明を行う。
次に、以上説明した本実施形態に係るロボットアーム制御システム及びロボットアーム制御方法におけるいくつかの変形例について説明する。
本実施形態では、基準位置を導出する際に、撮像部140を基準位置に向ける操作が行われる。具体的には、当該操作では、撮像部140の視野領域の略中心に基準位置が来るように、アーム部120及び撮像部140の位置及び姿勢が調整される。上記実施形態では、ユーザが、アーム部120に直接外力を加えることにより、又は、リモコンやコントローラ等の各種の入力装置を介して、アーム部120を移動させ、撮像部140の位置が調整されていた。ただし、本実施形態はかかる例に限定されず、他の方法によって、撮像部140の視野領域の略中心に基準位置が来るように、アーム部120及び撮像部140の位置及び姿勢が調整されてもよい。
本実施形態では、基準位置を導出する際に、撮像部140と基準位置との距離情報が取得される。上記実施形態では、当該距離情報は、撮像部140の焦点距離に基づいて取得されていた。ただし、本実施形態はかかる例に限定されず、他の方法によって、撮像部140と基準位置との距離情報が取得されてもよい。
以下では、本実施形態に係る全身協調制御を実現するための制御システムの構成や、制御方法について説明する。ここで、上述したロボットアーム制御システム2や、ロボットアーム制御方法は、医療用のロボットアーム装置に対して好適に適用することが可能である。そこで、以下では、医療用ロボットアーム装置を例に挙げて、ロボットアーム装置の全身協調制御の一実施形態について説明を行う。
まず、本開示をより明確なものとするために、本発明者らが以下に説明する実施形態に想到するに至った背景について説明する。
以下では、本開示の一実施形態に係るロボットアーム制御システムについて説明する。本実施形態に係るロボットアーム制御システムにおいては、ロボットアーム装置に設けられる複数の関節部の駆動を、一般化逆動力学を用いた全身協調制御により制御する。更に、外乱の影響を補正することにより指令値に対する理想的な応答を実現する理想関節制御を当該関節部の駆動制御に適用する。
まず、図5を参照して、本開示の一実施形態に係るロボットアーム装置の概略構成について説明する。図5は、本開示の一実施形態に係るロボットアーム装置の外観を示す概略図である。
次に、本実施形態におけるロボットアーム装置400の全身協調制御に用いられる一般化逆動力学の概要について説明する。
多リンク構造体の各関節部におけるある物理量によって構成されるベクトルを一般化変数qと呼ぶ(関節値q又は関節空間qとも呼称する。)。操作空間xは、一般化変数qの時間微分値とヤコビアンJとを用いて、以下の数式(1)で定義される。
一般化逆動力学の第2段階である実在力算出処理では、上記(2-2-1.仮想力決定プロセス)で得られた仮想力fvを、実在の関節力と外力で置換する処理を行う。仮想力による一般化力τv=Jv Tfvを関節部に生じる発生トルクτaと外力feとで実現するための条件は、下記数式(8)で表現される。
次に、本実施形態に係る理想関節制御について説明する。各関節部421a~421fの運動は、下記数式(12)の二次遅れ系の運動方程式によってモデル化される。
動目的を実現するために各関節部421a~421fに作用させるべき実在力であるτaを算出することができる。従って、理想的には、算出された各τaを上記数式(12)に適用することにより、上記数式(12)に示す理論モデルに従った応答が実現する、すなわち、所望の運動目的が達成されるはずである。
次に、上記(5-2-2.一般化逆動力学について)及び上記(5-2-3.理想関節制御について)で説明した全身協調制御や理想関節制御がロボットアーム装置の駆動制御に適用された、本実施形態に係るロボットアーム制御システムの構成について説明する。
次に、本実施形態に係る運動目的の具体例について説明する。上記(5-2-4.ロボットアーム制御システムの構成)で説明したように、本実施形態においては、全身協調制御によって各種の運動目的が実現される。ここでは、本実施形態に係る運動目的の具体例として、パワーアシスト動作と、ピボット動作について説明する。なお、以下の運動目的の具体例についての説明では、図9に示す機能ブロック図における参照番号を用いて、本実施形態に係るロボットアーム制御システムの構成部材を表す。
次に、図13を参照して、本開示の一実施形態に係るロボットアーム制御方法の処理手順について説明する。図13は、本開示の一実施形態に係るロボットアーム制御方法の処理手順を示すフロー図である。なお、以下では、図9に示すロボットアーム制御システム1の構成によって本実施形態に係るロボットアーム制御方法が実現される場合を例に挙げて説明を行う。従って、本実施形態に係るロボットアーム制御方法は医療用ロボットアーム制御方法であると言える。なお、以下の本実施形態に係るロボットアーム制御方法の処理手順についての説明において、図9に示すロボットアーム制御システム1の各構成の機能については、上記(5-2-4.ロボットアーム制御システムの構成)で既に説明しているため、詳細な説明は省略する。
以上説明したように、本実施形態においては、以下の効果を得ることができる。
次に、図14を参照しながら、図1及び図9に示す、本実施形態に係るロボットアーム装置10及び制御装置20のハードウェア構成について、詳細に説明する。図14は、本開示の一実施形態に係るロボットアーム装置10及び制御装置20のハードウェア構成の一構成例を示す機能ブロック図である。
以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
(1)複数のリンクが関節部によって互いに連結されて構成され、撮像部が接続可能なアーム部と、前記関節部の各々を協調させて駆動させることにより前記アーム部の駆動を制御する駆動制御部と、を備え、前記駆動制御部は、前記アーム部の状態と、前記撮像部と基準位置との距離情報と、に基づく、前記アーム部に対する前記基準位置の相対位置情報を用いて、前記基準位置が前記撮像部の光軸上に位置するように前記アーム部の駆動を制御する、ロボットアーム装置。
(2)前記駆動制御部は、複数の前記関節部の状態に基づいて取得される前記アーム部の状態に基づく、前記アーム部の協調制御のための制御値に基づいて前記関節部の駆動を制御する、前記(1)に記載のロボットアーム装置
(3)前記駆動制御部は、検出された複数の前記関節部の状態に基づいて取得される前記アーム部の状態と、前記アーム部の運動目的及び拘束条件と、を用いた一般化逆動力学による前記アーム部の全身協調制御のための制御値、に基づいて前記関節部の駆動を制御する、前記(2)に記載のロボットアーム装置。
(4)前記制御値は、前記アーム部に作用する力と前記アーム部に発生する加速度との関係を記述する操作空間において前記運動目的を達成するために作用される仮想力と、前記拘束条件に基づいて前記仮想力が前記関節部を駆動するための実在力と、に基づいて算出される、前記(3)に記載のロボットアーム装置。
(5)前記駆動制御部は、前記制御値に対して外乱の影響を補正することにより算出される指令値、に基づいて前記関節部の駆動を制御する、前記(2)~(4)のいずれか1項に記載のロボットアーム装置。
(6)前記指令値は、検出された前記関節部の状態に基づいて推定される前記関節部の駆動に対する外乱の影響を表す外乱推定値を用いて、前記制御値を補正することにより算出される、前記(5)に記載のロボットアーム装置。
(7)前記駆動制御部は、前記撮像部の光軸上に前記基準位置が位置するという拘束条件に基づいて前記関節部の駆動を制御することにより、前記撮像部が前記基準位置を向いた状態で前記基準位置を頂点とするピボット動作を行うように、前記アーム部の駆動を制御する、前記(2)~(6)のいずれか1項に記載のロボットアーム装置。
(8)前記駆動制御部は、前記撮像部の光軸上の所定の点が前記基準位置に固定されるという拘束条件に基づいて前記関節部の駆動を制御することにより、前記撮像部が前記基準位置を向いた状態で前記基準位置を中心とするピボット動作を行うように、前記アーム部の駆動を制御する、前記(2)~(6)のいずれか1項に記載のロボットアーム装置。
(9)前記駆動制御部は、前記アーム部の位置及び姿勢が固定されるという拘束条件に基づいて前記関節部の駆動を制御することにより、前記撮像部が所定の点を向いた状態で所定の位置で固定される動作を行うように、前記アーム部の駆動を制御する、前記(2)~(6)のいずれか1項に記載のロボットアーム装置。
(10)前記駆動制御部は、ユーザの操作入力に応じて前記アーム部の位置及び姿勢を任意に変更可能なように前記関節部の駆動を制御する、前記(2)~(6)のいずれか1項に記載のロボットアーム装置。
(11)前記駆動制御部は、前記撮像部が前記基準位置を向いた状態で前記基準位置を頂点とするピボット動作と、前記撮像部が所定の点を向いた状態で所定の位置で固定される動作と、ユーザの操作入力に応じて前記アーム部の位置及び姿勢を任意に変更可能な動作と、のいずれかの動作を行うように、前記アーム部の駆動を制御する、前記(2)~(6)のいずれか1項に記載のロボットアーム装置。
(12)前記撮像部と前記基準位置との距離は、前記撮像部の焦点距離に基づいて取得される、前記(1)~(11)のいずれか1項に記載のロボットアーム装置。
(13)複数の前記関節部は、前記関節部の状態を検出する関節状態検出部を有し、前記関節状態検出部は、前記関節部での発生トルク及び前記関節部に外部から加えられる外トルクを検出するトルク検出部と、前記関節部の回転角度を検出する回転角度検出部と、を少なくとも有する、前記(1)~(12)のいずれか1項に記載のロボットアーム装置。
(14)前記制御値は、前記関節部での発生トルクである、前記(2)~(11)のいずれか1項に記載のロボットアーム装置。
(15)前記撮像部を更に備える、前記(1)~(14)のいずれか1項に記載のロボットアーム装置。
(16)前記撮像部は、医療処置に用いられるカメラである、前記(1)~(15)のいずれか1項に記載のロボットアーム装置。
(17)複数のリンクが関節部によって互いに連結されて構成され撮像部が接続可能なアーム部の状態を取得することと、前記撮像部と基準位置との距離情報を取得することと、前記アーム部の状態と前記距離情報とに基づく、前記アーム部に対する前記基準位置の相対位置情報を用いて、前記基準位置が前記撮像部の光軸上に位置するように、前記アーム部の状態に基づいて前記関節部の各々を協調させて駆動させて前記アーム部の駆動を制御することと、を含む、ロボットアーム制御方法。
(18)コンピュータのプロセッサに、複数のリンクが関節部によって互いに連結されて構成され撮像部が接続可能なアーム部の状態を取得する機能と、前記撮像部と基準位置との距離情報を取得する機能と、前記アーム部の状態と前記距離情報とに基づく、前記アーム部に対する前記基準位置の相対位置情報を用いて、前記基準位置が前記撮像部の光軸上に位置するように、前記アーム部の状態に基づいて前記関節部の各々を協調させて駆動させて前記アーム部の駆動を制御する機能と、を実現させる、プログラム。
10 ロボットアーム装置
20 制御装置
30 表示装置
110 アーム制御部
111 駆動制御部
120 アーム部
130 関節部
131 関節駆動部
132 回転角度検出部
133 トルク検出部
135 関節制御部
140 撮像部
210 入力部
220 記憶部
230 制御部
240 全身協調制御部
241 アーム状態取得部
242 演算条件設定部
243 仮想力算出部
244 実在力算出部
250 理想関節制御部
251 外乱推定部
252 指令値算出部
260 基準位置導出部
261 距離情報取得部
262 相対位置算出部
Claims (18)
- 複数のリンクが関節部によって互いに連結されて構成され、撮像部が接続可能なアーム部と、
前記関節部の各々を協調させて駆動させることにより前記アーム部の駆動を制御する駆動制御部と、
を備え、
前記駆動制御部は、前記アーム部の状態と、前記撮像部と基準位置との距離情報と、に基づく、前記アーム部に対する前記基準位置の相対位置情報を用いて、前記基準位置が前記撮像部の光軸上に位置するように前記アーム部の駆動を制御する、
ロボットアーム装置。 - 前記駆動制御部は、複数の前記関節部の状態に基づいて取得される前記アーム部の状態に基づく、前記アーム部の協調制御のための制御値に基づいて前記関節部の駆動を制御する、
請求項1に記載のロボットアーム装置。 - 前記駆動制御部は、検出された複数の前記関節部の状態に基づいて取得される前記アーム部の状態と、前記アーム部の運動目的及び拘束条件と、を用いた一般化逆動力学による前記アーム部の全身協調制御のための制御値、に基づいて前記関節部の駆動を制御する、
請求項2に記載のロボットアーム装置。 - 前記制御値は、前記アーム部に作用する力と前記アーム部に発生する加速度との関係を記述する操作空間において前記運動目的を達成するために作用される仮想力と、前記拘束条件に基づいて前記仮想力が前記関節部を駆動するための実在力と、に基づいて算出される、
請求項3に記載のロボットアーム装置。 - 前記駆動制御部は、前記制御値に対して外乱の影響を補正することにより算出される指令値、に基づいて前記関節部の駆動を制御する、
請求項2に記載のロボットアーム装置。 - 前記指令値は、検出された前記関節部の状態に基づいて推定される前記関節部の駆動に対する外乱の影響を表す外乱推定値を用いて、前記制御値を補正することにより算出される、
請求項5に記載のロボットアーム装置。 - 前記駆動制御部は、前記撮像部の光軸上に前記基準位置が位置するという拘束条件に基づいて前記関節部の駆動を制御することにより、前記撮像部が前記基準位置を向いた状態で前記基準位置を頂点とするピボット動作を行うように、前記アーム部の駆動を制御する、
請求項2に記載のロボットアーム装置。 - 前記駆動制御部は、前記撮像部の光軸上の所定の点が前記基準位置に固定されるという拘束条件に基づいて前記関節部の駆動を制御することにより、前記撮像部が前記基準位置を向いた状態で前記基準位置を中心とするピボット動作を行うように、前記アーム部の駆動を制御する、
請求項2に記載のロボットアーム装置。 - 前記駆動制御部は、前記アーム部の位置及び姿勢が固定されるという拘束条件に基づいて前記関節部の駆動を制御することにより、前記撮像部が所定の点を向いた状態で所定の位置で固定される動作を行うように、前記アーム部の駆動を制御する、
請求項2に記載のロボットアーム装置。 - 前記駆動制御部は、ユーザの操作入力に応じて前記アーム部の位置及び姿勢を任意に変更可能なように前記関節部の駆動を制御する、
請求項2に記載のロボットアーム装置。 - 前記駆動制御部は、
前記撮像部が前記基準位置を向いた状態で前記基準位置を頂点とするピボット動作と、
前記撮像部が所定の点を向いた状態で所定の位置で固定される動作と、
ユーザの操作入力に応じて前記アーム部の位置及び姿勢を任意に変更可能な動作と、
のいずれかの動作を行うように、前記アーム部の駆動を制御する、
請求項2に記載のロボットアーム装置。 - 前記撮像部と前記基準位置との距離は、前記撮像部の焦点距離に基づいて取得される、
請求項1に記載のロボットアーム装置。 - 複数の前記関節部は、前記関節部の状態を検出する関節状態検出部を有し、
前記関節状態検出部は、
前記関節部での発生トルク及び前記関節部に外部から加えられる外トルクを検出するトルク検出部と、
前記関節部の回転角度を検出する回転角度検出部と、
を少なくとも有する、
請求項1に記載のロボットアーム装置。 - 前記制御値は、前記関節部での発生トルクである、
請求項2に記載のロボットアーム装置。 - 前記撮像部を更に備える、
請求項1に記載のロボットアーム装置。 - 前記撮像部は、医療処置に用いられるカメラである、
請求項1に記載のロボットアーム装置。 - 複数のリンクが関節部によって互いに連結されて構成され撮像部が接続可能なアーム部の状態を取得することと、
前記撮像部と基準位置との距離情報を取得することと、
前記アーム部の状態と前記距離情報とに基づく、前記アーム部に対する前記基準位置の相対位置情報を用いて、前記基準位置が前記撮像部の光軸上に位置するように、前記アーム部の状態に基づいて前記関節部の各々を協調させて駆動させて前記アーム部の駆動を制御することと、
を含む、ロボットアーム制御方法。 - コンピュータのプロセッサに、
複数のリンクが関節部によって互いに連結されて構成され撮像部が接続可能なアーム部の状態を取得する機能と、
前記撮像部と基準位置との距離情報を取得する機能と、
前記アーム部の状態と前記距離情報とに基づく、前記アーム部に対する前記基準位置の相対位置情報を用いて、前記基準位置が前記撮像部の光軸上に位置するように、前記アーム部の状態に基づいて前記関節部の各々を協調させて駆動させて前記アーム部の駆動を制御する機能と、
を実現させる、プログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580009570.XA CN106061427B (zh) | 2014-02-28 | 2015-02-12 | 机器人臂设备、机器人臂控制方法和程序 |
US15/119,671 US10561469B2 (en) | 2014-02-28 | 2015-02-12 | Robot arm apparatus and robot arm control method |
EP15754874.4A EP3135444B1 (en) | 2014-02-28 | 2015-02-12 | Robot arm apparatus, robot arm control method, and program |
JP2016505143A JP6614130B2 (ja) | 2014-02-28 | 2015-02-12 | 手術用ロボットアーム装置、手術用ロボットアーム制御方法及びプログラム |
US16/743,104 US11633245B2 (en) | 2014-02-28 | 2020-01-15 | Robot arm apparatus and robot arm control method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014038654 | 2014-02-28 | ||
JP2014-038654 | 2014-02-28 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/119,671 A-371-Of-International US10561469B2 (en) | 2014-02-28 | 2015-02-12 | Robot arm apparatus and robot arm control method |
US16/743,104 Continuation US11633245B2 (en) | 2014-02-28 | 2020-01-15 | Robot arm apparatus and robot arm control method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015129474A1 true WO2015129474A1 (ja) | 2015-09-03 |
Family
ID=54008797
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/053876 WO2015129474A1 (ja) | 2014-02-28 | 2015-02-12 | ロボットアーム装置、ロボットアーム制御方法及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (2) | US10561469B2 (ja) |
EP (1) | EP3135444B1 (ja) |
JP (1) | JP6614130B2 (ja) |
CN (1) | CN106061427B (ja) |
WO (1) | WO2015129474A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105690402A (zh) * | 2016-04-12 | 2016-06-22 | 上海应用技术学院 | 一种服务机器人的臂部结构及其控制方法 |
WO2017082047A1 (ja) * | 2015-11-13 | 2017-05-18 | オリンパス株式会社 | 内視鏡システム |
JP2017113343A (ja) * | 2015-12-25 | 2017-06-29 | ソニー株式会社 | 医療用撮像装置及び手術ナビゲーションシステム |
JP2019165839A (ja) * | 2018-03-22 | 2019-10-03 | 株式会社デンソー | 治療装置 |
JP2020520745A (ja) * | 2017-05-25 | 2020-07-16 | コヴィディエン リミテッド パートナーシップ | 自動誘導付きロボット外科システム |
JP2021118912A (ja) * | 2016-01-11 | 2021-08-12 | カール・ツアイス・メディテック・アーゲー | トルク補償用スタンドおよび方法 |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10383765B2 (en) | 2012-04-24 | 2019-08-20 | Auris Health, Inc. | Apparatus and method for a global coordinate system for use in robotic surgery |
US10499999B2 (en) | 2014-10-09 | 2019-12-10 | Auris Health, Inc. | Systems and methods for aligning an elongate member with an access site |
EP3302335A4 (en) | 2015-06-03 | 2019-02-20 | Covidien LP | OFFSET INSTRUMENT DRIVE UNIT |
JP6704255B2 (ja) * | 2016-01-19 | 2020-06-03 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用観察装置、医療用観察システム及び画揺れ補正方法 |
US10667723B2 (en) | 2016-02-19 | 2020-06-02 | Covidien Lp | Systems and methods for video-based monitoring of vital signs |
WO2017169649A1 (ja) * | 2016-03-28 | 2017-10-05 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用観察装置、駆動制御方法、医療用観察システム及び支持アーム装置 |
EP3542973B1 (en) * | 2016-11-17 | 2021-09-22 | Fuji Corporation | Work robot and work position correction method |
US10917543B2 (en) * | 2017-04-24 | 2021-02-09 | Alcon Inc. | Stereoscopic visualization camera and integrated robotics platform |
KR102558063B1 (ko) | 2017-06-28 | 2023-07-25 | 아우리스 헬스, 인코포레이티드 | 전자기장 생성기 정렬 |
US11395703B2 (en) | 2017-06-28 | 2022-07-26 | Auris Health, Inc. | Electromagnetic distortion detection |
US10464209B2 (en) | 2017-10-05 | 2019-11-05 | Auris Health, Inc. | Robotic system with indication of boundary for robotic arm |
US10016900B1 (en) | 2017-10-10 | 2018-07-10 | Auris Health, Inc. | Surgical robotic arm admittance control |
WO2019087904A1 (ja) * | 2017-11-01 | 2019-05-09 | ソニー株式会社 | 手術アームシステム及び手術アーム制御システム |
EP3681394A1 (en) | 2017-11-13 | 2020-07-22 | Covidien LP | Systems and methods for video-based monitoring of a patient |
CN111565638B (zh) | 2018-01-08 | 2023-08-15 | 柯惠有限合伙公司 | 用于基于视频的非接触式潮气容积监测的系统和方法 |
JP7079123B2 (ja) * | 2018-03-15 | 2022-06-01 | キヤノン株式会社 | 撮像装置及びその制御方法、撮像システム |
CN108789404B (zh) * | 2018-05-25 | 2021-06-18 | 哈尔滨工程大学 | 一种基于视觉的串联机器人运动学参数标定方法 |
EP3806727A1 (en) * | 2018-06-15 | 2021-04-21 | Covidien LP | Systems and methods for video-based patient monitoring during surgery |
CN112584753A (zh) | 2018-08-09 | 2021-03-30 | 柯惠有限合伙公司 | 基于视频的患者监测系统以及用于检测和监测呼吸的相关方法 |
WO2020075423A1 (ja) * | 2018-10-10 | 2020-04-16 | ソニー株式会社 | ロボット制御装置、ロボット制御方法及びロボット制御プログラム |
CN109330544A (zh) * | 2018-11-16 | 2019-02-15 | 中国医学科学院北京协和医院 | 一种腔镜支架及腔镜系统 |
US11617520B2 (en) | 2018-12-14 | 2023-04-04 | Covidien Lp | Depth sensing visualization modes for non-contact monitoring |
US20200231082A1 (en) * | 2019-01-21 | 2020-07-23 | Kevin Arnold Morran | Remote controlled lighting apparatus |
US11315275B2 (en) | 2019-01-28 | 2022-04-26 | Covidien Lp | Edge handling methods for associated depth sensing camera devices, systems, and methods |
EP3975907A4 (en) * | 2019-06-03 | 2023-06-21 | Covidien LP | EXTERNAL TORQUE OBSERVATION AND COMPENSATION SYSTEM AND APPARATUS FOR SURGICAL ROBOTIC ARM |
EP3753520A1 (de) | 2019-06-19 | 2020-12-23 | Karl Storz SE & Co. KG | Medizinische handhabungsvorrichtung zur steuerung einer handhabungsvorrichtung |
US11625107B2 (en) * | 2019-06-27 | 2023-04-11 | Intuitive Surgical Operations, Inc. | System and method for motion mode management |
KR20220056220A (ko) | 2019-09-03 | 2022-05-04 | 아우리스 헬스, 인코포레이티드 | 전자기 왜곡 검출 및 보상 |
CN110557568B (zh) * | 2019-09-05 | 2021-04-09 | 昭世(北京)科技有限公司 | 一种基于人工智能模块的摄影设备结构及方法 |
US11484208B2 (en) | 2020-01-31 | 2022-11-01 | Covidien Lp | Attached sensor activation of additionally-streamed physiological parameters from non-contact monitoring systems and associated devices, systems, and methods |
CN111714210B (zh) * | 2020-06-30 | 2024-02-13 | 深圳市精锋医疗科技股份有限公司 | 手术机器人及其控制装置、控制方法 |
CN113855286B (zh) * | 2021-09-24 | 2023-01-10 | 四川锋准机器人科技有限公司 | 一种种植牙机器人导航系统及方法 |
CN116919330B (zh) * | 2023-09-19 | 2023-12-29 | 北京大学第三医院(北京大学第三临床医学院) | 一种纤维支气管镜 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001300875A (ja) * | 2000-04-19 | 2001-10-30 | Denso Corp | ロボットシステム |
JP2009028851A (ja) * | 2007-07-27 | 2009-02-12 | Nachi Fujikoshi Corp | ロボット制御装置 |
WO2009110242A1 (ja) * | 2008-03-06 | 2009-09-11 | パナソニック株式会社 | マニピュレータおよびその制御方法 |
JP2010228064A (ja) * | 2009-03-27 | 2010-10-14 | National Institute Of Advanced Industrial Science & Technology | 福祉用ロボット装置のロボットアーム操作方法、ロボットアーム操作プログラム、及び、記録媒体 |
JP2012081568A (ja) * | 2010-10-14 | 2012-04-26 | Sony Corp | ロボットの制御装置及び制御方法、並びにコンピューター・プログラム |
JP2012091280A (ja) * | 2010-10-27 | 2012-05-17 | Mitsubishi Electric Corp | 座標系校正方法及びロボットシステム |
WO2012087929A2 (en) * | 2010-12-21 | 2012-06-28 | Restoration Robotics, Inc. | Methods and systems for directing movement of a tool in hair transplantation procedures |
WO2013132501A1 (en) * | 2012-03-07 | 2013-09-12 | M.S.T. Medical Surgery Technologies Ltd. | Overall endoscopic control system |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6290703A (ja) * | 1985-10-17 | 1987-04-25 | Nissan Motor Co Ltd | ロボツト制御装置 |
JPH0544011A (ja) | 1991-08-07 | 1993-02-23 | Toyota Motor Corp | 硬質無機薄膜形成方法 |
JP2000115809A (ja) * | 1998-09-29 | 2000-04-21 | Mitsubishi Heavy Ind Ltd | 三次元表示マンマシン・システム |
US6493608B1 (en) * | 1999-04-07 | 2002-12-10 | Intuitive Surgical, Inc. | Aspects of a control system of a minimally invasive surgical apparatus |
JP3421608B2 (ja) * | 1999-04-08 | 2003-06-30 | ファナック株式会社 | 教示モデル生成装置 |
US8004229B2 (en) * | 2005-05-19 | 2011-08-23 | Intuitive Surgical Operations, Inc. | Software center and highly configurable robotic systems for surgery and other uses |
US8768516B2 (en) * | 2009-06-30 | 2014-07-01 | Intuitive Surgical Operations, Inc. | Control of medical robotic system manipulator about kinematic singularities |
JP4032410B2 (ja) * | 2001-11-09 | 2008-01-16 | ソニー株式会社 | 情報処理システムおよび情報処理方法、プログラムおよび記録媒体、並びに情報処理装置 |
EP1327421B1 (de) | 2001-12-18 | 2004-03-10 | BrainLAB AG | Projektion von Patientenbilddaten aus Durchleuchtungs- bzw. Schichtbilderfassungsverfahren auf Videobilder |
JP3766805B2 (ja) * | 2002-03-15 | 2006-04-19 | 株式会社日立製作所 | 手術支援装置 |
US6940891B2 (en) * | 2002-10-28 | 2005-09-06 | Metron Systems, Inc. | High precision optical imaging systems and related systems |
JP4064323B2 (ja) | 2003-09-08 | 2008-03-19 | オリンパス株式会社 | 手術用顕微鏡装置 |
JP2006293624A (ja) * | 2005-04-08 | 2006-10-26 | Mitsubishi Electric Corp | 多軸制御装置 |
US7706683B2 (en) * | 2005-05-31 | 2010-04-27 | Brainlab Ag | Self adjusting operation lamp system |
US7907166B2 (en) * | 2005-12-30 | 2011-03-15 | Intuitive Surgical Operations, Inc. | Stereo telestration for robotic surgery |
CN201239247Y (zh) * | 2008-08-04 | 2009-05-20 | 苏州捷美医疗器械有限公司 | 医用显微镜的照明系统 |
JP2010082188A (ja) * | 2008-09-30 | 2010-04-15 | Olympus Corp | 手術マニピュレータシステム |
JP5093058B2 (ja) * | 2008-11-04 | 2012-12-05 | 株式会社デンソーウェーブ | ロボットの座標の結合方法 |
US9492927B2 (en) * | 2009-08-15 | 2016-11-15 | Intuitive Surgical Operations, Inc. | Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose |
JP5571432B2 (ja) * | 2010-03-30 | 2014-08-13 | カール シュトルツ ゲゼルシャフト ミット ベシュレンクテル ハフツング ウント コンパニー コマンディートゲゼルシャフト | 医療用ロボットシステム |
US9498231B2 (en) * | 2011-06-27 | 2016-11-22 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
JP5902425B2 (ja) * | 2011-09-21 | 2016-04-13 | 株式会社東芝 | ロボット制御装置、外乱判定方法およびアクチュエータ制御方法 |
EP2863827B1 (en) * | 2012-06-21 | 2022-11-16 | Globus Medical, Inc. | Surgical robot platform |
JP6128767B2 (ja) * | 2012-07-05 | 2017-05-17 | キヤノン株式会社 | ロボット制御装置、及びロボット制御方法 |
KR20230156801A (ko) * | 2012-08-03 | 2023-11-14 | 스트리커 코포레이션 | 로봇 수술을 위한 시스템 및 방법 |
WO2014139023A1 (en) * | 2013-03-15 | 2014-09-18 | Synaptive Medical (Barbados) Inc. | Intelligent positioning system and methods therefore |
US9827054B2 (en) * | 2014-03-14 | 2017-11-28 | Synaptive Medical (Barbados) Inc. | Intelligent positioning system and methods therefore |
US20150039517A1 (en) | 2013-08-05 | 2015-02-05 | Mozido, Inc. | Cloud entertainment platform |
-
2015
- 2015-02-12 WO PCT/JP2015/053876 patent/WO2015129474A1/ja active Application Filing
- 2015-02-12 US US15/119,671 patent/US10561469B2/en active Active
- 2015-02-12 EP EP15754874.4A patent/EP3135444B1/en active Active
- 2015-02-12 CN CN201580009570.XA patent/CN106061427B/zh active Active
- 2015-02-12 JP JP2016505143A patent/JP6614130B2/ja active Active
-
2020
- 2020-01-15 US US16/743,104 patent/US11633245B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001300875A (ja) * | 2000-04-19 | 2001-10-30 | Denso Corp | ロボットシステム |
JP2009028851A (ja) * | 2007-07-27 | 2009-02-12 | Nachi Fujikoshi Corp | ロボット制御装置 |
WO2009110242A1 (ja) * | 2008-03-06 | 2009-09-11 | パナソニック株式会社 | マニピュレータおよびその制御方法 |
JP2010228064A (ja) * | 2009-03-27 | 2010-10-14 | National Institute Of Advanced Industrial Science & Technology | 福祉用ロボット装置のロボットアーム操作方法、ロボットアーム操作プログラム、及び、記録媒体 |
JP2012081568A (ja) * | 2010-10-14 | 2012-04-26 | Sony Corp | ロボットの制御装置及び制御方法、並びにコンピューター・プログラム |
JP2012091280A (ja) * | 2010-10-27 | 2012-05-17 | Mitsubishi Electric Corp | 座標系校正方法及びロボットシステム |
WO2012087929A2 (en) * | 2010-12-21 | 2012-06-28 | Restoration Robotics, Inc. | Methods and systems for directing movement of a tool in hair transplantation procedures |
WO2013132501A1 (en) * | 2012-03-07 | 2013-09-12 | M.S.T. Medical Surgery Technologies Ltd. | Overall endoscopic control system |
Non-Patent Citations (1)
Title |
---|
See also references of EP3135444A4 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017082047A1 (ja) * | 2015-11-13 | 2017-05-18 | オリンパス株式会社 | 内視鏡システム |
JP2017113343A (ja) * | 2015-12-25 | 2017-06-29 | ソニー株式会社 | 医療用撮像装置及び手術ナビゲーションシステム |
CN108366833A (zh) * | 2015-12-25 | 2018-08-03 | 索尼公司 | 手术信息处理设备和方法 |
CN108366833B (zh) * | 2015-12-25 | 2021-10-12 | 索尼公司 | 手术信息处理设备和方法 |
JP2021118912A (ja) * | 2016-01-11 | 2021-08-12 | カール・ツアイス・メディテック・アーゲー | トルク補償用スタンドおよび方法 |
JP7241799B2 (ja) | 2016-01-11 | 2023-03-17 | カール・ツアイス・メディテック・アーゲー | トルク補償用スタンドおよび方法 |
CN105690402A (zh) * | 2016-04-12 | 2016-06-22 | 上海应用技术学院 | 一种服务机器人的臂部结构及其控制方法 |
JP2020520745A (ja) * | 2017-05-25 | 2020-07-16 | コヴィディエン リミテッド パートナーシップ | 自動誘導付きロボット外科システム |
US11839441B2 (en) | 2017-05-25 | 2023-12-12 | Covidien Lp | Robotic surgical system with automated guidance |
JP2019165839A (ja) * | 2018-03-22 | 2019-10-03 | 株式会社デンソー | 治療装置 |
JP7086381B2 (ja) | 2018-03-22 | 2022-06-20 | ソニア・セラピューティクス株式会社 | 治療装置 |
Also Published As
Publication number | Publication date |
---|---|
US20200146762A1 (en) | 2020-05-14 |
CN106061427A (zh) | 2016-10-26 |
JPWO2015129474A1 (ja) | 2017-03-30 |
US20170007342A1 (en) | 2017-01-12 |
EP3135444A1 (en) | 2017-03-01 |
EP3135444A4 (en) | 2017-11-15 |
CN106061427B (zh) | 2020-10-27 |
EP3135444B1 (en) | 2022-04-13 |
US10561469B2 (en) | 2020-02-18 |
US11633245B2 (en) | 2023-04-25 |
JP6614130B2 (ja) | 2019-12-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6614130B2 (ja) | 手術用ロボットアーム装置、手術用ロボットアーム制御方法及びプログラム | |
JP6272885B2 (ja) | 医療用ロボットアーム装置、医療用ロボットアーム制御システム、医療用ロボットアーム制御方法及びプログラム | |
JP6555248B2 (ja) | 医療用アーム装置、キャリブレーション方法及びプログラム | |
JP6586079B2 (ja) | アーム装置、及びプログラム | |
JP6773165B2 (ja) | 医療用支持アーム装置、医療用支持アーム制御方法及びプログラム | |
WO2017169096A1 (ja) | 医療用支持アームの制御装置、医療用支持アーム装置の制御方法及び医療用システム | |
WO2015133291A1 (ja) | アクチュエータ及びロボットアーム装置 | |
WO2015137040A1 (ja) | ロボットアーム装置、ロボットアーム制御方法及びプログラム | |
WO2017169082A1 (ja) | 制御装置及び制御方法 | |
WO2018221035A1 (ja) | 医療用支持アームシステム、医療用支持アームの制御方法、および医療用支持アームの制御装置 | |
WO2015137140A1 (ja) | ロボットアームの制御装置、ロボットアームの制御方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15754874 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016505143 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15119671 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2015754874 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015754874 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |