WO2023166868A1 - Mobile imaging robot system and method for controlling same - Google Patents

Mobile imaging robot system and method for controlling same Download PDF

Info

Publication number
WO2023166868A1
WO2023166868A1 PCT/JP2023/001032 JP2023001032W WO2023166868A1 WO 2023166868 A1 WO2023166868 A1 WO 2023166868A1 JP 2023001032 W JP2023001032 W JP 2023001032W WO 2023166868 A1 WO2023166868 A1 WO 2023166868A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot arm
unit
vibration
frequency
imaging
Prior art date
Application number
PCT/JP2023/001032
Other languages
French (fr)
Japanese (ja)
Inventor
陽 久保
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Publication of WO2023166868A1 publication Critical patent/WO2023166868A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages

Definitions

  • the present invention relates to a mobile imaging robot system and its control method.
  • the imaging device mentioned here is a device for obtaining two-dimensional or three-dimensional images of the surrounding environment, such as a camera or a LiDAR (light detection and ranging) sensor, or a device that fuses them.
  • the mobile robot's imaging device is attached to the hand position of the robot arm on which it is mounted, rather than the mobile object itself.
  • the imaging data can be used to determine the behavior of the robot, such as collision avoidance.
  • Patent Document 1 in a robot arm mounted on a moving part, when the position of the workpiece or the arm is displaced after the moving part moves to a predetermined place, state monitoring using camera images A technique for generating a trajectory plan for the arm to compensate for deviations is described.
  • the tip position of the mounted robot arm may vibrate due to unevenness such as grooves and steps on the road surface on which the moving part moves. In addition, it vibrates due to disturbance due to the movement of the center of gravity of the moving part accompanying the movement and the impact at the time of grounding.
  • the scope of the problem of vibration of the imaging device is not limited to the problem of measurement accuracy.
  • An object of the present invention is to provide a mobile imaging robot system that can obtain a stable field of view even during movement.
  • the mobile imaging robot system of the present invention includes an imaging unit that photographs the surroundings and outputs imaging data; a robot arm that mounts the imaging unit on a hand and adjusts the position of the hand; Based on the moving unit that carries the arm and moves, and the tracking trajectory of the position of the predetermined target in the captured image of the imaging unit, vibration of the hand of the robot arm occurs due to the movement of the center of gravity of the moving unit or the impact at the time of contact with the ground.
  • a robot arm control unit for controlling the robot arm so as to prevent the robot arm from
  • the present invention it is possible to stabilize the imaging field of view while the mobile imaging robot is moving.
  • FIG. 1 is a configuration diagram of a mobile imaging robot system of Example 1.
  • FIG. 4 is a configuration diagram of a robot arm control unit of Example 1.
  • FIG. It is a figure which shows the form comprised from the moving part by a robot arm and a multi-legged robot. It is a figure which shows the form comprised from the moving part by a robot arm and a multi-legged robot.
  • FIG. 5 is a flowchart of processing of the mobile imaging robot system of Example 1; It is a figure explaining frequency analysis. It is a figure explaining preparation of a vibration compensation control command. It is a figure explaining preparation of a vibration compensation control command.
  • FIG. 11 is a configuration diagram of a robot arm control unit of Example 2; FIG.
  • FIG. 11 is a flowchart of processing of the mobile imaging robot system of Example 2; It is a figure which shows the analysis result of a frequency-analysis part. It is a figure which shows the analysis result of a frequency-analysis part. It is a figure which shows the positional information on a ground plane map. 4 is a control block diagram for explaining the vibration compensation operation of the robot arm 40 of this embodiment.
  • FIG. FIG. 10 is a processing flow diagram when generating vibration data and ground plane data at the same time; It is a figure which shows the time-frequency-analysis result of a frequency-analysis part.
  • FIG. 1 is a configuration diagram of a mobile imaging robot system of this embodiment.
  • the mobile imaging robot system includes a robot arm 40 equipped with an imaging unit 50 for imaging the surroundings at the hand, a moving unit 60 on which the robot arm 40 is mounted, a computer 30 for collecting and displaying data, and controlling the robot arm 40.
  • the robot arm control unit 10 is configured by connecting the robot arm control unit 10 , the computer 30 , and the robot arm 40 via a network 20 .
  • the moving part 60 is equipped with the robot arm 40 and configured to be autonomously movable by the moving part driving motor 61 .
  • the robot arm 40 has a rotation angle information sensor group (not shown) and is driven by a robot arm drive motor 41 so that the hand position can be adjusted.
  • the robot arm 40 has an imaging unit 50 at its tip. Then, the imaging unit 50 performs imaging in a direction corresponding to the orientation of the hand.
  • the robot arm control unit 10 includes an input/output unit 11 for exchanging input/output data with the robot arm 40 and the computer 30, an arithmetic unit 12 for executing the functions of the imaging device, and an input from the user of the imaging robot.
  • a storage unit 13 is provided in which data and data extracted by the calculation unit 12 are stored. The configuration of the robot arm control section 10 will be described in detail with reference to FIG.
  • FIG. 2 is a configuration diagram of the robot arm control unit 10 of the mobile imaging robot system.
  • the calculation unit 12 of the robot arm control unit 10 includes a target selection unit 120, a target tracking unit 121, a frequency analysis unit 122, a frequency/amplitude determination unit 123, a vibration compensation control unit 124, and a robot arm motion control unit 125.
  • the storage unit 13 includes vibration data 130 and vibration compensation control data 131 . Each component will be described in detail below.
  • the target selection unit 120 selects a predetermined stationary object as a target to be tracked from the imaging data around the imaging unit 50 when the moving unit 60 is stationary.
  • the imaging data is input by the input/output unit 11 via the network 20 .
  • the target tracking unit 121 tracks the target selected by the target selection unit 120 based on the imaging data while the moving unit 60 is moving, and calculates the relative position of the target and the robot arm 40 (imaging unit 50) in the vertical direction. Get change. That is, the target tracking unit 121 acquires changes in the hand position of the robot arm 40 accompanying movement of the moving unit 60 .
  • the frequency analysis unit 122 performs frequency analysis using the time-series data of the target position for each frame of the imaging data as the time-series data of the hand position of the robot arm.
  • the frequency/amplitude determination unit 123 uses the output result of the frequency analysis unit 122 to calculate the dominant frequency and amplitude of the vibration of the hand position of the robot arm 40 related to movement.
  • the vibration compensation control unit 124 causes the robot arm 40 to perform an operation to cancel the vibration of the hand position (imaging unit 50) of the robot arm 40 caused by the movement of the moving unit 60 based on the vibration calculated by the frequency/amplitude determination unit 123.
  • the robot arm motion control unit 125 controls rotation of the robot arm drive motor 41 that drives at least one joint of the robot arm based on the motion command generated by the vibration compensation control unit 124 .
  • the vibration data 130 is data relating to shaking caused by the movement of the moving part determined by the frequency/amplitude determination unit 123 and is stored in the storage unit 13 . Further, the vibration data 130 is expected to be required after the robot arm 40 is recalibrated, when walking on an unknown ground, or when the hardware part or control software of the robot arm 40 is updated. It can be referred to for creating vibration compensation controls.
  • the vibration compensation control data 131 is vibration compensation control data of the robot arm for canceling the vibration of the hand position of the robot arm due to the movement operation of the moving unit 60, and is stored in the storage unit 13 in association with the vibration data 130. .
  • the vibration compensation control data 131 is newly generated together with new vibration data 130 when the moving speed of the moving unit 60 or the hardware of the moving unit 60 is changed.
  • the robot arm control unit 10 includes a CPU (Central Processing Unit), ROM (Read Only Memory), RAM (Random Access Memory), HDD (Hard Disk Drive), input/output I/F (Interface), communication It is realized by an information processing device having an I/F and a media I/F.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • HDD Hard Disk Drive
  • I/F Interface
  • communication It is realized by an information processing device having an I/F and a media I/F.
  • the calculation unit 12 By executing a program stored in the ROM or HDD by the CPU, the calculation unit 12 operates a target selection unit 120, a target tracking unit 121, a frequency analysis unit 122, a frequency/amplitude determination unit 123, a vibration compensation control unit 124, and a robot arm.
  • the functions of the motion control unit 125, the vibration data 130, and the vibration compensation control data 131 are realized.
  • the storage unit 13 is implemented by an HDD and stores vibration data 130 and vibration compensation control data 131 .
  • the input/output unit 11 is implemented by an input/output I/F or a communication I/F, and is connected to the imaging unit 50, the robot arm 40, and the moving unit 60 via the network 20 to exchange data.
  • FIG. 3A shows an example of a form composed of a robot arm 40 having an imaging unit 50 attached to its hand and a moving unit 60 made up of a multi-legged robot.
  • FIG. 3B shows an example of a form composed of a robot arm 40 having an end effector and an imaging unit 50 attached to the hand, and a crawler track type moving unit 60 .
  • the robot arm control unit 10 starts acquisition of surrounding imaging data by the imaging unit 50 (start of imaging).
  • step S102 the target selection unit 120 selects a specific object as a target to be tracked in the range captured while the moving unit 60 is stationary.
  • the target selection unit 120 determines whether the target selected at step S102 is stationary. If it is stationary (YES in S103), the process proceeds to step S104, and if not (NO in S103), the process returns to step S102. Whether or not the target is stationary is determined by, for example, whether or not the position of the target is constant in imaging data for several frames.
  • the target tracking unit 121 starts tracking the target of the imaging data.
  • the robot arm control unit 10 permits the movement of the moving unit 60, and the moving unit drive motor 61 starts operating, thereby starting the movement of the moving unit 60.
  • step S106 the target tracking unit 121 acquires the position of the target for each frame while the moving unit 60 is moving as a tracking trajectory.
  • the frequency analysis unit 122 performs frequency analysis for determining the dominant frequency regarding the vibration of the hand position of the robot arm caused by the movement of the moving unit 60 from the tracking trajectory and frame rate acquired in step S106. conduct.
  • the frequency analyzed by the frequency analysis unit 122 is the frequency of the disturbance due to the movement of the center of gravity of the moving unit 60 and the impact when the moving unit 60 touches the ground. Details will be described separately with reference to FIG.
  • the dominant frequency is the value determined by the walking cycle when the moving part 60 is of the multi-legged robot type (FIG. 3A), and the rotation frequency of the belt part or the belt part when the moving part 60 is of the endless track type (FIG. 3B). is determined by the rotational frequency of the motor that drives the In other words, the frequency is determined by the kinematic behavior of the moving part 60, and the vibration is unavoidable even on flat ground without steps or unevenness.
  • the frequency analysis unit 122 may use Fourier transform for frequency analysis, may use a time-frequency analysis method such as wavelet transform, or may derive using peak detection or an autocorrelation function. Moreover, any direction within a plane that intersects perpendicularly with the advancing direction of movement may be selected as the direction of vibration to be analyzed.
  • step S108 the frequency/amplitude determining unit 123 assumes that the cause is caused by the movement of the moving unit 60 when a frequency with a predetermined value of amplitude intensity is obtained in the frequency analysis of step S107. Also, the control signal to the moving part 60 or the periodicity information of the control signal for at least one of the moving part driving motor 61 for driving the moving part 60 is compared with the periodicity information, and if the frequency matches, the moving part 60 is moved. may be attributed to
  • the frequency/amplitude determination unit 123 determines the dominant frequency and amplitude of the vibration of the hand position of the robot arm 40, which is the result of the frequency analysis in step S107, when it is assumed that the vibration is caused by the movement of the moving unit 60. Vibration data 130 is assumed.
  • the amplitude of the vibration data 130 may be obtained by directly measuring the target with a camera or LiDAR sensor that has a function of surveying position and dimensions. Alternatively, it may be indirectly estimated from the amount of displacement of the target within the field of view of the imaging unit 50 that is offset by vibrating the robot arm 40 with a specific amplitude and in the opposite phase to the obtained vibration.
  • step S109 the frequency/amplitude determination unit 123 determines whether the assumption that the result of the frequency analysis in step S107 is caused by the movement of the moving unit 60 has continued for a predetermined time. If continued (YES in S109), the vibration data 130 is confirmed as being caused by the movement of the moving part 60, and the process proceeds to step S110.If not continued (NO in S109), the process returns to step S106. .
  • step S110 the vibration compensation control unit 124 predictively moves the imaging unit 50 with the opposite phase to the frequency of the vibration data 130 and the same amplitude as the amplitude of the vibration data 130, thereby canceling the vibration of the target to be imaged.
  • Control information (vibration compensating operation command) for the robot arm 40 is created so as to be performed. Details will be described separately with reference to FIG.
  • step S ⁇ b>111 the vibration compensation control section 124 stores the vibration data 130 in the storage section 13 .
  • Either step S110 or step S111 may be performed first, or may be performed simultaneously.
  • step S112 the robot arm motion control unit 125 notifies the robot arm 40 of the control information (vibration compensation motion command) for the robot arm 40 created in step S110, and starts vibration compensation control of the robot arm.
  • step S113 the robot arm motion control unit 125 acquires the tracking trajectory by the target tracking unit 121, and determines whether or not the shaking of the target is within the allowable range by vibration compensation control of the robot arm. If it is within the allowable range (YES in S113), the process proceeds to step S114, and if it is not within the allowable range (NO in S113), the process returns to step S106.
  • step S114 the robot arm motion control unit 125 stores the control information (vibration compensation motion command) notified to the robot arm 40 in step S112 in the storage unit 13 as vibration compensation control data 131 in association with the vibration data 130. .
  • the vibration data 130 By linking with the vibration data 130 , it can be used for vibration compensation of the robot arm 40 when similar vibration is detected by the imaging unit 50 .
  • FIG. 5 is a diagram showing an example of the tracking trajectory acquired by the target tracking unit 121 in step S106 of FIG.
  • the frequency analysis unit 122 calculates the time t2- from the time t1 when the target takes the maximum peak position to the time t2 when the target next takes the maximum peak position in the time-series data (dotted curve) of the target position acquired as the tracking trajectory. With t1 as the period T, the frequency F is determined as its reciprocal to extract the frequency information of the vibration of the target. Furthermore, the difference between the maximum peak position and the minimum peak position of the target is obtained, and the amplitude A is determined.
  • the frequency analysis unit 122 may determine each of the frequency F and the amplitude A from one sample or the average value of a plurality of samples. Alternatively, the frequency analysis unit 122 may perform calculation through frequency transform processing such as Fourier transform.
  • FIG. 6 shows the relationship between the predicted vibration (solid line) of the tip of the robot arm 40 (imaging unit 50) and the motion (broken line) of the tip of the robot arm 40 (imaging unit 50) in response to notification of the vibration compensation control command.
  • FIG. 4 is a diagram showing;
  • the vibration compensation control unit 124 sets the vibration data 130 to the predicted vibration (solid line) of the tip of the robot arm 40 (imaging unit 50). Then, the vibration compensation control unit 124 creates a vibration compensation control command for the robot arm 40 so as to add, to the motion of the hand of the robot arm 40 , a motion with the opposite phase and the same amplitude as the vibration data 130 .
  • FIG. 7 which shows the tracking trajectory of the target after the vibration compensation control is performed
  • the vibration of the hand (imaging unit 50) of the robot arm 40 caused by the movement of the moving unit 60 is compensated.
  • the field of view of the imaging section 50 can be stabilized.
  • the robot arm control unit 10 of the embodiment controls the robot arm 40 so that the vibration of the hand of the robot arm 40 due to the movement of the center of gravity of the moving unit 60 and the impact at the time of contact with the ground is suppressed, and the vibration is compensated.
  • vibrations due to irregularities or steps on the ground contact surface will be described.
  • the robot arm control unit 10 of the second embodiment first separates the vibration due to the movement of the center of gravity of the moving part 60 and the impact at the time of grounding from the vibration due to the unevenness of the ground surface or the difference in level. , the robot arm 40 is controlled so as to compensate for both vibrations, and the field of view of the imaging unit 50 is stabilized.
  • FIG. 8 is a configuration diagram of the robot arm control unit 10 of the mobile imaging robot system of the second embodiment.
  • the robot arm control unit 10 is configured by adding a position information acquisition unit 126 and a ground surface data generation unit 127 to the calculation unit 12 of the robot arm control unit 10 of the first embodiment described with reference to FIG. It is obtained by adding the ground plane map 132 and the ground plane data 133 . Therefore, here, the added position information acquisition unit 126, the ground surface data creation unit 127, the ground surface map 132, and the ground surface data 133 will be described, and the other parts are the same as those in FIG.
  • the position information acquisition unit 126 acquires position information of the moving unit 60 within the movement range of the moving unit 60 .
  • the ground surface data generation unit 127 causes the frequency analysis unit 122 to ) is detected, it is determined that the vibration has occurred due to unevenness or steps on the contact surface of the moving part 60 or an obstacle. Then, the ground plane data generation unit 127 associates the frequency and amplitude obtained by the frequency analysis unit 122 with the ground plane map 132 with the position information of the moving unit 60 as the ground plane data 133 and stores the data in the storage unit 13 . If the existing ground plane data 133 exists in the storage unit 13, the ground plane data 133 is updated.
  • the ground plane map 132 is information about the shape and layout of the range in which the moving unit 60 moves.
  • the ground plane map 132 may be set from the input/output unit 11 as a known map in advance, or if the moving section 60 has a mapping function, the map created by the moving section 60 is stored as the ground plane map. It may be synchronized or input to unit 13 .
  • the ground surface data 133 is the frequency and amplitude of vibration when vibration of the tip of the robot arm 40 (imaging unit 50) caused by an unevenness or step on the ground surface of the moving unit 60 or an obstacle is detected. It is newly created or updated by the creation unit 127 .
  • the ground plane data 133 at this time will be described with reference to FIG. 10B.
  • the ground plane data 133 is linked to the ground plane map 132 and stored in the storage unit 13 .
  • the newly created or updated ground plane data 133 is stored to perform vibration compensation control of the robot arm 40 so that the imaging unit 50 does not vibrate when the moving unit 60 moves in the same place from now on.
  • step S115 the vibration compensation control unit 124 continues to move the moving unit 60 while the vibration compensation control of the robot arm 40 is being executed.
  • step S116 the vibration compensation control unit 124 checks whether or not the vibration (in particular, the amplitude) of the robot arm 40 tip (imaging unit 50) is outside the allowable range even though the vibration compensation control is being performed. judge. If the vibration does not deviate from the allowable range (NO in S116), the process returns to step S115, and if the vibration does not deviate from the allowable range (YES in S116), the process proceeds to step S117.
  • step S ⁇ b>117 the position information acquisition unit 126 acquires position information of the moving unit 60 within the movement range of the moving unit 60 . Then, the position information acquisition unit 126 stores this position information in the ground plane map 132 .
  • step S118 the ground plane data creation unit 127 determines whether or not there is ground plane data 133 corresponding to the position information determined in step S116 that the vibration is out of the allowable range. If the ground contact surface data generating unit 127 determines that there is (YES in S118), the process proceeds to step S120, and if it determines that there is no (NO in S118), the process proceeds to step S119.
  • step S119 the ground plane data generation unit 127 associates the frequency and amplitude of the vibration caused by the ground plane newly detected by the frequency analysis unit 122 with the position information obtained in step S117 in the ground plane map 132.
  • the ground plane data 133 is stored in the storage unit 13, and the process ends.
  • step S120 the ground plane data creation unit 127 converts the frequency and amplitude of the vibration caused by the ground plane newly detected by the frequency analysis unit 122 into the existing ground plane data 133 corresponding to the position information obtained in step S117. is added to and updated, and the process ends.
  • the robot arm control unit 10 of the second embodiment controls and moves the robot arm 40 so as to compensate for the movement of the center of gravity of the moving unit 60 and the vibration caused by the impact at the time of grounding. Vibrations due to irregularities, steps, or obstacles are detected, so that ground plane data 133 at a predetermined position can be obtained separately.
  • FIG. 10A is a diagram showing analysis results of the frequency analysis unit 122.
  • the frequency/amplitude determination unit 123 extracts the disturbance vibration due to the movement of the center of gravity of the moving part and the impact at the time of grounding as vibration data of frequency F1 and amplitude A1, and makes the vibration data 130 .
  • the vibration compensation control unit 124 creates vibration compensation control data 131 for the robot arm 40 so that the vibration of the tip of the robot arm 40 (imaging unit 50) is canceled. control 40;
  • the robot arm 40 compensates for vibration that constantly occurs as long as the moving part 60 is moving, so that the imaging part 50 can continue imaging without disturbance in the field of view.
  • FIG. 10B is a diagram showing the analysis results of the frequency analysis unit 122 when new vibrations due to bumps or bumps on the ground surface or collision with an obstacle are detected.
  • the contact surface data creation unit 127 extracts the vibration due to the unevenness or step of the contact surface as the contact surface data having the frequency F2 and the amplitude A2, and converts this to the contact surface map where the vibration occurred, which is acquired by the position information acquisition unit 126.
  • the ground plane data 133 is linked to the position information X (see FIG. 10C) on 132 .
  • the robot arm 40 adds vibration data 130 to the vibration compensation control to offset vibrations that occur permanently and temporarily at specific locations on the ground plane map 132 as long as the moving part 60 is moving.
  • Ground contact data 133 is added to the vibration compensation control to offset terrain-induced vibrations. Thereby, the vibration compensation control can be changed so that the imaging unit 50 does not vibrate on the known map.
  • Vibration compensation control of the robot arm 40 in this embodiment is created by inverting the phases of the vibration data 130 and the ground plane data 133 and adding them together.
  • the vibration data 130 and the ground plane data 133 may be created at the same time. This case will be described with reference to FIGS. 12 and 13.
  • FIG. 12 and 13 This case will be described with reference to FIGS. 12 and 13.
  • FIG. 12 is a processing flow for simultaneously creating vibration data 130 and ground plane data 133 in this embodiment. Steps S108-S114, steps S121, and steps S117-S120 proceed in parallel.
  • step S121 it is determined whether or not the result of the frequency analysis in step S107 has detected a single pulse-like vibration. This is based on the fact that the vibration caused by bumps or bumps on the contact surface or collision with an obstacle becomes a single-pulse vibration, unlike the movement of the center of gravity of the moving part, which is a periodic vibration, or the impact at the time of contact with the ground. .
  • steps S108 to S114 are the same as the processing described with reference to FIG.
  • step S121 if the result of the frequency analysis unit 122 is not single-pulse vibration, the ground plane data generation unit 127 terminates the parallel processing. The processing of S117-S120 is performed.
  • FIG. 13 is a diagram showing the time-frequency analysis result of the frequency analysis unit 122.
  • the frequency analysis unit 122 obtains the spectrogram of FIG. 13 by short-time Fourier transform or wavelet transform.
  • Vibration data 130 due to the movement of the moving body is assumed to be constant continuous vibration such as the vibration of frequency F1
  • ground plane data 133 is assumed to be vibration appearing as a single pulse wave such as vibration of frequency F2.
  • the vibration data and the ground contact data can be separately acquired at the same time.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the above embodiments have been described in detail to facilitate understanding of the present invention, and are not necessarily limited to those having all the described configurations. Also, part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.

Abstract

This mobile imaging robot system is configured to comprise: an imaging unit (50) for imaging the surroundings and outputting imaging data; a robot arm (40) having the imaging unit mounted on the tip thereof, the robot arm (40) adjusting the position of the tip; a movement unit (60) that moves, the robot arm being mounted on the movement unit; and a robot arm control unit (10) for controlling the robot arm, on the basis of a tracking trajectory of the position of a prescribed target in an image captured by the imaging unit, so as to prevent vibration of the tip of the robot arm caused by movement of the center of gravity of the movement unit or by an impact during contact with the ground. The mobile imaging robot system is also configured such that a stable field of view can be obtained even during movement.

Description

移動型撮像ロボットシステム及びその制御方法MOBILE IMAGING ROBOT SYSTEM AND CONTROL METHOD THEREOF
 本発明は、移動型撮像ロボットシステム及びその制御方法に関する。 The present invention relates to a mobile imaging robot system and its control method.
 多脚ロボットなどの移動型ロボットに、人が立ち入り辛い場所や広大な敷地内での在庫管理、外観検査、形状測定等の作業を行わせるニーズが存在する。その際、ロボットは撮像装置により取得した撮像データを用いて作業を行う。ここで言う撮像装置とは、カメラまたはLiDAR(light detection and ranging)センサなど、周囲環境の二次元または三次元のイメージを得るための装置、またはそれらをフュージョンさせた装置である。 There is a need for mobile robots such as multi-legged robots to perform tasks such as inventory management, visual inspection, and shape measurement in places that are difficult for people to enter or in vast premises. At that time, the robot performs the work using the imaging data acquired by the imaging device. The imaging device mentioned here is a device for obtaining two-dimensional or three-dimensional images of the surrounding environment, such as a camera or a LiDAR (light detection and ranging) sensor, or a device that fuses them.
 移動型ロボットの実際の在庫管理やロジスティクス、保守分野への適用においては、撮像対象を動かすのではなくロボットの視点自体を柔軟に動かすことで、キズや異物、商品の個数などを確認し、必要に応じてピックアンドプレイスなどエンドエフェクタによる作業を行うことも多い。その際、商品陳列棚の間やメンテナンス中の車両の床下などの狭いスペースでマニュピュレータ動かし、視点を多角的に変える場面が存在する。 In the actual application of mobile robots to the fields of inventory management, logistics, and maintenance, it is possible to flexibly move the viewpoint of the robot rather than moving the imaging target to check for scratches, foreign objects, the number of products, etc. End effectors such as pick and place are often used depending on the situation. At that time, there are scenes where the manipulator is moved in a narrow space such as between product display shelves or under the floor of a vehicle under maintenance to change the viewpoint from multiple angles.
 このため、移動型ロボットの撮像装置は、移動体そのものではなく、搭載したロボットアームの手先位置に取り付けられている。これにより、特に広大な敷地内での外観検査を行う場合、移動をしながら精度よく撮像ができると、俯瞰的に視点を動かしつつ、異常を認められれば移動またはカメラズームによる精査ができ、作業をスムーズに行うことができる上に、衝突回避などのロボットの行動決定にも撮像データを活かすことができる。 For this reason, the mobile robot's imaging device is attached to the hand position of the robot arm on which it is mounted, rather than the mobile object itself. As a result, when inspecting the appearance of a large site, it is possible to accurately capture images while moving. In addition, the imaging data can be used to determine the behavior of the robot, such as collision avoidance.
 例えば、特許文献1には、移動部に搭載したロボットアームにおいて、移動部が所定の場所に移動した後にワークの位置ずれやアームの位置ずれが起こった場合、カメラ画像を用いた状態監視によって、アームの軌道計画を生成してズレを補償する技術が記載されている。 For example, in Patent Document 1, in a robot arm mounted on a moving part, when the position of the workpiece or the arm is displaced after the moving part moves to a predetermined place, state monitoring using camera images A technique for generating a trajectory plan for the arm to compensate for deviations is described.
特開2018-158391号公報JP 2018-158391 A
 移動型ロボットの撮像装置の移動部が静止した状態ではなく、移動中に撮像を行う場合において、搭載されたロボットアームの手先位置は、移動部が移動する路面の溝や段差等の凹凸による振動以外に、移動に伴う移動部の重心移動や接地時の衝撃による外乱を受けて、振動する。 When the moving part of the imaging device of the mobile robot is moving rather than in a stationary state, the tip position of the mounted robot arm may vibrate due to unevenness such as grooves and steps on the road surface on which the moving part moves. In addition, it vibrates due to disturbance due to the movement of the center of gravity of the moving part accompanying the movement and the impact at the time of grounding.
 このため、撮像データによる物体認識自体の精度が下がる問題がある。特に、移動型ロボットや、移動型ロボットに搭載されているロボットアームの行動決定(衝突回避、検査におけるカメラのズームイン、ズームアウトなど)への活用が期待されている小型のMEMS式LiDARセンサによる撮像は、定点観測でないと距離や角度の測定精度が下がる。 For this reason, there is a problem that the accuracy of object recognition itself based on imaging data decreases. In particular, imaging by a small MEMS-type LiDAR sensor, which is expected to be used for determining the behavior of mobile robots and robot arms mounted on mobile robots (collision avoidance, zooming in and out of cameras during inspections, etc.) , the accuracy of distance and angle measurements decreases unless fixed-point observation is used.
 また、人が遠隔地からロボットの操作を行う場合、画像が振動するとオペレータの疲労は増大してしまう。このように、撮像装置の振動の問題が及ぶ範囲は、測定精度の問題に限らない。 Also, when a person operates a robot from a remote location, the operator's fatigue increases if the image vibrates. Thus, the scope of the problem of vibration of the imaging device is not limited to the problem of measurement accuracy.
 上記の先行技術では、移動型撮像ロボットが移動中に撮像を行うことは考慮されていない。 In the prior art described above, it is not considered that the mobile imaging robot performs imaging while moving.
 本発明の目的は、移動中においても、安定した視野を得ることができる移動型撮像ロボットシステムを提供することにある。 An object of the present invention is to provide a mobile imaging robot system that can obtain a stable field of view even during movement.
 前記課題を解決するため、本発明の移動型撮像ロボットシステムは、周囲を撮影し撮像データを出力する撮像部と、手先に前記撮像部を搭載し、手先位置を調整するロボットアームと、前記ロボットアームを搭載し移動する移動部と、前記撮像部の撮像画像における所定のターゲットの位置のトラッキング軌跡に基づいて、前記移動部の重心移動または接地時の衝撃による前記ロボットアームの手先の振動が生じないように前記ロボットアームを制御するロボットアーム制御部と、を備えるようにした。 In order to solve the above-described problems, the mobile imaging robot system of the present invention includes an imaging unit that photographs the surroundings and outputs imaging data; a robot arm that mounts the imaging unit on a hand and adjusts the position of the hand; Based on the moving unit that carries the arm and moves, and the tracking trajectory of the position of the predetermined target in the captured image of the imaging unit, vibration of the hand of the robot arm occurs due to the movement of the center of gravity of the moving unit or the impact at the time of contact with the ground. a robot arm control unit for controlling the robot arm so as to prevent the robot arm from
 本発明によれば、移動型撮像ロボットの移動中の撮像の視界を安定させることができる。 According to the present invention, it is possible to stabilize the imaging field of view while the mobile imaging robot is moving.
実施例1の移動型撮像ロボットシステムの構成図である。1 is a configuration diagram of a mobile imaging robot system of Example 1. FIG. 実施例1のロボットアーム制御部の構成図である。4 is a configuration diagram of a robot arm control unit of Example 1. FIG. ロボットアームと多脚ロボットによる移動部から構成されている形態を示す図である。It is a figure which shows the form comprised from the moving part by a robot arm and a multi-legged robot. ロボットアームと多脚ロボットによる移動部から構成されている形態を示す図である。It is a figure which shows the form comprised from the moving part by a robot arm and a multi-legged robot. 実施例1の移動型撮像ロボットシステムの処理のフロー図である。FIG. 5 is a flowchart of processing of the mobile imaging robot system of Example 1; 周波数解析について説明する図である。It is a figure explaining frequency analysis. 振動補償制御指令の作成について説明する図である。It is a figure explaining preparation of a vibration compensation control command. 振動補償制御指令の作成について説明する図である。It is a figure explaining preparation of a vibration compensation control command. 実施例2のロボットアーム制御部の構成図である。FIG. 11 is a configuration diagram of a robot arm control unit of Example 2; 実施例2の移動型撮像ロボットシステムの処理のフロー図である。FIG. 11 is a flowchart of processing of the mobile imaging robot system of Example 2; 周波数解析部の解析結果を示す図である。It is a figure which shows the analysis result of a frequency-analysis part. 周波数解析部の解析結果を示す図である。It is a figure which shows the analysis result of a frequency-analysis part. 接地面マップ上での位置情報を示す図である。It is a figure which shows the positional information on a ground plane map. 本実施例のロボットアーム40の振動補償動作を説明する制御ブロック図である。4 is a control block diagram for explaining the vibration compensation operation of the robot arm 40 of this embodiment. FIG. 振動データと接地面データを同時に作成する場合の処理フロー図である。FIG. 10 is a processing flow diagram when generating vibration data and ground plane data at the same time; 周波数解析部の時間周波数解析結果を示す図である。It is a figure which shows the time-frequency-analysis result of a frequency-analysis part.
 以下、本発明の移動型撮像ロボットシステムの実施形態について、図面を参照しながら詳細に説明する。 Hereinafter, embodiments of the mobile imaging robot system of the present invention will be described in detail with reference to the drawings.
 図1は、本実施例の移動型撮像ロボットシステムの構成図である。
 移動型撮像ロボットシステムは、手先に周囲を撮像する撮像部50を備えるロボットアーム40と、ロボットアーム40を搭載する移動部60と、データ収集、データ表示用のコンピュータ30と、ロボットアーム40を制御するロボットアーム制御部10と、を備え、ロボットアーム制御部10とコンピュータ30とロボットアーム40とをネットワーク20を介して接続して構成する。
FIG. 1 is a configuration diagram of a mobile imaging robot system of this embodiment.
The mobile imaging robot system includes a robot arm 40 equipped with an imaging unit 50 for imaging the surroundings at the hand, a moving unit 60 on which the robot arm 40 is mounted, a computer 30 for collecting and displaying data, and controlling the robot arm 40. The robot arm control unit 10 is configured by connecting the robot arm control unit 10 , the computer 30 , and the robot arm 40 via a network 20 .
 移動部60は、ロボットアーム40を搭載し、移動部駆動モータ61により自律移動可能に構成する。 The moving part 60 is equipped with the robot arm 40 and configured to be autonomously movable by the moving part driving motor 61 .
 ロボットアーム40は、不図示の回転角度情報センサ群を有し、ロボットアーム駆動モータ41によって手先位置を調整可能に駆動する。ロボットアーム40は、その手先に撮像部50を備える。そして、撮像部50は手先の向きに応じた方向の撮像を行う。 The robot arm 40 has a rotation angle information sensor group (not shown) and is driven by a robot arm drive motor 41 so that the hand position can be adjusted. The robot arm 40 has an imaging unit 50 at its tip. Then, the imaging unit 50 performs imaging in a direction corresponding to the orientation of the hand.
 ロボットアーム制御部10は、ロボットアーム40及びコンピュータ30との入出力データの遣り取りを行う入出力部11と、撮像装置の機能を実行するための演算部12と、撮像ロボットの使用者が入力するデータ及び演算部12の抽出したデータが記憶される記憶部13を備える。ロボットアーム制御部10の構成は、図2により詳細に説明する。 The robot arm control unit 10 includes an input/output unit 11 for exchanging input/output data with the robot arm 40 and the computer 30, an arithmetic unit 12 for executing the functions of the imaging device, and an input from the user of the imaging robot. A storage unit 13 is provided in which data and data extracted by the calculation unit 12 are stored. The configuration of the robot arm control section 10 will be described in detail with reference to FIG.
 図2は、移動型撮像ロボットシステムのロボットアーム制御部10の構成図である。
 ロボットアーム制御部10の演算部12は、ターゲット選定部120と、ターゲットトラッキング部121と、周波数解析部122と、周波数・振幅判定部123と、振動補償制御部124と、ロボットアーム動作制御部125とを備える。記憶部13は、振動データ130と、振動補償制御データ131と、を備える。以下、各構成要素を詳細に説明する。
FIG. 2 is a configuration diagram of the robot arm control unit 10 of the mobile imaging robot system.
The calculation unit 12 of the robot arm control unit 10 includes a target selection unit 120, a target tracking unit 121, a frequency analysis unit 122, a frequency/amplitude determination unit 123, a vibration compensation control unit 124, and a robot arm motion control unit 125. and The storage unit 13 includes vibration data 130 and vibration compensation control data 131 . Each component will be described in detail below.
 ターゲット選定部120は、移動部60が停止している状態における撮像部50の周囲の撮像データから、静止している所定の物体をトラッキング(追跡)するターゲットとして選定する。撮像データは、ネットワーク20を介して入出力部11により入力する。 The target selection unit 120 selects a predetermined stationary object as a target to be tracked from the imaging data around the imaging unit 50 when the moving unit 60 is stationary. The imaging data is input by the input/output unit 11 via the network 20 .
 ターゲットトラッキング部121は、移動部60が移動している状態において、撮像データによりターゲット選定部120が選定したターゲットをトラッキングし、ターゲットとロボットアーム40(撮像部50)との鉛直方向の相対位置の変化を取得する。
 すなわち、ターゲットトラッキング部121は、移動部60の移動に伴うロボットアーム40の手先位置の変化を取得する。
The target tracking unit 121 tracks the target selected by the target selection unit 120 based on the imaging data while the moving unit 60 is moving, and calculates the relative position of the target and the robot arm 40 (imaging unit 50) in the vertical direction. Get change.
That is, the target tracking unit 121 acquires changes in the hand position of the robot arm 40 accompanying movement of the moving unit 60 .
 周波数解析部122は、撮像データのフレーム毎のターゲット位置の時系列データを、ロボットアームの手先位置の時系列データとして、周波数解析を行う。 The frequency analysis unit 122 performs frequency analysis using the time-series data of the target position for each frame of the imaging data as the time-series data of the hand position of the robot arm.
 周波数・振幅判定部123は、周波数解析部122の出力結果を用いて、移動に関するロボットアーム40の手先位置の振動について支配的な周波数及びその振幅を算出する。 The frequency/amplitude determination unit 123 uses the output result of the frequency analysis unit 122 to calculate the dominant frequency and amplitude of the vibration of the hand position of the robot arm 40 related to movement.
 振動補償制御部124は、周波数・振幅判定部123によって算出された振動に基づき移動部60の移動によるロボットアーム40の手先位置(撮像部50)の振動を相殺する動作をロボットアーム40に行わせるための動作指令を生成する。 The vibration compensation control unit 124 causes the robot arm 40 to perform an operation to cancel the vibration of the hand position (imaging unit 50) of the robot arm 40 caused by the movement of the moving unit 60 based on the vibration calculated by the frequency/amplitude determination unit 123. Generate motion commands for
 ロボットアーム動作制御部125は、振動補償制御部124が作成した動作指令に基づき、ロボットアームの少なくとも一つの関節を駆動するロボットアーム駆動モータ41の回転を制御する。 The robot arm motion control unit 125 controls rotation of the robot arm drive motor 41 that drives at least one joint of the robot arm based on the motion command generated by the vibration compensation control unit 124 .
 振動データ130は、周波数・振幅判定部123が判定した移動部の移動が起因の揺れに関するデータであり、記憶部13に記憶される。また、振動データ130は、ロボットアーム40をキャリブレーションし直した後や、未知の地面を歩行する際や、ロボットアーム40のハード部分あるいは制御ソフトに更新があった場合に、必要と予測される振動補償制御を作成するために参照することができる。 The vibration data 130 is data relating to shaking caused by the movement of the moving part determined by the frequency/amplitude determination unit 123 and is stored in the storage unit 13 . Further, the vibration data 130 is expected to be required after the robot arm 40 is recalibrated, when walking on an unknown ground, or when the hardware part or control software of the robot arm 40 is updated. It can be referred to for creating vibration compensation controls.
 振動補償制御データ131は、移動部60の移動動作によるロボットアームの手先位置の振動を相殺するための、ロボットアームの振動補償制御データであり、振動データ130と紐づけて記憶部13に格納する。振動補償制御データ131は、移動部60の移動速度や、移動部60のハードが変更された際は、新規の振動データ130と共に新たに生成される。 The vibration compensation control data 131 is vibration compensation control data of the robot arm for canceling the vibration of the hand position of the robot arm due to the movement operation of the moving unit 60, and is stored in the storage unit 13 in association with the vibration data 130. . The vibration compensation control data 131 is newly generated together with new vibration data 130 when the moving speed of the moving unit 60 or the hardware of the moving unit 60 is changed.
 具体的には、ロボットアーム制御部10は、CPU(Central Processing Unit)、ROM(Read Only Memory)、RAM(Random Access Memory)、HDD(Hard Disk Drive)、入出力I/F(Interface)、通信I/F及びメディアI/Fを有する情報処理装置で実現される。 Specifically, the robot arm control unit 10 includes a CPU (Central Processing Unit), ROM (Read Only Memory), RAM (Random Access Memory), HDD (Hard Disk Drive), input/output I/F (Interface), communication It is realized by an information processing device having an I/F and a media I/F.
 演算部12は、CPUがROMまたはHDDに記憶するプログラムを実行することで、ターゲット選定部120とターゲットトラッキング部121と周波数解析部122と周波数・振幅判定部123と振動補償制御部124とロボットアーム動作制御部125と振動データ130と振動補償制御データ131の機能を実現する。 By executing a program stored in the ROM or HDD by the CPU, the calculation unit 12 operates a target selection unit 120, a target tracking unit 121, a frequency analysis unit 122, a frequency/amplitude determination unit 123, a vibration compensation control unit 124, and a robot arm. The functions of the motion control unit 125, the vibration data 130, and the vibration compensation control data 131 are realized.
 記憶部13は、HDDにより実現され、振動データ130と振動補償制御データ131とを記憶する。
 入出力部11は、入出力I/Fまたは通信I/Fにより実現し、ネットワーク20を介して、撮像部50とロボットアーム40と移動部60と接続し、データを授受する。
The storage unit 13 is implemented by an HDD and stores vibration data 130 and vibration compensation control data 131 .
The input/output unit 11 is implemented by an input/output I/F or a communication I/F, and is connected to the imaging unit 50, the robot arm 40, and the moving unit 60 via the network 20 to exchange data.
 ここで、本実施例の移動型撮像ロボットシステムにおけるロボットアーム40と撮像部50と移動部60の形態を、図3Aと図3Bにより説明する。 Here, the forms of the robot arm 40, the imaging unit 50, and the moving unit 60 in the mobile imaging robot system of this embodiment will be described with reference to FIGS. 3A and 3B.
 図3Aは、手先に撮像部50が取り付けられたロボットアーム40と、多脚ロボットによる移動部60から構成されている形態例である。
 図3Bは、手先にエンドエフェクタと撮像部50が取り付けられたロボットアーム40と、無限軌道型の移動部60から構成されている形態例である。
FIG. 3A shows an example of a form composed of a robot arm 40 having an imaging unit 50 attached to its hand and a moving unit 60 made up of a multi-legged robot.
FIG. 3B shows an example of a form composed of a robot arm 40 having an end effector and an imaging unit 50 attached to the hand, and a crawler track type moving unit 60 .
 つぎに、本実施例の移動型撮像ロボットシステムの処理を、図4のフロー図により説明する。 Next, the processing of the mobile imaging robot system of this embodiment will be explained with reference to the flowchart of FIG.
 S101で、ロボットアーム制御部10は、撮像部50による周囲の撮像データの取得を開始する(撮像の開始)。 At S101, the robot arm control unit 10 starts acquisition of surrounding imaging data by the imaging unit 50 (start of imaging).
 ステップS102で、ターゲット選定部120は、移動部60が停止している状態で撮像されている範囲において、特定の物体をトラッキングするターゲットとして選定する。 In step S102, the target selection unit 120 selects a specific object as a target to be tracked in the range captured while the moving unit 60 is stationary.
 ステップS103で、ターゲット選定部120は、ステップS102で選定したターゲットが静止中であるか否かを判定する。静止中であれば(S103のYES)、ステップS104に進み、静止中でなければ(S103のNO)ステップS102に戻る。静止中であるか否かの判定は、例えば、数フレーム分の撮像データにおいて、ターゲットの位置が一定であるか否かによって判定する。 At step S103, the target selection unit 120 determines whether the target selected at step S102 is stationary. If it is stationary (YES in S103), the process proceeds to step S104, and if not (NO in S103), the process returns to step S102. Whether or not the target is stationary is determined by, for example, whether or not the position of the target is constant in imaging data for several frames.
 ステップS104で、ターゲットトラッキング部121は、撮像データのターゲットのトラッキングを開始する。 At step S104, the target tracking unit 121 starts tracking the target of the imaging data.
 ステップS105で、ロボットアーム制御部10は、移動部60の移動を許可し、移動部駆動モータ61が動作を開始することにより移動部60の移動が開始される。 At step S105, the robot arm control unit 10 permits the movement of the moving unit 60, and the moving unit drive motor 61 starts operating, thereby starting the movement of the moving unit 60.
 ステップS106で、ターゲットトラッキング部121は、移動部60の移動中におけるターゲットのフレーム毎の位置をトラッキング軌跡として取得する。 In step S106, the target tracking unit 121 acquires the position of the target for each frame while the moving unit 60 is moving as a tracking trajectory.
 ステップS107で、周波数解析部122は、ステップS106で取得されたトラッキング軌跡とフレームレートから、移動部60の移動によって生じるロボットアームの手先位置の振動に関する支配的な周波数を決定するための周波数解析を行う。周波数解析部122が解析する周波数は、移動部60の重心移動や接地時の衝撃による外乱の周波数である。なお、詳細は、図5により別途説明する。 In step S107, the frequency analysis unit 122 performs frequency analysis for determining the dominant frequency regarding the vibration of the hand position of the robot arm caused by the movement of the moving unit 60 from the tracking trajectory and frame rate acquired in step S106. conduct. The frequency analyzed by the frequency analysis unit 122 is the frequency of the disturbance due to the movement of the center of gravity of the moving unit 60 and the impact when the moving unit 60 touches the ground. Details will be described separately with reference to FIG.
 特に、支配的な周波数は、移動部60が多脚ロボット型の場合(図3A)は、歩行周期によって決まる値であり、無限軌道型の場合(図3B)はベルト部の回転周波数またはベルト部を駆動するモータの回転周波数によって決まる。つまり、移動部60の運動学的ふるまいによって決まる周波数であり、段差または凹凸のない平地でも避けられない振動である。 In particular, the dominant frequency is the value determined by the walking cycle when the moving part 60 is of the multi-legged robot type (FIG. 3A), and the rotation frequency of the belt part or the belt part when the moving part 60 is of the endless track type (FIG. 3B). is determined by the rotational frequency of the motor that drives the In other words, the frequency is determined by the kinematic behavior of the moving part 60, and the vibration is unavoidable even on flat ground without steps or unevenness.
 周波数解析部122は、周波数解析に、フーリエ変換を使ってもよいし、ウェーブレット変換等の時間周波数解析方法を用いてもよいし、ピーク検出や自己相関関数を用いて導出してもよい。また、解析する振動の方向は、移動の進行方向に対して垂直に交わる平面内のどの方向を選定してもよい。 The frequency analysis unit 122 may use Fourier transform for frequency analysis, may use a time-frequency analysis method such as wavelet transform, or may derive using peak detection or an autocorrelation function. Moreover, any direction within a plane that intersects perpendicularly with the advancing direction of movement may be selected as the direction of vibration to be analyzed.
 ステップS108で、周波数・振幅判定部123は、ステップS107の周波数解析で所定値の振幅強度の周波数を得られた場合に、移動部60の移動に起因するものである仮定する。また、移動部60への制御信号、または移動部60を駆動する移動部駆動モータ61のうち少なくとも一つに対する制御信号の周期性の情報と周波数比較して、一致する場合に移動部60の移動に起因するとしてもよい。 In step S108, the frequency/amplitude determining unit 123 assumes that the cause is caused by the movement of the moving unit 60 when a frequency with a predetermined value of amplitude intensity is obtained in the frequency analysis of step S107. Also, the control signal to the moving part 60 or the periodicity information of the control signal for at least one of the moving part driving motor 61 for driving the moving part 60 is compared with the periodicity information, and if the frequency matches, the moving part 60 is moved. may be attributed to
 周波数・振幅判定部123は、移動部60の移動に起因するものであると仮定した場合に、ステップS107の周波数解析の結果であるロボットアーム40の手先位置の振動に関する支配的な周波数と振幅を振動データ130とする。 The frequency/amplitude determination unit 123 determines the dominant frequency and amplitude of the vibration of the hand position of the robot arm 40, which is the result of the frequency analysis in step S107, when it is assumed that the vibration is caused by the movement of the moving unit 60. Vibration data 130 is assumed.
 なお、振動データ130の振幅は、位置や寸法の測量機能を持つカメラやLiDARセンサによってターゲットを直接測定して求めてもよい。
 また、求めた振動と逆位相かつ特定の振幅でロボットアーム40を振動させることによって相殺される撮像部50の視界内でのターゲットの変位量から間接的に推定してもよい。
Note that the amplitude of the vibration data 130 may be obtained by directly measuring the target with a camera or LiDAR sensor that has a function of surveying position and dimensions.
Alternatively, it may be indirectly estimated from the amount of displacement of the target within the field of view of the imaging unit 50 that is offset by vibrating the robot arm 40 with a specific amplitude and in the opposite phase to the obtained vibration.
 ステップS109で、周波数・振幅判定部123は、ステップS107の周波数解析の結果が移動部60の移動に起因するものであるとの仮定が、所定時間継続したかを判定する。継続した場合には(S109のYES)、振動データ130が移動部60の移動に起因するものとして確定してステップS110に進み、継続しなかった場合には(S109のNO)、ステップS106に戻る。 In step S109, the frequency/amplitude determination unit 123 determines whether the assumption that the result of the frequency analysis in step S107 is caused by the movement of the moving unit 60 has continued for a predetermined time. If continued (YES in S109), the vibration data 130 is confirmed as being caused by the movement of the moving part 60, and the process proceeds to step S110.If not continued (NO in S109), the process returns to step S106. .
 ステップS110で、振動補償制御部124は、振動データ130の周波数と逆位相、かつ振動データ130の振幅と同一の振幅で撮像部50を予測的に動かすことで、撮像されるターゲットの振動が相殺されるように、ロボットアーム40の制御情報(振動補償動作指令)を作成する。なお、詳細は、図6により別途説明する。 In step S110, the vibration compensation control unit 124 predictively moves the imaging unit 50 with the opposite phase to the frequency of the vibration data 130 and the same amplitude as the amplitude of the vibration data 130, thereby canceling the vibration of the target to be imaged. Control information (vibration compensating operation command) for the robot arm 40 is created so as to be performed. Details will be described separately with reference to FIG.
 ステップS111で、振動補償制御部124は、振動データ130を記憶部13に格納する。
 なお、ステップS110とステップS111はいずれが先に行われてもいいし、同時に行われてもよい。
In step S<b>111 , the vibration compensation control section 124 stores the vibration data 130 in the storage section 13 .
Either step S110 or step S111 may be performed first, or may be performed simultaneously.
 ステップS112で、ロボットアーム動作制御部125は、ステップS110で作成したロボットアーム40の制御情報(振動補償動作指令)をロボットアーム40に通知して、ロボットアームの振動補償制御を開始する。 In step S112, the robot arm motion control unit 125 notifies the robot arm 40 of the control information (vibration compensation motion command) for the robot arm 40 created in step S110, and starts vibration compensation control of the robot arm.
 ステップS113で、ロボットアーム動作制御部125は、ターゲットトラッキング部121によりトラッキング軌跡を取得して、ロボットアームの振動補償制御によってターゲットの揺れが許容範囲内であるか否かを判定する。許容範囲内である場合には(S113のYES)、ステップS114に進み、許容範囲内でない場合には(S113のNO)、ステップS106に戻る。 In step S113, the robot arm motion control unit 125 acquires the tracking trajectory by the target tracking unit 121, and determines whether or not the shaking of the target is within the allowable range by vibration compensation control of the robot arm. If it is within the allowable range (YES in S113), the process proceeds to step S114, and if it is not within the allowable range (NO in S113), the process returns to step S106.
 ステップS114で、ロボットアーム動作制御部125は、ステップS112でロボットアーム40に通知した制御情報(振動補償動作指令)を、振動補償制御データ131として記憶部13に振動データ130と紐づけて格納する。振動データ130との紐づけにより、同様の振動が撮像部50によって検知された際に、ロボットアーム40の振動補償に使用することができる。 In step S114, the robot arm motion control unit 125 stores the control information (vibration compensation motion command) notified to the robot arm 40 in step S112 in the storage unit 13 as vibration compensation control data 131 in association with the vibration data 130. . By linking with the vibration data 130 , it can be used for vibration compensation of the robot arm 40 when similar vibration is detected by the imaging unit 50 .
 つぎに、図5により、図4のステップS107の周波数解析について説明する。
 図5は、図4のステップS106で、ターゲットトラッキング部121が取得したトラッキング軌跡の一例を示す図である。
Next, the frequency analysis in step S107 in FIG. 4 will be described with reference to FIG.
FIG. 5 is a diagram showing an example of the tracking trajectory acquired by the target tracking unit 121 in step S106 of FIG.
 周波数解析部122は、トラッキング軌跡として取得されたターゲットの位置の時系列データ(点線の曲線)において、ターゲットが最大ピーク位置をとる時刻t1からつぎに最大ピーク位置をとる時刻t2までの時間t2-t1を周期Tとし、その逆数として周波数Fを決定して、ターゲットの振動の周波数情報を抽出する。
 さらに、ターゲットの最大ピーク位置と最小ピーク位置の差分を求め、振幅Aを決定する。
The frequency analysis unit 122 calculates the time t2- from the time t1 when the target takes the maximum peak position to the time t2 when the target next takes the maximum peak position in the time-series data (dotted curve) of the target position acquired as the tracking trajectory. With t1 as the period T, the frequency F is determined as its reciprocal to extract the frequency information of the vibration of the target.
Furthermore, the difference between the maximum peak position and the minimum peak position of the target is obtained, and the amplitude A is determined.
 周波数解析部122は、周波数Fと振幅Aは、それぞれ1サンプルから決定しても、複数サンプルの平均値から決定してもよい。もしくは周波数解析部122は、フーリエ変換のような周波数変換処理を介して算出してもよい。 The frequency analysis unit 122 may determine each of the frequency F and the amplitude A from one sample or the average value of a plurality of samples. Alternatively, the frequency analysis unit 122 may perform calculation through frequency transform processing such as Fourier transform.
 つぎに、図6と図7により、図4のステップS110の振動補償制御指令の作成について説明する。
 図6は、ロボットアーム40の手先(撮像部50)の予測される振動(実線)と、振動補償制御指令の通知によるロボットアーム40の手先(撮像部50)の動作(破線)との関係を示す図である。
Next, generation of the vibration compensation control command in step S110 of FIG. 4 will be described with reference to FIGS. 6 and 7. FIG.
FIG. 6 shows the relationship between the predicted vibration (solid line) of the tip of the robot arm 40 (imaging unit 50) and the motion (broken line) of the tip of the robot arm 40 (imaging unit 50) in response to notification of the vibration compensation control command. FIG. 4 is a diagram showing;
 振動補償制御部124は、振動データ130をロボットアーム40の手先(撮像部50)の予測される振動(実線)とする。そして、振動補償制御部124は、ロボットアーム40の手先の動作に、振動データ130に対して逆位相かつ同一の振幅の動作を付加するように、ロボットアーム40の振動補償制御指令を作成する。 The vibration compensation control unit 124 sets the vibration data 130 to the predicted vibration (solid line) of the tip of the robot arm 40 (imaging unit 50). Then, the vibration compensation control unit 124 creates a vibration compensation control command for the robot arm 40 so as to add, to the motion of the hand of the robot arm 40 , a motion with the opposite phase and the same amplitude as the vibration data 130 .
 これにより、図7の振動補償制御が行われた後のターゲットのトラッキング軌跡を示す図のように、移動部60の移動に起因するロボットアーム40の手先(撮像部50)の振動は補償され、撮像部50の視界を安定させることができる。 As a result, as shown in FIG. 7, which shows the tracking trajectory of the target after the vibration compensation control is performed, the vibration of the hand (imaging unit 50) of the robot arm 40 caused by the movement of the moving unit 60 is compensated. The field of view of the imaging section 50 can be stabilized.
 図7の撮像部50のターゲットのトラッキングの結果の表示は必ずしも必要ではなく、振動データや振動補償データを記憶部13に保持しておくか、入出力部11を通じてコンピュータ30に出力するのみでもよい。 It is not always necessary to display the result of tracking of the target by the imaging unit 50 of FIG. .
 実施例1では、実施形態のロボットアーム制御部10が、移動部60の重心移動や接地時の衝撃によるロボットアーム40の手先の振動が収まるようにロボットアーム40を制御して、振動補償することを説明したが、実施例2では、上記の振動に加えて、接地面の凹凸若しくは段差による振動が加わる場合について説明する。 In Example 1, the robot arm control unit 10 of the embodiment controls the robot arm 40 so that the vibration of the hand of the robot arm 40 due to the movement of the center of gravity of the moving unit 60 and the impact at the time of contact with the ground is suppressed, and the vibration is compensated. However, in the second embodiment, in addition to the above vibrations, vibrations due to irregularities or steps on the ground contact surface will be described.
 実施例2のロボットアーム制御部10は、まず、移動部60の重心移動や接地時の衝撃による振動と、接地面の凹凸若しくは段差による振動とを切り分け、つぎに、同一地点を移動する際に、双方の振動を補償するようにロボットアーム40の制御を行い、撮像部50の視界を安定させる。 The robot arm control unit 10 of the second embodiment first separates the vibration due to the movement of the center of gravity of the moving part 60 and the impact at the time of grounding from the vibration due to the unevenness of the ground surface or the difference in level. , the robot arm 40 is controlled so as to compensate for both vibrations, and the field of view of the imaging unit 50 is stabilized.
 図8は、実施例2の移動型撮像ロボットシステムのロボットアーム制御部10の構成図である。 FIG. 8 is a configuration diagram of the robot arm control unit 10 of the mobile imaging robot system of the second embodiment.
 ロボットアーム制御部10は、図2で説明した実施例1のロボットアーム制御部10の演算部12に、位置情報取得部126と接地面データ作成部127とを加えて構成され、記憶部13に接地面マップ132と接地面データ133を加えたものである。このため、ここでは、追加した位置情報取得部126、接地面データ作成部127、接地面マップ132と接地面データ133について説明し、他は図2と同様のため省略する。 The robot arm control unit 10 is configured by adding a position information acquisition unit 126 and a ground surface data generation unit 127 to the calculation unit 12 of the robot arm control unit 10 of the first embodiment described with reference to FIG. It is obtained by adding the ground plane map 132 and the ground plane data 133 . Therefore, here, the added position information acquisition unit 126, the ground surface data creation unit 127, the ground surface map 132, and the ground surface data 133 will be described, and the other parts are the same as those in FIG.
 位置情報取得部126は、移動部60の移動範囲における移動部60の位置情報を取得する。 The position information acquisition unit 126 acquires position information of the moving unit 60 within the movement range of the moving unit 60 .
 接地面データ作成部127は、振動補償制御部124により移動部60の重心移動や接地時の衝撃による振動補償制御を行っている際に、周波数解析部122によりロボットアーム40の手先(撮像部50)の振動が検知された場合に、移動部60の接地面の凹凸若しくは段差、または障害物による振動が生じたと判定する。そして、接地面データ作成部127は、周波数解析部122で求めた周波数と振幅とを接地面データ133として接地面マップ132に移動部60の位置情報により紐づけて記憶部13に格納する。既存の接地面データ133が記憶部13にある場合、接地面データ133を更新する。 When the vibration compensation control unit 124 is performing vibration compensation control due to the movement of the center of gravity of the moving unit 60 and the impact at the time of contacting the ground, the ground surface data generation unit 127 causes the frequency analysis unit 122 to ) is detected, it is determined that the vibration has occurred due to unevenness or steps on the contact surface of the moving part 60 or an obstacle. Then, the ground plane data generation unit 127 associates the frequency and amplitude obtained by the frequency analysis unit 122 with the ground plane map 132 with the position information of the moving unit 60 as the ground plane data 133 and stores the data in the storage unit 13 . If the existing ground plane data 133 exists in the storage unit 13, the ground plane data 133 is updated.
 接地面マップ132は、移動部60が移動する範囲の形状や間取りに関する情報である。接地面マップ132は、あらかじめ既知のマップとして入出力部11から設定していてもよいし、移動部60がマッピング機能を持っている場合、移動部60が作成したマップを接地面マップとしてとして記憶部13に同期または入力してもよい。 The ground plane map 132 is information about the shape and layout of the range in which the moving unit 60 moves. The ground plane map 132 may be set from the input/output unit 11 as a known map in advance, or if the moving section 60 has a mapping function, the map created by the moving section 60 is stored as the ground plane map. It may be synchronized or input to unit 13 .
 接地面データ133は、移動部60の接地面の凹凸若しくは段差、または障害物によるロボットアーム40の手先(撮像部50)の振動を検出した際の、振動の周波数と振幅であり、接地面データ作成部127によって新規作成または更新される。この際の接地面データ133は、図10Bで説明する。 The ground surface data 133 is the frequency and amplitude of vibration when vibration of the tip of the robot arm 40 (imaging unit 50) caused by an unevenness or step on the ground surface of the moving unit 60 or an obstacle is detected. It is newly created or updated by the creation unit 127 . The ground plane data 133 at this time will be described with reference to FIG. 10B.
 接地面データ133は、接地面マップ132に紐づけられて記憶部13に格納される。新規作成または更新された接地面データ133は、今後、移動部60が同一の場所を移動する際に、撮像部50が振動しないようにロボットアーム40の振動補償制御を行うために格納される。 The ground plane data 133 is linked to the ground plane map 132 and stored in the storage unit 13 . The newly created or updated ground plane data 133 is stored to perform vibration compensation control of the robot arm 40 so that the imaging unit 50 does not vibrate when the moving unit 60 moves in the same place from now on.
 つぎに、実施例2のロボットアーム制御部10の接地面データ133の作成処理を、図9のフロー図により説明する。
 図9のフロー図は、図4のフロー図にステップS115-S119を加えたものである。ここでは、ステップS115-S119について説明し、他のステップは図4と同様のため、説明は省略する。
Next, the process of creating the ground plane data 133 of the robot arm control unit 10 of the second embodiment will be described with reference to the flowchart of FIG.
The flow chart of FIG. 9 is obtained by adding steps S115 to S119 to the flow chart of FIG. Here, steps S115 to S119 will be explained, and the other steps are the same as in FIG. 4, so the explanation will be omitted.
 ステップS115で、振動補償制御部124は、ロボットアーム40の振動補償制御を実行している状態で、移動部60を移動継続する。 In step S115, the vibration compensation control unit 124 continues to move the moving unit 60 while the vibration compensation control of the robot arm 40 is being executed.
 ステップS116で、振動補償制御部124は、振動補償制御を行っているにも関わらず、ロボットアーム40の手先(撮像部50)の振動(特に、振幅)が許容範囲を逸脱しているかどうかを判定する。振動が許容範囲を逸脱していない場合には(S116のNO)、ステップS115に戻り、振動が許容範囲を逸脱している場合には(S116のYES)、ステップS117に進む。 In step S116, the vibration compensation control unit 124 checks whether or not the vibration (in particular, the amplitude) of the robot arm 40 tip (imaging unit 50) is outside the allowable range even though the vibration compensation control is being performed. judge. If the vibration does not deviate from the allowable range (NO in S116), the process returns to step S115, and if the vibration does not deviate from the allowable range (YES in S116), the process proceeds to step S117.
 ステップS117で、位置情報取得部126は、移動部60の移動範囲における移動部60の位置情報を取得する。そして、位置情報取得部126は、この位置情報を接地面マップ132に記憶する。 In step S<b>117 , the position information acquisition unit 126 acquires position information of the moving unit 60 within the movement range of the moving unit 60 . Then, the position information acquisition unit 126 stores this position information in the ground plane map 132 .
 ステップS118で、接地面データ作成部127は、ステップS116で振動が許容範囲を逸脱していると判定した位置情報に対応する接地面データ133があるか否かを判定する。接地面データ作成部127は、あると判定した場合には(S118のYES)、ステップS120に進み、ないと判定した場合には(S118のNO)、ステップS119に進む。 In step S118, the ground plane data creation unit 127 determines whether or not there is ground plane data 133 corresponding to the position information determined in step S116 that the vibration is out of the allowable range. If the ground contact surface data generating unit 127 determines that there is (YES in S118), the process proceeds to step S120, and if it determines that there is no (NO in S118), the process proceeds to step S119.
 ステップS119で、接地面データ作成部127は、周波数解析部122において新たに検知された接地面起因の振動の周波数と振幅とを、接地面マップ132におけるステップS117で得た位置情報に紐づけて接地面データ133として記憶部13に格納し、処理を終了する。 In step S119, the ground plane data generation unit 127 associates the frequency and amplitude of the vibration caused by the ground plane newly detected by the frequency analysis unit 122 with the position information obtained in step S117 in the ground plane map 132. The ground plane data 133 is stored in the storage unit 13, and the process ends.
 ステップS120で、接地面データ作成部127は、周波数解析部122において新たに検知された接地面起因の振動の周波数と振幅とを、ステップS117で得た位置情報に対応する既存の接地面データ133に追記して更新し、処理を終了する。 In step S120, the ground plane data creation unit 127 converts the frequency and amplitude of the vibration caused by the ground plane newly detected by the frequency analysis unit 122 into the existing ground plane data 133 corresponding to the position information obtained in step S117. is added to and updated, and the process ends.
 このように、実施例2のロボットアーム制御部10は、移動部60の重心移動や接地時の衝撃による振動を補償するようにロボットアーム40を制御して移動する状態で、トラッキング軌跡から接地面の凹凸若しくは段差または障害物による振動を検出するので、所定位置の接地面データ133を切り分けて取得できる。 As described above, the robot arm control unit 10 of the second embodiment controls and moves the robot arm 40 so as to compensate for the movement of the center of gravity of the moving unit 60 and the vibration caused by the impact at the time of grounding. Vibrations due to irregularities, steps, or obstacles are detected, so that ground plane data 133 at a predetermined position can be obtained separately.
 つぎに、本実施例における振動補償制御データ131と接地面データ133の作成方法の一例を説明する。 Next, an example of a method for creating the vibration compensation control data 131 and the ground plane data 133 in this embodiment will be described.
 図10Aは、周波数解析部122の解析結果を示す図である。
 周波数・振幅判定部123は、移動部の重心移動や接地時の衝撃による外乱の振動を、周波数F1、振幅A1の振動データとして抽出し、これを振動データ130する。そして、振動補償制御部124が、振動データ130に基づいて、ロボットアーム40の手先(撮像部50)の振動が相殺されるように、ロボットアーム40の振動補償制御データ131を作成し、ロボットアーム40を制御する。これにより、ロボットアーム40は、移動部60が移動している限り恒常的に発生する振動を補償するため、撮像部50は視界に外乱を受けることなく撮像を続けることができる。
FIG. 10A is a diagram showing analysis results of the frequency analysis unit 122. FIG.
The frequency/amplitude determination unit 123 extracts the disturbance vibration due to the movement of the center of gravity of the moving part and the impact at the time of grounding as vibration data of frequency F1 and amplitude A1, and makes the vibration data 130 . Then, based on the vibration data 130, the vibration compensation control unit 124 creates vibration compensation control data 131 for the robot arm 40 so that the vibration of the tip of the robot arm 40 (imaging unit 50) is canceled. control 40; As a result, the robot arm 40 compensates for vibration that constantly occurs as long as the moving part 60 is moving, so that the imaging part 50 can continue imaging without disturbance in the field of view.
 図10Bは、接地面の凹凸若しくは段差、または障害物との衝突による新たな振動が検知された場合の周波数解析部122の解析結果を示す図である。
 接地面データ作成部127は、接地面の凹凸若しくは段差による振動を、周波数F2、振幅A2の接地面データとして抽出し、これを、位置情報取得部126によって取得される振動が起こった接地面マップ132上での位置情報X(図10C参照)に紐付けして接地面データ133とする。
FIG. 10B is a diagram showing the analysis results of the frequency analysis unit 122 when new vibrations due to bumps or bumps on the ground surface or collision with an obstacle are detected.
The contact surface data creation unit 127 extracts the vibration due to the unevenness or step of the contact surface as the contact surface data having the frequency F2 and the amplitude A2, and converts this to the contact surface map where the vibration occurred, which is acquired by the position information acquisition unit 126. The ground plane data 133 is linked to the position information X (see FIG. 10C) on 132 .
 つぎに、図11の制御ブロック図により、本実施例のロボットアーム40の振動補償動作を説明する。 Next, the vibration compensation operation of the robot arm 40 of this embodiment will be described with reference to the control block diagram of FIG.
 ロボットアーム40は、移動部60が移動している限り、恒常的に生じる振動を相殺するために振動データ130を振動補償制御に加え、接地面マップ132上の特定の位置において一時的に発生する地形起因の振動を相殺するように、接地面データ133を振動補償制御に加える。これにより、既知のマップ上で撮像部50が振動しないように振動補償制御を変更することができる。本実施例でのロボットアーム40の振動補償制御は、振動データ130と接地面データ133の位相をそれぞれ反転し足し合わせることで作成する。 The robot arm 40 adds vibration data 130 to the vibration compensation control to offset vibrations that occur permanently and temporarily at specific locations on the ground plane map 132 as long as the moving part 60 is moving. Ground contact data 133 is added to the vibration compensation control to offset terrain-induced vibrations. Thereby, the vibration compensation control can be changed so that the imaging unit 50 does not vibrate on the known map. Vibration compensation control of the robot arm 40 in this embodiment is created by inverting the phases of the vibration data 130 and the ground plane data 133 and adding them together.
 本実施例において、振動データ130と接地面データ133は同時に作成してもよい。この場合について、図12、図13で説明する。 In this embodiment, the vibration data 130 and the ground plane data 133 may be created at the same time. This case will be described with reference to FIGS. 12 and 13. FIG.
 図12は、本実施例において振動データ130と接地面データ133を同時に作成する処理フローである。ステップS108-S114と、ステップS121、ステップS117-S120が同時並行で進む。 FIG. 12 is a processing flow for simultaneously creating vibration data 130 and ground plane data 133 in this embodiment. Steps S108-S114, steps S121, and steps S117-S120 proceed in parallel.
 ステップS121は、ステップS107の周波数解析の結果が、単発パルス的な振動を検知したか否かを判定している。これは、接地面の凹凸若しくは段差または障害物との衝突による振動は、周期的な振動になる移動部の重心移動や接地時の衝撃と異なり、単発パルス的な振動になることに基づいている。 In step S121, it is determined whether or not the result of the frequency analysis in step S107 has detected a single pulse-like vibration. This is based on the fact that the vibration caused by bumps or bumps on the contact surface or collision with an obstacle becomes a single-pulse vibration, unlike the movement of the center of gravity of the moving part, which is a periodic vibration, or the impact at the time of contact with the ground. .
 詳しくは、ステップS108-S114は、図4で説明した処理と同一である。
 ステップS121で、接地面データ作成部127は、周波数解析部122の結果が、単発パルス的な振動でなければ、並行処理を終了し、単発パルス的な振動であれば、図9で説明したステップS117-S120の処理を行う。
Specifically, steps S108 to S114 are the same as the processing described with reference to FIG.
In step S121, if the result of the frequency analysis unit 122 is not single-pulse vibration, the ground plane data generation unit 127 terminates the parallel processing. The processing of S117-S120 is performed.
 図13は、周波数解析部122の時間周波数解析結果を示す図である。
 周波数解析部122は、短時間フーリエ変換やウェーブレット変換によって、図13のスペクトログラムを得る。そして、周波数F1の振動のように恒常的に続く振動を移動体の移動による振動データ130とし、周波数F2の振動のように単発のパルス波として現れる振動を接地面データ133とする。これにより振動データと接地面データを分離して同時に取得できる。
FIG. 13 is a diagram showing the time-frequency analysis result of the frequency analysis unit 122. As shown in FIG.
The frequency analysis unit 122 obtains the spectrogram of FIG. 13 by short-time Fourier transform or wavelet transform. Vibration data 130 due to the movement of the moving body is assumed to be constant continuous vibration such as the vibration of frequency F1, and ground plane data 133 is assumed to be vibration appearing as a single pulse wave such as vibration of frequency F2. As a result, the vibration data and the ground contact data can be separately acquired at the same time.
 本実施例によれば、移動型撮像ロボットが撮った撮像データを用いて、撮像装置付きロボットアームの制御を行うことにより、移動による揺れを予測、相殺し、視界を安定させることができるので、移動中の撮像の視界を安定させることができる。 According to this embodiment, by controlling the robot arm with the imaging device using the imaging data taken by the mobile imaging robot, it is possible to predict and cancel the shaking due to movement and stabilize the field of view. It is possible to stabilize the field of view for imaging while moving.
 本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。上記の実施例は本発明で分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施形態の構成の一部を他の実施形態の構成に置き換えることが可能であり、また、ある実施形態の構成に他の実施形態の構成を加えることも可能である。 The present invention is not limited to the above-described embodiments, and includes various modifications. The above embodiments have been described in detail to facilitate understanding of the present invention, and are not necessarily limited to those having all the described configurations. Also, part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
 10 ロボットアーム制御部
 11 入出力部
 12 演算部
 120 ターゲット選定部
 121 ターゲットトラッキング部
 122 周波数解析部
 123 周波数・振幅判定部
 124 振動補償制御部
 125 ロボットアーム動作制御部
 13 記憶部
 130 振動データ
 131 振動補償制御データ
 20 ネットワーク
 30 コンピュータ
 40 ロボットアーム
 41 ロボットアーム駆動モータ
 50 撮像部
 60 移動部
 61 移動部駆動モータ
 
10 robot arm control unit 11 input/output unit 12 calculation unit 120 target selection unit 121 target tracking unit 122 frequency analysis unit 123 frequency/amplitude determination unit 124 vibration compensation control unit 125 robot arm operation control unit 13 storage unit 130 vibration data 131 vibration compensation Control data 20 Network 30 Computer 40 Robot arm 41 Robot arm drive motor 50 Imaging unit 60 Moving unit 61 Moving unit drive motor

Claims (9)

  1.  周囲を撮影し撮像データを出力する撮像部と、
     手先に前記撮像部を搭載し、手先位置を調整するロボットアームと、
     前記ロボットアームを搭載し移動する移動部と、
     前記撮像部の撮像画像における所定のターゲットの位置のトラッキング軌跡に基づいて、前記移動部の重心移動または接地時の衝撃による前記ロボットアームの手先の振動が生じないように前記ロボットアームを制御するロボットアーム制御部と、
    を備えたことを特徴とする移動型撮像ロボットシステム。
    an imaging unit that captures images of the surroundings and outputs captured data;
    a robot arm that mounts the imaging unit on the tip and adjusts the position of the tip;
    a moving unit that carries and moves the robot arm;
    A robot that controls the robot arm based on the tracking trajectory of the position of the predetermined target in the captured image of the imaging unit so that the tip of the robot arm does not vibrate due to the movement of the center of gravity of the moving unit or the impact at the time of contact with the ground. an arm control unit;
    A mobile imaging robot system comprising:
  2.  請求項1に記載の移動型撮像ロボットシステムにおいて、
     前記ロボットアーム制御部は、
     前記撮像画像により前記移動部の移動に伴う前記ロボットアームの手先位置の変化を取得するターゲットトラッキング部と、
     前記ロボットアームの手先位置の時系列データを周波数解析する周波数解析部と、
    前記周波数解析部の解析結果に基づいてロボットアームの手先位置の振動に関する支配的な周波数及びその振幅を求める周波数・振幅判定部と、
     前記周波数・振幅判定部で求めた前記周波数及びその振幅に基づいて、移動部の移動によるロボットアームの手先位置の振動を相殺する動作を前記ロボットアームに行わせる動作指令を作成する振動補償制御部と、
     前記動作指令に基づいてロボットアーム駆動モータの回転を制御するロボットアーム動作制御部と、
    を備えることを特徴とする移動型撮像ロボットシステム。
    In the mobile imaging robot system according to claim 1,
    The robot arm control unit
    a target tracking unit that acquires a change in the hand position of the robot arm accompanying movement of the moving unit from the captured image;
    a frequency analysis unit for frequency-analyzing the time-series data of the hand position of the robot arm;
    a frequency/amplitude determination unit that obtains a dominant frequency and its amplitude related to the vibration of the hand position of the robot arm based on the analysis result of the frequency analysis unit;
    A vibration compensation control unit that creates an operation command for causing the robot arm to perform an operation that cancels out vibration of the hand position of the robot arm due to movement of the moving unit, based on the frequency and its amplitude obtained by the frequency/amplitude determination unit. and,
    a robot arm motion control unit that controls rotation of a robot arm drive motor based on the motion command;
    A mobile imaging robot system comprising:
  3.  請求項2に記載の移動型撮像ロボットシステムにおいて、
     前記振動補償制御部は、前記周波数・振幅判定部が同一の支配的な周波数及びその振幅を所定時間求めた時に、前記動作指令を生成する
    ことを特徴とする移動型撮像ロボットシステム。
    In the mobile imaging robot system according to claim 2,
    The mobile imaging robot system, wherein the vibration compensation control unit generates the operation command when the frequency/amplitude determination unit obtains the same dominant frequency and its amplitude for a predetermined time.
  4.  請求項2に記載の移動型撮像ロボットシステムにおいて、
     前記周波数・振幅判定部は、
     前記移動部が多脚ロボット型の場合には、歩行周期を前記支配的な周波数とし、
     前記移動部が無限軌道型の場合には、ベルト部の回転周波数またはベルト部を駆動するモータの回転周波数に基づいて前記支配的な周波数を求める
    ことを特徴とする移動型撮像ロボットシステム。
    In the mobile imaging robot system according to claim 2,
    The frequency/amplitude determination unit is
    When the moving part is a multi-legged robot type, the walking cycle is the dominant frequency,
    A mobile imaging robot system according to claim 1, wherein the dominant frequency is determined based on the rotational frequency of a belt or the rotational frequency of a motor for driving the belt, when the movable section is of the endless track type.
  5.  請求項2に記載の移動型撮像ロボットシステムにおいて、
     前記ロボットアーム制御部は、さらに、
     前記移動部の移動範囲における前記移動部の位置情報を取得する位置情報取得部と、
    前記周波数解析部の解析結果に基づいて前記移動部の接地面の凹凸若しくは段差または障害物によるロボットアームの手先位置の振動の周波数及びその振幅を求める接地面データ作成部と、を備え、
     前記ロボットアームの手先の振動が生じないように前記ロボットアームを制御する
    ことを特徴とする移動型撮像ロボットシステム。
    In the mobile imaging robot system according to claim 2,
    The robot arm control unit further
    a position information acquisition unit that acquires position information of the moving unit in a movement range of the moving unit;
    a ground surface data creation unit that obtains the frequency and amplitude of vibration of the hand position of the robot arm due to unevenness or steps of the ground surface of the moving unit or obstacles based on the analysis result of the frequency analysis unit;
    A mobile imaging robot system, wherein the robot arm is controlled so that a tip of the robot arm does not vibrate.
  6.  請求項5に記載の移動型撮像ロボットシステムにおいて、
     前記接地面データ作成部は、
     前記ロボットアームが移動によるロボットアームの手先位置の振動を相殺する動作を行っている際に、前記移動部の接地面の凹凸または段差によるロボットアームの手先位置の振動の周波数及びその振幅を求める
    ことを特徴とする移動型撮像ロボットシステム。
    In the mobile imaging robot system according to claim 5,
    The ground surface data creation unit is
    Obtaining the frequency and amplitude of vibration of the robot arm's hand position due to irregularities or steps on the contact surface of the moving part when the robot arm is performing an operation to cancel the vibration of the robot arm's hand position caused by movement. A mobile imaging robot system characterized by:
  7.  請求項5に記載の移動型撮像ロボットシステムにおいて、
     前記接地面データ作成部は、
     移動部の接地面の凹凸または段差との衝突によるロボットアームの手先位置の単発パルス的振動を検知した場合には、前記移動部の移動によるロボットアームの手先位置の振動の周波数及びその振幅の取得と並行して、前記移動部の接地面の凹凸または段差によるロボットアームの手先位置の振動の周波数及びその振幅を求める
    ことを特徴とする移動型撮像ロボットシステム。
    In the mobile imaging robot system according to claim 5,
    The ground surface data creation unit is
    When a single pulse-like vibration of the hand position of the robot arm due to collision with unevenness or unevenness of the contact surface of the moving part is detected, the frequency and amplitude of the vibration of the hand position of the robot arm due to the movement of the moving part are obtained. In parallel with the above, a mobile imaging robot system characterized by obtaining the frequency and the amplitude of vibration of the position of the hand of the robot arm due to unevenness or steps of the contact surface of the moving part.
  8.  周囲を撮影し撮像データを出力する撮像部と、手先に前記撮像部を搭載し、手先位置を調整するロボットアームと、前記ロボットアームを搭載し移動する移動部と、を有する移動型撮像ロボットシステムの制御方法であって、
     前記撮像部の撮像画像における所定のターゲットの位置のトラッキング軌跡を取得し、
     前記トラッキング軌跡に基づいて前記移動部の重心移動または接地時の衝撃による前記ロボットアームの手先の振動の周波数に関する支配的な周波数及びその振幅を求め、
     前記周波数及びその振幅に基づいて前記ロボットアームの手先の振動を相殺する動作を前記ロボットアームに行わせる
    ことを特徴とする移動型撮像ロボットシステムの制御方法。
    A mobile imaging robot system having an imaging unit that captures surroundings and outputs imaging data, a robot arm that mounts the imaging unit on a hand and adjusts the position of the hand, and a moving unit that mounts and moves the robot arm. A control method of
    obtaining a tracking trajectory of a position of a predetermined target in an image captured by the imaging unit;
    Obtaining a dominant frequency and an amplitude of a vibration frequency of the hand of the robot arm due to a movement of the center of gravity of the moving part or an impact at the time of contact with the ground based on the tracking trajectory;
    A control method for a mobile imaging robot system, comprising: causing the robot arm to perform an action that cancels vibration of a hand of the robot arm based on the frequency and the amplitude thereof.
  9.  請求項8に記載の移動型撮像ロボットシステムの制御方法において、
     前記ロボットアームが移動によるロボットアームの手先位置の振動を相殺する動作を行っている際に、前記移動部の接地面の凹凸または段差によるロボットアームの手先位置の振動の周波数及びその振幅を求め、
     前記移動部の重心移動または接地時の衝撃、及び前記移動部の接地面の凹凸または段差による前記ロボットアームの手先の振動を相殺する動作を前記ロボットアームに行わせる
    ことを特徴とする移動型撮像ロボットシステムの制御方法。
     
    In the control method of the mobile imaging robot system according to claim 8,
    Obtaining the frequency and amplitude of vibration of the robot arm's hand position due to irregularities or steps on the ground contact surface of the moving part when the robot arm is performing an operation to cancel the vibration of the robot arm's hand position due to movement;
    A moving type imaging device, characterized in that the robot arm is caused to perform an operation that offsets impact when the center of gravity of the moving part moves or touches the ground, and vibration of the hand of the robot arm due to irregularities or steps on the ground contact surface of the moving part. How to control a robot system.
PCT/JP2023/001032 2022-03-03 2023-01-16 Mobile imaging robot system and method for controlling same WO2023166868A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022032423A JP2023128220A (en) 2022-03-03 2022-03-03 Mobile imaging robot system and method for controlling same
JP2022-032423 2022-03-03

Publications (1)

Publication Number Publication Date
WO2023166868A1 true WO2023166868A1 (en) 2023-09-07

Family

ID=87883717

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/001032 WO2023166868A1 (en) 2022-03-03 2023-01-16 Mobile imaging robot system and method for controlling same

Country Status (2)

Country Link
JP (1) JP2023128220A (en)
WO (1) WO2023166868A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5989247A (en) * 1982-07-15 1984-05-23 エグル アジュール コーセプト エス・エヌ・セ Remote controller for position of support section held to movable body
JPH10210345A (en) * 1997-01-22 1998-08-07 Sony Corp Robot camera device
JP2004297670A (en) * 2003-03-28 2004-10-21 Hiroo Iwata Vibration isolating mechanism for mobile omnidirectional camera device
JP2005081447A (en) * 2003-09-04 2005-03-31 Tech Res & Dev Inst Of Japan Def Agency Traveling robot
US20060267327A1 (en) * 2005-05-30 2006-11-30 Move'n Shoot Gmbh Apparatus for attachment of a camera to a vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5989247A (en) * 1982-07-15 1984-05-23 エグル アジュール コーセプト エス・エヌ・セ Remote controller for position of support section held to movable body
JPH10210345A (en) * 1997-01-22 1998-08-07 Sony Corp Robot camera device
JP2004297670A (en) * 2003-03-28 2004-10-21 Hiroo Iwata Vibration isolating mechanism for mobile omnidirectional camera device
JP2005081447A (en) * 2003-09-04 2005-03-31 Tech Res & Dev Inst Of Japan Def Agency Traveling robot
US20060267327A1 (en) * 2005-05-30 2006-11-30 Move'n Shoot Gmbh Apparatus for attachment of a camera to a vehicle

Also Published As

Publication number Publication date
JP2023128220A (en) 2023-09-14

Similar Documents

Publication Publication Date Title
US10144128B1 (en) Tooltip stabilization
EP2959315B1 (en) Generation of 3d models of an environment
KR101475826B1 (en) Leader-Follower Formation Device, Method and Mobile robot using Backstepping Method
US7507948B2 (en) Method of detecting object using structured light and robot using the same
EP2888603B1 (en) Robot positioning system
KR101863360B1 (en) 3D laser scanning system using the laser scanner capable of tracking dynamic position in real time
US20060115160A1 (en) Method and apparatus for detecting corner
JP2013526423A (en) Apparatus and method for robust calibration between machine vision system and robot
CN102666032A (en) Slip detection apparatus and method for a mobile robot
JP2006220521A (en) Self-position measuring device and program for performing self-position measurement method
US20180075609A1 (en) Method of Estimating Relative Motion Using a Visual-Inertial Sensor
JP2010276447A (en) Position measuring apparatus, position measuring method and robot system
JP2020121373A (en) Robot system, robot control method, robot controller, and program
KR101685151B1 (en) Calibration apparatus for gyro sensor
JP2009193097A (en) Control device for mobile robot and mobile robot system
WO2023166868A1 (en) Mobile imaging robot system and method for controlling same
KR100784125B1 (en) Method for extracting coordinates of landmark of mobile robot with a single camera
Chhaniyara et al. Optical flow algorithm for velocity estimation of ground vehicles: A feasibility study
Hartmann et al. High accurate pointwise (geo-) referencing of a k-tls based multi-sensor-system
JP2017166897A (en) Object detection system, abnormality determination method, and program
TWI748234B (en) Three-dimensional reconstruction device, three-dimensional reconstruction system, three-dimensional reconstruction method, and recording medium for recording three-dimensional reconstruction program
KR100703882B1 (en) Mobile robot capable of pose sensing with a single camera and method thereof
Ramírez-Hernández et al. Stereoscopic vision systems in machine vision, models, and applications
KR100377357B1 (en) System for tracking target using image feedback and Method therefor
EP3730256A1 (en) Servo control method and system for controlling movement of a robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23763130

Country of ref document: EP

Kind code of ref document: A1