WO2022190434A1 - Système de support de description d'instructions, procédé de support de description d'instructions et programme de support de description d'instructions - Google Patents

Système de support de description d'instructions, procédé de support de description d'instructions et programme de support de description d'instructions Download PDF

Info

Publication number
WO2022190434A1
WO2022190434A1 PCT/JP2021/034738 JP2021034738W WO2022190434A1 WO 2022190434 A1 WO2022190434 A1 WO 2022190434A1 JP 2021034738 W JP2021034738 W JP 2021034738W WO 2022190434 A1 WO2022190434 A1 WO 2022190434A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
action
unique
code
indicating
Prior art date
Application number
PCT/JP2021/034738
Other languages
English (en)
Japanese (ja)
Inventor
篤志 大城
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2022190434A1 publication Critical patent/WO2022190434A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators

Definitions

  • the present invention relates to an instruction description support system, an instruction description support method, and an instruction description support program.
  • Patent Document 1 discloses a technique that allows a teacher to easily create a control program.
  • Patent Document 2 discloses a control device that can be easily programmed even by a user who does not have advanced programming skills.
  • Patent Document 3 discloses a method for facilitating teaching of specific movements of a robot when teaching the robot.
  • Patent Document 4 discloses an automatic teaching system that automatically teaches assembly work to an assembly work robot.
  • the purpose of the present invention is to provide a technology that can create instructions describing the motion of a robot according to the motion unique to the robot.
  • a command description support system is a system that uses a score indicating the similarity between waveform data and eigenvectors in a time interval set in motion waveform data indicating the motion of a target robot to obtain a binary code indicating the motion in the time interval.
  • a calculator that calculates a code, an estimator that estimates a motion element from a binary code by referring to a motion element model, and a database in which eigen motions composed of one or more motion elements are registered.
  • a generation unit that determines one or more unique motions indicating the motion of the robot and generates a unique motion code indicating the determined one or more unique motions as an instruction code.
  • the instruction code can be generated by determining the unique motion from the motion waveform data representing the motion of the target robot. Since the peculiar motion is used as a basic unit for describing the motion of the robot, it is possible to standardize the commands describing the motion of the robot and to suppress individualization.
  • the instruction description support system may further include an extraction unit that extracts eigenvectors from the motion waveform data, and a model configuration unit that configures a motion element model by clustering a plurality of binary codes.
  • a motion element model can be configured by extracting motion elements based on the eigenvectors included in the motion waveform data.
  • the calculation unit calculates the similarity of the waveform data in the time interval with respect to each of the plurality of eigenvectors, and converts the binary code according to the magnitude relationship between two similarities that are randomly selected from the calculated similarities. You may make it determine the value of each bit which comprises. According to this configuration, it is possible to efficiently calculate the feature amount for specifying the motion element.
  • the calculation unit may shift the time interval set in the operating waveform data along the time axis. According to this configuration, it is possible to sequentially determine corresponding motion elements for continuous motions of the robot.
  • the instruction description support system associates a series of motion elements estimated using a motion element model from motion waveform data when a robot performs a predetermined motion with verbs representing the predetermined motion, thereby and determining a unique action composed of one or more motion elements included in the motion, and registering in a database a unique action code indicating the determined unique action.
  • a unique motion composed of one or more motion elements is determined for the target robot.
  • the determined eigenmotion can be used as a basic unit for describing the motion of the robot.
  • the command description support system may further include a motion recognition unit that determines a verb indicating a predetermined motion by motion recognition of a moving image of a robot performing a predetermined motion. According to this configuration, each motion of the robot can be labeled automatically instead of manually.
  • a command description support method indicates a motion of a target robot based on a score indicating the degree of similarity between waveform data and eigenvectors in a time interval set in the motion waveform data indicating the motion of the target robot.
  • a step of calculating a binary code a step of estimating a motion element from the binary code with reference to a motion element model; Determining one or more unique motions indicating the motion of the robot, and generating a unique motion code indicating the determined one or more unique motions as an instruction code.
  • An instruction description support program provides a computer with a score indicating the similarity between waveform data and eigenvectors in a time interval set in motion waveform data representing the motion of a target robot. a step of calculating a binary code representing a motion; a step of estimating a motion element from the binary code by referring to a motion element model; , determining one or more unique motions indicating motions of the target robot, and generating a unique motion code indicating the determined one or more unique motions as an instruction code.
  • FIG. 3 is a schematic diagram for explaining an outline of processing in the instruction description support system according to the embodiment
  • 1 is a schematic diagram outlining a system configuration example including an instruction description support system according to an embodiment
  • FIG. 2 is a schematic diagram showing a hardware configuration example of a control device according to the present embodiment
  • FIG. 2 is a schematic diagram showing a hardware configuration example of a robot controller according to the present embodiment
  • FIG. 1 is a schematic diagram showing a hardware configuration example of an information processing apparatus according to an embodiment
  • FIG. 4 is a flow chart showing a processing procedure of overall processing executed by the instruction description support system according to the embodiment
  • FIG. 4 is a diagram for explaining processing for calculating an ECT score in the instruction description support system according to the embodiment
  • FIG. 3 is a schematic diagram for explaining an outline of processing in the instruction description support system according to the embodiment
  • 1 is a schematic diagram outlining a system configuration example including an instruction description support system according to an embodiment
  • FIG. 2 is a schematic diagram showing a hardware configuration example of a
  • FIG. 4 is a diagram for explaining processing for calculating a binary code in the instruction description support system according to the embodiment;
  • FIG. 4 is a diagram for explaining clustering processing for constructing behavioral element models in the instruction description support system according to the present embodiment;
  • FIG. 4 is a diagram for explaining labeling processing for constructing an action element model in the instruction description support system according to the embodiment;
  • FIG. 10 is a diagram for explaining similarity evaluation processing for constructing an action element model in the instruction description support system according to the present embodiment;
  • FIG. 4 is a diagram for explaining processing for configuring an action definition database in the instruction description support system according to the embodiment;
  • FIG. 2 is a diagram for explaining the relationship between an action definition database and action elements in the instruction description support system according to the embodiment; 2 is a schematic diagram showing an example of functional configuration for configuring an action definition database of the instruction description support system according to the present embodiment;
  • FIG. 1 is a schematic diagram showing a functional configuration example for generating an instruction code of an instruction description support system according to the present embodiment;
  • FIG. 4 is a diagram showing an example of an instruction code generated by the instruction description support system according to the embodiment;
  • FIG. 1 is a schematic diagram for explaining the outline of the processing in the command description support system 1 according to this embodiment. An example of a process in which the instruction description support system 1 outputs the instruction code 72 defining the motion of the robot will be described with reference to FIG.
  • the instruction description support system 1 creates a motion element model 48 (see FIG. 10) and a An action definition database 60 is configured (step S1).
  • the motion element model 48 is configured based on the characteristics of motion elements, and can be used to identify motion elements (motion element codes) included in arbitrary motion waveform data.
  • the action definition database 60 stores unique action codes 66 assigned to unique actions 64 composed of one or more action elements.
  • one or more unique actions 64 may be classified into one or more action categories.
  • a “specific motion” is a unit that defines a motion unique to the target robot, and is composed of one or more “motion elements”.
  • “Action elements” mean individual actions obtained by decomposing "specific actions” into more detailed unit actions.
  • step S1 the feature of the motion element is extracted from the motion of the target robot, and the unique motion corresponding to the feature of the extracted motion element is searched from the motion definition database 60 (step S1). Based on the search results, a unique action that constitutes the action of the target robot is determined (step S2). An instruction code 72 is generated from the unique action code corresponding to the determined unique action (step S3). That is, instruction code 72 is defined as a combination of one or more unique operations 64 .
  • the instruction description support system 1 defines the unique motion 64 of the robot, and combines the unique motion code 66 indicating the unique motion 64, so that the motion to be executed by the target robot is determined.
  • FIG. 2 is a schematic diagram outlining a system configuration example including the instruction description support system 1 according to the present embodiment.
  • instruction description support system 1 includes control device 100 and robot controller 250 connected to control device 100 via field network 10 .
  • Control device 100 may typically be realized by a PLC.
  • the robot controller 250 is in charge of controlling the robot 200. More specifically, the robot controller 250 functions as an interface with the robot 200, outputs commands for driving the robot 200, and obtains state values of the robot 200 in accordance with commands from the control device 100. and output to the control device 100 .
  • protocols for industrial networks such as EtherCAT (registered trademark) and EtherNet/IP can be used.
  • the control device 100 is connected to the information processing device 300 and the display device 400 via the host network 20 .
  • a protocol for industrial networks such as EtherNet/IP can be used.
  • the instruction description support system 1 may include a camera 4 for photographing the robot 200 and collecting moving images.
  • the instruction description support system 1 does not necessarily include the control device 100, the robot 200, the robot controller 250, the display device 400, and the like, and the processing subject capable of executing the processing for generating the instruction code as described later is Existence is fine.
  • FIG. 3 is a schematic diagram showing a hardware configuration example of the control device 100 according to this embodiment.
  • control device 100 includes processor 102, main memory 104, storage 110, memory card interface 112, host network controller 106, field network controller 108, local bus controller 116, USB and a USB controller 120 that provides a (Universal Serial Bus) interface. These components are connected via processor bus 118 .
  • the processor 102 corresponds to an arithmetic processing unit that executes control arithmetic, and is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like. Specifically, the processor 102 reads a program stored in the storage 110, develops it in the main memory 104, and executes it, thereby implementing control calculations for the controlled object.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the main memory 104 is composed of volatile storage devices such as DRAM (Dynamic Random Access Memory) and SRAM (Static Random Access Memory).
  • the storage 110 is configured by, for example, a non-volatile storage device such as SSD (Solid State Drive) or HDD (Hard Disk Drive).
  • the storage 110 stores a system program 1102 for realizing basic functions, an IEC program 1104 and a robot program 1106 created according to the object to be controlled, and the like.
  • the IEC program 1104 is also called a PLC program, and includes instructions necessary to implement processes other than the process of controlling the robot 200.
  • the IEC program 1104 may be written in any language defined by IEC61131-3 defined by the International Electrotechnical Commission (IEC).
  • the robot program 1106 includes instructions for controlling the robot 200.
  • the robot program 1106 may include instructions written in a predetermined programming language (for example, a programming language for robot control such as V+ language or a programming language for NC control such as G code).
  • the memory card interface 112 accepts a memory card 114, which is an example of a removable storage medium.
  • the memory card interface 112 is capable of reading/writing arbitrary data from/to the memory card 114 .
  • the upper network controller 106 exchanges data with arbitrary information processing devices (such as the information processing device 300 and the display device 400 shown in FIG. 2) via the higher network 20 .
  • the field network controller 108 exchanges data with devices such as the robot controller 250 via the field network 10 .
  • the local bus controller 116 exchanges data with any functional unit 130 included in the control device 100 via the local bus 122 .
  • the USB controller 120 exchanges data with any information processing device via a USB connection.
  • FIG. 4 is a schematic diagram showing a hardware configuration example of the robot controller 250 according to this embodiment.
  • robot controller 250 includes field network controller 252 and control processing circuit 260 .
  • the field network controller 252 mainly exchanges data with the control device 100 via the field network 10 .
  • control processing circuit 260 executes arithmetic processing required to drive the robot 200 .
  • control processing circuitry 260 includes processor 262 , main memory 264 , storage 270 and interface circuitry 268 .
  • a processor 262 executes control operations for driving the robot 200 .
  • the main memory 264 is composed of, for example, a volatile memory device such as DRAM or SRAM.
  • the storage 270 is configured by, for example, a non-volatile storage device such as SSD or HDD.
  • the storage 270 stores a system program 272 for realizing control for driving the robot 200 .
  • the system program 272 includes commands for executing control operations related to the operation of the robot 200 and commands related to interfacing with the robot 200 .
  • FIG. 5 is a schematic diagram showing a hardware configuration example of the information processing apparatus 300 according to this embodiment.
  • information processing apparatus 300 includes processor 302 such as a CPU or MPU, main memory 304, storage 310, network controller 320, USB controller 324, input unit 326, and display unit 328. including. These components are connected via bus 308 .
  • the processor 302 reads various programs stored in the storage 310 , develops them in the main memory 304 , and executes them, thereby realizing necessary processing in the information processing apparatus 300 .
  • the storage 310 is composed of, for example, an HDD or SSD.
  • the storage 310 typically stores an OS 312 and an instruction description support program 314 for realizing processing as described later. Note that the storage 310 may store necessary programs other than the programs shown in FIG.
  • the network controller 320 exchanges data with any information processing device via any network.
  • the USB controller 324 exchanges data with any information processing device via a USB connection.
  • the input unit 326 is composed of a mouse, keyboard, touch panel, etc., and receives instructions from the user.
  • a display unit 328 includes a display, various indicators, and the like, and outputs processing results from the processor 302 and the like.
  • the information processing device 300 may have an optical drive 306 .
  • the optical drive 306 reads the program from a recording medium 307 (for example, an optical recording medium such as a DVD (Digital Versatile Disc)) that stores a computer-readable program non-transitory, and stores the program in a storage 310 or the like.
  • a recording medium 307 for example, an optical recording medium such as a DVD (Digital Versatile Disc)
  • DVD Digital Versatile Disc
  • Various programs executed by the information processing device 300 may be installed via the computer-readable recording medium 307, or may be installed by downloading them from any server on the network.
  • Display device 400 may be implemented using a general-purpose personal computer as an example. Since the basic hardware configuration example of the display device 400 is well known, detailed description thereof will not be given here.
  • FIG. 3 to 5 show configuration examples in which one or more processors execute programs to provide necessary functions. It may be implemented using a hardware circuit (for example, ASIC (Application Specific Integrated Circuit) or FPGA (Field-Programmable Gate Array)).
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • one or a plurality of information processing devices 300 may cooperate to execute necessary processing.
  • part or all of the instruction description support processing according to the present embodiment may be executed using so-called computer resources on the cloud.
  • FIG. 6 is a flow chart showing a processing procedure of overall processing executed by the instruction description support system 1 according to the present embodiment. Each step shown in FIG. 6 is typically implemented by processor 302 of information processing apparatus 300 executing instruction description support program 314 .
  • information processing device 300 collects motion waveform data 30 and moving image 46 from a target robot (step S100).
  • the information processing apparatus 300 extracts eigenvectors from the collected operation waveform data 30 (step S102). That is, the information processing device 300 extracts eigenvectors from the motion waveform data 30 representing the motion of the target robot.
  • the information processing device 300 selects one motion waveform data 30 collected from the target robot (step S104).
  • the information processing apparatus 300 sets the window function 32 for the selected operating waveform data 30 (step S106), and based on the extracted eigenvector and the feature amount of the operating waveform data 30 included in the set window function 32. , the ECT score for the set window function 32 is calculated (step S108).
  • the ECT score is a value indicating the degree of similarity between the feature quantity of the operating waveform data 30 included in the window function 32, which is a time interval, and each eigenvector.
  • the information processing device 300 calculates a binary code from the calculated ECT score (step S110).
  • the information processing device 300 shifts the window function 32 to the next time interval (step S112).
  • the processes of steps S106 to S112 are repeated by the number of window functions 32 set for the operating waveform data 30.
  • the information processing device 300 selects another motion waveform data 30 collected from the target robot (step S114), and repeats the processing of steps S106 to S112. The processing of steps S104 to S114 is repeated by the number of target operation waveform data 30 .
  • the information processing device 300 plots the calculated binary code in the hyperspace 38 (step S116), and determines one or a plurality of clusters 40 representing motion feature amounts formed in the hyperspace 38 (step S118). ). Then, the information processing apparatus 300 labels each of the determined clusters 40 with an action element and assigns an action element code 44 corresponding to the action element (step S120).
  • the action element model 48 is configured.
  • the action element model 48 outputs a corresponding action element and an action element code 44 indicating the action element in response to the binary code input.
  • the action definition database 60 is constructed. More specifically, the information processing device 300 calculates a binary code for each window function from the action waveform data 30 corresponding to an arbitrary robot action (step S122), and sequentially inputs the calculated binary codes to the action element model 48. By doing so, the data string of the action element code 44 is calculated (step S124).
  • the data string of the motion element code 44 corresponds to the motion elements of the robot arranged in chronological order.
  • the information processing device 300 performs action recognition from the moving image 46 corresponding to the robot action for which the binary code was calculated in step S122, and determines a string of verbs (words) corresponding to the action elements of the robot (step S126). In this way, the information processing apparatus 300 determines the verb indicating the predetermined action by recognizing the motion image 46 of the robot performing the predetermined action. Instead of the moving image 46, or in addition to the moving image 46, the path plan of the target robot may be used. Also, the moving image 46 obtained by virtually capturing the robot may be used instead of the moving image 46 obtained by actually capturing the robot.
  • the information processing apparatus 300 determines the action definition of the unique action 64 by combining the data string of the action element code 44 calculated in step S124 and the verb determined in step S126 (step S128). A unique action code 66 is assigned to identify each (step S130). Then, the information processing device 300 registers the unique action 64 and the corresponding unique action code 66 in the action definition database 60 (step S132).
  • the information processing apparatus 300 may register in the action definition database 60 in units of a series of actions 68 made up of a plurality of unique actions 64 or a process 70 made up of a plurality of steps 68 as needed.
  • the operation definition database 60 is configured by the above processing. Using the action definition database 60, an instruction code 72 corresponding to an arbitrary robot action is generated.
  • the information processing apparatus 300 calculates a binary code for each window function from the motion waveform data 30 corresponding to an arbitrary robot motion (step S134), and sequentially inputs the calculated binary codes to the motion element model 48. By doing so, the data string of the action element code 44 is calculated (step S136). That is, the information processing device 300 refers to the motion element model 48 and estimates motion elements from the binary code.
  • the data string of the motion element code 44 corresponds to the motion elements of the robot arranged in chronological order.
  • the information processing device 300 refers to the action definition database 60 to search for a unique action 64 that matches or is most similar to the data string of the action element code 44 (step S138). Finally, the information processing device 300 sequentially outputs unique action codes 66 corresponding to unique actions 64 that match or are most similar to the data string of the action element code 44 (step S140). A set of peculiar operation codes 66 that are sequentially output becomes an instruction code 72 .
  • the information processing apparatus 300 refers to the action definition database 60 in which the unique actions 64 composed of one or more action elements are registered, and determines the one or more unique actions 64 indicating the action of the target robot. Along with the determination, a unique operation code 66 indicating the determined one or more unique operations 64 is generated as an instruction code 72 .
  • the information processing device 300 collects a predetermined amount of motion waveform data from the target robot.
  • the motion waveform data corresponds to time-series data of motion feature vectors collected for each predetermined sampling period.
  • a motion feature vector is a multi-dimensional vector defined by a plurality of feature amounts relating to the motion of the target robot.
  • the feature values include, for example, the TCP posture (position of each axis), joint torque, TCP velocity, TCP acceleration, TCP jerk, etc. of the target robot.
  • TCP posture (X, Y, Z, Rx, Ry, Rz), joint rotation angle (6-axis), joint torque (6-axis), TCP velocity (Vx, Vy, Vz, Vrx, Vry, Vrz)
  • the mobile robot uses a total of 24-dimensional feature values, including the current posture (X, Y, ⁇ ), drive train torque (L torque, R torque), speed, and IMU data (speed, rotational position, rotational speed).
  • Any feature quantity may be used as long as it is information that defines the motion of the target robot.
  • the information processing device 300 extracts one or more eigenvectors by performing principal component analysis on the collected operating waveform data. Any method can be used to extract eigenvectors. The extracted eigenvectors are believed to represent the fundamental motion characteristics of the robot of interest.
  • FIG. 7 is a diagram for explaining the process of calculating the ECT score in the instruction description support system 1 according to this embodiment.
  • the motion waveform data 30 corresponds to time-series data of motion feature vectors collected at predetermined sampling intervals.
  • the information processing apparatus 300 sequentially applies a window function 32 having a predetermined time width to the operating waveform data 30 while shifting the window function 32 along the time axis. That is, the information processing apparatus 300 shifts the window function 32, which is the time interval set in the operating waveform data 30, along the time axis.
  • the information processing apparatus 300 uses the window function 32 to extract eigenvectors that match the operating waveform data 30 from the eigenvectors extracted in advance.
  • the information processing device 300 calculates a time correlation matrix between the target data included in the window function 32 and the reference data.
  • Information processing apparatus 300 compares the time correlation matrix and the eigenvector, and calculates the eigenvalue from the time correlation matrix.
  • the dot product indicates how similar two vectors are with values ranging from -1 to 1.
  • an ECT score indicating the eigenvector inner product of the correlation matrix in the window function 32 can be calculated.
  • the information processing device 300 calculates a score (ECT score) for each eigenvector for each window function 32 set in the operating waveform data 30 .
  • the calculated ECT scores are output as an ECT score list 34.
  • the information processing apparatus 300 determines an eigenvector that chronologically matches the operating waveform data 30 for a predetermined number of eigenvectors, and divides the determined eigenvector inner product angle by the norm product. is calculated, and the calculated angle is normalized and calculated as an ECT score.
  • step S110 processing for calculating a binary code
  • FIG. 8 is a diagram for explaining the process of calculating the binary code 36 in the instruction description support system 1 according to this embodiment.
  • binary code 36 is calculated based on test patches.
  • a test patch is a combination of two ECT scores (ECT x and ECT y ) randomly selected from the group of ECT scores included in the pre-calculated ECT score list 34 .
  • the first and third ECT scores in the ECT score list 34 are selected as the first combination, and the first and second ECT scores in the ECT score list 34 are selected as the second combination. selected, the second and fifth ECT scores in the ECT score list 34 are selected as the third combination.
  • two ECT scores are randomly selected sequentially.
  • the two ECT scores included in the patch are compared, assigning "1" if the first ECT score (ECT x ) is greater than or equal to the second ECT score (ECT y ), and assigning the first ECT score (ECT x ) is less than the second ECT score (ECT y ), assign a '0'.
  • the information processing apparatus 300 calculates the similarity of the waveform data in the time interval corresponding to the window function 32 for each of the plurality of eigenvectors, and selects two similarities randomly selected from the calculated similarities.
  • the value of each bit forming the binary code 36 is determined according to the magnitude relationship between degrees.
  • a 36-bit binary code 36 is calculated as the motion feature amount.
  • the calculated binary code 36 corresponds to a code that indicates the motion of the robot in the time interval of the window function 32.
  • the information processing apparatus 300 uses the ECT score, which indicates the similarity between the waveform data of the window function 32, which is a time interval set in the operation waveform data 30, and the eigenvector, to obtain the binary code 36 indicating the operation in the time interval.
  • FIG. 9 is a diagram for explaining the clustering process for constructing the action element model 48 in the instruction description support system 1 according to this embodiment.
  • a number of binary codes 36 calculated by the procedure as described above are plotted in hyperspace 38.
  • FIG. The hyperspace 38 in which the binary code 36 is plotted has the same number of dimensions as the number of bits forming the binary code 36 (36 dimensions in the above example).
  • a number of binary codes 36 plotted in hyperspace 38 form one or more clusters 40 corresponding to motion elements. Each of the clusters 40 will exhibit characteristics of an operating element.
  • FIG. 10 is a diagram for explaining the labeling process for constructing the action element model 48 in the instruction description support system 1 according to this embodiment.
  • motion elements 56 such as "turn”, “stop”, and “go straight” are labeled with respect to the motion feature values obtained by clustering, and motion element codes 44 corresponding to the motion elements 56 are added. assign.
  • the motion element 56 can be determined from the details of the motion of the robot targeted by the collected motion waveform data.
  • a motion element model 48 is configured by the above clustering processing and labeling processing. That is, the information processing device 300 configures the action element model 48 by clustering the plurality of binary codes 36 . Note that the action element model 48 does not necessarily need to have the hyperspace 38 as shown in FIG. 10, and may be implemented in the form of a known classifier.
  • FIG. 11 is a diagram for explaining the similarity evaluation process for constructing the action element model 48 in the instruction description support system 1 according to the present embodiment.
  • a histogram 42 of clusters 40 showing action elements 56 formed by binary code 36 plotted in hyperspace 38 as shown in FIG. 9 is computed.
  • the histogram 42 indicates that the more similar the directions of the vectors of the motion feature amounts, the higher the similarity of the motion feature amounts.
  • the Kullback-Leibler divergence can be used to evaluate the similarity of each distribution. That is, the Kullback-Leibler information amount (KL information amount) indicates how similar the probability distribution P(i) and the probability distribution Q(i) are. The closer the KL information amount is to zero, the higher the degree of similarity between the two probability distributions.
  • the KL information amount D KL can be calculated according to the formula shown in FIG. 12, and by evaluating the KL information amount D KL , the identification performance of the motion element model 48 can be enhanced.
  • the algorithm applied to FIG. 11 has the same content as the "Bag-of-Visual Words" used in image processing and object detection algorithms.
  • FIG. 12 is a diagram for explaining the process of configuring the action definition database 60 in the instruction description support system 1 according to this embodiment.
  • the information processing device 300 collects motion waveform data for an arbitrary robot motion, and also collects moving images and the like by photographing the robot motion.
  • the information processing device 300 calculates the binary code 36 for each window function from the collected operating waveform data (binary code list 50) according to the processing procedure shown in FIGS. Then, the information processing device 300 determines to which cluster 40 on the hyperspace 38 shown in FIG. 10 the binary code 36 for each window function is classified (class classification processing 52 using the motion element model 48). .
  • an action element code 44 (88", "232”, “235”, “343”, etc.) indicating the determined action element 56 is a basic element of the instruction description.
  • the information processing device 300 performs action recognition from the corresponding moving image or the like, and determines a verb (word) corresponding to a series of actions (action (word) decision processing 54 based on action recognition).
  • action (word) decision processing 54 determines a verb (word) corresponding to a series of actions.
  • One or more action elements 56 constitute a unique action 64 (eg, "wipe").
  • the information processing device 300 assigns a unique action code 66 to the determined unique action 64 .
  • a series of motion elements 56 is determined, and by assigning verbs to the determined series of motion elements 56, the eigen motion 64 and a unique action code 66 that identifies the unique action 64 .
  • the unique operation code 66 thus determined is the basic unit of the instruction code 72 .
  • FIG. 13 is a diagram for explaining the relationship between the action definition database 60 and the action elements 56 in the instruction description support system 1 according to this embodiment.
  • action definition database 60 includes a plurality of unique actions 64 and corresponding unique action codes 66 .
  • Each eigenmotion 64 is associated with one or more motion elements 56 that make up the eigenmotion 64 .
  • Each action element 56 is assigned an action element code 44 .
  • a unique action code 66 may correspond to one or more action element codes 44 .
  • the action definition database 60 is configured by the above processing. (c7: Generation of instruction code 72) Next, the processing for generating the instruction code 72 will be described (steps S134 to S140).
  • the information processing device 300 calculates the binary code 36 for each window function from the motion waveform data 30 corresponding to the robot motion for which the instruction code 72 is to be generated, according to the processing procedure shown in FIGS. Then, the information processing apparatus 300 inputs the binary code 36 to the motion element model 48 for each calculated window function, and determines the motion element code 44 corresponding to each window function. That is, the information processing device 300 calculates a series of motion element codes 44 corresponding to the binary code 36 for each window function.
  • the information processing device 300 refers to the action definition database 60 to determine a unique action corresponding to the series of calculated action element codes 44 . That is, the information processing apparatus 300 searches for the unique motion 64 that matches or is most similar to the combination of the motion element 56 and the motion element code 44 shown in FIG.
  • the information processing device 300 outputs a combination of unique action codes 66 corresponding to one or more unique actions 64 obtained by searching as an instruction code 72 .
  • the information processing device 300 refers to the motion element model 48 and the motion definition database 60 to generate the instruction code 72 from the motion waveform data 30 representing an arbitrary robot motion.
  • FIG. 14 is a schematic diagram showing a functional configuration example for configuring the action definition database 60 of the instruction description support system 1 according to this embodiment.
  • FIG. 15 is a schematic diagram showing a functional configuration example for generating the instruction code 72 of the instruction description support system 1 according to this embodiment.
  • FIGS. 14 and 15 are typically implemented by the processor 302 of the information processing device 300 executing the instruction description support program 314.
  • information processing apparatus 300 includes an eigenvector extraction unit 350, a window function setting unit 352, an ECT score calculation unit 354, and a binary code calculation unit as the functional configuration for configuring action definition database 60. 356 , an action element model configuration unit 358 , an action element determination unit 360 , an action recognition unit 362 , and an action definition database configuration unit 364 .
  • the eigenvector extraction unit 350 extracts eigenvectors or eigenvector candidates from the motion waveform data 30 collected from the robot. That is, the eigenvector extraction unit 350 extracts eigenvectors from the motion waveform data 30 representing the motion of the target robot.
  • the window function setting unit 352 sets the window function 32 for the motion waveform data 30 collected from the robot, and sequentially shifts the window function 32 along the time axis.
  • the ECT score calculator 354 calculates an ECT score based on the eigenvector extracted by the eigenvector extractor 350 and the feature amount of the operating waveform data 30 included in the window function 32 set by the window function setter 352. .
  • the binary code calculator 356 calculates the binary code 36 from the ECT score calculated by the ECT score calculator 354 . That is, the binary code calculator 356 calculates the binary code 36 representing the operation in the time interval from the ECT score, which indicates the similarity between the waveform data in the time interval (window function 32) set in the operation waveform data 30 and the eigenvector. calculate.
  • the action element model constructing unit 358 constructs the action element model 48 for estimating the action element 56 by clustering a plurality of binary codes 36 . More specifically, the action element model constructing unit 358 plots the binary code 36 calculated by the binary code calculating unit 356 in the hyperspace 38 and determines the cluster 40 from the set of plotted binary codes 36 . The action element model constructing unit 358 constructs the action element model 48 by labeling the action element 56 and assigning the action element code 44 corresponding to the action element 56 to each cluster 40 .
  • the motion element determination unit 360 inputs the binary code 36 for each window function calculated by the binary code calculation unit 356 to the motion element model 48, and determines the motion element 56 corresponding to the motion waveform data 30 and the corresponding motion element code 44. Output sequentially.
  • the motion recognition unit 362 outputs verbs (words) corresponding to the motion waveform data 30 based on the moving image 46 . That is, the motion recognition unit 362 determines the verb indicating the predetermined motion by recognizing the motion image of the robot performing the predetermined motion.
  • the action definition database construction section 364 constructs the action definition database 60 by combining the action element codes 44 output in time series from the action element determination section 360 and the verbs output in time series from the action recognition section 362 . . That is, the action definition database constructing unit 364 stores a series of action elements estimated using the action element model 48 from the action waveform data 30 when the robot performs a predetermined action, and a verb indicating the predetermined action. By doing so, a unique action 64 composed of one or more action elements included in the predetermined action is determined, and a unique action code 66 indicating the determined unique action 64 is registered in the action definition database 60 .
  • the action definition database 60 configured using the functional configuration as described above is used to generate the instruction code 72 in the following manner, for example.
  • information processing apparatus 300 includes a window function setting unit 352, an ECT score calculation unit 354, a binary code calculation unit 356, a motion element estimation unit, and a functional configuration for generating instruction code 72. 370 and an instruction code generator 372 .
  • the window function setting unit 352, the ECT score calculation unit 354, and the binary code calculation unit 356 have substantially the same configuration as shown in FIG. However, in the functional configuration shown in FIG. 15, the operation waveform data 30 corresponding to the robot operation for which the instruction code 72 is desired to be generated is input. Therefore, the calculated binary code 36 also indicates the robot motion for which the instruction code 72 is desired to be generated.
  • the motion element estimation unit 370 refers to the motion element model 48 and estimates motion elements from the binary code 36 . More specifically, the motion element estimator 370 sequentially inputs the binary code 36 sequentially calculated from the motion waveform data 30 to the motion element model 48 to calculate the data string of the motion element code 44 .
  • the instruction code generator 372 refers to the action definition database 60 to search for a unique action 64 that matches or is most similar to the data string of the action element code 44 . Finally, the instruction code generator 372 sequentially outputs unique operation codes 66 corresponding to unique operations 64 that match or are most similar to the data string of the operation element code 44 . A set of peculiar operation codes 66 that are sequentially output becomes an instruction code 72 .
  • the instruction code generator 372 refers to the action definition database 60 in which unique actions 64 composed of one or more action elements are registered, and generates one or more unique actions 64 representing actions of the target robot. is determined, and a unique operation code 66 indicating the determined one or more unique operations 64 is generated as an instruction code 72 .
  • an instruction code 72 corresponding to an arbitrary robot motion is generated.
  • FIG. 16 is a diagram showing an example of an instruction code 72 generated by the instruction description support system 1 according to this embodiment.
  • instruction code 72 is defined by a combination of multiple unique operations 64 .
  • Each unique motion 64 is composed of one or more motion elements 56 .
  • parameters for embodying actions can be set for each of the action elements 56, and the user may appropriately set or change the parameters so as to achieve desired actions.
  • an instruction description support system comprising a generator (372) that generates a unique operation code (66) representing a plurality of unique operations as an instruction code (72).
  • the calculating unit calculates a similarity of the waveform data in the time interval with respect to each of the plurality of eigenvectors, and according to a magnitude relationship between two similarities randomly selected from the calculated similarities, A value of each bit constituting the binary code is determined. 3.
  • Configuration 5 By associating a series of motion elements (56) estimated using the motion element model from motion waveform data when the robot performs a predetermined motion with verbs indicating the predetermined motion, A database configuration unit (364) that determines a unique motion composed of one or more motion elements included in the motion and registers a unique motion code (66) indicating the determined unique motion (64) in the database (60) ), the instruction description support system according to any one of configurations 1 to 4.
  • a database (60) in which unique motions composed of one or more motion elements are registered one or more unique motions (64) indicating motions of the target robot are determined, and the determined one
  • an instruction description support method comprising steps (S138, S140) of generating a unique operation code (66) indicating a plurality of unique operations as an instruction code (72).
  • a database (60) in which unique motions composed of one or more motion elements are registered one or more unique motions (64) indicating motions of the target robot are determined, and the determined one Alternatively, an instruction description support program for executing steps (S138, S140) of generating a unique operation code (66) representing a plurality of unique operations as an instruction code (72).
  • 1 instruction description support system 4 camera, 10 field network, 20 upper network, 30 motion waveform data, 32 window function, 34 score list, 36 binary code, 38 hyperspace, 40 cluster, 42 histogram, 44 motion element code, 46 Moving image, 48 Action element model, 50 Binary code list, 52 Classification process, 54 Decision process, 56 Action element, 60 Action definition database, 64 Unique action, 66 Unique action code, 68 Continuous action, 70 Process, 72 Instruction code , 100 control device, 102, 262, 302 processor, 104, 264, 304 main memory, 106 upper network controller, 108, 252 field network controller, 110, 270, 310 storage, 112 memory card interface, 114 memory card, 116 local Bus controller, 118 processor bus, 120, 324 USB controller, 122 local bus, 130 function unit, 200 robot, 250 robot controller, 260 control processing circuit, 268 interface circuit, 272, 1102 system program, 300 information processing device, 306 optics Drive, 307 recording medium, 308 bus, 312 OS, 314 instruction description support program, 320 network controller, 3

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

Système de support de description d'instructions qui comprend : une unité de calcul qui calcule un code binaire indiquant une action dans un ensemble de segments temporels dans des données de forme d'onde d'action indiquant l'action d'un robot cible, le calcul de celui-ci à partir d'un score indiquant une similarité entre des données de forme d'onde pour le segment temporel et un vecteur propre ; une unité d'estimation qui se réfère à un modèle d'élément d'action et estime un élément d'action à partir du code binaire ; et une unité de génération qui explore une base de données dans laquelle des actions uniques comprenant au moins un élément d'action sont stockées, détermine au moins une action unique indiquant l'action du robot cible, et génère, en tant que code d'instruction, un code d'action unique qui indique l'au moins une action unique qui a été déterminée.
PCT/JP2021/034738 2021-03-12 2021-09-22 Système de support de description d'instructions, procédé de support de description d'instructions et programme de support de description d'instructions WO2022190434A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-040255 2021-03-12
JP2021040255A JP2022139741A (ja) 2021-03-12 2021-03-12 命令記述支援システム、命令記述支援方法および命令記述支援プログラム

Publications (1)

Publication Number Publication Date
WO2022190434A1 true WO2022190434A1 (fr) 2022-09-15

Family

ID=83226233

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/034738 WO2022190434A1 (fr) 2021-03-12 2021-09-22 Système de support de description d'instructions, procédé de support de description d'instructions et programme de support de description d'instructions

Country Status (2)

Country Link
JP (1) JP2022139741A (fr)
WO (1) WO2022190434A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11175132A (ja) * 1997-12-15 1999-07-02 Omron Corp ロボット、ロボットシステム、ロボットの学習方法、ロボットシステムの学習方法および記録媒体
JP2016018422A (ja) * 2014-07-09 2016-02-01 キヤノン株式会社 画像処理方法、画像処理装置、プログラム、記録媒体、生産装置、及び組立部品の製造方法
JP2018153873A (ja) * 2017-03-15 2018-10-04 株式会社オカムラ マニピュレータの制御装置、制御方法およびプログラム、ならびに作業システム
JP2020201937A (ja) * 2019-06-06 2020-12-17 株式会社日立製作所 センサデータを与える複数の装置を含むシステムを管理する方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11175132A (ja) * 1997-12-15 1999-07-02 Omron Corp ロボット、ロボットシステム、ロボットの学習方法、ロボットシステムの学習方法および記録媒体
JP2016018422A (ja) * 2014-07-09 2016-02-01 キヤノン株式会社 画像処理方法、画像処理装置、プログラム、記録媒体、生産装置、及び組立部品の製造方法
JP2018153873A (ja) * 2017-03-15 2018-10-04 株式会社オカムラ マニピュレータの制御装置、制御方法およびプログラム、ならびに作業システム
JP2020201937A (ja) * 2019-06-06 2020-12-17 株式会社日立製作所 センサデータを与える複数の装置を含むシステムを管理する方法

Also Published As

Publication number Publication date
JP2022139741A (ja) 2022-09-26

Similar Documents

Publication Publication Date Title
Xiong et al. Transferable two-stream convolutional neural network for human action recognition
Hu et al. Early action prediction by soft regression
Lan et al. A hierarchical representation for future action prediction
CN107609541B (zh) 一种基于可变形卷积神经网络的人体姿态估计方法
Kumar et al. A position and rotation invariant framework for sign language recognition (SLR) using Kinect
Wang et al. Transferring rich feature hierarchies for robust visual tracking
CN107150347B (zh) 基于人机协作的机器人感知与理解方法
Kappler et al. Leveraging big data for grasp planning
Kadous Machine recognition of Auslan signs using PowerGloves: Towards large-lexicon recognition of sign language
JP7146247B2 (ja) 動作認識方法及び装置
Yin et al. Online state-based structured SVM combined with incremental PCA for robust visual tracking
Jayaraman et al. End-to-end policy learning for active visual categorization
Ridge et al. Self-supervised cross-modal online learning of basic object affordances for developmental robotic systems
CN103415825A (zh) 用于手势识别的系统和方法
CN110069129B (zh) 确定系统和确定方法
Park et al. Attributed grammars for joint estimation of human attributes, part and pose
JP7031685B2 (ja) モデル学習装置、モデル学習方法及びコンピュータプログラム
Mohd Asaari et al. Adaptive Kalman Filter Incorporated Eigenhand (AKFIE) for real-time hand tracking system
JP2015111332A (ja) 姿勢検出装置、姿勢検出方法および姿勢検出プログラム
CN114730407A (zh) 使用神经网络对工作环境中的人类行为进行建模
Nikpour et al. Deep reinforcement learning in human activity recognition: A survey
JP7472471B2 (ja) 推定システム、推定装置および推定方法
JP4928193B2 (ja) 顔画像認識装置及び顔画像認識プログラム
WO2022190434A1 (fr) Système de support de description d'instructions, procédé de support de description d'instructions et programme de support de description d'instructions
WO2022190435A1 (fr) Système d'assistance de script de commande, procédé d'assistance de script de commande et programme d'assistance de script de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21930282

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21930282

Country of ref document: EP

Kind code of ref document: A1