US20220134567A1 - Robot system, robot control device, and robot control program - Google Patents

Robot system, robot control device, and robot control program Download PDF

Info

Publication number
US20220134567A1
US20220134567A1 US17/431,485 US202017431485A US2022134567A1 US 20220134567 A1 US20220134567 A1 US 20220134567A1 US 202017431485 A US202017431485 A US 202017431485A US 2022134567 A1 US2022134567 A1 US 2022134567A1
Authority
US
United States
Prior art keywords
work
sensor
robot
unit
target point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/431,485
Other languages
English (en)
Inventor
Masatoshi Ishikawa
Taku SENOO
Yuji Yamakawa
Shouren HUANG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Tokyo NUC
Original Assignee
University of Tokyo NUC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Tokyo NUC filed Critical University of Tokyo NUC
Assigned to THE UNIVERSITY OF TOKYO reassignment THE UNIVERSITY OF TOKYO ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HUANG, SHOUREN, ISHIKAWA, MASATOSHI, SENOO, Taku, YAMAKAWA, YUJI
Publication of US20220134567A1 publication Critical patent/US20220134567A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/005Manipulators for mechanical processing tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1633Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39011Fixed camera detects deviation end effector from reference on workpiece, object
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40488Coarse and fine motion planning combined
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40609Camera to monitor end effector as well as object to be handled

Definitions

  • the present invention relates to a robot for industrial, medical, domestic use or the like, especially relates to a robot system, a control apparatus of a robot, and a control program of the robot that need to work with high-accuracy.
  • robots are rapidly increasing in industrial fields such as industry, commerce, and agriculture, in medical fields such as surgery, nursing, and care, and even in households such as cleaning.
  • industrial fields such as industry, commerce, and agriculture
  • medical fields such as surgery, nursing, and care
  • households such as cleaning.
  • objects of the robots are frequently changing in accordance with diversifying needs such as custom-made production or high-mix low-volume production. Therefore, the robots are required to respond quickly and flexibly. Further, high-accuracy work is essential to achieve high-quality.
  • Patent Application Publication No. 5622250 discloses an apparatus for executing high-accuracy work processing.
  • an apparatus projects a reference pattern from a projection means onto a work to be machined, calculates a displacement data by imaging the work with the projected reference pattern, corrects a three-dimensional machining data based on the displacement data, and matches a machining origin of an industrial robot with a machining origin of the work.
  • Patent Application Publication No. 5622250 improves machining accuracy by projecting and imaging a reference pattern and correcting machining data, the following problems still exist. Every time a work to be machined is changed, a reference pattern must be created and a jig to hold the work with high positioning accuracy is required, thus the work to be machined cannot be easily changed. Moreover, since an image capturing camera is fixed at a place far from a machining origin, it is impossible to execute a highly accurate observation at the machining origin.
  • the present invention has been made in view of the above circumstances and provides a robot system, a robot control apparatus, and a robot control program of the robot capable of executing a high-accuracy work without preparing a jig corresponding to the objects even when the objects are in different shapes.
  • a robot system comprising: a robot including a first sensor configured to measure a displacement quantity of a coordinate position between a work point and a target point defined for each of a plurality of objects with different shapes or a physical quantity that changes due to the displacement quantity at a first operation frequency; and a control apparatus for controlling the robot, including a coarse operation management unit configured to move the target point to the vicinity of the object at a second operation frequency, a calculation control unit configured to generate a control signal to correct the displacement quantity at a third operation frequency in such a manner that the target point approaches the work point, and a correction drive unit configured to execute a correction operation to align the target point with the work point based on the control signal; wherein the second operation frequency is a frequency less than or equal to 1 ⁇ 2 of the first and third operation frequencies.
  • the displacement of the coordinate position between the work point and the target point, which is defined for each object can be measured by the first sensor, and the position of the target point can be corrected via the correction drive unit.
  • the first operation frequency which is an operation frequency of the first sensor
  • the third operation frequency of the calculation control unit are higher than twice the operation frequency of the coarse operation management unit, which enables quick positioning. In other words, even when the objects are individually different in shape, high-accuracy work can be executed smoothly without preparing a jig corresponding to the objects.
  • FIG. 1 is a functional block diagram of a robot system according to embodiments of the present invention.
  • FIG. 2 is a configuration diagram of an object action unit and a first sensor of a robot according to a first embodiment.
  • FIG. 3 is a diagram showing work position image information of the robot according to the first embodiment.
  • FIG. 4 is a single work control flow diagram of the robot system according to the first embodiment.
  • FIG. 5 is a continuous work control flow diagram of the robot system according to the first embodiment.
  • FIG. 6 is a configuration diagram of an object action unit and a first sensor of a robot according to a second embodiment.
  • FIG. 7 is a continuous work control flow diagram with online correction according to a third embodiment.
  • FIG. 8 is a schematic diagram of a neural network according to a fourth embodiment.
  • FIG. 9 is a conceptual diagram of a high-level intelligent robot system using artificial intelligence according to the fourth embodiment.
  • FIG. 10 is a control flow diagram for measuring high-accuracy position information before work according to a fifth embodiment.
  • the circuit in a broad sense is a circuit realized by combining at least an appropriate number of a circuit, a circuitry, a processor, a memory, and the like.
  • a circuit includes Application Specific Integrated Circuit (ASIC), Programmable Logic Device (e.g., Simple Programmable Logic Device (SPLD), Complex Programmable Logic Device (CPLD), and Field Programmable Gate Array (FPGA)), and the like.
  • ASIC Application Specific Integrated Circuit
  • SPLD Simple Programmable Logic Device
  • CPLD Complex Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • FIG. 1 is a diagram showing a configuration outline of the robot system 1 according to the present embodiment.
  • the robot system 1 comprises a robot 2 and a control apparatus 3 for controlling the robot 2 , which are electrically connected to each other.
  • the robot system 1 executes a predetermined work on an object OBJ (see FIG. 2 ), which is given for each work.
  • the overall form of the robot 2 is not particularly limited, but is characterized by comprising a first sensor 21 and an object action unit 22 (target point). The details of these two components will be described later. Further, other functions generally possessed by robots, such as a user interface function for specifying a work to be executed by an operator, a function for supplying the object OBJ, and a static position adjustment function, are assumed to be executed by a main body 20 in the drawings and will not be described in detail here.
  • the object action unit 22 is configured to displace the coordinate position and to execute a predetermined work on multiple types of objects OBJ with different individual shapes.
  • the displacement method of the coordinate position is not limited, any method such as an axial sliding type or an articulated type can be used.
  • the first sensor 21 is configured to measure a distance d, which is a displacement quantity of the coordinate position between a work point OP defined for each object OBJ and the object action unit 22 (target point), or a force or a torque, which is a physical quantity that changes due to the displacement quantity of the coordinate positions.
  • the operation frequency of the first sensor 21 is defined as a first operation frequency.
  • the measurement method for measuring the distance d, which is the displacement quantity of the coordinate positions, the force or the torque is not limited, any method can be used such as a camera that detects at least one of visible light, infrared light, and ultraviolet light, an ultrasonic sonar, and a torque sensor.
  • a method for measuring the distance d, which is the displacement quantity will be described.
  • FIG. 2 shows a configuration using a high-speed two-dimensional actuator 22 a as the object action unit 22 and a monocular high frame rate camera 21 a as the first sensor 21 .
  • the main body 20 is not shown.
  • the high-speed two-dimensional actuator 22 a is configured to move on the horizontal plane in each of x-axis and y-axis, and a cutting tool CT is arranged on a tip of the high-speed two-dimensional actuator 22 a as an example.
  • the cutting tool CT is arranged for cutting a working content of the robot system, it can be replaced with a coating tool, a laser injection unit, or the like as appropriate according to the work content of the robot system.
  • the high frame rate camera 21 a which is the first sensor 21 in FIG. 2 , is capable of acquiring information within a specific visual field as an image signal.
  • the camera is arranged to capture the cutting tool CT and the work point OP on the object OBJ within the visual field.
  • a frame rate (first operation frequency) of 100 fps or higher is preferable, and 500 fps or higher is even more preferable.
  • the high frame rate camera 21 a can be fixed in a position to overlook the entire object OBJ, the high frame rate camera 21 a can always follow the work point OP to acquire magnified image information with high-accuracy by mechanically interlocking with the object action unit 22 .
  • a second sensor (unshown) is separately arranged for the coarse operation management unit 33 , which will be described later, and both the object action unit 22 and the high frame rate camera 21 a move to the vicinity of the object OBJ based on a measurement result of the second sensor.
  • the high frame rate camera 21 a measures the displacement quantity as two-dimensional coordinate information, and the correction drive unit 33 described later executes a two-dimensional correction operation.
  • the control apparatus 3 comprises a communication unit 31 , a storage unit 32 , and a controller 33 , and these components are electrically connected via a communication bus 30 inside the control apparatus 3 .
  • a communication bus 30 inside the control apparatus 3 .
  • the communication unit 31 exchanges information with the robot 2 .
  • wired communication means such as USB, IEEE1394, Thunderbolt, and wired LAN network communication are preferable, wireless LAN network communication, mobile communication such as 5G/LTE/3G, Bluetooth (registered trademark) communication or the like may be included as necessary.
  • the communication means illustrated above are only examples, and a dedicated communication standard may be adopted as well. In other words, it is more preferable to carry out as a set of a plurality of the aforementioned communication means.
  • the communication unit 31 is connected to the first sensor 21 and the main body 20 in the robot 2 separately, the physical connections may be collected one and may be configured to be logically distributed in the robot 2 .
  • the storage unit 32 is a volatile or non-volatile storage media that stores various information.
  • the storage unit 32 can be implemented as a storage device such as a solid state drive (SSD) or as a memory such as a random access memory (RAM) that stores temporarily necessary information (arguments, arrays, etc.) regarding program operation, and any combination thereof.
  • SSD solid state drive
  • RAM random access memory
  • the storage unit 32 stores various parameters regarding different work types and work contents, information regarding shapes and materials of different objects OBJ, and past work position information during continuous work.
  • the storage unit 32 stores various programs or the like regarding the control apparatus 3 that are executed by the controller 33 .
  • the storage unit 32 stores a program that executes coarse operation management of the object action unit 22 defined for each object OBJ, calculates the displacement of the coordinate position between the work point OP defined for each object OBJ and the object action unit 22 based on the information input from the first sensor 21 , and calculates and instructs a correction operation to the object action unit 22 to make the object action unit 22 approach the work point OP.
  • the controller 33 processes and controls overall operations regarding the control apparatus 3 .
  • the controller 33 is, for example, an unshown central processing unit (CPU).
  • the controller 33 realizes various functions related to the control apparatus 3 by reading out a predetermined program stored in the storage unit 32 . Specifically, the controller 33 realize functions of calculating the coordinate position displacement information between the work point OP defined for each object OBJ and the current object action unit 22 based on the information given in advance for each object OBJ and the information from the first sensor 21 and other sensors, managing the coarse operation of the object action unit 22 and the first sensor 21 , and executing correction operation of the object action unit 22 with high-accuracy.
  • information processing by software can be specifically realized by hardware (controller 33 ) to be executed as a calculation control unit 331 , a coarse operation management unit 332 , and a correction drive unit 333 .
  • controller 33 is indicated as a single one in FIG. 1 , it is not limited to this, and may be implemented to have a plurality of controllers 33 for each function, or any combination thereof.
  • the calculation control unit 331 , the coarse operation management unit 332 , and the correction drive unit 333 will be described in detail.
  • the calculation control unit 331 is one in which information processing by software (stored in the storage unit 32 ) is specifically realized by hardware (controller 33 ).
  • the calculation control unit 331 executes operations to identify spatial coordinates of the work point OP and the object action unit 22 based on the information acquired from the first sensor 21 via the communication unit 31 and the parameters given in advance for each object OBJ.
  • the frequency of the calculation is the first operation frequency, which is the operation frequency of the first sensor 21 .
  • the parameters include the shape or length of the cutting tool CT, the thickness of the object OBJ, or the brightness threshold for image recognition of the mark preliminarily attached to the work point OP, or the like.
  • a control signal to correct the position is generated based on the acquired displacement information between the spatial coordinates of the work point OP and the object action unit 22 .
  • the control signal is utilized by the correction drive unit 333 alone or by both the coarse operation management unit 332 and the correction drive unit 333 as described below.
  • the calculation frequency to generate the control signal is defined as a third operation frequency. Although there is no problem even if the third operation frequency is the same as the first operation frequency, there is no need to be the same. By having the first and third operation frequencies at high frequencies, the robot system 1 as a whole can execute high-speed work.
  • the calculation control unit 331 executes a calculation to identify the spatial coordinates of the work point OP and the object action unit 22 based on the information obtained from the second sensor via the communication unit 31 and the parameters given in advance for each object OBJ.
  • the spatial coordinates calculated based on the information from the second sensor are not necessarily more accurate than the spatial coordinates calculated from the first sensor 21 , and the update frequency (operation frequency) is also not as high as the first operation frequency, which is the operation frequency of the first sensor.
  • the spatial coordinate position information calculated from the second sensor is utilized by the coarse operation management unit 332 .
  • the coarse operation management unit 332 is one in which information processing by software (stored in the storage unit 32 ) is specifically realized by hardware (controller 33 ).
  • the coarse operation management unit 332 manages a coarse operation of the object action unit 22 alone or coarse operations of both the object action unit 22 and the first sensor 21 .
  • the coarse operation means that the object action unit 22 alone or both the object action unit 22 and the first sensor 21 are brought close to the work point OP defined for each object OBJ.
  • the vicinity of the work point may be as follows: when utilizing the information defined in the software stored in the storage unit 32 , when utilizing the spatial coordinate position information calculated by the calculation control unit 331 based on the information from the first sensor 21 , or when utilizing the spatial coordinate position information calculated by the calculation control unit 331 based on the information from the second sensor. Further, any combination thereof may be utilized.
  • the operation frequency at which the coarse operation management unit 332 adjusts the position of the object action unit 22 is defined as a second operation frequency.
  • the second operation frequency is 1 ⁇ 2 or less than the first operation frequency, which is the operation frequency of the first sensor, and the third operation frequency, which is the operation frequency of the calculation control unit 331 described below.
  • the correction drive unit 333 is one in which information processing by software (stored in the storage unit 32 ) is specifically realized by hardware (controller 33 ). Based on the position correction signal provided by the calculation control unit 331 , the correction drive unit 333 executes position correction with respect to the object action unit 22 to align the action point of the object action unit 22 with the action point defined for each object OBJ. In this case, highly accurate coordinate position alignment within a range of spatial resolution of the first sensor 21 and the object action unit 22 becomes possible.
  • FIG. 3 shows image information obtained by capturing a part of the object OBJ with the high frame rate camera 21 a in the configuration illustrated in FIG. 2 .
  • FIG. 4 shows a control flow during single work
  • FIG. 5 shows a control flow during continuous work.
  • the single work control flow is a control flow when the robot system 1 executes single work on the object OBJ. See FIG. 4 for the control flow diagram.
  • the object OBJ is arranged in a robot workable area. It is sufficient that the positioning accuracy at this time is determined by the subsequent processing, the work point OP (designated point for work) on the object OBJ and the action point TP (target point) of the object action unit 22 (tip of the cutting tool CT) exist in a visual field of the first sensor 21 (high frame rate camera 21 a ) and the object action unit 22 (high-speed two-dimensional actuator 22 a ), and enter a correction operation allowable range of the object action unit 22 (high-speed two-dimensional actuator 22 a ). It is not necessary to prepare a jig for holding the object OBJ manufactured with high-accuracy only for positioning.
  • the coarse operation management unit 332 moves the object action unit 22 to the vicinity of the work point OP on the object OBJ.
  • the work point OP may be used by inputting the coordinate position information of the work point OP for each object OBJ, which has been stored in the storage unit 32 in advance, to the coarse operation management unit 332 .
  • a method may be utilized in which the coarse operation management unit 332 uses the coordinate information obtained as a calculation result of inputting the image information acquired from a general camera into the calculation control unit 331 .
  • FIG. 3 shows a time point when the coarse operation management in step S 2 is ended.
  • the left side of FIG. 3 is an overall view of the object OBJ, and the right side of FIG. 3 is an image data IM taken by the high frame rate camera 21 a .
  • There is a position displacement of a distance d occurs between the work point OP and the tip position TP of the cutting tool CT.
  • the image information captured by the high frame rate camera 21 a (first sensor 21 ) is input to the calculation control unit 331 via the communication unit 31 , and the calculation control unit 331 calculates the coordinate position displacement quantity information.
  • the coordinate position displacement information obtained in step S 3 is transmitted to the correction drive unit 333 .
  • the correction drive unit 333 executes coordinate position correction movement control for the high-speed two-dimensional actuator 22 a (object action unit 22 ) in such a manner that the tip position TP of the cutting tool CT approaches the work point OP.
  • the work point OP and the tip position TP of the cutting tool CT can be brought close to each other with high-accuracy within the range of spatial resolution of the high frame rate camera 21 a (first sensor 21 ) and the high-speed two-dimensional actuator 22 a (object action unit 22 ).
  • Robot 2 executes a work on the object OBJ.
  • the continuous work control flow is a control flow when the robot system 1 executes continuous work on the object OBJ. See FIG. 5 for the control flow diagram.
  • the object OBJ is arranged in a workable area of the robot 2 .
  • the explanation will be given with FIG. 3 .
  • the left side of FIG. 3 shows an example of image information during continuous supplementary operation.
  • the dotted line on the left side of FIG. 3 is a linear continuous work designated position RT 1 .
  • the work start position on the continuous work designated position RT 1 is set as the continuous work start point ST, and the object OBJ is arranged in the workable area of the robot with respect to the continuous work start point ST.
  • the position accuracy at this time is determined by the subsequent processing, the continuous work start point ST on the object OBJ and the action point (tip of the cutting tool CT) TP (target point) of the object action unit 22 (high-speed two-dimensional actuator 22 a ) exist in the visual field of the first sensor 21 (high frame rate camera 21 a ), and enter a correction operation allowable range of the object action unit 22 (high-speed two-dimensional actuator 22 a ), which is the same as for the single work. It is not necessary to prepare a jig for holding the object OBJ manufactured with high-accuracy only for positioning, which is also the same as for the single work.
  • the coarse operation management unit 332 moves the object action unit 22 to the vicinity of the work point OP updated for each work, starting from the continuous work start point ST on the continuous work designated position RT 1 on the object OBJ and moving in the direction of the continuous work end point EN for each work.
  • the work point OP may be used by inputting the coordinate position information of the work point OP to the coarse operation management unit 332 , wherein the coordinate position information of the work point OP is stored in the storage unit 32 in advance and is updated for each work on the continuous work designated position RT 1 for each object OBJ.
  • a method may be utilized in which the coarse operation management unit 332 uses the coordinate information obtained as a calculation result of inputting the image information acquired from a general camera into the calculation control unit 331 , which is the same as for the single work.
  • the continuous work designated position RT 1 can be explicitly indicated by the operator such as applying a mark, or can be utilized by image-recognizing a boundary line if the boundary line can be defined as the continuous work designated position RT 1 in the case where there are multiple objects in the object OBJ or the like.
  • a control trajectory RT 2 on the left side of FIG. 3 indicates a trajectory where the coarse operation management unit 332 has executed a control.
  • the distance of the control trajectory RT 2 of the coarse operation management unit 332 with respect to the continuous work designated position RT 1 is within the visual field of the first sensor 21 (high frame rate camera 21 a ) and within a correction operation range of the object action unit 22 .
  • FIG. 3 shows a time point when the coarse operation management in step S 2 is ended.
  • the right side of FIG. 3 shows the image data IM captured by the high frame rate camera 21 a .
  • There is a position displacement (distance d) occurs between the work point OP and the tip position TP of the cutting tool CT.
  • the image information captured by the high frame rate camera 21 a (first sensor 21 ) is input to the calculation control unit 331 via the communication unit 31 , and the calculation control unit 331 calculates the coordinate position displacement quantity information.
  • the coordinate position displacement information obtained in step S 3 is transmitted to the correction drive unit 333 .
  • the correction drive unit 333 executes coordinate position correction movement control for the high-speed two-dimensional actuator 22 a (object action unit 22 ) in such a manner that the tip position TP of the cutting tool CT approaches the work point OP.
  • the work point OP and the tip position TP of the cutting tool CT can be brought close to each other with high-accuracy within the range of spatial resolution of the high frame rate camera 21 a (first sensor 21 ) and the high-speed two-dimensional actuator 22 a (object action unit 22 ), which is the same as for the single work.
  • Robot 2 executes a work on the object OBJ, which is the same as for the single work.
  • This step determines whether the continuous work has been finished, which can be determined by checking whether all the work at the continuous work designated position RT 1 for each object OBJ stored in the storage unit 32 in advance is finished. Alternatively, if a general camera is used as the second sensor, the stage of the continuous work may be determined, for example, that the end point of the marked work instruction line has been reached or the like. If the continuous work is not finished, the process returns to step S 2 to continue the work.
  • the robot 2 can be controlled with high-accuracy without preparing a jig for holding the object OBJ even if the shape of each object OBJ is different.
  • the loop is broken out when a series of continuous work is finished.
  • FIG. 6 shows a configuration diagram of an embodiment of a three-dimensional position displacement correction operation.
  • the main body 20 is not shown in FIG. 6 .
  • a high-speed three-dimensional actuator 22 b is configured to move on a three-dimensional coordinate in each of x-axis, y-axis, and z-axis, and a cutting tool CT is arranged on a tip of the high-speed three-dimensional actuator 22 b as an example.
  • the cutting tool CT is arranged for cutting a work content of the robot system, it can be replaced with a coating tool, a laser injection unit or the like as appropriate according to the work content of the robot system 1 .
  • the first sensor 21 two high frame rate cameras 21 a and 21 b are arranged. If image information of the object OBJ is acquired from different angles by using two or more optical cameras, the three-dimensional coordinate of the work point OP on the object OBJ can be clarified by calculation in the calculation control unit 331 . Even in three-dimensional measurement, the requirements for each of the high frame rate cameras 21 a and 21 b are the same as for the two-dimensional high frame rate camera 21 a described in sections 1 and 2, and for the purpose of high-speed and high-accuracy positioning, a high frame rate (imaging rate) of 100 fps or higher is preferable, and 500 fps or higher is even more preferable. Specific examples are omitted.
  • the high frame rate cameras 21 a and 21 b can be fixed in a position to overlook the entire object OBJ, the high frame rate cameras 21 a and 21 b can always follow the work point OP to acquire magnified image information with high-accuracy by mechanically interlocking with the object action unit 22 (target point), which is the same as in the case of a two-dimensional correction. It should be noted that the high frame rate cameras 21 a and 21 b measure the displacement quantity as three-dimensional coordinate information, and the correction drive unit 333 executes a three-dimensional correction operation.
  • the robot system 1 with three-dimensional coordinate position displacement correction can be realized by preparing a first sensor 21 capable of executing spatial three-dimensional coordinate measurement and an object action unit 22 capable of executing three-dimensional correction movement.
  • the control flow described in section 2 can be applied accordingly as it is.
  • the continuous work designated position RT 1 used by the coarse operation management unit 332 is stored in the storage unit 32 in advance, or the method that uses information from the second sensor (such as a general camera) is adopted.
  • the second sensor such as a general camera
  • FIG. 2 A configuration diagram of the object action unit 22 (target point) and the first sensor 21 can be referred to FIG. 2
  • image information can be referred to FIG. 3
  • FIG. 7 A configuration diagram of the control flow diagram can be referred to FIG. 7 .
  • the object OBJ is placed in the workable area of the robot 2 .
  • the description thereof is omitted since this step is the same as for the continuous work in section 2.2.
  • the coarse operation management unit 332 moves the object action unit 22 to the vicinity of the work point OP updated for each work, starting from the continuous work start point ST on the continuous work designated position RT 1 on the object OBJ and moving in the direction of the continuous work end point EN for each work.
  • the movement information at this time may be updated using the work point OP information identified by the first sensor 21 , as described later in step S 8 .
  • the continuous work designated position RT 1 can be explicitly indicated by the operator such as applying a mark, or can be utilized by image-recognizing a boundary line if the boundary line can be defined as the continuous work designated position RT 1 in the case where there are multiple objects in the object OBJ or the like, which is the same as in section 2.2.
  • the coordinate position displacement measurement and the correction work for each work within the continuous operation is the same as for the single work in section 2.1 and for the continuous work in section 2.2, thus the description thereof is omitted.
  • Information on where the work point OP identified by the high frame rate camera 21 a (first sensor 21 ) exists in the image data IM in FIG. 3 is used in steps S 7 and S 8 described later.
  • step S 3 The coordinate position displacement information obtained in step S 3 is transmitted to the correction drive unit 333 , and the coordinate position correction movement control implemented with respect to the object action unit 22 is the same as for the single work in section 2.1 and for the continuous work in section 2.2, thus the description thereof is omitted.
  • Robot 2 executes a work on the object OBJ, which is the same as for the single work in section 2.1 and for the continuous work in section 2.2.
  • This step determines whether all the steps of the continuous work stored in the storage unit 32 in advance have been finished. If the continuous work is not finished, the process returns to step S 7 to continue the work.
  • This step determines whether to update the movement information used by the coarse operation management unit 332 , by checking whether the work point OP identified by the high frame rate camera 21 a (first sensor 21 ) in step S 3 is located within the allowable range of the image data IM. Specifically, for instance, if it can be estimated that the current work point OP is located near the center of the image data IM, and the position displacement of the distance d between the next work point and the tip position TP of the cutting tool CT (object action unit 22 ) is small enough in a range for the correction drive unit to execute processing, then the process returns to step S 2 to continue the work at the current position under the allowable range. If it is determined that the allowable range is exceeded, then the process proceeds to step S 8 . It is also possible to set the allowable range, which is a threshold value, to 0 and always proceed to step S 8 .
  • the movement information used in the coarse operation management unit 332 is updated based on the work point OP information identified by the high frame rate camera 21 a (first sensor 21 ). Specifically, for instance, if the work point OP displaces upward from the center in the image data IM, the work point OP can be brought closer to the center direction by moving the robot 2 upward.
  • the calculation control unit 331 executes such calculation, updates the movement information used by the coarse operation management unit 332 , and returns to step S 2 to continue the continuous work.
  • the calculation can also be executed taking into account the actual movement distance information measured in the object action unit 22 .
  • the robot system 1 By adding machine learning, which is being actively researched in the field of artificial intelligence (AI), to the robot system 1 according to the present embodiment, accurate and efficient product processing can be expected.
  • AI artificial intelligence
  • the robot system 1 is particularly suitable when the object OBJ is custom-made production or high-mix low-volume production.
  • the custom-made production or the high-mix low-volume production may naturally have various specific shapes or dimensions, in terms of the attributes of the objects, many parts are common to conventional goods such as applications, materials, shapes, and dimensions. Therefore, the attributes of the objects to be machined can be machine learned by the robot system 1 in such a manner that the objects can be machined more accurately and efficiently during current machining or in future machining.
  • FIG. 8 shows a schematic diagram of a neural network.
  • An input signal defined by various parameters is input to a first layer L 1 .
  • the input signal here is an attribute of the object to be machined (for example, information including use, material, shape, dimension, process of machining, or the like). Further, past machining data in which these attributes are known is accumulated as prior learning data. It is especially preferable to upload the data to a cloud server to share the learning data.
  • the input signals are output from computation nodes N_ 11 to N_ 13 of the first layer L 1 to computation nodes N_ 21 to N_ 25 of a second layer L 2 , respectively. At this time, with respect to the values output from the calculation nodes N_ 11 to N_ 13 , the values obtained by multiplying a weight w set between the calculation nodes N are input to the calculation nodes N_ 21 to N_ 25 .
  • Computation nodes N_ 21 to N_ 25 add up the input values from the computation nodes N_ 11 to N_ 13 and input these values (or values obtained by adding a predetermined bias value to the values) to a predetermined activation function. Then, the output value of the activation function is transmitted to the computation node N_ 31 , which is the next node. At this time, the value obtained by multiplying the weight w set between the calculation nodes N_ 21 to N_ 25 and the calculation node N_ 31 by the output values is input to the calculation node N_ 31 . The calculation node N_ 31 adds up the input values and outputs the total value as an output signal.
  • the calculation node N_ 31 may add the input values together, input the total value plus the bias value to the activation function, and output the output value as an output signal.
  • a machining plan data of the object OBJ to be machined is optimized and output.
  • Such machining plan data is utilized, for example, for a determination of the coarse operation by the coarse operation management unit 332 .
  • FIG. 9 shows a conceptual diagram of a high-level intelligent robot system utilizing artificial intelligence (AI).
  • AI artificial intelligence
  • the proposed method of the present embodiment can achieve high-speed (SPEED), high absolute accuracy (ABSOLUTE ACCURACY), and high-flexibility (adaptability) (FLEXIBILITY) compared to conventional teaching or reproducing methods and existing model-based feedback control methods.
  • the robot system 1 can evolve into middle-level intelligence or high-level intelligence and can be used for task management in Industrial 4.0.
  • AI artificial intelligence
  • the object action unit 22 can be removed from the robot 2 in FIG. 1 and the first sensor 21 can be used to identify a high-accuracy position information of the target point.
  • the control flow in the case where the actual work is continuous is shown in FIG. 10 .
  • a configuration diagram of the object action unit 22 and the first sensor 21 can be referred to FIG. 2
  • image information can be referred to FIG. 3 .
  • the object OBJ is arranged in the robot workable area with the object action unit 22 removed from the robot 2 .
  • the continuous work start point ST on the continuous work designated position RT 1 on the left side of FIG. 3 at this time is set to be within the visual field of the first sensor 21 (high frame rate camera 21 a ).
  • the action point (tip of the cutting tool CT) TP of the object action unit 22 does not exist, there is no need to pay attention thereto.
  • RT 1 in FIG. 3 is used as the target point for the actual work.
  • the coarse operation management unit 332 moves the first sensor 21 from the vicinity of the continuous work start point ST on the continuous work designated position RT 1 on the object OBJ to the direction of the vicinity of the continuous work end point EN.
  • the continuous work designated position RT 1 may utilize the information stored in the storage unit 32 in advance.
  • a method may be utilized in which the coarse operation management unit 332 uses the coordinate information obtained as a calculation result of inputting the image information acquired from a general camera into the calculation control unit 331 .
  • the continuous work designated position RT 1 can be explicitly indicated by the operator such as applying a mark, or can be utilized by image-recognizing a boundary line if the boundary line can be defined as the continuous work designated position RT 1 in the case where there are multiple objects in the object OBJ or the like, which is the same as for the continuous work control flow described in section 2.
  • High-accuracy position information of the target point is obtained from the image data IM captured by the high frame rate camera 21 a (first sensor 21 ).
  • the high-accuracy position information of the target point is obtained by inputting the image data IM to the calculation control unit 331 via the communication unit 31 , calculating the information of the coordinate position displacement quantity from the center of the image data with the calculation control unit 331 , and combining with the movement quantity by the coarse operation management unit 332 .
  • the high-accuracy position information calculated in step S 3 is stored in the storage unit 32 .
  • This step determines whether the measurement of the entire continuous work designated positions have been finished. If the measurement is finished, the process proceeds to step S 6 . If not, the process returns to step S 2 to continue the measurement.
  • the cutting tool CT (object action unit 22 ) is attached to the robot 2 to execute the work. At this time, the continuous work is executed while moving the tip position TP of the cutting tool CT based on the high-accuracy position information stored in the storage unit 32 . Since the high-accuracy position information is stored, there is no need to execute feedback control during the work.
  • the robot system 1 is capable of executing high-accuracy work without preparing a jig corresponding to the object OBJ can be implemented.
  • the robot system 1 comprising: a robot 2 including a first sensor 21 configured to measure a distance d, which is a displacement quantity of a coordinate position between a work point OP and a target point defined for each object OBJ, or a physical quantity that changes due to the displacement quantity at a first operation frequency; and a control apparatus 3 for controlling the robot 2 , including a coarse operation management unit 332 configured to move the object action unit 22 to the vicinity of the object OBJ at a second operation frequency, a calculation control unit 331 configured to generate a control signal to correct the displacement quantity at a third operation frequency in such a manner that the object action unit 22 approaches the work point OP, and a correction drive unit 333 configured to execute a correction operation to align the object action unit 22 with the work point OP based on the control signal; wherein the second operation frequency is a frequency less than or equal to 1 ⁇ 2 of the first and third operation frequencies.
  • the control apparatus 3 of the robot 2 capable of executing high-accuracy work without preparing a jig corresponding to the object OBJ can be implemented.
  • the control apparatus 3 of the robot 2 including a first sensor 21 configured to measure a distance d, which is a displacement quantity of a coordinate position between a work point OP and a target point defined for each object OBJ, or a physical quantity that changes due to the displacement quantity at a first operation frequency, comprising: a coarse operation management unit 332 configured to move the target point (the object action unit 22 ) to the vicinity of the object OBJ at a second operation frequency, a calculation control unit 331 configured to generate a control signal to correct the displacement quantity at a third operation frequency in such a manner that the object action unit 22 approaches the work point OP, and a correction drive unit 333 configured to execute a correction operation to align the target point (the object action unit 22 ) with the work point OP based on the control signal; wherein the second operation frequency is a frequency less than or equal to 1 ⁇ 2 of the first and third operation frequencies.
  • the software for implementing the control apparatus 3 of the robot 2 or the robot system 1 as hardware which can execute high-accuracy work without preparing a jig corresponding to the object OBJ, can be implemented as a program.
  • a program may be provided as a non-transitory computer readable media, a downloadable program from an external server, or a so-called “cloud computing” that enables an external computer to run the program and execute each function on a client terminal.
  • the control program of the robot 2 including a first sensor 21 configured to measure a distance d, which is a displacement quantity of a coordinate position between a work point OP and a target point defined for each object OBJ, or a physical quantity that changes due to the displacement quantity at a first operation frequency, configured to allow a computer to execute: a coarse operation management function that moves the target point (the object action unit 22 ) to the vicinity of the object OBJ at a second operation frequency, a calculation control function that generates a control signal to correct the displacement quantity at a third operation frequency in such a manner that the target point (the object action unit 22 ) approaches the work point OP, and a correction drive function that executes a correction operation to align the target point (the object action unit 22 ) with the work point OP based on the control signal; wherein the second operation frequency is a frequency less than or equal to 1 ⁇ 2 of the first and third operation frequencies.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)
US17/431,485 2019-02-25 2020-02-25 Robot system, robot control device, and robot control program Abandoned US20220134567A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2019031790 2019-02-25
JP2019-031790 2019-02-25
PCT/JP2020/007310 WO2020175425A1 (ja) 2019-02-25 2020-02-25 ロボットシステム、ロボットの制御装置、およびロボットの制御プログラム

Publications (1)

Publication Number Publication Date
US20220134567A1 true US20220134567A1 (en) 2022-05-05

Family

ID=72239038

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/431,485 Abandoned US20220134567A1 (en) 2019-02-25 2020-02-25 Robot system, robot control device, and robot control program

Country Status (4)

Country Link
US (1) US20220134567A1 (ja)
JP (1) JP7228290B2 (ja)
CN (1) CN113439013B (ja)
WO (1) WO2020175425A1 (ja)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210146552A1 (en) * 2019-11-20 2021-05-20 Samsung Electronics Co., Ltd. Mobile robot device and method for controlling mobile robot device
KR102715964B1 (ko) 2023-12-28 2024-10-11 (주)마젠타로보틱스 엔드 이펙터를 구비하는 코팅 로봇의 코팅 방법 및 이를 이용한 코팅 시스템

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4821206A (en) * 1984-11-27 1989-04-11 Photo Acoustic Technology, Inc. Ultrasonic apparatus for positioning a robot hand
US20150100066A1 (en) * 2013-10-04 2015-04-09 KB Medical SA Apparatus, systems, and methods for precise guidance of surgical tools
US20150241203A1 (en) * 2012-09-11 2015-08-27 Hexagon Technology Center Gmbh Coordinate measuring machine
US20150343641A1 (en) * 2014-06-02 2015-12-03 Seiko Epson Corporation Robot, control method of robot, and control device of robot
WO2016003077A1 (en) * 2014-07-01 2016-01-07 Samsung Electronics Co., Ltd. Cleaning robot and controlling method thereof
US20160075028A1 (en) * 2014-09-15 2016-03-17 The Boeing Company Methods and systems of repairing a structure
US20170014193A1 (en) * 2015-07-15 2017-01-19 NDR Medical Technology Pte. Ltd. System and method for aligning an elongated tool to an occluded target
US20170066130A1 (en) * 2015-09-09 2017-03-09 Carbon Robotics, Inc. Robotic arm system and object avoidance methods
JP2017087325A (ja) * 2015-11-06 2017-05-25 キヤノン株式会社 ロボット制御装置、ロボット制御方法、ロボット制御システムおよびコンピュータプログラム
US10059003B1 (en) * 2016-01-28 2018-08-28 X Development Llc Multi-resolution localization system
US20190176348A1 (en) * 2017-12-12 2019-06-13 X Development Llc Sensorized Robotic Gripping Device
US20210034032A1 (en) * 2018-01-29 2021-02-04 Shaper Tools, Inc. Systems, methods and apparatus for guided tools with multiple positioning systems

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997024206A1 (fr) * 1995-12-27 1997-07-10 Fanuc Ltd Systeme robotique composite de detection
JP5383756B2 (ja) * 2011-08-17 2014-01-08 ファナック株式会社 学習制御機能を備えたロボット
JP5561260B2 (ja) * 2011-09-15 2014-07-30 株式会社安川電機 ロボットシステム及び撮像方法
JP2013078825A (ja) * 2011-10-04 2013-05-02 Yaskawa Electric Corp ロボット装置、ロボットシステムおよび被加工物の製造方法
US10099365B2 (en) * 2012-08-02 2018-10-16 Fuji Corporation Work machine provided with articulated robot and electric component mounting machine
CN104802174B (zh) * 2013-10-10 2016-09-07 精工爱普生株式会社 机器人控制系统、机器人、程序以及机器人控制方法
JP2015089575A (ja) * 2013-11-05 2015-05-11 セイコーエプソン株式会社 ロボット、制御装置、ロボットシステム及び制御方法
JP5622250B1 (ja) 2013-11-08 2014-11-12 スターテクノ株式会社 較正機能付きワーク加工装置
JP2015147256A (ja) * 2014-02-04 2015-08-20 セイコーエプソン株式会社 ロボット、ロボットシステム、制御装置、及び制御方法
JP6042860B2 (ja) * 2014-12-02 2016-12-14 ファナック株式会社 ロボットを用いて物品を移送する物品移送装置及び物品移送方法
JP6267157B2 (ja) * 2015-05-29 2018-01-24 ファナック株式会社 位置補正機能を有するロボットを備えた生産システム
WO2018043525A1 (ja) 2016-09-02 2018-03-08 倉敷紡績株式会社 ロボットシステム、ロボットシステム制御装置、およびロボットシステム制御方法
JP6469069B2 (ja) * 2016-12-13 2019-02-13 ファナック株式会社 学習を容易化する機能を備えたロボット制御装置、及びロボット制御方法
JP6697510B2 (ja) * 2017-07-12 2020-05-20 ファナック株式会社 ロボットシステム
JP2018158439A (ja) * 2018-03-15 2018-10-11 株式会社東芝 物体ハンドリング装置、制御装置、および較正方法

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4821206A (en) * 1984-11-27 1989-04-11 Photo Acoustic Technology, Inc. Ultrasonic apparatus for positioning a robot hand
US20150241203A1 (en) * 2012-09-11 2015-08-27 Hexagon Technology Center Gmbh Coordinate measuring machine
US20150100066A1 (en) * 2013-10-04 2015-04-09 KB Medical SA Apparatus, systems, and methods for precise guidance of surgical tools
US20150343641A1 (en) * 2014-06-02 2015-12-03 Seiko Epson Corporation Robot, control method of robot, and control device of robot
WO2016003077A1 (en) * 2014-07-01 2016-01-07 Samsung Electronics Co., Ltd. Cleaning robot and controlling method thereof
US20160075028A1 (en) * 2014-09-15 2016-03-17 The Boeing Company Methods and systems of repairing a structure
US20170014193A1 (en) * 2015-07-15 2017-01-19 NDR Medical Technology Pte. Ltd. System and method for aligning an elongated tool to an occluded target
US20170066130A1 (en) * 2015-09-09 2017-03-09 Carbon Robotics, Inc. Robotic arm system and object avoidance methods
JP2017087325A (ja) * 2015-11-06 2017-05-25 キヤノン株式会社 ロボット制御装置、ロボット制御方法、ロボット制御システムおよびコンピュータプログラム
US10059003B1 (en) * 2016-01-28 2018-08-28 X Development Llc Multi-resolution localization system
US20190176348A1 (en) * 2017-12-12 2019-06-13 X Development Llc Sensorized Robotic Gripping Device
US20210034032A1 (en) * 2018-01-29 2021-02-04 Shaper Tools, Inc. Systems, methods and apparatus for guided tools with multiple positioning systems

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210146552A1 (en) * 2019-11-20 2021-05-20 Samsung Electronics Co., Ltd. Mobile robot device and method for controlling mobile robot device
US11609575B2 (en) * 2019-11-20 2023-03-21 Samsung Electronics Co., Ltd. Mobile robot device and method for controlling mobile robot device
KR102715964B1 (ko) 2023-12-28 2024-10-11 (주)마젠타로보틱스 엔드 이펙터를 구비하는 코팅 로봇의 코팅 방법 및 이를 이용한 코팅 시스템

Also Published As

Publication number Publication date
CN113439013B (zh) 2024-05-14
CN113439013A (zh) 2021-09-24
JPWO2020175425A1 (ja) 2021-11-18
JP7228290B2 (ja) 2023-02-24
WO2020175425A1 (ja) 2020-09-03

Similar Documents

Publication Publication Date Title
US11498214B2 (en) Teaching device, teaching method, and robot system
US10112301B2 (en) Automatic calibration method for robot systems using a vision sensor
CN111152229B (zh) 3d机械视觉的机械手引导方法和装置
DE102018213985B4 (de) Robotersystem
JP2020128009A (ja) ロボットを制御する方法
WO2024027647A1 (zh) 机器人控制方法、系统和计算机程序产品
JP2018528084A (ja) ロボットシステムの自動較正方法
CN104249195A (zh) 具备视觉传感器和力传感器的毛刺去除装置
D’Avella et al. ROS-Industrial based robotic cell for Industry 4.0: Eye-in-hand stereo camera and visual servoing for flexible, fast, and accurate picking and hooking in the production line
García-Díaz et al. OpenLMD, an open source middleware and toolkit for laser-based additive manufacturing of large metal parts
DE102019212452A1 (de) Interferenzvermeidungsvorrichtung und Robotersystem
US20220134567A1 (en) Robot system, robot control device, and robot control program
JP2018202542A (ja) 計測装置、システム、制御方法及び物品の製造方法
WO2022107684A1 (ja) パラメータを調整する装置、ロボットシステム、方法、及びコンピュータプログラム
US11478932B2 (en) Handling assembly comprising a handling device for carrying out at least one work step, method, and computer program
US11559888B2 (en) Annotation device
US20210178584A1 (en) Measuring device
Haag et al. Chain of refined perception in self-optimizing assembly of micro-optical systems
JP7059968B2 (ja) 制御装置および位置合わせ装置
CN113253445A (zh) 一种0回程差扫描平台控制系统及方法
WO2014091897A1 (ja) ロボット制御システム
Protic et al. Development of a novel control approach for collaborative robotics in i4 intelligent flexible assembling cells
WO2024154604A1 (ja) 作業システム及び溶接システム
Wang et al. Deep Dynamic Layout Optimization of Photogrammetry Camera Position Based on Digital Twin
Schmitt et al. Single camera-based synchronisation within a concept of robotic assembly in motion

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE UNIVERSITY OF TOKYO, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ISHIKAWA, MASATOSHI;SENOO, TAKU;YAMAKAWA, YUJI;AND OTHERS;REEL/FRAME:057198/0827

Effective date: 20210716

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION