WO2021181035A1 - Procédé de réalisation automatique d'une opération sur un objet avec un outil porté par un système polyarticulé - Google Patents
Procédé de réalisation automatique d'une opération sur un objet avec un outil porté par un système polyarticulé Download PDFInfo
- Publication number
- WO2021181035A1 WO2021181035A1 PCT/FR2021/050380 FR2021050380W WO2021181035A1 WO 2021181035 A1 WO2021181035 A1 WO 2021181035A1 FR 2021050380 W FR2021050380 W FR 2021050380W WO 2021181035 A1 WO2021181035 A1 WO 2021181035A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tool
- polyarticulated system
- polyarticulated
- working
- sensor
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0096—Programme-controlled manipulators co-operating with a working support, e.g. work-table
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/1605—Simulation of manipulator lay-out, design, modelling of manipulator
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40442—Voxel map, 3-D grid map
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40475—In presence of moving obstacles, dynamic environment
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02E—REDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
- Y02E30/00—Energy generation of nuclear origin
- Y02E30/30—Nuclear fission reactors
Definitions
- the present invention relates to the technical field of performing all operations, for example cutting, welding, marking, stripping, painting, surfacing, positioning a sensor or any type of tool. analysis, etc., on an object present in a working environment, the positions and geometries of the object and of the working environment possibly being scalable and / or insufficiently defined to be able to carry out said operations.
- the invention applies to the technical dismantling sector, in which measuring, cutting and gripping operations are carried out on objects possibly presenting risks associated with radioactivity, without this being limiting.
- the invention relates more precisely to a method and an installation for automatically performing an operation on an object with a tool carried by a polyarticulated system and movable in a working environment, the positions and geometries of the object and of the working environment being scalable and / or insufficiently defined to be able to perform said operation in automatic mode.
- the operation can be of any nature whatsoever in any type of environment, with any type of tool and carried out especially type of object.
- the operations cannot be carried out on contact by an operator, the tool (measuring, cutting, gripping) must then be carried by a polyarticulated system to carry out this operation.
- the operator controls the system remotely using a control reference, in indirect vision using cameras and screens.
- the drawbacks of such a system are on the one hand its high cost, on the other hand the imprecision of its control in terms of speed, effort and positioning, leading to a long implementation time and premature wear of the consumables of the tools ( blades, discs).
- the invention aims to remedy the drawbacks of the prior art by providing a method and an installation for performing an operation on an object in a working environment, making it possible to ensure optimum conditions of safety, precision and speed. .
- the method comprises at least:
- 3D sensor is understood to mean any means for digitizing an object or part of an object in the form of a three-dimensional image, and includes in particular 3D cameras, laser scanners, video means, webcams. ..
- the anti-collision parameters are for example threshold distances from which safety actions are carried out automatically such as the total stopping of any movement or the reduction in the speed of movement of the polyarticulated system and of the tool. , for example up to a threshold speed which then constitutes an anticollision parameter.
- the invention makes it possible to carry out operations in automatic and remote mode in an environment that is initially uncertain or unknown or inaccessible while guaranteeing their correct execution.
- the operator views the working environment and the object in a cloud of points on the display, and defines a tool movement path to perform the desired operation, for example a cutting operation.
- the invention is also advantageous in that the method makes it possible to carry out a simulation, in particular in CAD, of the corresponding displacement of the polyarticulated system and of its tool in order to ensure that the displacement is achievable without collision.
- the simulation of the displacement confirms its feasibility
- the actual displacement of the polyarticulated system carrying the tool can be executed to perform the operation as such on the object.
- the method is thus carried out in at least three steps, namely: a step A of securing the environment and the object by digitization in the form of an overall point cloud, a step B of definition and simulation trajectories, and an execution step C.
- the method also comprises a step A ', between step A and step B, consisting in controlling the movement of the polyarticulated system to capture detailed images of areas of the object with a 3D sensor carried by the polyarticulated system, in the form of a point cloud denser and more precise than the overall point cloud and to integrate into the working image the captured images of the corresponding area of the object, replacing the part corresponding value from the aggregate point cloud.
- a step A ' between step A and step B, consisting in controlling the movement of the polyarticulated system to capture detailed images of areas of the object with a 3D sensor carried by the polyarticulated system, in the form of a point cloud denser and more precise than the overall point cloud and to integrate into the working image the captured images of the corresponding area of the object, replacing the part corresponding value from the aggregate point cloud.
- This additional step makes it possible to provide an improved working image, with increased precision in order to define the different trajectories of the operation to be carried out. Moreover, this additional step also makes it possible to capture images of the areas not visible by the sensor used for step A, thanks to the displacement of the polyarticulated system, and thus to complete the working image.
- the 3D sensor carried by the polyarticulated system can be permanently attached to it, or else be stored nearby.
- the method comprises a step A ", between step A is step A ', consisting in automatically moving the polyarticulated system to grip and connect to the 3D sensor stored nearby.
- the method comprises a step C ', between step B and step C, consisting in automatically moving the polyarticulated system in order, optionally to disconnect and remove a 3D sensor, and to connect to the tool. stored nearby.
- This characteristic makes it possible to have a plurality of tools available stored nearby, intended to be connected to the polyarticulated system.
- the method also comprises, prior to step B, a step B ′ consisting in selecting, in and for the simulation of step B, in a database of tools , the tool to perform the operation.
- the method is also remarkable in that, during step C, the speed of movement of the polyarticulated system carrying the tool in direct contact with the object perhaps, according to need, regulated in real time according to a direct or indirect measurement of the force undergone by the tool.
- the displacement speed of the polyarticulated system is automatically reduced when the 3D sensor or a part of the polyarticulated system approaches the object or a part of the working environment, and movement is automatically stopped when the 3D sensor or part of the polyarticulate system is at a safe distance from the object or part of the working environment, depending on the parameters collision avoidance defined in step A.
- the definition of the tool path on the part of the working image representing the object according to step B consists of positioning on the working image either at least one starting point - end point pair, or at least one predefined geometric figure chosen from a library composed for example of at least one line, planes, masks, etc.
- the invention also relates to an installation for implementing the method described above, characterized in that it comprises a polyarticulated system and at least one 3D sensor, connected to a computer processing system and to a display, designed for : represent on the display an overall point cloud of the object and all or part of the working environment from images captured by at least one 3D sensor; represent a working image resulting from the fusion of the overall point cloud, a pre-existing CAD model of the polyarticulated system and a possible pre-existing CAD model of all or part of the environment as built, the processing system computer system being also designed to make it possible to define a trajectory of the tool on the part of the working image representing the object, and to perform a simulation of the corresponding displacement of the polyarticulated system and of the tool in the working image to ensure that the operation is feasible, and if the operation is feasible, to perform the actual displacement of the polyarticulated system carrying the tool along the path defined to perform the operation as such on the object.
- the installation comprises a means for regulating, during step C and in real time, the speed of movement of the polyarticulated system carrying a tool in direct contact with the object, as a function of a direct or indirect measurement of the force undergone by the tool.
- Figure 1 is a schematic perspective representation of the installation according to the invention, illustrating the polyarticulated system carrying the tool, in its working environment and to perform an operation on an object.
- FIG. 2 is a schematic representation similar to that of FIG. 1, the polyarticulated system carrying a 3D camera.
- FIG. 3 illustrates in perspective the representation on the display, of the merger, in a working image, of an overall point cloud of the object and of a part of the working environment, obtained from images captured by a 3D camera on the one hand and the CAD model of the polyarticulated system.
- FIG. 4 is a view similar to that of FIG. 3, the polyarticulated system carrying a 3D camera to capture images of details of the object, which come to be integrated into the working image, replacing the corresponding part of the ensemble point cloud.
- Figure 5 is a view similar to that of Figure 4, a tool path being defined on the part of the working image representing the object.
- FIG. 6 is a schematic representation similar to that of FIG. 5 illustrating the simulation of the displacement of the polyarticulated system and of the tool in the working image.
- FIG. 7 illustrates in detail the positioning of a plane in intersection with the part of the working image representing an object.
- FIG. 8 is a representation similar to that of FIG. 7, the trajectory resulting from the intersection between the positioned plane and the part of the working image representing the object being automatically calculated, making it possible to automatically calculate the trajectories displacement of the polyarticulated system.
- FIG. 9 illustrates the phase of simulation of the displacement of the polyarticulated system.
- FIG. 10 represents a simplified flowchart of the method according to the invention.
- the invention relates to a method and an installation (1) for automatically performing an operation on an object (2) positioned in a working environment (3).
- the invention is not limited to a particular operation, and may relate to an operation of measuring, cutting, gripping, welding, writing, marking, stripping, painting, surfacing, positioning of a sensor or any type of analysis tool, etc.
- the operation as such is carried out by means of a tool (4) carried by a polyarticulated system (5) movable in the working environment (3).
- the object (2) on which the operation according to the method of the invention is to be carried out may be of any nature, such as for example a radioactive object or any other object to be dismantled, or else an object to be repaired, to solder, etc.
- the work environment (3) linked to the object (2) can be of any kind, such as for example a work environment at risk, confined, inaccessible to an operator or in which an operator operates with difficulty, such as '' a radioactive environment for example, or even work at height.
- the installation (1) comprises a polyarticulated system (5), in particular in the form of a robotic arm movable in all directions and in the working environment (3).
- the installation (1) comprises at least one 3D sensor, for example a 3D camera (6A, 6B, 6C) intended to capture an image of the object (2) and of all or part of the working environment (3 ) to digitize it and represent, by means of a known computer processing system, and a display, a three-dimensional representation of a cloud (7) of overall points of the object (2) and of all or part of the working environment (3).
- a 3D sensor for example a 3D camera (6A, 6B, 6C) intended to capture an image of the object (2) and of all or part of the working environment (3 ) to digitize it and represent, by means of a known computer processing system, and a display, a three-dimensional representation of a cloud (7) of overall points of the object (2) and of all or part of the working environment (3).
- the installation (1) comprises three 3D cameras (6A, 6B, 6C) positioned in a fixed manner on an arch (8) around and above the polyarticulated system (5).
- the installation (1) implements a method comprising at least one step A consisting in capturing, in the form of a cloud of points assembly (7), an image of the object (2) and all or part of the working environment (3) with the 3D camera (s) (6A, 6B, 6C) attached to the arch (8 ), to merge this overall point cloud (7) with a pre-existing CAD model of the polyarticulated system (5) and a pre-existing CAD model of the environment as built, such as that of the arch (8) and to represent on a display the resulting working image (17) (assembly of FIG. 3).
- the installation (1) preferably comprises a plurality of tools (4) of different types, and a 3D sensor, such as a 3D camera (16) stored at proximity, for example in a dedicated bin.
- a 3D sensor such as a 3D camera (16) stored at proximity, for example in a dedicated bin.
- This 3D camera (16) can be of the same type as the 3D cameras (6A, 6B, 6C) or be different therefrom and, in the latter case, it will advantageously be more precise than the 3D cameras (6A, 6B, 6C).
- the method according to the invention advantageously comprises a step A ", after step A, consisting in automatically moving, and therefore in a secure manner, the polyarticulated system (5) in order to enter and connect to the 3D camera (16 ) stored nearby.
- the method comprises a step A ', see figure 2, consisting in controlling the movement of the polyarticulated system (5) to capture detail images of different zones of object (2) with the 3D camera (16) more precise than the 3D cameras (6A, 6B, 6C), in the form of a point cloud which is denser and more precise than the overall point cloud (7) and to be integrated into the 'working image (17), see FIG. 4, the captured images (9) of the corresponding areas of the object (2) replacing the corresponding part from the overall point cloud (7).
- the movement of the polyarticulated system (5) to capture the different images is controlled, automatically or by an operator remote from the working environment (3), for example by means of a control lever and makes it possible to capture images of areas inaccessible, or for example by directly selecting specific areas from the overall point cloud (7), said selection causing the automatic displacement of the polyarticulated system.
- the operator selects in practice an area for which he wishes to improve the precision of modeling of the object.
- the software calculates the positions of the polyarticulated system (5) necessary to capture the images with the best points of view.
- the displacement of the polyarticulated system (5) takes place without collision between the polyarticulated system (5), the 3D camera (16), the object (2) and the elements of the working environment (3) since everything is modeled through the working image (17), either in CAD, or by the overall point cloud (7), or by the images (9) of details.
- step A ' the displacement speed of the polyarticulated system (5) is automatically reduced when the tool (4) or a part of the polyarticulated system (5), whose CAD models are known, approaches the object (2) or an element in the working environment (3), and the movement is stopped when the tool (4) or part of the polyarticulated system (5) is at a safe distance from an obstacle.
- the method comprises a step B 'consisting in selecting, in and for the simulation, in a database of tools, the tool (4) in front of perform the operation.
- the method comprises a step B consisting in defining a path (10) of the tool (4) on the part of the working image (17) representing the object (2) ,, and to perform a simulation of the corresponding displacement of the polyarticulated system (5) and of the tool (4) in the working image (17) to ensure that the operation is feasible, in particular in terms of orientation , accessibility, and absence of collision, see figure 6.
- the operator can position on the working image (17) displayed on the display, a starting point - ending point pair or a geometric figure chosen from a library made up of lines, planes or masks ... and the computer processing system is designed to automatically calculate the trajectories (10) on the object (2). If necessary, the processing system allows manual adjustment of the trajectory (10), or allows the operator to directly trace the trajectories (10) on the representation of the object (2).
- a plane (11) has been positioned on the representation of the surface of the object (2) to be cut, and the computer processing system has defined the trajectory (10) by the intersection between the plane (11) and said surface.
- This technique is also illustrated in Figure 7, where we see the positioning of a plane (11), and in Figure 8 where we see the trajectories (10) calculated and plotted from the intersection between the plan (11) and the representation of the object (2).
- the computer processing system After having calculated the trajectories (10), the computer processing system makes it possible to simulate the corresponding displacement of the polyarticulated system (5) and of the tool (4) in the working image (17) to ensure that the operation is possible, depending on of course of the path (10) tested, of the tool (4) and of the accessibility and displacement possibilities of the polyarticulated system (5).
- the displacement test steps are for example illustrated in Figures 6 and 9. If the simulation shows that the displacement is possible, that is to say that it is possible in terms of orientation and accessibility, and that 'it does not cause any collision between the various elements of the installation (1) and of the working environment (3), the actual displacement can be carried out. Step B is carried out as many times as necessary to end up with an executable movement.
- the method then advantageously comprises a step C 'consisting in automatically moving the polyarticulated system (5) to disconnect and deposit the 3D camera (16), for example in the dedicated tank, and to connect to the tool (4) which had previously been selected, also stored nearby.
- the method comprises a step C consisting, if the simulation has shown that the operation is feasible, in carrying out the actual displacement of the polyarticulated system (5) carrying the tool (4) according to the trajectory (10) defined and validated to perform the operation as such on the object (2).
- next cutting path (10) is defined, in masked time, during the operation. previous cutting, or several trajectories (10) are simulated and recorded to be then performed one after the other in an order that can be changed by the operator.
- the invention is also particularly advantageous in an evolving environment, due to new elements or potential obstacles by adding objects or by freeing up space, for example linked to the cutting of the object (2), steps A to C of the method of the invention then being repeated in order to have the most up-to-date images and simulations possible for continuing operations on the object (2), in a facilitated manner and without risk of collision.
- the display illustrates, in CAD mode, the installation (1), the object (2) and the various elements of the environment.
- the polyarticulated system (5) is shown in interactive colors.
- the polyarticulated system (5) is for example represented in green and as one of its parts approaches an obstacle, the color of the part concerned successively changes to orange when the polyarticulated system (5) and / or the tool (4) enters the collision risk zone defined by the collision management parameters and red when movement is stopped because the polyarticulated system (5) and / or the tool (4) has reached the threshold distance defined in the collision management parameters.
- the invention provides a method and an installation (1) making it possible to perform, in an automatic and secure manner, an operation on an object (2) present in a working environment (3), positions and geometries of the object and of the working environment being scalable and / or insufficiently defined to be able to carry out the operation.
- the process adapts to any type of geometry or nature of the object (2).
- the first step of securing the environment and the object (2) by digitizing in the form of an overall point cloud is carried out with 3D cameras, which allows adaptation to any type of position and geometry.
- Processing time is fast, including 500,000 measurement points in one second.
- the details obtained from the environment are important, the information is continuous. This allows for realistic rendering and modeling in real time, the operator can optionally easily visually check whether the point cloud reconstruction (7) is correct.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Numerical Control (AREA)
- Manipulator (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202180029322.7A CN115916476A (zh) | 2020-03-13 | 2021-03-05 | 利用由多关节系统承载的工具对物体自动执行操作的方法 |
KR1020227034950A KR20220154716A (ko) | 2020-03-13 | 2021-03-05 | 다관절 시스템에 의해 운반되는 도구를 사용하여 물체에 대한 작업을 자동으로 수행하는 방법 |
US17/906,142 US20230106854A1 (en) | 2020-03-13 | 2021-03-05 | Method for automatically performing an operation on an object with a tool carried by a polyarticulated system |
EP21714659.6A EP4117867A1 (fr) | 2020-03-13 | 2021-03-05 | Procédé de réalisation automatique d'une opération sur un objet avec un outil porté par un système polyarticulé |
JP2022555706A JP2023530209A (ja) | 2020-03-13 | 2021-03-05 | 多関節システムによって保持されたツールを用いて対象物に対する操作を自動で行うための方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FRFR2002515 | 2020-03-13 | ||
FR2002515A FR3108183B1 (fr) | 2020-03-13 | 2020-03-13 | Procédé de réalisation automatique d’une opération sur un objet avec un outil porté par un système polyarticulé |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021181035A1 true WO2021181035A1 (fr) | 2021-09-16 |
Family
ID=72266362
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FR2021/050380 WO2021181035A1 (fr) | 2020-03-13 | 2021-03-05 | Procédé de réalisation automatique d'une opération sur un objet avec un outil porté par un système polyarticulé |
Country Status (7)
Country | Link |
---|---|
US (1) | US20230106854A1 (fr) |
EP (1) | EP4117867A1 (fr) |
JP (1) | JP2023530209A (fr) |
KR (1) | KR20220154716A (fr) |
CN (1) | CN115916476A (fr) |
FR (1) | FR3108183B1 (fr) |
WO (1) | WO2021181035A1 (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113848803B (zh) * | 2021-10-14 | 2023-09-12 | 成都永峰科技有限公司 | 一种深腔曲面加工刀路生成方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2896441A1 (fr) * | 2006-01-23 | 2007-07-27 | Jerome Grosbois | Procede et systeme permettant la prehension automatisee de piece(s) |
US9669543B1 (en) * | 2015-12-11 | 2017-06-06 | Amazon Technologies, Inc. | Validation of robotic item grasping |
-
2020
- 2020-03-13 FR FR2002515A patent/FR3108183B1/fr active Active
-
2021
- 2021-03-05 US US17/906,142 patent/US20230106854A1/en active Pending
- 2021-03-05 JP JP2022555706A patent/JP2023530209A/ja active Pending
- 2021-03-05 WO PCT/FR2021/050380 patent/WO2021181035A1/fr active Application Filing
- 2021-03-05 EP EP21714659.6A patent/EP4117867A1/fr active Pending
- 2021-03-05 CN CN202180029322.7A patent/CN115916476A/zh active Pending
- 2021-03-05 KR KR1020227034950A patent/KR20220154716A/ko active Search and Examination
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2896441A1 (fr) * | 2006-01-23 | 2007-07-27 | Jerome Grosbois | Procede et systeme permettant la prehension automatisee de piece(s) |
US9669543B1 (en) * | 2015-12-11 | 2017-06-06 | Amazon Technologies, Inc. | Validation of robotic item grasping |
Also Published As
Publication number | Publication date |
---|---|
JP2023530209A (ja) | 2023-07-14 |
FR3108183B1 (fr) | 2022-02-25 |
EP4117867A1 (fr) | 2023-01-18 |
US20230106854A1 (en) | 2023-04-06 |
FR3108183A1 (fr) | 2021-09-17 |
CN115916476A (zh) | 2023-04-04 |
KR20220154716A (ko) | 2022-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2674317C (fr) | Procede et systeme permettant la prehension automatisee de piece(s) | |
CA2002513C (fr) | Methode et dispositif de detection automatique du profil d'une surface et en vue d'effectuer un travail | |
US20190080446A1 (en) | System and method for automated defect detection | |
EP2242621A2 (fr) | Procede pour l'apprentissage d'un robot ou similaire et dispositif pour la mise en oeuvre de ce procede | |
EP4182126B1 (fr) | Contrôle automatique d'un système d'ébavurage de pièces | |
CA2923490C (fr) | Procede de generation d'un programme d'usinage interpretable par un controleur physique d'une machine-outil a commande numerique | |
EP3523504B1 (fr) | Dispositif et procédé de prise et de pose automatisée d'un voussoir pour former un revêtement d'un tunnel | |
EP0034967B1 (fr) | Procédé automatique et auto-adaptatif de soudage par fusion et dispositif pour la mise en oeuvre de ce procédé | |
WO2007017597A2 (fr) | Procede et dispositif pour determiner la pose d'un moyen de capture video dans le repere de numerisation d'au moins un objet virtuel en trois dimensions modelisant au moins un objet reel | |
EP4117867A1 (fr) | Procédé de réalisation automatique d'une opération sur un objet avec un outil porté par un système polyarticulé | |
EP0724149A1 (fr) | Procédé de contrÔle non destructif d'une surface, en particulier en milieu hostile | |
CA3138919A1 (fr) | Dispositif et procede pour le controle d'une piece en cours de fabrication | |
EP1671192B1 (fr) | Procede d' etalonnage d' une machine de percage de verres ophtalmiques, dispositif pour la mise en oeuvre d' un tel procede, et appareil d' usinage de verres ophtalmiques equipe d' un tel dispositif | |
WO2022040819A2 (fr) | Surveillance mise en oeuvre par ordinateur d'une opération de soudage | |
EP3092533B1 (fr) | Systemes d`usinage comportant une machine d`usinage et des procedes de commande | |
FR2741438A1 (fr) | Dispositif et procede de controle dimensionnel d'un cordon de matiere depose sur un support | |
FR3096599A1 (fr) | Collaboration d'un robot et d'un opérateur en vue de modifier une pièce | |
FR2720026A1 (fr) | Procédé de génération d'une trajectoire d'outil sur une surface d'une pièce. | |
FR3123680A1 (fr) | Procédé de saisie automatique d’un voussoir de revêtement d'un tunnel | |
WO2023214063A1 (fr) | Procédé et système de réparation d'un objet | |
WO2019020924A1 (fr) | Procédé de contrôle d'une surface | |
FR2562685A1 (fr) | Procede et installation pour l'execution repetee d'une operation ou suite d'operations enregistree | |
EP0453433A2 (fr) | Procédé pour le contrôle automatique de la qualité d'un composant de carrosserie automobile | |
WO2015177283A1 (fr) | Procédé de personnalisation d'un objet personnalisable pour un système client/serveur; support d'enregistrement d'informations et système client/serveur associés | |
FR3028615A1 (fr) | Procede d’inspection d’un produit tel qu’un composant d’une nacelle de turboreacteur |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21714659 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022555706 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20227034950 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021714659 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2021714659 Country of ref document: EP Effective date: 20221013 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |