WO2022043979A1 - Programme, procédé et système - Google Patents

Programme, procédé et système Download PDF

Info

Publication number
WO2022043979A1
WO2022043979A1 PCT/IB2021/059489 IB2021059489W WO2022043979A1 WO 2022043979 A1 WO2022043979 A1 WO 2022043979A1 IB 2021059489 W IB2021059489 W IB 2021059489W WO 2022043979 A1 WO2022043979 A1 WO 2022043979A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensing
abnormality
dimensional position
program
processor
Prior art date
Application number
PCT/IB2021/059489
Other languages
English (en)
Japanese (ja)
Inventor
菅谷俊二
Original Assignee
株式会社オプティム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社オプティム filed Critical 株式会社オプティム
Publication of WO2022043979A1 publication Critical patent/WO2022043979A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces

Definitions

  • This disclosure relates to programs, methods, and systems.
  • the manufacturing process at a manufacturing site includes various processes such as a process of assembling an object and a process of painting the body of an object.
  • painting process for example, painting of an object is advanced through a plurality of procedures using a robot arm or the like. Then, when the painting is completed, the final confirmation of the object as a manufactured object is performed, and it is confirmed that there is no abnormality in the manufactured object. If, for example, an abnormality such as discoloration, cracks, adhesion of dust, or uneven coating is found during the visual inspection of the object, repair is performed to eliminate the abnormality. For example, when uneven coating is found, the area with uneven coating is polished with a polisher to make the area painted uniform.
  • Patent Document 1 discloses an inspection method / apparatus for optically detecting coating unevenness, but does not describe detection other than coating unevenness.
  • the purpose of this disclosure is to reduce the burden on workers in visual inspection.
  • One aspect of the present disclosure is a program for causing a computer having a processor and a memory to execute the program.
  • the program has a step of acquiring the three-dimensional shape of the object based on the sensing data obtained by sensing the object using electromagnetic waves, a step of detecting the abnormality of the object based on the sensing data, and a step of detecting the detected abnormality.
  • a step to specify a 3D position a step to move a robot arm equipped with a device capable of eliminating an abnormality so that the device reaches the specified 3D position, and a step to move the device to the 3D position. , Steps to get the device up and running.
  • the burden on the worker can be reduced.
  • the system 1 is, for example, a system that inspects the painted surface of a painted object and performs work for eliminating the abnormality when an abnormality occurs on the painted surface.
  • the system 1 is used, for example, in an automobile production line.
  • FIG. 1 is a diagram showing an overall configuration of the system 1 according to the present embodiment.
  • the system 1 shown in FIG. 1 inspects the painted surface of the object 100.
  • the illustration of the object 100 is simplified, it is a work such as an outer panel of an automobile body, which has finished the painting process and is not to be visually inspected.
  • the object 100 is not limited to the outer panel of the body of an automobile, and various workpieces for which visual inspection is to be avoided are assumed.
  • System 1 includes a sensing unit 10, an industrial robot 20, a control device 30, and a display device 40.
  • the sensing unit 10, the industrial robot 20, and the control device 30 are connected to the network 50 so as to be able to communicate with each other.
  • the connection of the sensing unit 10, the industrial robot 20, and the control device 30 to the network 50 may be wired or wireless. Further, the sensing unit 10 and the industrial robot 20 may be connected to the control device 30 without going through the network 50.
  • the control device 30 is connected to the display device 40 by wire or wirelessly.
  • the control device 30 and the display device 40 may be connected via the network 50.
  • the sensing unit 10 senses the object 100 using electromagnetic waves.
  • the sensing unit 10 includes a sensing module (sensing means) 11 for sensing and an actuator (not shown) for controlling the position of the sensing module 11.
  • the sensing module 11 senses the object 100 using electromagnetic waves in a preset band.
  • any of the following devices realized by using electromagnetic waves in a preset band is assumed.
  • Visible light camera ⁇ Infrared camera
  • Ultrasonic camera ⁇ Ultrasonic sensor
  • RGB-D camera ⁇ LiDAR (Light Detection and Ringing)
  • a plurality of sensing modules 11 are arranged around the object 100 so that the three-dimensional positions of abnormalities that may occur on the painted surface of the object 100 can be acquired.
  • the sensing module 11 is arranged at a position where a predetermined surface of the object 100 can be simultaneously sensed from different directions.
  • two sensing modules 11 are arranged around the object 100, but the number of sensing modules 11 arranged around the object 100 may be three or more.
  • the sensing unit 10 transmits the sensing data acquired by the sensing module 11 to the control device 30.
  • the actuator of the sensing unit 10 for example, a Cartesian robot capable of moving the position of the sensing module 11 along the XY axis is used.
  • the behavior of the orthogonal robot is controlled by the control device 30.
  • the actuator may be, for example, a single-axis robot capable of moving the sensing module 11 along one axis.
  • the industrial robot 20 is, for example, a vertical articulated robot.
  • the industrial robot 20 includes, for example, an articulated robot arm 21 and an end effector 22 mounted on the tip of the robot arm 21.
  • the industrial robot 20 is communicatively connected to the control device 30.
  • the robot arm 21 is driven according to the control from the control device 30. Specifically, in the robot arm 21, a motor is provided in the joint, and the arm portion is moved by operating the joint by the force of the motor.
  • the end effector 22 is a device that performs a predetermined process. In the present embodiment, it is a device for eliminating an abnormality that has occurred on the painted surface of the object 100.
  • the end effector 22 is moved to a desired position by the robot arm 21.
  • Examples of the end effector 22 include an electric polisher that eliminates coating unevenness (abnormality) generated on the outer panel after painting by polishing.
  • the end effector 22 is not limited to the electric polisher as long as it is a device that can eliminate the abnormality generated on the painted surface of the object 100.
  • the control device 30 controls the operation of each device in the system 1.
  • the display device 40 presents the information processed by the control device 30 to the user.
  • FIG. 2 is a block diagram showing the configuration of the control device 30.
  • the control device 30 shown in FIG. 2 includes a processor 31, a storage device 32, a communication interface 33, and an input / output interface 34.
  • the storage device 32 is realized by a non-volatile storage circuit such as an HDD (hard disk drive) or SSD (solid state drive) that stores various information.
  • the storage device 32 stores various programs executed by the processor 31, data processed by the programs, and the like. For example, the storage device 32 stores three-dimensional position information described later.
  • the storage device 32 stores, for example, a plurality of trained models generated by machine learning.
  • the first trained model is, for example, a model for detecting an abnormality on the painted surface of the object 100.
  • the second trained model is, for example, a model for controlling the operation of the industrial robot 20.
  • the first trained model and the second trained model may be stored in advance at the time of shipment of the control device 30, and may be installed via the network 50 or via a storage medium after shipment.
  • the first trained model and the second trained model are obtained by having the machine learning model perform machine learning according to the model learning program based on the training data.
  • the first trained model is trained to output an abnormality on the painted surface with respect to the input sensing data.
  • the training data for example, a plurality of sensing data about the object are used as input data, and a judgment about an abnormality that can be included in the input data is used as correct output data.
  • the second trained model is trained to output the control parameters of the industrial robot 20 with respect to the input three-dimensional position information.
  • the learning data is, for example, input data of position information about an object having a predetermined three-dimensional shape, and correct output data of control parameters of the robot arm 21 accessible to the input data by the end effector 22.
  • the parameters that the end effector 22 can reach the input three-dimensional position are set in consideration of the shape of the robot arm 21 and the three-dimensional shape of the object 100. That is, in the process of reaching the three-dimensional position of the end effector 22, the parameter for driving the robot arm 21 is set without the robot arm 21 and the object 100 coming into contact with each other.
  • the communication interface 33 is realized by, for example, a circuit connected to the network 50.
  • the communication interface 33 is a wireless base station compatible with communication standards such as 5G, 4G, and LTE (Long Term Evolution), and a wireless LAN such as IEEE (Institute of Electrical and Electricals Engineers) 802.11. It is realized by a circuit that connects to a communication device such as a wireless LAN router that supports the Local Area Network) standard.
  • the control device 30 directly communicates with the sensing unit 10 or the industrial robot 20, it is realized by a circuit capable of communicating with the sensing unit 10 or the industrial robot 20.
  • the input / output interface 34 is an interface for connecting to an input device that receives input from a user and a display device 40.
  • the input device is realized by, for example, a touch panel, a touch pad, a pointing device such as a mouse, a keyboard, or the like.
  • the processor 31 functions as the center of the control device 30.
  • the processor 31 is hardware for executing an instruction set described in a program, and is composed of an arithmetic unit, registers, peripheral circuits, and the like.
  • the processor 31 realizes various functions corresponding to the program by reading and executing the program stored in the storage device 32.
  • the processor 31 realizes the functions as the first control unit 31A, the acquisition unit 31B, the detection unit 31C, the position identification unit 31D, and the second control unit 31E by executing the program stored in the storage device 32. ..
  • the first control unit 31A controls the sensing unit 10. For example, the first control unit 31A transmits a drive signal to the actuator of the sensing unit 10 so as to move the sensing module 11 to an appropriate position.
  • the first control unit 31A transmits, for example, a drive signal to the sensing module 11 to start sensing by the sensing module 11.
  • the first control unit 31A senses the object 100 by the sensing module 11 and acquires the sensing data.
  • the first control unit 31A drives two sensing modules 11 at substantially the same time in FIG. 1 to sense the painted surface of the object 100 from different directions.
  • the first control unit 31A may cause the sensing module 11 to sense while driving the actuator. That is, the first control unit 31A may control the sensing unit 10 so as to acquire the sensing data as a moving image. Further, the first control unit 31A may stop the sensing when driving the actuator, and may stop the actuator when sensing. That is, the first control unit 31A may control the sensing unit 10 so as to acquire sensing data as a plurality of still images.
  • the acquisition unit 31B acquires the three-dimensional shape of the object 100 based on the sensing data. Specifically, the acquisition unit 31B is three-dimensionally based on the sensing data obtained by sensing the object 100 from a plurality of directions, the positions of the plurality of sensing modules 11 that have acquired the sensing data, and the positions of the object 100. Calculate the shape. In the example shown in FIG. 1, since two sensing modules 11 are arranged, it is possible to acquire sensing data from a plurality of directions without moving the sensing module 11 and the object 100. The acquisition unit 31B stores the acquired information about the three-dimensional shape in the storage device 32.
  • the detection unit 31C detects an abnormality occurring on the painted surface of the object 100 based on the sensing data.
  • the abnormality means, for example, discoloration, cracks, adhesion of dust, uneven coating, and the like.
  • a portion where an abnormality occurs on the painted surface of the object 100 is referred to as an abnormal portion.
  • the detection unit 31C detects an abnormality by using, for example, the first trained model.
  • the detection unit 31C inputs the sensing data acquired by the sensing unit 10 to the first trained model, and outputs the abnormal unit included in the sensing data.
  • the position specifying unit 31D specifies the position of the detected abnormal part in the three-dimensional space. Specifically, for example, the position specifying portion 31D identifies the three-dimensional position of the abnormal portion in the object 100 by comparing it with the three-dimensional shape of the object 100.
  • the three-dimensional position of the anomalous portion is represented by, for example, coordinates in a predetermined coordinate system with respect to a predetermined position of the object 100.
  • the predetermined coordinate system is, for example, an orthogonal coordinate system or a polar coordinate system.
  • the three-dimensional position of the abnormal portion may be specified, for example, based on a predetermined position in the factory, a predetermined position of the industrial robot 20, or the like.
  • the position specifying unit 31D stores information about the specified three-dimensional position in the storage device 32.
  • the second control unit 31E controls the industrial robot 20. For example, the second control unit 31E transmits a drive signal to the servomotor of the robot arm 21 so that the end effector 22 reaches a predetermined position. Further, the second control unit 31E transmits a drive signal to the end effector 22, for example, and starts processing by the end effector 22.
  • the second control unit 31E sets a parameter to be input to the robot arm 21 in order to make the end effector 22 reach the three-dimensional position of the abnormal part specified by the position specifying unit 31D.
  • the second control unit 31E inputs the three-dimensional position information of the abnormal part to the second trained model and outputs the control parameters of the industrial robot 20.
  • the second control unit 31E transmits the parameters output from the second trained model to the robot arm 21.
  • the second control unit 31E operates the end effector 22 when the end effector 22 reaches the specified three-dimensional position.
  • the electric polisher rotates and comes into contact with the abnormal portion, so that the area where the coating unevenness exists is polished and the coating unevenness is eliminated.
  • FIG. 3 is a diagram showing a data structure of three-dimensional position information of an abnormal portion. Note that FIG. 3 is an example and does not exclude data not described.
  • each record of the position information DB includes an item "abnormal ID”, an item “object ID”, an item “date”, an item “position information”, and the like.
  • abnormal ID stores an ID for identifying an abnormality that has occurred.
  • the item "object ID” stores an ID for identifying the object 100 in which the abnormality has occurred.
  • the item "date” stores the date when the abnormality occurred.
  • position information stores the three-dimensional position of the anomaly that has occurred.
  • FIG. 3 shows an example in which a three-dimensional position in a Cartesian coordinate system is stored, but the position information stored in the position information DB is not limited to that in the Cartesian coordinate system.
  • FIG. 4 is a diagram for explaining an example of a flow in which an abnormality on a painted surface of an object 100 is detected by the system 1 and the detected abnormality is eliminated.
  • the first control unit 31A of the control device 30 controls the sensing unit 10 (step S301).
  • the first control unit 31A transmits a drive signal to the actuator of the sensing unit 10 and the sensing module 11.
  • the actuator of the sensing unit 10 moves the sensing module 11 to an appropriate position according to the drive signal from the control device 30.
  • the sensing module 11 starts sensing according to the drive signal from the sensing unit 10 (step S101).
  • the two sensing modules 11 shown in FIG. 1 sense the object 100 from different directions.
  • the sensing unit 10 transmits the sensing data acquired by the sensing of the sensing module 11 to the control device 30 (step S102).
  • the acquisition unit 31B of the control device 30 Upon receiving the sensing data from the sensing unit 10, the acquisition unit 31B of the control device 30 acquires the three-dimensional shape of the object 100 (step S302). Specifically, the acquisition unit 31B calculates the three-dimensional shape of the object 100 based on the sensing data acquired by the sensing module 11, the position where the sensing module 11 acquires the sensing data, and the position of the object 100. .. The acquisition unit 31B stores the acquired information about the three-dimensional shape in the storage device 32.
  • the detection unit 31C detects an abnormality on the painted surface of the object 100 (step S303). Specifically, the detection unit 31C inputs the sensing data acquired by the sensing module 11 into the first trained model, and outputs the abnormal unit included in the sensing data.
  • the position specifying unit 31D identifies the position of the abnormal part (step S304). For example, the position specifying unit 31D identifies the three-dimensional position of the abnormal portion in the object 100 by comparing it with the three-dimensional shape of the object 100. At this time, the position specifying unit 31D represents, for example, the three-dimensional position of the abnormal portion by the coordinates in the orthogonal coordinate system with respect to the predetermined position of the object 100.
  • the second control unit 31E controls the industrial robot 20 (step S305). For example, the second control unit 31E inputs the three-dimensional position information of the abnormal portion to the second trained model, and outputs the control parameters for reaching the end effector 22 of the industrial robot 20 to the abnormal portion. The second control unit 31E transmits a drive signal for instructing the control parameters to the robot arm 21 of the industrial robot 20. Further, the second control unit 31E transmits a drive signal for starting the processing of the end effector 22 to the end effector 22.
  • the robot arm 21 causes the end effector 22 to reach a position according to the drive signal from the control device 30 (step S201).
  • the end effector 22 starts processing at the reached position according to the drive signal from the sensing unit 10 (step S202).
  • FIG. 5 is a schematic diagram showing an example of an abnormal portion on the painted surface of the object 100.
  • FIG. 6 is a schematic diagram for explaining the operation of the industrial robot 20.
  • the position specifying portion 31D specifies the three-dimensional position of the abnormal portion B.
  • the second control unit 31E calculates the control parameters of the robot arm 21 based on the specified three-dimensional position, and controls the robot arm 21 based on the calculated control parameters, so that the end effector is as shown in FIG. 22 is made to reach the abnormal portion B.
  • the second control unit 31E operates the electric polisher, which is the end effector 22, and polishes the abnormal portion B.
  • uneven coating is eliminated by polishing with the electric polisher which is the end effector 22.
  • the control device 30 When the processing to the abnormal portion B is completed by the end effector 22, the control device 30 performs the processing of steps S301 and S303 shown in FIG. 4 to confirm whether or not the abnormality on the painted surface of the object 100 has been resolved. If the abnormality is not resolved, the control device 30 performs the processes of steps S304 and S305 shown in FIG. 4 to eliminate the abnormality. The processes of steps S301 and S303 to S305 may be repeated, for example, until appropriate repair work is confirmed.
  • control device 30 may perform the acquisition of the three-dimensional shape of the object 100 in step S302 and the detection of the abnormality in the object 100 in step S303 in separate flows.
  • the control device 30 senses the appearance of the object 100 by the sensing module 11 and acquires the three-dimensional shape of the object 100. Further, the control device 30 detects an abnormality based on the sensing data and identifies the three-dimensional position of the detected abnormality. Then, the control device 30 is designed to repair the abnormality that has occurred in the appearance of the object 100 by reaching the end effector 22 at the position where the abnormality has occurred and driving the end effector 22. As a result, the control device 30 can automatically detect the abnormality in the object 100 and automatically repair the detected abnormality.
  • the acquisition unit 31B acquires the three-dimensional shape of the object 100 based on the sensing data obtained by sensing the object 100 from a plurality of directions.
  • the control device 30 can accurately acquire the three-dimensional shape of the object 100.
  • the detection unit 31C inputs the sensing data for the object and detects the abnormality by using the trained model trained to output the presence or absence of the abnormality. This makes it possible to detect abnormalities with high accuracy regardless of the operator.
  • the second control unit 31E uses a trained model trained to input the abnormal three-dimensional position and output the position of the end effector 22 with respect to the three-dimensional position, and uses the robot arm. I am trying to move 21. This makes it possible to bring the end effector 22 to the position where the abnormality has occurred with high accuracy regardless of the worker.
  • FIG. 7 is a diagram showing the configuration of the system 1 according to the modified example 1. Note that FIG. 7 omits the illustration of the control device 30, the display device 40, and the network 50 in the system 1.
  • the system 1 shown in FIG. 7 includes one sensing unit 10. Further, the object 100 is placed on, for example, a rotatable stage 60.
  • the first control unit 31A controls the sensing unit 10 and the stage 60. Specifically, for example, the first control unit 31A controls the actuator of the sensing unit 10 or rotates the stage 60 so as to move the sensing module 11 relative to the object 100.
  • the first control unit 31A causes the sensing module 11 to sense the painted surface of the object 100 before and after moving the sensing module 11 relative to the object 100. At this time, since the first control unit 31A acquires the three-dimensional shape of the object 100, the first control unit 31A has a predetermined angle with respect to the object 100 between the first sensing and the second sensing. I try to open the interval only. As a result, sensing data in different directions regarding the painted surface of the object 100 are acquired.
  • the acquisition unit 31B obtains a three-dimensional shape of the object 100 based on the sensing data obtained by sensing the object 100 from a plurality of directions, the position of the sensing module 11 with respect to the object 100 when the sensing data is acquired, and the position of the object 100. calculate.
  • the control device 30 in the first modification carries out the operations of steps S301 to S305 in the same manner as the operations shown in FIG. However, in step S301, the first control unit 31A may rotate the stage 60 instead of controlling the actuator of the sensing unit 10. As a result, in step S101, sensing data from two directions will be acquired.
  • the control device 30 uses one sensing module 11 and moves the sensing module 11 with respect to the object 100 to obtain sensing data from different directions on the painted surface of the object 100. get. Then, the control device 30 detects an abnormality on the painted surface of the object 100 based on the acquired sensing data, and eliminates the detected abnormality by using the industrial robot 20. As a result, even when there is only one sensing module 11, it is possible to accurately detect the abnormality of the object 100 and eliminate the abnormality.
  • FIG. 8 is a diagram showing the configuration of the system 1 according to the modified example 2. Note that FIG. 8 omits the illustration of the control device 30, the display device 40, and the network 50 in the system 1.
  • the system 1 shown in FIG. 8 includes an illumination 70 which is a light source.
  • the illumination 70 generates light having a predetermined wavelength.
  • the light emitted from the illumination 70 is, for example, white light including light of all wavelengths.
  • the illumination 70 generates light with an amount of light that is unlikely to cause diffuse reflection of the reflected light reflected on the surface of the object 100.
  • the light emitted by the illumination 70 irradiates the object 100.
  • the sensing module 11 senses light having a wavelength emitted from the illumination 70.
  • the sensing module 11 senses visible light, for example.
  • the control device 30 in the second modification carries out the operations of steps S301 to S305 in the same manner as the operations shown in FIG.
  • the illumination 70 irradiates light having a predetermined wavelength, and the sensing module 11 senses the reflected light from the object 100. Then, the control device 30 detects an abnormality on the painted surface of the object 100 based on the acquired sensing data, and eliminates the detected abnormality by using the industrial robot 20. This makes it possible to accurately detect abnormalities such as slight coating unevenness, for which differences are difficult to confirm by sensing in natural light.
  • the illumination 70 may be moved relative to the object 100 when irradiating the light.
  • the sensing module 11 may or may not be moved relative to the object 100.
  • the illumination 70 may switch the wavelength of the irradiated light when irradiating the object 100 with the light.
  • the sensing unit 10 may include a plurality of sensing modules 11 corresponding to the wavelength of light.
  • FIG. 9 is a diagram showing the configuration of the system 1 according to the modified example 3.
  • the system 1 shown in FIG. 9 has a terminal device 80.
  • the terminal device 80 is realized by a portable computer such as a head-mounted display, a smartphone, or a tablet terminal.
  • the terminal device 80 has a function of expanding a perceptible real environment by superimposing a desired image on a real image or a landscape.
  • the terminal device 80 as a head-mounted display is generally referred to as an AR (Augmented Reality) glass.
  • the terminal device 80 has a communication function and a display function of superimposing and displaying the information acquired by the communication function on an actual image or a landscape.
  • the control device 30 transmits information about the three-dimensional position of the abnormal portion of the object 100 stored in the storage device 32 to the terminal device 80 via the network 50. That is, the control device 30 transmits information regarding the three-dimensional position of the abnormality identified by the processing of steps S301 to S304 shown in FIG. 4 to the terminal device 80.
  • the terminal device 80 determines whether or not an abnormal portion is included in the viewing direction via the terminal device 80 based on the acquired information on the three-dimensional position.
  • the terminal device 80 superimposes and displays information indicating that an abnormality has occurred at a position corresponding to the abnormal portion on the display. Even in a situation where the abnormal part cannot be seen directly, that is, even when the abnormal part is hidden behind a predetermined structure, if the direction is grasped, information indicating the occurrence of the abnormality is superimposed.
  • the information indicating that an abnormality has occurred includes information indicating the position of the abnormal part, information indicating the date and time when the abnormality was detected, information indicating the degree of the abnormality, information indicating the line where the abnormality occurred, and an object in which the abnormality occurred. Includes information that represents.
  • the terminal device 80 determines whether or not an abnormal portion is included in the viewing direction via the lens.
  • the terminal device 80 superimposes and displays information indicating that an abnormality has occurred at a position corresponding to the abnormal portion on the lens.
  • the user of the terminal device 80 visually recognizes the object 100 through the terminal device 80, an image showing the indication to that effect can be visually recognized as augmented reality in the portion where the abnormality of the object 100 occurs. It will be like.
  • control device 30 causes the terminal device 80 to display the information regarding the three-dimensional position of the specified abnormal portion as augmented reality.
  • the position of the abnormal portion in the object 100 can be immediately grasped, and the degree of abnormality can be confirmed, the repair work can be examined, the man-hours can be estimated, and the like efficiently.
  • FIG. 10 is a diagram showing the configuration of the system 1 according to the modified example 4.
  • the system 1 shown in FIG. 10 has two sensing units 10a in which the sensing module 11 is arranged at the tip of the robot arm 12.
  • the number of sensing units 10a included in the system 1 may be one or three or more.
  • the first control unit 31A controls the sensing unit 10a. Specifically, for example, the first control unit 31A controls the servomotor of the robot arm 12 so as to move the sensing module 11 relative to the object 100. More specifically, the first control unit 31A controls the robot arm 12 so that the sensing module 11 captures the motion of the object 100 as a regular motion, for example, a constant velocity linear motion, a constant acceleration motion, or the like. do.
  • the stage 61 on which the object 100 is arranged may transport the object 100 in a predetermined direction while repeating stopping and moving.
  • the first control unit 31A controls and senses the robot arm 12 so that the sensing module 11 can capture the motion of the object 100 as a regular motion even when the stage 61 repeats stopping and moving. Move the module 11.
  • the control device 30 in the modification 4 carries out the operations of steps S301 to S304 in the same manner as the operations shown in FIG.
  • the control device 30 senses the painted surface of the object 100 by the sensing module 11 attached to the robot arm 12. Then, the control device 30 detects an abnormality on the painted surface of the object 100 based on the acquired sensing data.
  • the position of the sensing module 11 can be precisely controlled. As a result, the accuracy of detecting the abnormality of the object 100 is improved.
  • the sensing module 11 controls the movement of the robot arm 12 so that the movement of the object 100 can be regarded as a regular movement, for example, a constant velocity linear motion, a constant acceleration motion, or the like. There is. As a result, the abnormality of the object 100 also moves regularly, so that erroneous detection can be suppressed and the abnormality detection accuracy can be improved.
  • the control device 30 includes the detection unit 31C and the detection unit 31C detects an abnormality of the object 100 based on the sensing data
  • the abnormality of the object 100 may be detected by the sensing unit 10.
  • the sensing unit 10 stores a trained model for detecting an abnormality on the painted surface of the object 100.
  • the trained model is trained to output an abnormality on the painted surface with respect to the input sensing data.
  • the training data for example, a plurality of sensing data about the object are used as input data, and a judgment about an abnormality that can be included in the input data is used as correct output data.
  • the sensing unit 10 When the sensing data is acquired by the sensing module 11, the sensing unit 10 inputs the acquired sensing data to the trained model and outputs the determination of the abnormality included in the sensing data. The sensing unit 10 transmits the sensing data and the determination of abnormality to the control device 30.
  • a plurality of types of devices realized by the above may be adopted.
  • the detection unit 31C of the control device 30 detects the abnormality of the object 100 based on the sensing data using the electromagnetic waves of the respective bands of the sensing module 11, for example. This makes it possible to improve the accuracy of detecting abnormalities.
  • (Appendix 1) A program for causing a computer having a processor 31 and a memory to execute the program.
  • anomalies are detected from (Appendix 1) to (Appendix 6) by using a trained model trained to input sensing data for an object and output the presence or absence of anomalies.
  • (Appendix 14) A method performed by a computer equipped with a processor and memory, wherein the processor The step (S302) of acquiring the three-dimensional shape of an object based on the sensing data obtained by sensing the object using electromagnetic waves, and Step (S303) to detect anomalies in an object based on sensing data, Step (S304) to identify the three-dimensional position of the detected anomaly, A step (S305) of moving a robot arm equipped with a device capable of resolving an abnormality so that the device reaches a specified three-dimensional position. When the device reaches the three-dimensional position, the step (S202) of operating the device and How to do it.
  • (Appendix 16) A method performed by a computer equipped with a processor and memory, wherein the processor A step of moving a sensing means to an object and causing the sensing means to sense the object using electromagnetic waves so that the motion of the object is regarded as a regular motion.
  • a means to identify the three-dimensional position of the detected anomaly, A system equipped with.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

[Problème] Réduire la charge pesant sur des travailleurs pendant une inspection visuelle. [Solution] Un programme destiné à être exécuté par un ordinateur comprenant un processeur et une mémoire. Le programme amène le processeur à exécuter : une étape d'acquisition de la forme tridimensionnelle d'un objet, sur la base de données de détection obtenues par détection à l'aide d'ondes électromagnétiques émises en direction de l'objet ; une étape de détection d'une anomalie de l'objet, sur la base des données de détection ; une étape d'identification de la position tridimensionnelle d'une anomalie détectée ; une étape de déplacement d'un bras de robot qui est équipé d'un dispositif capable de résoudre l'anomalie, de manière que le dispositif atteigne la position tridimensionnelle identifiée ; et une étape d'actionnement du dispositif lorsque le dispositif atteint la position tridimensionnelle.
PCT/IB2021/059489 2020-08-25 2021-10-15 Programme, procédé et système WO2022043979A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-141403 2020-08-25
JP2020141403A JP2022037326A (ja) 2020-08-25 2020-08-25 プログラム、方法、及びシステム

Publications (1)

Publication Number Publication Date
WO2022043979A1 true WO2022043979A1 (fr) 2022-03-03

Family

ID=80354772

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2021/059489 WO2022043979A1 (fr) 2020-08-25 2021-10-15 Programme, procédé et système

Country Status (2)

Country Link
JP (2) JP2022037326A (fr)
WO (1) WO2022043979A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024034210A1 (fr) * 2022-08-09 2024-02-15 株式会社日立製作所 Système et procédé de mesure, et système et procédé de régénération de composant

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0666732A (ja) * 1992-08-19 1994-03-11 Mitsubishi Motors Corp 塗装点検仕上げ方法
JPH0829143A (ja) * 1994-07-13 1996-02-02 Mazda Motor Corp 表面状態検査方法及びその装置
JP2002508071A (ja) * 1997-07-18 2002-03-12 アウディ アーゲー ボディ・シェルの表面欠陥を自動的に認識する方法およびこの方法を実施する装置
JP2011163823A (ja) * 2010-02-05 2011-08-25 Aisin Seiki Co Ltd 物体形状評価装置
JP2017116404A (ja) * 2015-12-24 2017-06-29 ダイハツ工業株式会社 形状認識装置、及び形状認識方法
JP2018044812A (ja) * 2016-09-13 2018-03-22 株式会社Vrc 3dスキャナ

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06241737A (ja) * 1992-12-25 1994-09-02 Toyota Central Res & Dev Lab Inc 断面面積および容積計測装置
JP3678916B2 (ja) * 1998-06-09 2005-08-03 株式会社ミツトヨ 非接触三次元測定方法
JP2000009659A (ja) * 1998-06-19 2000-01-14 Fuji Photo Film Co Ltd 表面検査方法及び装置
JP5323320B2 (ja) * 2006-07-19 2013-10-23 有限会社シマテック 表面検査装置
JP2013234951A (ja) * 2012-05-10 2013-11-21 Mitsutoyo Corp 三次元測定装置
JP6217343B2 (ja) * 2013-11-26 2017-10-25 セントラル硝子株式会社 湾曲板形状検査装置
JP6520451B2 (ja) * 2015-06-19 2019-05-29 株式会社デンソー 外観撮影装置及び外観撮影方法
JP6830386B2 (ja) * 2017-03-27 2021-02-17 株式会社ミツトヨ 測定ヘッド
JP6845072B2 (ja) * 2017-04-21 2021-03-17 ファナック株式会社 工場設備の保守支援装置及び保守支援システム
JP7228951B2 (ja) * 2017-12-05 2023-02-27 あいおいニッセイ同和損害保険株式会社 損失価格評価システム
IT201800006253A1 (it) * 2018-06-12 2019-12-12 Metodo e impianto per la localizzazione di punti su una superficie complessa nello spazio

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0666732A (ja) * 1992-08-19 1994-03-11 Mitsubishi Motors Corp 塗装点検仕上げ方法
JPH0829143A (ja) * 1994-07-13 1996-02-02 Mazda Motor Corp 表面状態検査方法及びその装置
JP2002508071A (ja) * 1997-07-18 2002-03-12 アウディ アーゲー ボディ・シェルの表面欠陥を自動的に認識する方法およびこの方法を実施する装置
JP2011163823A (ja) * 2010-02-05 2011-08-25 Aisin Seiki Co Ltd 物体形状評価装置
JP2017116404A (ja) * 2015-12-24 2017-06-29 ダイハツ工業株式会社 形状認識装置、及び形状認識方法
JP2018044812A (ja) * 2016-09-13 2018-03-22 株式会社Vrc 3dスキャナ

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024034210A1 (fr) * 2022-08-09 2024-02-15 株式会社日立製作所 Système et procédé de mesure, et système et procédé de régénération de composant

Also Published As

Publication number Publication date
JP2022037856A (ja) 2022-03-09
JP2022037326A (ja) 2022-03-09

Similar Documents

Publication Publication Date Title
US11331803B2 (en) Mixed reality assisted spatial programming of robotic systems
Munoz et al. Mixed reality-based user interface for quality control inspection of car body surfaces
JP7490349B2 (ja) 入力装置、入力装置の制御方法、ロボットシステム、ロボットシステムを用いた物品の製造方法、制御プログラム及び記録媒体
US20180117766A1 (en) Device, method, program and recording medium, for simulation of article arraying operation performed by robot
US11185985B2 (en) Inspecting components using mobile robotic inspection systems
US20080013825A1 (en) Simulation device of robot system
CN103770112A (zh) 机器人系统及加工件的制造方法
JP6856590B2 (ja) センシングシステム、作業システム、拡張現実画像の表示方法、およびプログラム
CN103231162A (zh) 机器人焊接质量视觉检测装置及其检测方法
JP6677706B2 (ja) リンク情報生成装置、リンク情報生成方法及びリンク情報生成プログラム
JP2014530767A5 (fr)
WO2022043979A1 (fr) Programme, procédé et système
JP2021079520A (ja) 拡張現実を用いたシミュレーション装置及びロボットシステム
JP2006154924A (ja) 自動化設備システム
CN112894798A (zh) 在存在人类操作员的情况下控制机器人的方法
CN204088274U (zh) 探针台的晶圆芯粒自动对准装置
US20180374265A1 (en) Mixed reality simulation device and computer readable medium
JP6603289B2 (ja) ロボット、ロボットシステム、およびロボットの座標系設定方法
JP7173048B2 (ja) 異音検査装置、異音検査方法、プログラム及びワークの製造方法
JP2022176106A (ja) 双腕ロボット組立システム
CN114746207B (zh) 补焊设备以及补焊方法
CN117006972A (zh) 物体检测方法、装置、设备、三维扫描系统及存储介质
JP2015058488A (ja) ロボット制御システム、ロボット、ロボット制御方法及びプログラム
CN112444283B (zh) 车辆组合件的检测设备和车辆组合件生产系统
Kidiraliev et al. Using optical sensors for industrial robot-human interactions in a Gazebo environment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21860732

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21860732

Country of ref document: EP

Kind code of ref document: A1