WO2024034210A1 - Système et procédé de mesure, et système et procédé de régénération de composant - Google Patents

Système et procédé de mesure, et système et procédé de régénération de composant Download PDF

Info

Publication number
WO2024034210A1
WO2024034210A1 PCT/JP2023/018231 JP2023018231W WO2024034210A1 WO 2024034210 A1 WO2024034210 A1 WO 2024034210A1 JP 2023018231 W JP2023018231 W JP 2023018231W WO 2024034210 A1 WO2024034210 A1 WO 2024034210A1
Authority
WO
WIPO (PCT)
Prior art keywords
coating
measurement
component
area
image
Prior art date
Application number
PCT/JP2023/018231
Other languages
English (en)
Japanese (ja)
Inventor
好文 關口
雅徳 宮城
秀憲 町屋
恵理 高橋
俊介 森
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Publication of WO2024034210A1 publication Critical patent/WO2024034210A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/30Administration of product recycling or disposal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present disclosure relates to a measurement system and method, and a parts remanufacturing system and method.
  • One type of recycling is a business in which products that have malfunctioned or have deteriorated over time are collected by manufacturers, and the parts included in the collected products are recycled and reused. To regenerate a component, it is necessary to identify the damaged area that caused the malfunction or the damaged area due to aging deterioration, and to repair the damaged area according to the degree of damage.
  • Patent Document 1 Although not related to identifying and repairing damaged areas of parts included in products, there is, for example, Patent Document 1 as a technique for automatically inspecting corroded parts on the floor of a tank. The summary of Patent Document 1 and FIG. "Measure.”
  • step S7 The distance from the measuring section 10 to the corroded part C and the floor surface 4 around it is automatically measured as shown in FIG. 3 (step S7).
  • Each corroded part The depth of C can be calculated as the difference in distance from the surrounding area, and the position of each corroded area C with respect to the position of the main body 5 can be determined by measurement when the corroded area C is detected as being deeper than the surrounding area. It can be understood as the position of section 10.''
  • Patent Document 2 as a technique for thermal spraying repair of concave damaged parts that occur on the oven wall of a coke oven carbonization chamber.
  • the present disclosure has been made in view of the above-mentioned problems, and provides a measurement system and method that can determine whether or not to remove a film even if the surface of the part is covered with a film, and measure the shape of the part, and a part.
  • the purpose of the present invention is to provide a reproduction system and method.
  • the measurement system of the present disclosure is, for example, a measurement system that measures the shape of a component whose surface is at least partially covered with a film, and acquires image data by imaging the component.
  • an image capturing unit that determines the state of the coating based on the image data, diagnoses whether or not the coating needs to be removed, and determines a measurement area; and an image analysis unit that determines the shape of the component in the measurement area.
  • a measurement unit that performs measurement, and when it is diagnosed that removal of the coating is necessary, at least a part of the area from which the coating has been removed is included in the measurement area.
  • the parts remanufacturing system of the present disclosure may determine whether or not the part can be repaired based on, for example, the measurement system, a measurement result of the shape of the part by the measurement system, and pre-recorded design data of the part.
  • the present invention includes a repair diagnosis section that determines a method, and a repair section that repairs the component using the repair method when the repair diagnosis section determines that the component can be repaired.
  • the measurement method of the present disclosure is, for example, a measurement method that measures the shape of a component whose surface is at least partially covered with a film, and acquires image data by imaging the component.
  • the component recycling method of the present disclosure for example, after the measurement method, it is determined whether or not the component can be repaired and the repair method based on the measurement result of the shape of the component and pre-recorded design data of the component. and a repair step of repairing the component using the repair method when it is determined that the component can be repaired.
  • a measurement system and method and a parts recycling system and method, which can determine whether or not the film needs to be removed and measure the shape of the part even if the surface of the part is covered with a film. can.
  • FIG. 3 is a diagram showing the entire process of parts recycling. It is a flowchart figure of the whole process of parts reproduction.
  • FIG. 1 is a block diagram of a parts recycling system. 1 is a diagram showing an example of a coating according to Embodiment 1.
  • FIG. 7 is a diagram illustrating an example of use of a parts recycling system according to a second embodiment.
  • FIG. 7 is a diagram illustrating an imaging process according to Embodiment 3.
  • FIG. 7 is a diagram illustrating an imaging process according to Embodiment 4.
  • FIG. 1 is a diagram showing the entire process of parts recycling. Specifically, FIG. 1 shows an example of the entire process from collecting malfunctioning products or products that have deteriorated over time to reusing parts included in the collected products.
  • the processes shown in Figure 1 include a collection process 10 in which products are collected from users, a disassembly/cleaning process 20 in which the collected products are disassembled, parts are taken out, and cleaned, and the deterioration of the shape and surface condition of the parts is measured.
  • a repairability diagnosis step 40 in which, based on the measured results, the repairability of the component is diagnosed and, if repairable, a repair method is determined; and a component repair step in which repair is performed based on the determined repair method.
  • a parts inspection process 60 for inspecting whether the parts have been repaired as expected
  • an assembly process 70 for assembling a product using the parts that passed the inspection
  • a product inspection process 80 for inspecting the assembled product.
  • the process shown in FIG. 1 is an example, and the process is not limited thereto.
  • the present disclosure mainly relates to the parts measurement process 30, the repairability diagnosis process 40, and the parts repair process 50 among these processes.
  • Products that use recycled parts include automobiles, railways, aircraft, ships, industrial equipment such as motors, power infrastructure such as turbines, and construction machinery such as bulldozers. These products are composed of many large parts, and the total energy required for repair is often lower if the parts are repaired than by replacing them with new parts each time they deteriorate. Furthermore, by recycling resources, the environmental burden can be reduced. As the transition to a recycling-based society begins in earnest in the future, it will be important to improve the efficiency of the parts recycling process and recycle more parts in a shorter time.
  • a component measurement step 30 is performed for a repairability diagnosis step 40 in which repairability is diagnosed and a repair method is determined.
  • a change in the shape of a component from a design value is referred to as damage, and the extent of the change in shape is defined as the degree of damage. Further, the area where damage is estimated to have occurred is called a damaged area.
  • FIG. 2 is a flowchart of the entire process of parts recycling.
  • FIG. 3 is a block diagram of the parts recycling system.
  • the component measurement step 30 shown in FIG. 2 is a measurement method for measuring the shape of a component whose surface is at least partially covered with a film, and includes an imaging step 31, an image analysis step 32, and a film removal preparation step 33. , a coating removal step 34 , and a measurement step 35 .
  • the film removal preparation process 33 and the film removal process 34 may be one process.
  • the component repair process 50 includes a cleaning necessity diagnosis process 51, a cleaning process 52, a repair method diagnosis process 53, and a repair process 54.
  • the parts recycling system 1 shown in FIG. 3 includes an imaging section 2, an image analysis section 3, a surface processing section 4, a measurement section 5, a repair diagnosis section 6, and a repair section 7. These are connected to the network 100 via interface I/Fs (2a to 7a).
  • the measurement system according to this embodiment is a system that includes an imaging section 2, an image analysis section 3, a surface processing section 4, and a measurement section 5 in the component recycling system 1.
  • Each block (2 to 7) has in common an interface I/F (2a to 7a) for connecting to the network, a processor (CPU) (2b to 7b) for executing a program in each block, and a processor (CPU) for executing a program in each block. It has memories (2c to 7c) for storing programs to be executed and storage units (2d to 7d) for storing programs and data to be executed by the processor.
  • These functions are usually configured with multiple server groups and computer terminals, but it is also possible to use a single server, and some or all functions can be configured on the cloud (physically configured with one or more servers). It may also be realized in a cloud environment), and various physical configurations are possible. This disclosure is not limited to physical configurations that implement these functions. Further, a part of the server and a computer terminal can be realized by a smartphone, a tablet, a personal computer (PC), a notebook PC, or the like.
  • the memories (2c to 7c) include a ROM (Read Only Memory), which is a nonvolatile storage element, and a RAM (Random Access Memory), which is a volatile storage element.
  • the ROM stores an unchangeable program.
  • the RAM temporarily reads and stores programs and data corresponding to an application from a storage device when the application is started.
  • the storage units (2d to 7d) are flash memories such as hard disks and SSDs (Solid State Drives), optical storage devices, and the like, and are configured with large-capacity, non-volatile storage devices.
  • the storage unit stores data and programs necessary for executing the program in each block.
  • the data and programs required to execute programs in each block are stored in each storage unit via networks, removable media (optical disks, flash memory, etc.), wireless communication, etc. (including personal computers, etc.) have an interface that connects these and a storage unit. It may also be equipped with a mouse and keyboard interface to provide installation instructions and the like.
  • the component measurement process 30, the repairability diagnosis process 40, and the component repair process 50 shown in FIG. 2 will be explained in detail using the block diagram shown in FIG. 3.
  • the imaging section 2 shown in FIG. 3 images the component and acquires image data.
  • the imaging unit 2 for example, a camera mounted on a smartphone or a tablet can be used.
  • the captured image data is transmitted to the image analysis section 3 via the network 100.
  • a smartphone will be described as an example of the imaging unit 2.
  • a smartphone is a computer that has an imaging function and a communication function, and can be used by installing an application (application, software).
  • the imaging conditions such as the direction in which the component is imaged, the distance to the component, and the illumination conditions are adjusted.
  • the imaging unit 2 may also acquire three-dimensional shape information using a LiDAR (Light Detection And Ranging) function, which has been installed in smartphones in recent years.
  • LiDAR Light Detection And Ranging
  • the imaging unit 2 may image the component under a plurality of imaging conditions. For example, there may be a set of images taken from the front and images taken from an oblique direction, a set of images taken from multiple angles, or a set of image data when the direction of illumination is frontal and oblique.
  • FIG. 4(b) is a diagram showing an example of a component whose surface is covered with a film with unevenness.
  • the coating region 150b on the surface of the component 150 may be imaged using a plurality of cameras 2e, the image may be imaged under a plurality of illumination conditions, the image may be imaged from a plurality of angles, or the like.
  • the image analysis unit 3 can more accurately determine the state of the coating, such as the unevenness of the coating region 150b. Thereby, the image analysis unit 3 can improve the accuracy of estimating the damaged area 150c and diagnosing whether or not the film needs to be removed.
  • the image analysis unit 3 estimates the coating area covered by the coating based on the image data and determines the state of the coating. Examples of the state of the coating include the color, thickness, unevenness (surface roughness), area, etc. of the coating. Then, it is diagnosed whether or not the film needs to be removed.
  • the image analysis unit 3 diagnoses whether or not the film needs to be removed and determines the measurement area. For example, the image analysis unit 3 estimates the damaged area and degree of damage based on the determined state of the coating, and determines the measurement area based on the degree of damage. This makes it possible to measure only the parts with the greatest degree of damage, and it is possible to shorten the working time compared to measuring the entire part.
  • the method of determining the measurement area is not limited to this. For example, if the degree of damage to the area not covered by the film is large, the area not covered by the film may be included in the measurement area, or the entire surface of the component may be set as the measurement area regardless of the degree of damage.
  • the image analysis unit 3 diagnoses that the coating needs to be removed, and proceeds to a preparation step 33 for coating removal. If the coating does not cover the measurement area, or if the coating does not interfere with measurement even if it covers the measurement area as shown in FIG. 4(c), the image analysis unit 3 determines that the coating does not need to be removed. After diagnosis, proceed to measurement step 35. Further, the image analysis section 3 may diagnose whether or not the film needs to be removed based on a plurality of image data acquired by the imaging section 2 by imaging under a plurality of different imaging conditions.
  • the storage unit 3d of the image analysis unit 3 shown in FIG. 3 stores a surface analysis program that executes the image analysis step 32, and when the activation of the surface analysis program is instructed, the image analysis unit 3 stores the image data.
  • the image analysis step 32 is started.
  • the image analysis unit 3 analyzes the surface condition of the component based on the image data, and analyzes the color, brightness (brightness difference within the image, contrast), the difference between a mirror surface and a rough surface (difference in surface roughness, etc.), and the surface Estimate the coating area using the scattering properties of For example, all areas with a color and brightness different from the background (areas that are not parts in image data) may be extracted, and if the color or brightness differs from the background by more than a certain threshold value, it may be detected as a coating area.
  • the image data has RGB values of 256 gradations, and an area where all the values of R, G, and B are below a certain value is set as a coating area.
  • the image analysis unit 3 analyzes image data obtained by capturing images from two directions or from an oblique angle such that the shadow of a three-dimensional object appears during the image capture, to determine the rough thickness and unevenness of the coating ( surface roughness).
  • FIG. 4(a) is a diagram showing an example of a component whose surface is covered with a flat film.
  • the component 150 is an iron component
  • the coating in the coating region 150b is a paint film that is flatly applied to the surface of the component 150.
  • the image analysis unit 3 determines that if the coating in the coating region 150b is in a clean state with no peeling or unevenness as shown in FIG. Since it can be estimated that there is no area where the degree of damage is large, it may be determined that removal of the coating in the coating area 150b is unnecessary.
  • the coating in the coating region 150b is uniform and the shape of the component can be measured even from above the coating, it may be determined that removal of the coating is unnecessary. However, even if the coating in the coating region 150b is clean or uniform in this way, the component 150 is measured by removing the coating in the coating region 150b from the material data of the component 150 and the coating in the coating region 150b. If necessary, for example, if you want to use the entire part as a measurement area or compare it with a normal area, the image analysis unit 3 may diagnose that the coating in the coating area 150b needs to be removed.
  • FIG. 4(c) shows a case where there is a film such as a thin oxide film instead of a paint film.
  • the film thickness is sufficiently thin, such as a thin oxide film, interference fringes 160 may appear in the image data as shown in FIG. 4(c).
  • the image analysis unit 3 can determine from the image data that the film is a thin oxide film.
  • AI Artificial Intelligence
  • the image analysis unit 3 includes AI (Artificial Intelligence) that performs machine learning on the correlation between the interference fringes 160 and the coating state.
  • the image analysis unit 3 can determine from the image data that the film is a thin oxide film.
  • the image analysis section 3 may diagnose that removal of the film is unnecessary and proceed to the measurement step 35. Note that if the type of oxide film is known from the material of the component, the thickness of the oxide film may be accurately determined based on the principle of ellipsometry.
  • each worker may diagnose whether or not the film needs to be removed, but the judgment can be made using AI that has machine learned the correlation between image data and the state of the film. This eliminates the dependence on the person making the decision and allows diagnosis to always be made at the same level.
  • the damaged area where the shape of the part has changed and the degree of damage are estimated.
  • estimation of the damaged area and degree of damage to the parts inside the coating will be explained.
  • the method of estimating the damaged area and degree of damage in the area where there is no coating is not limited to this.
  • parameters used to estimate the coating area and the parameters obtained by estimating the coating area are used.
  • Information regarding the coating such as parameters used to estimate the coating area, parameters obtained by estimating the coating area, and image data of the coating, will be referred to as coating data.
  • parameters used to estimate the coating area include color, brightness (brightness difference within the image, contrast), difference between a mirror surface and a rough surface (difference in surface roughness, etc.), surface scattering, etc.
  • Parameters obtained by estimating the coating area include the coating area shape, area area, maximum thickness of the coating, and the like.
  • a threshold value is set for coating data such as the color and contrast of the coating region, and the area area.
  • This threshold value is determined in advance based on the correlation between coating data actually measured or calculated in advance and the degree of damage. For example, if multiple samples are measured and the degree of damage is large enough to require measurement, the R value of RGB in the coating area exceeds 128 gradations in all samples, so the R value threshold is set to 128. All you have to do is decide.
  • the degree of damage may be estimated over the entire surface of the component, and an area where the degree of damage exceeds a predetermined threshold may be determined as a damaged area.
  • the degree of damage may be estimated for the damaged area. In this case, since the area for evaluating the degree of damage is limited to the damaged area, work efficiency can be improved.
  • the estimation method of setting a threshold value for the coating data described above defines the color of the coating as a feature quantity, obtains a threshold value for the feature quantity in advance, and estimates the degree of damage from the magnitude relationship with the threshold value.
  • This estimation may be replaced by a predictive model based on machine learning. That is, the image analysis unit 3 estimates the degree of change in the shape of the part from the image data as the degree of damage, using a prediction model built in advance based on the correlation between the film data and the change in shape of the part inside the film. It's okay.
  • the amount related to the degree of damage is expressed by a numerical value such as a probability, rather than a classification of whether or not measurement is necessary, which will be described later, so that detailed diagnosis is possible and the diagnostic accuracy is improved.
  • a predictive model based on machine learning is obtained by training the correlation between capsular data and the degree of damage in a predetermined machine learning algorithm.
  • image data of the capsule may be used for model training.
  • a predictive model can be obtained by training the correlation between the image data of the capsule and the degree of damage. In this case, both the damage area and damage degree estimation can be automated at once.
  • a predictive model may be constructed by classifying image data of a film that has been determined to require measurement in advance and images that do not require measurement, and then learning each of them. In this case as well, both the damage area and damage degree estimation can be automated at once.
  • the image analysis unit 3 may have a function of receiving input of a set of damage degree data and coating data in order to construct or update a prediction model or a prediction method used for estimating the damage degree.
  • the damaged area may exist not only in the area covered by the film on the surface of the component but also in the area not covered by the film, or in both areas.
  • the image analysis unit 3 can check whether there is damage that is clearly irreparable in areas where there is no coating, and can diagnose areas where there is no coating as being clearly unrepairable due to large damage or other reasons. If so, proceed to the discontinuation step 200 and reuse the product as a raw material (material recycling) or discard it if the product is severely damaged.
  • the damaged area and degree of damage are estimated from the surface condition of the component based on image data.
  • the correlation between the image data and the degree of damage may be obtained even for areas where there is no coating, and the damaged area and the degree of damage may be estimated from the correlation between the image data and the degree of damage.
  • the degree of damage is correlated with the brightness in image data, the difference in brightness with the surroundings (contrast), the difference between a mirror surface and a rough surface (difference in surface roughness, difference in surface scattering), etc.
  • the image analysis unit 3 classifies whether the damaged area is a highly damaged area or not. Alternatively, a plurality of ranks may be provided depending on the degree of damage, and the estimated degree of damage may be classified into each rank. Examples include a rank in which no damage is considered to have occurred, a rank in which damage has occurred but does not require repair, a rank in which repair is required, a rank in which repair is not possible, and the like.
  • the image analysis unit 3 determines the measurement area based on the estimation result of the degree of damage. For example, the measurement area may be determined so as to include the area estimated to have the greatest degree of damage among the estimated damage areas. Furthermore, when the degree of damage is classified into ranks, the measurement area may be determined so as to include a region where the degree of damage is classified into a specific rank. Accurately estimating and measuring areas with a large degree of damage improves the accuracy of determining whether repairs are possible. Furthermore, in determining whether or not repair is possible, it is also important to estimate that there are no areas with a large degree of damage. The measurement area may be determined automatically by the image analysis unit 3, but if the degree of damage is quantified, the measurement area may be determined by the operator based on the numerical value. Further, the condition of the damaged area may be checked or determined by the operator.
  • the image analysis unit 3 diagnoses that the coating needs to be removed, and proceeds to a preparation step 33 for coating removal. If the coating does not interfere with measurement, the image analysis unit 3 diagnoses that removal of the coating is unnecessary, and proceeds to measurement step 35. Further, whether or not the film needs to be removed may be diagnosed based on the degree of damage. For example, the image analysis unit 3 may diagnose a region estimated to have a small degree of damage as not requiring removal of the coating. Furthermore, if the coating impedes measurement in a region estimated to have a large degree of damage, the image analysis unit 3 diagnoses that the coating needs to be removed.
  • the degree of damage to the parts inside the coating may be estimated based on the measurement results. Note that when determining the measurement area without depending on the degree of damage, estimation of the degree of damage and the damaged area may be omitted.
  • a film removal step 34 is performed in which the film is removed by using the film region diagnosed as requiring film removal as the film removal region.
  • processing conditions for the surface processing section 4 when removing the film are determined based on at least the state of the film.
  • the film removal preparation step 33 may be performed by any of the imaging section 2, the image analysis section 3, the processor of the surface processing section 4, etc., or may be performed in cooperation with each other.
  • Conditions for processing the surface are determined from information obtained from the image data, that is, information such as the state of the coating, coating data, damaged area, degree of damage, and measurement area. The information required for processing conditions depends on the processing equipment.
  • the processing device 4e of the surface processing section 4 shown in FIG. 3 is a polishing/grinding machine such as a grinder
  • the thickness of the coating to be removed is estimated from the degree of damage in the measurement area in the coating removal preparation step 33.
  • the processing device 4e polishes and grinds the film according to the estimated thickness to remove the film.
  • the film thickness and depth are estimated by the image analysis section 3 and transferred to the surface processing section 4.
  • the configuration is not limited to this, and the estimation may be performed by the processor of the surface processing section 4, the measuring section 5, or the repair diagnosis section 6.
  • other processors not mentioned may be used.
  • the image analysis section 3 and the repair diagnosis section 6 may have a common processor, and various physical configurations are possible.
  • the surface processing section 4 may perform processing automatically based on the transferred values, or may be performed by an operator based on the estimated value of the coating thickness. Note that the parameters to be input to operate the processing device are stored in any one of the storage sections of the image analysis section 3, the surface processing section 4, and the repair diagnosis section 6.
  • the processing conditions are laser irradiation conditions determined also based on the material information of the component. Therefore, it is necessary to determine the laser irradiation conditions in the film removal preparation step 33.
  • the most important parameters among them are the quantities related to energy density [J/m 2 ], such as energy [J], peak power [W], peak power density [W/m 2 ], pulse width [s], These are average power [W], wavelength [nm], etc.
  • the energy density is changed by changing any of these parameters.
  • the coating cannot be removed unless the energy density exceeds a certain level. On the other hand, if it exceeds a certain size, not only the coating but also the parts will be damaged. Therefore, it is necessary to set the energy density to an appropriate value depending on the state of the coating and the material of the part.
  • the relationship between the part material and the maximum energy density may be determined in advance from experiments as part material information. .
  • the lower limit of the energy density (or the amount related to the energy density) may be determined through experiments, but the image analysis unit 3 and the surface processing The determination may be made using a database stored in a storage unit such as the unit 4. Once the lower limit of the energy density is determined, it is sufficient to set the energy density to be slightly larger than the lower limit.
  • the energy density may be determined to be half of the maximum energy density determined from the material information of the part.
  • the power density that causes damage is often sufficiently higher than that of a corroded film such as rust, which is one of the coatings. Therefore, when the material is metal, the likelihood of setting the power density is large, and it becomes possible to determine the energy density in a wide range as described above.
  • the laser irradiation conditions can be determined based on the material information of the component or the information analyzed from the image data in the image analysis section 3, it will be possible to determine the laser irradiation conditions every time it is used. This eliminates the need for the operator to consider and decide on conditions, and has the effect of automating coating removal using a laser.
  • these laser conditions be displayed on a monitor that controls the processing device 4e, on a smartphone or the like using an app for film removal, or be recorded in a file or the like. This makes it possible to suppress machining errors and check machining conditions at a later date.
  • the surface processing section 4 removes the coating according to the coating removal area and processing conditions.
  • All processes may be automatic processes that do not involve humans, but if humans are involved, for example, an app or the like can be used to support film removal to reduce work errors and improve work efficiency.
  • an app that supports coating removal is installed on a smartphone or computer, and the app supports coating removal by showing the area to be removed along with the captured image, and also showing the estimated degree of damage.
  • the screen of the smartphone becomes the display section. This improves the efficiency of human film removal.
  • the processing conditions etc. are also indicated, it is possible to prevent parts from being accidentally damaged when removing the coating.
  • the application may be operated not only on the smartphone but also on the CPU 4b of the surface processing section 4, or on another CPU. It is good if it can operate when removing the film.
  • the measurement unit 5 measures the shape of the component in the measurement area. Note that if a coating existing in the measurement area obstructs measurement, it is diagnosed in the image analysis step 32 that the coating needs to be removed. In this case, the coating present in the measurement area is removed in the coating removal step 34. Therefore, if it is diagnosed that the coating needs to be removed, the measurement area in the measurement step 35 will include at least a portion of the area from which the coating has been removed. In addition, in the image analysis step 32, if a region estimated to have a large degree of damage is diagnosed as requiring film removal, the measurement region is determined to include that region in the measurement region, and the film is determined to need to be removed. Since the coating in the diagnosed area is removed in the coating removal step 34, the measurement area in the measurement step 35 includes the area from which the coating was removed from the area estimated to have a large degree of damage.
  • an operator may use calipers or the like, or three-dimensional measurement may be performed using a laser displacement meter, a laser profile measuring device, etc.
  • a laser displacement meter e.g., a laser displacement meter
  • a laser profile measuring device e.g., a laser laser profile measuring device
  • three-dimensional measurement is preferable to improve the accuracy of determining whether or not it can be repaired.
  • three-dimensional measurement is desirable because the accuracy of diagnosis of the repair method is improved by performing three-dimensional measurement and using the resulting shape data because the amount of information is large.
  • the coating is partially removed due to the influence of the coating, it is sufficient to measure the area where the coating was removed. Furthermore, a region estimated to have a large degree of damage may be set as a first measurement region, and a region estimated to have a small degree of damage may be set as a second measurement region. In this case, it is desirable that the degree of damage in the second measurement area is so small that it can be considered normal without any damage occurring.
  • the measurement unit measures the relative shape of the component in the first measurement region by measuring the shape of the component in each of the first measurement region and the second measurement region. Such relative measurements are highly accurate.
  • the degree of damage in the second measurement area is small, absolute dimensions can be measured with high precision. If the coating is removed in the second measurement area as well, the relative dimensions of the parts in the first measurement area can be measured. shape can be measured with higher precision. As a result, the absolute shape of the component in the first measurement area can be measured with higher precision. Further, in the case of a very thin coating as shown in FIG. 4C, for example, the second measurement area may be measured without removing the coating, since it does not interfere with measurement. Alternatively, an area where the degree of damage is small and where there is no coating from the beginning may be used as the second measurement area.
  • the work time can be shortened by removing and measuring only the part of the film estimated to be severely damaged, rather than removing the entire film. Furthermore, by partially removing the coating and measuring the relative shape of the removed area and the area with less damage, it is possible to reduce work time and improve measurement accuracy. can.
  • the repair diagnosis section 6 uses the shape data (measurement result of the part shape) obtained in the measurement step 35 and the design data of the part such as pre-recorded 3D CAD data. , determine whether the parts can be repaired. Specifically, it is inspected to see if the shape of the part has changed significantly from the design data, and based on the inspection results, it is determined whether the part can be repaired.
  • the storage unit 6d stores various data necessary for diagnosis. Furthermore, the storage section 6d of the repair diagnosis section 6 may share a part of the storage section 3d of the image analysis section, and the data used for image analysis may be shared.
  • 3D CAD, design drawings, etc. do not necessarily need to be stored in the storage unit 6d, and may be read into the memory 6c from another server via the network 100.
  • the damaged area can be determined not only by shape differences, but also by performing material component analysis using laser-induced breakdown spectroscopy (LIBS) to confirm that the components fall within the material specifications. .
  • LIBS laser-induced breakdown spectroscopy
  • the repair diagnosis section 6 diagnoses whether or not repair is possible based on these inspections. If the part can be repaired, the process proceeds to a parts repair step 50. If repair is not possible, the process proceeds to a discontinuation step 200, where the product is reused as a raw material (material recycling) or, if the damage is severe, it is discarded.
  • the component repair process 50 shown in FIG. 2 will be explained.
  • the repair diagnosis section 6 diagnoses the necessity of cleaning, and if necessary, proceeds to a cleaning step 52.
  • cleaning means that when it is determined in the image analysis step 32 that the film does not need to be removed, or when the film covering the surface of the component is partially removed in the film removal step 34, the surface processing section 4 is This refers to the removal of all coatings left on the surface of parts.
  • data that has already been acquired such as the image data described above, coating data obtained through image analysis, coating area, damaged area, degree of damage, measured surface shape, and inspection data used for repair diagnosis, is used. You can make a diagnosis based on this.
  • conditions for reprocessing the surface processed portion 4 when removing the film are determined based on at least the state of the film.
  • the reprocessing conditions may be the same laser irradiation conditions as the processing conditions.
  • the type of processing device and processing conditions (reprocessing conditions) for removing the film in the cleaning step 52 may be the same as the type of processing device and processing conditions used in the film removal step 34 before measurement.
  • the repair method diagnosis step 53 shown in FIG. 2 will be explained. Similar to the repairability diagnosis step 40, the repair diagnostic unit 6 determines a method for repairing the component based on the measurement result of the shape of the component and pre-recorded design data of the component. Repair methods include thermal spraying, overlaying, cutting, remelting, and heat treatment. Sometimes one process is performed, and sometimes a plurality of processes are performed, and the selection of processes, setting of process conditions, order of processes, etc. are determined in the repair method diagnosis step 53. Diagnosis of the repair method is performed based on the data already obtained, such as the image data mentioned above, coating data obtained through image analysis, coating area, damaged area, degree of damage, measured surface shape, and inspection data used for repair diagnosis. It's fine. Particularly when performing repairs using a laser, utilizing image data and optical characteristics such as color and brightness obtained from the image data has the effect of facilitating the setting of laser irradiation conditions. Once the repair method is determined, the process proceeds to repair step 54.
  • the repair section 7 repairs the component using the repair method determined earlier. Specifically, the repair section 7 repairs the component using a repair device 7e.
  • the cleaning necessity diagnosis step 51 and the repair method diagnosis step 53 are separated in FIG. 2, the present invention is not limited thereto. , you may proceed to repair including cleaning.
  • the order from the cleaning necessity diagnosis step 51 to the repair step 54 has a likelihood, and there are various orders.
  • This component inspection is an inspection to determine whether the repaired component has the characteristics and dimensions assumed at the time of diagnosis in the repair method diagnosis step 53, and whether there is no problem in assembling it as part of the product.
  • the process returns to the repair method diagnosis step 53 and the repair is redone. In some cases, the process may return to the cleaning necessity diagnosis step 51.
  • an assembly step 70 is performed to assemble the parts, and a product inspection step 80 is performed to check whether there are any problems as a product. If there are no problems in the product inspection process 80, the process proceeds to a shipping process 90. Note that if the product cannot be shipped in the product inspection step 80, it is reassembled with consideration given to using new parts.
  • FIG. 5 is a diagram showing an example of use of the parts recycling system according to the second embodiment. A case where there is rust corrosion as a coating on iron parts will be explained using FIG. 5. Note that descriptions of the same configurations as in Embodiment 1 will be omitted.
  • FIG. 5(a) is a diagram showing an example of the imaging process 31.
  • the iron part 150 has a normal region 150a in which the part is not covered with rust and a coating region 150b in which the part is covered with rust. Note that the degree of damage to the normal region 150a is assumed to be small.
  • An example in which the imaging unit 2 is a smartphone and the imaging step 31 is performed using an imaging application will be described. Note that in this embodiment, it is assumed that the screen of the smartphone functions as a display unit.
  • the worker starts an imaging application on a smartphone. Following the instructions of the app displayed on the smartphone screen, the component 150 is imaged using the smartphone's camera function. If appropriate imaging has been performed, the app displays OK on the smartphone screen, saves the image data of the image 101, and transmits it to the image analysis unit 3 from the wireless interface 2a via the network 100. If the image capture is inappropriate, the app will instruct you to retake the image.
  • FIG. 5(b) shows an example in which the damaged area 150c estimated by the image analysis unit 3 is displayed in the image 101.
  • the display section displays the damaged area on the image captured by the imaging section 2, as shown in FIG. 5(b).
  • FIG. 5B shows an example in which there are three damaged regions 150c, and the degree of damage to the component due to rust corrosion is numerically indicated next to each damaged region 150c.
  • the operator may set the measurement area based on the numerical value of the degree of damage, or a threshold value for the numerical value of the degree of damage may be set in advance, and an area where the numerical value of the degree of damage is larger than the threshold value may be set as the measurement area.
  • the measurement area may be displayed after automatically determining the measurement area without displaying the damage area, or both the damage area and the measurement area may be displayed.
  • the process may proceed to the film removal preparation step 33 or measurement step 35 without displaying such a display.
  • the operator can visually confirm the relevant part of the part, so it is better to display the damaged area and measurement area, as the machine and the person double-check the area and avoid mistakes in the wrong place. This has the effect of preventing film removal and measurement.
  • the image analysis unit 3 diagnoses that at least a portion of the component 150 does not require removal of the coating, and also diagnoses that the removal of the coating is necessary for the area estimated to have a large degree of damage. is diagnosed.
  • the image analysis unit 3 diagnoses that the film needs to be removed only from the parts of the damaged area 150c whose damage degrees are displayed as 1.2 and 1.0 as shown in FIG. 5(b). , and other coating regions 150b, a case will be described in which it is determined that removal of the coating is unnecessary.
  • the image analysis unit 3 sets the two damaged areas 150c diagnosed as requiring coating removal as the measurement areas 150d.
  • FIG. 5(c) is a diagram showing an example of the state of the film removal step 34.
  • the processing device 4e according to the second embodiment is a laser removal device.
  • the coating in the measurement region 150d which is a region diagnosed as requiring coating removal, is removed by the processing device 4e.
  • the output of the laser beam 4eRay irradiated by the processing device 4e is set to be sufficiently lower than the power that would cause ablation and damage to iron, based on the material information of the part that the part is made of iron. There is.
  • the output of the laser beam 4eRay is adjusted based on the state of the coating estimated from the color of the rust.
  • FIG. 5(d) is a diagram showing an example of the measurement process 35.
  • the measuring device 5e according to the second embodiment is a laser shape measuring device that performs three-dimensional measurement by irradiating the surface of a component with laser light 5eRay.
  • the normal region 150a which shows the dimensional difference from the design value with high accuracy
  • the measurement region 150d in the coated region 150b the relative difference between the simultaneously measured regions can be seen with high precision. This has the effect that the degree of damage can be estimated with high accuracy.
  • FIG. 5(e) is a diagram showing an example of the cleaning process 52.
  • the surface processing section 4 removes the coating that has been diagnosed by the image analysis section 3 as not requiring removal when the repair diagnosis section 6 has determined that the component can be repaired.
  • the coating region 150b present on the surface of the component 150 in FIG. 5(d) is removed.
  • the coating region 150b is removed under the same laser irradiation conditions using the laser removal device 4e used in the coating removal step 34 shown in FIG. 5C.
  • FIG. 5(f) is a diagram showing an example of the repair process 54.
  • the repair device 7e of the repair section 7 repairs parts by welding, overlaying, etc. using laser light 7eRay.
  • the conditions for the laser beam 7eRay can be determined, for example, based on the image 101 and the results obtained therefrom. Alternatively, the determination may be made based on machine learning data on the cloud.
  • FIG. 6 is a diagram illustrating an imaging process according to the third embodiment. In order to accurately determine the state of a coating, estimate the degree of damage, etc.
  • the imaging unit 2 has a function of supporting imaging so that an image with which the state of the coating can be easily determined is captured under appropriate imaging conditions.
  • the imaging unit 2 is a computer, a smartphone, etc.
  • a function to support imaging may be provided as an application. That is, the measurement system or the parts recycling system may include a computer or a smartphone installed with an application that supports imaging.
  • FIG. 6 Note that descriptions of configurations similar to those in Embodiment 1 or 2 will be omitted.
  • the imaging unit 2 supports imaging in an interactive (interactive/bidirectional) manner.
  • the imaging unit 2 sends the image data obtained by imaging to the image analysis unit 3, and when the image analysis unit 3 determines that the image data is not good, appropriate imaging is performed. It is conceivable to issue an instruction to perform re-imaging under certain conditions.
  • the imaging unit 2 has a display unit and instructions are given through the display on the display unit, but the invention is not limited to this, and instructions may be given by voice or the like.
  • the quality of the image data is determined based on whether the information contained in the image data is sufficient for determining the state of the coating, estimating the degree of damage, estimating the damaged area, etc. For example, if the thickness of the film cannot be determined from the image data or the color of the film cannot be determined because the lighting environment is too dark, it is determined that the image data is not good.
  • the display unit displays the image captured by the imaging unit 2, and also displays instructions regarding the quality of the image data and re-imaging.
  • Examples of display of quality of image data include display of graded evaluation such as Excellent, Good, and Poor.
  • instructions regarding re-imaging may be displayed simply by prompting the user to re-take the image, such as "Please take the image again," or by displaying instructions such as "Please enlarge the image” or "Please take the image from multiple directions.” It may also be a display showing advice on what is missing.
  • the imaging unit 2 supports imaging and acquires image data with sufficient information
  • the accuracy of determining the state of the coating based on the acquired image data improves.
  • the accuracy of estimating the degree of damage and measurement area based on the state of the coating is improved.
  • image data in which the state of the film cannot be determined is obtained, the imaging is re-taken, so unnecessary film removal or repair based on image data in which the state of the film cannot be determined can be suppressed, and work efficiency is improved.
  • FIG. 6 is a diagram illustrating the imaging step 31 according to the third embodiment.
  • Embodiment 3 a specific example will be described using FIG. 6 in which a parts reproduction system and a person interact to take an image of a part through an application installed in the imaging unit 2.
  • FIG. 6(a) is a diagram showing an example of the imaging process 31.
  • FIG. 6A shows a state in which a component 150 having a normal region 150a and a coating region 150b is imaged by the camera 2e of the imaging section 2.
  • FIG. 6(b) is a diagram showing an example of a display on the display unit.
  • the imaging unit 2 which is a smartphone, has a display unit, but the present invention is not limited to this.
  • the display unit displays an image 170a captured by the imaging unit 2, and also displays text 170b asking whether to transfer the image 170a to the image analysis unit 3, and text 170b.
  • a button 170c for responding to 170b is displayed. If Yes on the button 170c is pressed, the image data of the captured image 170a is transferred to the image analysis section 3. Moreover, when No of the button 170c is pressed, the image is captured again. As described above, before the image data is transferred to the image analysis section 3, the operator is asked to confirm the transfer of the image data, but this may be omitted.
  • FIG. 6(c) shows that after the image 170a is transferred in FIG. 6(b), the image analysis unit 3 determines that the information included in the image data is insufficient to estimate the damaged area. It is a diagram in which characters 170d and an arrow 170e are displayed on the display section to instruct the imaging section 2 to take an image from a different direction.
  • FIG. 6(d) shows the camera 2e of the imaging unit 2 capturing an image of the component 150 from the direction of the arrow 170e shown in FIG. 6(c).
  • FIG. 6(e) is a diagram showing an example of a display on the display unit.
  • An image 170f shown in FIG. 6(e) is an image captured by the camera 2e of the imaging unit 2.
  • the imaging unit 2 transfers the image data of the image 170f to the image analysis unit 3, and the image analysis unit 3 estimates the damaged area 150c and the degree of damage in each damaged area 150c.
  • the display section displays the damaged areas 150c estimated by the image analysis section 3 on the image 170f, and also displays the degree of damage of each damaged area 150c in numerical values.
  • characters 170g are displayed on the display to indicate whether the image is being captured accurately and whether any other damaged areas have been visually confirmed by the worker, so that the worker can check whether there is any problem with the image data. I've confirmed it.
  • a button 170h for responding to the character 170g is displayed. If “YES” of the button 170h is pressed, imaging is performed again, and if "No” is pressed, the process proceeds to diagnosis as to whether or not it is necessary to remove the film and to determine the measurement area.
  • FIG. 6(f) shows one of the damaged areas 170i being enlarged and imaged according to instructions on the display unit.
  • the display section displays not a captured image but an image captured by the camera 2e of the imaging section 2 in real time. Furthermore, the position to be imaged is displayed on the upper left of the display section on a sub-screen 170h. Note that, as an example, on the sub-screen 170h, the position to be enlarged and imaged on the image previously imaged by the imaging unit 2 is displayed as a dotted circle.
  • a character 170j is displayed to indicate that the image is correct. The operator checks the characters 170j, and if the operator determines that there is no problem, presses the image capture button 170k to capture an image.
  • enlarged imaging is not necessarily required, it may be performed by default in order to improve the accuracy of diagnosing the necessity of removing the coating.
  • enlarged imaging may be performed when the image analysis unit 3 determines that the diagnostic probability (diagnosis reliability) of whether or not the film needs to be removed is low.
  • the diagnostic probability of whether or not the film needs to be removed may be displayed (not shown), and the operator may make a decision based on the displayed diagnostic probability and perform enlarged imaging.
  • the state of the film can be easily determined and the accuracy of diagnosis of whether or not the film needs to be removed can be increased.
  • the imaging direction is instructed by the display section of the imaging section 2, but the invention is not limited to this, and the lighting direction or lighting environment may be instructed.
  • FIGS. 6(a) to 6(f) By creating a flowchart including the operations shown in FIGS. 6(a) to 6(f) and operating the measurement system according to the created flowchart, full automation including reimaging is also possible.
  • a flowchart is created with some or all of the components, such as imaging, image confirmation, imaging from another direction, estimation of the damaged area and degree, and enlarged imaging of the damaged area, and in some cases, the enlarged imaging may be skipped. All you have to do is add a branch, etc.
  • smart glasses may also be used.
  • Smart glasses are glasses-shaped devices equipped with a camera and an image display section, and when worn, text, icons, images, etc. are displayed using AR (Augmented Reality) technology as if they were right in front of your eyes. It is something.
  • AR Augmented Reality
  • a worker can intuitively adjust the imaging position, which has the effect of improving work efficiency.
  • workers can work with both hands free and can work while checking their surroundings, which also improves safety.
  • FIG. 7 is a diagram illustrating an imaging process according to the fourth embodiment.
  • the imaging unit 2 images the reference and the component simultaneously.
  • the reference represents a standard such as color or brightness, and can be exemplified by a color sample. Examples of the reference include cards on which R (red), G (green), B (blue), white, and black are printed.
  • an imaging step 31 for simultaneously imaging a reference and a component will be specifically described using FIG. 7.
  • 180 is a reference code. Note that descriptions of the same configurations as those in Embodiments 1 to 3 will be omitted.
  • FIG. 7(a) is a diagram showing an example of the imaging step 31 in which the reference 180 is placed on the surface of the component 150 and an image is taken.
  • the location of the reference 180 is not limited to this, by placing the reference 180 near the coating region 150b, which is the target of diagnosis as to whether or not the removal of the coating is necessary, the illumination environment of the target of diagnosis can be corrected with high accuracy.
  • a method of correcting an image captured simultaneously of the reference 180 and the component 150 will be described.
  • the reference color in the image of the reference 180 and the component 150 is captured at the same time, and the reference color (vector of RGB values and brightness I (R, G, B, I) set as a standard in advance) ) is calculated to match the color conversion matrix.
  • correction is performed by converting the entire captured image using the calculated color conversion matrix.
  • the reference 180 is placed to image the part. Then, the correction is performed in the same manner as before, and the correlation between the image data of the corrected image and the degree of damage is obtained. This makes it possible to obtain a correlation between image data and the degree of damage while suppressing the effects of differences in lighting environments, particularly differences in color of illumination light.
  • the standard reference color for example, R, G, B, I
  • the reference and the component are imaged at the same time, and based on the comparison between the standard reference color and the imaged reference color,
  • RGB values R, G, B
  • Other color indicators such as chromaticity or wavelength dispersion, may also be used.
  • FIG. 7(b) is a diagram showing an example of the reference 180, which is white paper with high reflectance.
  • a standard diffuse reflection plate or the like that uniformly scatters and reflects in all directions may be used. If a material with high reflectance in a wide wavelength band is used, it will reflect the color of the environmental light as it is, so differences in the environmental light will be reflected in detail.
  • FIG. 7(c) is a diagram showing an example of the reference 180, and there is a black portion 180bl within the white portion 180w.
  • Using the reference 180 shown in FIG. 7(c) makes it easy to extract changes in contrast. Also, if there are marks such as black dots at the four corners, the reference position can be easily found within the image. It is easier to extract the reference position if there are marks not only in the four corners.
  • FIG. 7D is a diagram showing an example of the reference 180, in which a white part 180w is arranged in the center, and around it are a black part 180bl, a red part 180r, a green part 180g, and a blue part 180b.
  • a white part 180w is arranged in the center, and around it are a black part 180bl, a red part 180r, a green part 180g, and a blue part 180b.

Landscapes

  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Quality & Reliability (AREA)
  • Sustainable Development (AREA)
  • Economics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un système de mesure au moyen duquel, même lorsque la surface d'un composant est recouverte d'un revêtement, il est possible de déterminer si le revêtement doit être retiré ou non et de mesurer la forme du composant. Ce système de mesure est destiné à mesurer la forme d'un composant dont au moins une partie de la surface est recouverte d'un revêtement, le système de mesure comprenant : une unité d'imagerie qui acquiert des données d'image par la capture d'une image du composant ; une unité d'analyse d'image qui, sur la base desdites données d'image, détermine l'état du revêtement et diagnostique si le revêtement doit être retiré ou non, et décide également une zone de mesure ; et une unité de mesure qui mesure la forme du composant dans la zone de mesure. Lorsqu'il a été diagnostiqué que le revêtement doit être retiré, au moins une partie d'une zone obtenue par le retrait du revêtement est incluse dans la zone de mesure.
PCT/JP2023/018231 2022-08-09 2023-05-16 Système et procédé de mesure, et système et procédé de régénération de composant WO2024034210A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022126911A JP2024024234A (ja) 2022-08-09 2022-08-09 計測システム及び方法並びに部品再生システム及び方法
JP2022-126911 2022-08-09

Publications (1)

Publication Number Publication Date
WO2024034210A1 true WO2024034210A1 (fr) 2024-02-15

Family

ID=89851504

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/018231 WO2024034210A1 (fr) 2022-08-09 2023-05-16 Système et procédé de mesure, et système et procédé de régénération de composant

Country Status (2)

Country Link
JP (1) JP2024024234A (fr)
WO (1) WO2024034210A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018185890A1 (fr) * 2017-04-05 2018-10-11 株式会社ニコン Dispositif d'aide au revêtement, dispositif de revêtement, procédé d'aide au travail de revêtement, procédé de production d'article revêtu et programme d'aide au revêtement
WO2022043979A1 (fr) * 2020-08-25 2022-03-03 株式会社オプティム Programme, procédé et système

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018185890A1 (fr) * 2017-04-05 2018-10-11 株式会社ニコン Dispositif d'aide au revêtement, dispositif de revêtement, procédé d'aide au travail de revêtement, procédé de production d'article revêtu et programme d'aide au revêtement
WO2022043979A1 (fr) * 2020-08-25 2022-03-03 株式会社オプティム Programme, procédé et système

Also Published As

Publication number Publication date
JP2024024234A (ja) 2024-02-22

Similar Documents

Publication Publication Date Title
CA3062051C (fr) Systeme et methode d`inspection par ressuage au liquide fluorescent
CN113125458B (zh) 钢结构涂层状态的检查与评估方法及系统
CA3061262C (fr) Systeme et methode d`inspection par ressuage au liquide fluorescent
US9719774B2 (en) Method for detecting cracks in an aircraft or gas turbine component
WO2016035147A1 (fr) Dispositif de traitement de mesure, procédé de traitement de mesure, programme de traitement de mesure, et procédé de production de structure
JP7514259B2 (ja) カメラ部品が損傷しているかどうかを判定するためのシステムおよび方法
CN116563282B (zh) 一种基于机器视觉的钻削刀具检测方法及系统
TWI564556B (zh) 刮痕偵測方法及裝置
EP2605213B1 (fr) Procédé et système de traitement d'images pour l'inspection d'un objet
JP7053366B2 (ja) 検査装置及び検査方法
CN114719749B (zh) 基于机器视觉的金属表面裂纹检测及真实尺寸测量方法及系统
CN117260076A (zh) 用于自动焊接的系统和方法
EP3852059A1 (fr) Système et procédé d'évaluation de l'état de santé d'un actif
WO2024034210A1 (fr) Système et procédé de mesure, et système et procédé de régénération de composant
CN115867403A (zh) 用于工件的增材制造的方法和装置
CN118556257A (zh) 用于借助于人工智能来对经受疲劳测试的对象上的表面缺陷和制动盘上的裂纹进行识别和表征的方法
JP2023554337A (ja) 画像分類方法及び物体の光学検査方法
JP2018059883A (ja) 表面検査装置及び表面検査方法
JP2022029155A (ja) 評価システム、評価方法、および評価プログラム
JP2006329898A (ja) 表面歪の測定方法および測定装置
CN117934453B (zh) 一种手机屏背光异物缺陷诊断方法及系统
WO2024101186A1 (fr) Procédé d'inspection de substrat, dispositif d'inspection de substrat, et programme d'inspection de substrat
Ivaschenko et al. Intelligent Machine Vision Implementation for Production Quality Control
Moe Development of mesoscopic imaging system for surface inspection/Moe Win
Win Development of Mesoscopic Imaging System for Surface Inspection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23852195

Country of ref document: EP

Kind code of ref document: A1