WO2025027539A1 - Surface modification systems and methods - Google Patents

Surface modification systems and methods Download PDF

Info

Publication number
WO2025027539A1
WO2025027539A1 PCT/IB2024/057411 IB2024057411W WO2025027539A1 WO 2025027539 A1 WO2025027539 A1 WO 2025027539A1 IB 2024057411 W IB2024057411 W IB 2024057411W WO 2025027539 A1 WO2025027539 A1 WO 2025027539A1
Authority
WO
WIPO (PCT)
Prior art keywords
defect
indication
image
vehicle
repair
Prior art date
Application number
PCT/IB2024/057411
Other languages
French (fr)
Inventor
John W. Henderson
Michael E. O'brien
Kaitlin M. HAAS
Matthew D. Moore
Andrew W. LONG
Esther S. Jeong
Sophia S. Liu
Glenn E. Casner
Christopher M. Brown
Miaoding DAI
Lori A. Sjolund
Victoria F. GRANGER
Benjamin W. WATSON
Original Assignee
3M Innovative Properties Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3M Innovative Properties Company filed Critical 3M Innovative Properties Company
Publication of WO2025027539A1 publication Critical patent/WO2025027539A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D65/00Designing, manufacturing, e.g. assembling, facilitating disassembly, or structurally modifying motor vehicles or trailers, not otherwise provided for
    • B62D65/005Inspection and final control devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection

Definitions

  • a defect tracking system that includes an image capturing device configured to capture an image of a surface.
  • the system also includes a defect indicia receiver that is configured to receive a defect indication.
  • the defect indication comprises a defect on the surface.
  • the system also includes a defect mapper configured to map the defect to the surface.
  • the system also includes a user interface generator configured to generate a defect map to show the detected defect.
  • the system also includes a display component configured to display the defect map.
  • FIG. 1 is a schematic of a robotic surface modification system in which embodiments of the present invention are useful.
  • FIG. 2 illustrates a process for repairing a vehicle in which embodiments herein may be useful.
  • FIGS. 4A-4D illustrate systems for real-time marking and tracking of defects during a vehicle repair in accordance with embodiments herein.
  • FIGS. 5A-5D illustrate systems for accurately applying body filler to a vehicle in accordance with embodiments herein.
  • FIG. 6 illustrates a method for determining a repair operation in accordance with embodiments herein.
  • FIG. 7 illustrates an example user interface for a repair technician in accordance with embodiments herein.
  • FIG. 8 illustrates a digital defect tracking system implemented with a projection system in accordance with embodiments herein.
  • FIG. 9 illustrates a defect tracking system in accordance with embodiments herein.
  • a repair needs to be done on potential replacement parts themselves, e.g. due to damage during transport, rust removal, etc. While replacement parts can often be returned, the time taken to return and receive a new replacement part may be outside a timeframe for a given repair. Therefore, a repair shop may deem it worthwhile to repair a replacement part instead of returning it.
  • Systems and methods herein increase efficiency in the repair process by making it easier to locate and track defects and generate repair parameters based on contextual information, e.g. abrasive articles on hand, defect details, etc.
  • the term “defect” refers to an area on a worksurface containing an imperfection requiring removal or repair. Defects may include any item that interrupts the visual aesthetic. For example, many vehicles have specular, or reflective, surfaces that may appear shiny or metallic after painting is completed. A “defect” can include debris trapped within one or more of the various paint layers on the work surface. Defects can also include smudges in the paint, excess paint including smears or dripping, as well as dents or scratches.
  • FIG. 2 illustrates a process for repairing a vehicle in which embodiments herein may be useful.
  • Process 200 may be completed by a repair technician as part of a vehicle intake, for example, or by a repair technician during a repair operation.
  • the quality check area includes a tent, for example a tent with a gridline (or other repetitive pattern) and consistent lighting for feature detection or curvature.
  • Lighting may be adjustable, or positioned to amplify the appearance of dust, or may cycle through different colors to make buffing compound or masking tape stand out from the surroundings.
  • a quality check area is separate from a repair station for a vehicle, such that a repair technician, once a repair is done, moves the vehicle into the quality control station. While imaging is done and analyzed, the technician may be working on a second vehicle that has been moved into the repair station. When the imaging is finished, the technician may receive an indication of areas that need to be re-repaired.
  • a digital option for marking defects would provide a map of defects needing repair, without any markings on the vehicle that can be removed.
  • a digital defect tracking system may also assist in tracking which areas of a vehicle have been inspected for defects, as well as which defects have been repaired and which still need repairing.
  • a digital defect tracking system is implemented using augmented reality, e.g. such that a technician can, in real-time, mark defects they see.
  • the technician may use an image capturing device, for example built into glasses or a face shield such that the technician can see the vehicle surface and indicate a defect on the surface.
  • Such a system may allow for a technician to indicate the defect location in real-time in a digital record. It may then be possible to indicate repair operations done on the indicated defect, when and by who.
  • the 3D coordinate is generated using a fiducial marker and multiple cameras to triangulate the position.
  • a 3D model of a vehicle is provided - e.g. from a manufacturer or other source - and the digital defect tracking system indicates the defect by comparing the defect indication to the 3D model.
  • a vehicle is only damaged on one side (e.g. only driver’s or passenger’s). The non-damaged side may also be used to evaluate the repair needs of the damaged side.
  • FIGS. 3A-3D illustrate real-time marking and tracking of defects during a vehicle repair in accordance with embodiments herein.
  • Such a system may increase efficiency by allowing for repair technicians to keep track of which areas have and have not been inspected, where and what type of defects need to be repaired, and what defects have been repaired already.
  • FIG. 3 A illustrates a schematic of a vehicle 300 with several defects.
  • a user of a digital defect tracking system may be able to, in real-time, record the location of defects as they are noted.
  • a repair technician may first notice defect 302 which may, for example, be a nib. The technician might then move toward the driver’s side door and notice defect 304 which may be, for example, a paint smear. The technician may then notice a third defect 306.
  • FIGS. 3C and 3D illustrate example user interfaces that a user of a digital defect tracking system may interact with.
  • FIG. 3C illustrates a number of defects 340 that have been detected by a technician and entered into digital defect tracking system. Also indicated is an uninspected zone 350. Defect detection and tracking systems in accordance with embodiments herein may be able to track zones of a vehicle that have, and have not, already been inspected in addition to defects detected and repaired.
  • FIG. 3D illustrates another user interface that may be presented later in a vehicle repair.
  • FIG. 3D illustrates vehicle 300 with several repaired defects 360, and one yet-to-be repaired defect 370.
  • Area 350 from FIG. 3C is no longer indicated as not inspected. Inspected area 352 is now indicated has having no defects.
  • a repair technician may indicate a defect location using any suitable manner. For example, a user may point to a location on a vehicle and make an audio or haptic indication that an augmented reality system can pick up, in some embodiments.
  • a technician may capture an image, using a suitable image capture device, and indicate a defect location on the image, for example, using a touchscreen.
  • a technician may place a marker on a vehicle which is then captured by an image capturing device. An image analyzer may then associate that indicated defect position with a 3D coordinate relative to the vehicle.
  • FIGS. 4A-4D illustrate systems for real-time marking and tracking of defects during a vehicle repair in accordance with embodiments herein.
  • FIGS. 3A-3D illustrate some user interfaces that a user may interact with, generated by a digital defect tracking system.
  • FIGS. 4A-4B illustrate some examples of devices in which embodiments of a digital defect tracking system may either reside or be accessed through.
  • FIG. 4A illustrates a schematic 400 of a repair technician 410 wearing augmented reality enabled glasses 420. Glasses 420 may include, or interact with, a camera or other image capturing device.
  • An augmented reality system such as glasses 420, either includes or has access to an augmented reality generator, implemented using the one or more processors, that integrate images captured by the image capturing device with the map generated by the registration system, such that information can be overlaid over the images from the image capturing device.
  • the augmented reality generator provides the combined image presented to a user on a display, e.g. projected onto glasses 420 or provided through another display.
  • a user has a field of view 422.
  • previously marked defects 440 may be presented to user 410 using glasses 420.
  • augmented reality devices are designed to operate wirelessly, such that data can be stored and / or processed remotely, which may reduce power consumption requirements and allow for a device to operate longer between charges.
  • some systems herein may include at least some local storage or analysis components.
  • a helmet such as face shield 450, may include additional space for a power source sufficient to manage local data and / or analysis.
  • FIGS. 3A-3D and / or FIGS. 4A-4D reduces the difficulty in relocating defects for a repair.
  • the defect information location, type, etc.
  • a digital defect tracking system such as those described herein can also aid in confirming that an entire vehicle surface is checked for defects, and that all located defects are repaired.
  • relevant information may include the amount of body fdler needed for a repair, which correlates to the amount paid for defect repair.
  • relevant information may include the amount of body fdler needed for a repair, which correlates to the amount paid for defect repair.
  • a volume deviation from normal could be measured. This would help to provide consistent quantification.
  • a volume deviation may be measured in any suitable way, using a suitable dent measuring tool, such as the Dent Viewer MD from Collision EdgeTM Innovative Repair Systems, or the Dentstick, available from Dentstick. For example, a size and / or curvature and / or depth of the dent may be measured.
  • FIG. 5A-5D illustrate systems for accurately applying body filler to a vehicle in accordance with embodiments herein.
  • Some defects require application of a filler material or patch material to recreate an original curvature.
  • Applying filler material is a blend of art and science.
  • Filler material is applied in a wet form, and then sanded down after drying. Before filler is applied, a surface is sanded down to bare metal (e.g. all paint is removed).
  • the filler material is a two-part material, mixed to a required ratio, and applied with tools - spatulas, spreaders or other tools - often by hand. When dry, body filler material is very hard. The goal of a repair technician is to apply enough filler material so that the target contour is achieved, without being excessive, which will require more time and material to sand down.
  • FIG. 5A illustrates an example process of applying body filler.
  • Illustration 500 illustrates a first step where a surface 504, for example dented in a collision, needs to be restored to a target contour 502.
  • Illustration 510 illustrates an intermediate point in the repair process where material has been applied such that a contour 506 is formed. Contour 506, ideally, is applied such that additional material is not needed, e.g. contour 506 should not be less than target contour 502. After a material removal step, the target contour 508 is achieved, as indicated in illustration 520.
  • FIG. 5B illustrates a vehicle 530 where a defect area 532 has been indicated, for example using a digital defect tracking system.
  • the digital defect tracking system may retrieve a target contour, for example from a database containing a 3D model of vehicle 530 in some embodiments.
  • a target contour for example from a database containing a 3D model of vehicle 530 in some embodiments.
  • the contour of a similar area on the opposite side of the car can be imaged. Based on that image, a mirror of the contour can be generated that can serve as the target contour.
  • FIG. 5C illustrates an image 540 that may be captured during a repair.
  • a technician 542 is applying fdter 546 using an applicator 544.
  • the fdler material is not completely smooth along the surface when applied.
  • systems herein can, based on the periodic image and suitable image analysis discussed below, provide feedback and / or coaching for a technician in real-time. This can be very helpful while body fdler is applied to reduce the amount of time needed to sand the hardened fdler material.
  • Image analysis techniques may also be used to provide information not readily detectable by a human technician, such as smoothness of the surface in different areas, transition areas between materials and heat capacities, as well as key geometries - e.g. sharper radiused body lines.
  • a projection system or an augmented reality system it is also possible to provide color-coded or otherwise indicate areas of higher and lower spots in accordance with embodiments herein.
  • FIGS. 5A-5C discuss the scenario where a human operator is applying body fdler, it is also expressly noted that robotic repair units face similar issues.
  • a repair robot may have additional sensors for comparing curvature (e.g. cameras, lasers, analysis of deflectometry results, etc.), but robotic systems also have their own limitations.
  • Real-time feedback about where a current contour is relative to a target contour may be helpful in programming a robotic repair unit to sand applied body fdler to, or close to, the target contour.
  • the reference contour geometry can be fed, along with realtime information about a current contour, to a sander programed to achieve a desired material removal rate, a desired amount of material removed or a desired volume of material removed.
  • the sander may stop periodically for additional contour checks and reprogramming. For example, a robot may not know the specifications of an abrasive product (e.g. whether a correct product is installed and whether it is new or worn) and, therefore, not have accurate information about how much material is being removed during a sanding operation.
  • a robot may not know the specifications of an abrasive product (e.g. whether a correct product is installed and whether it is new or worn) and, therefore, not have accurate information about how much material is being removed during a sanding operation.
  • topography capturing device 552 and a single light source 560 are illustrated in schematic 550.
  • multiple of either, or both, may be suitable in other embodiments.
  • image capturing device 552 may have a movement mechanism (not shown) such that it can move relative to a vehicle having contour 556.
  • photogrammetry is used to obtain a topography of a vehicle.
  • weld parameters may be specified by a manufacturer as a total number along a joint, or spacing between them. A weld can be imaged to determine whether the technician met the required number and spacing.
  • FIG. 6 illustrates a method for determining a repair operation in accordance with embodiments herein.
  • Method 600 may be implemented using a digital defect tracking system, such as embodiments described herein, or using another suitable system.
  • a target surface for a vehicle is retrieved.
  • the target surface may be removal of a small dent, a nib or a smear-free surface.
  • a target surface may be more difficult to estimate.
  • Retrieving a target surface includes scanning an opposite side of a vehicle to obtain a mirror contour, as indicated in block 602.
  • retrieving a target surface includes retrieving a CAD model, either provided from a manufacturer or otherwise generated, as indicated in block 604.
  • Retrieving a target surface may, in another embodiment, include accessing a database of topographies and curvature samples, as indicated in block 606. Other suitable methods for retrieving a target surface are also envisioned, as indicated in block 608.
  • an initial surface of a vehicle is captured, for example, a topography 612 of a current surface may be obtained using images from an image capturing device. For example, images may be stitched together using photogrammetry techniques, or the images may be mapped to a CAD model. Additionally, using captured images and a fiducial, it may be possible to determine depth. Other techniques for capturing an initial surface may also be used, as indicated in block 614. For example, depth information may be obtained using laser sensors. Based on the difference between the target and initial surface, a repair plan may be generated. For example, material may need to be added to raise a current surface, or material may need to be removed to reduce the current surface.
  • a current in-process surface topography is obtained.
  • the in-process topography may be obtained using any suitable techniques including photogrammetry, fiducials, laser sensors or another suitable option.
  • the current in-process surface topography is compared to the target surface to provide a measure of progress for a current repair.
  • a repair status is generated.
  • the generated status includes a percentage completion, an estimated sanding time remaining or body filler needed, or another indication.
  • a next step is determined for the repair.
  • the next step may be determined based on a repair status - e.g. if complete, move to sanding or, if not complete, add more material.
  • a next step may be to move to a different grit size or to continue at a current grit size, for example.
  • Systems and methods herein may be able to generate instructions for a next step based on available contextual information.
  • Of particular concern is the need to prevent bum through due to generated heat from too long a dwell time, worn abrasive articles, etc. Conditions relevant to an abrasive article are particularly important to know.
  • a state of an abrasive article, or article wear 682 may be obtained by scanning or imaging an abrasive article selected for use. Systems herein may also receive information about available abrasive article product families, grit sizes, polishes and other materials.
  • Information about selected or available tools for a sanding operation may also be retrieved, as indicated in block 684, such as tool make, model, settings like air pressure and rotational speed, interface materials like rubber or foam durometer, and user conditions such as down force, angle to surface, etc.
  • Other information relevant to the amount of force and / or friction the tool can apply, or that may affect a material removal rate, may also be retrieved.
  • Information about abrasives and tools may be obtained from sensors, as indicated in block 686, for example images of a worksite may identify a tool used by a technician.
  • An image analysis system may recognize an abrasive article or tool from an image, either by shape or color or by detecting a barcode or other identifier.
  • a barcode scanner or RFID sensor may be able to detect a tool or abrasive article based on an RFID tag or barcode.
  • Information about an abrasive article and / or tool may be entered manually in some embodiments, as indicated in block 688.
  • user interfaces are illustrated herein that a technician may use during a repair process. It may be possible to indicate, for a current repair, the abrasive article and / or tool used or planned to be used. Other sources and types of abrading specifications are also envisioned, as indicated in block 689.
  • a set of abrading parameters is generated for an abrasive operation.
  • the set of abrasive parameters is provided through a user interface as a recommendation for a next step. Especially for newer repair technicians, it may be helpful to have a recommended tool, article, angle and / or dwell time.
  • the set of abrasive parameters may include a selected abrasive article type and grit, a sanding tool and / or an attack angle or force to apply. Additionally, a dwell time may be generated based on a heat capacity, calculated based on known body filler composition and thickness.
  • the generated abrading parameters may be stored either in a local datastore or a datastore accessible wirelessly or through a cloud-based network.
  • a set of parameters is provided to a repair technician for a next step, for example provided through a user interface.
  • the parameters may include abrading specifications 632 if the next step is a sanding or polishing step.
  • the parameters may include information about material to be applied, e.g. body filler, weld material, etc., as indicated in block 634. Other information may also be provided, as indicated in block 636.
  • the parameters may be provided as an overlay to an augmented reality screen in some embodiments, as indicated in block 674.
  • the parameters are provided to a user interface for presentation on a display or projector, as indicated in block 672.
  • the parameters are provided as an audible alert, as indicated in block 676.
  • other communication methods are possible, as indicated in block 678.
  • FIG. 7 illustrates an example user interface for a repair technician in accordance with embodiments herein. While the user interface 700 is illustrated as being presented on a device with a display, such as a tablet or a mobile phone, it is expressly contemplated that such information can be presented on a display in a worksite, projected onto a surface, or otherwise presented to one or more users.
  • a digital defect tracking system may provide functionality throughout the entirety of a vehicle repair, as illustrated in FIG. 7.
  • Vehicle information 720 may be presented for vehicle 710, alone with other information relevant to detected defects. For example, a vehicle may be presented with highlighted defects - e.g. remaining defects 750 still needing repairs. A user may be able to select a defect, using feature 702, view or edit repair details for either completed repairs 740 or remaining repairs 750. A user may also be able to mark a repair as complete using feature 706.
  • Technician information 760 may be presented.
  • a digital defect tracking system may track which technicians complete which repair steps, for example by requiring a log in, using biometrics to identify a user, or through manual entry of information.
  • Systems and methods herein may track additional details about a repair, such as important timing information 730. For example, based on an insurance estimate, a target repair time for completed defects 1-3 may have been 3 hours and 12 minutes. Thus far, only 2 hours and 45 minutes were used for those defects, and a repair is currently under target by 27 minutes. It may be useful to track the time taken for repairs, in addition to whether repairs were satisfactory or not, both for quality control and training purposes.
  • Augmented reality interfaces may not be readily available in all repair shops or work environments. It may instead be more economical to use a single display that may store information from multiple sources. Additionally, as many technicians have mobile phones, it may be possible for technicians to use a mobile phone application to interact with a digital defect tracking system. It may also be possible to use a mobile phone or tablet as an augmented reality enabled device.
  • technician information 760 and repair times 730 it is possible to track many technician-related statistics. In some embodiments, information like time taken for previous repairs, touch time statistics, etc.
  • information collected using a digital defect tracking system can also be presented to a supervisor or shop administrator in a useful format. For example, instead of a single vehicle 710, a user interface 700 may allow a supervisor to select from six different vehicles currently undergoing repairs. Instead of a list of completed and remaining repairs. A supervisor may be able to see a percent complete, a current repair being conducted, and / or time statistics 730 for multiple vehicles at once. Such information may be useful to quickly identify where different vehicles are in a repair workflow and how a current technician is doing with their repair work.
  • a projection system may also provide an opportunity to project information onto surfaces that are otherwise difficult to visually annotate.
  • FIG. 8 illustrates a defect tracking system in accordance with embodiments herein.
  • Embodiments herein are directed toward a system for delivering visual alerts and information to a technician via a projection system, as well as a computer vision system to detect and recognize the technician’s identity and / or recognize where the technician is in the environment and where a suitable projection surface might be that is in view of the technician.
  • a defect tracking system includes a projector, e.g. DLP, LCoS, LCD, LED, Laser, or the like.
  • a defect tracking system is configured to detect, identify, and localize an object (e.g., product, tool, car, person, etc.). Detection may be done using a computer vision system such as camera and an edge computing device in some embodiment.
  • detection includes an image capturing device with a communication component that sends images to a cloud-based computer vision system for computations.
  • Systems herein may include one or more traditional cameras, webcameras, rgb + depth cameras, or another suitable image capture device.
  • the one or more image capture devices are aligned and calibrated to the one or more projectors.
  • FIG. 8 illustrates one example implementation of a digital defect tracking system implemented with a projection system in accordance with embodiments herein.
  • System 800 illustrates a projector 802 with a field of view 804, a camera 806.
  • embodiments herein may utilize multiple projectors 802 and / or cameras 806.
  • real-time images can be provided to a defect tracking system 810, which can conduct image analysis, which can then provide information for projection by projector 802.
  • Projector 802 is illustrated as projecting information onto vehicle 804, however information may be presented on any suitable surface. Illustrated in FIG. 8 are two defect locations 832 overlayed on vehicle 840, along with descriptions 834. While only defect indicia and type are illustrated in FIG. 8, it is expressly contemplated that a projector with sufficient resolution could display more information such as repair plan, status, equipment and materials used, etc.
  • Camera 806 may also detect movement or audio from technicians or others in a worksite. Voice or gesture recognition can be used to modify the information projected, as well as using computer vision to provide context aware information based on an identification of a current task before being performed. Similarly, alerts or other notifications may be provided audibly in some embodiments herein.
  • a vision system may, as discussed herein, use a registration system to either directly classifying objects in the scene or by identifying and reading optical tags/barcodes (either visible or IR retroreflective). Classifying objects may include the registration system using a classical featurebased detector or using suitable machine learning techniques.
  • Embodiments herein allow for a defect tracking system to locate a needed item in the scene.
  • camera 806 may locate wrench 803 and a controller could actuate light source 824 to highlight the area, for example by providing a floodlight-type effect surrounding the item. This would reduce unnecessary search time or frustration by a worker in misplacing or tracking a large quantity of specific items needed to complete a task.
  • Defect tracking system 810 may also control operation of, or communicate with a controller for light sources 822 and 824.
  • Light sources 822 and 824 may be capable of attenuated lighting.
  • light sources 824 may provide gradations of a floodlight effect based on an order of operation or based on a specific task. For example, all tasks needed for a first step may be highlighted similarly.
  • projector 802 may provide an indication of what is needed, for example text, an image or a location (e.g. from an inventory or asset management system) of the needed item.
  • Defect tracking system 810 may be triggered to assess a scene for missing items, to display defect locations and / or any other information based on a variety of modalities, for example a direct voice request or gesture.
  • a direct voice request or gesture E.g. “where’s my wrench” triggers the camera 806 to search for wrench 803, or a raised hand causes projector 802 to add, change or remove projected information 832, 834.
  • a defect tracking system including a projection system may provide real-time feedback and alerts to a worker based on requested data or information on their assigned task. This system would reduce the installation and upkeep burden on a shop for pseudo-augmented reality, where classical AR systems may either be too cumbersome for a worker to wear during operation or the hazards/risks there would be subjected to environments where these systems are likely to be damaged.
  • an informational display could be projected onto the vehicle body or on a surface near to the vehicle to highlight details on the current vehicle - for example a VIN, a make, a model, a year, etc. Additional details that can be projected include a current position in the repair process, the scheduled timeline for upcoming stages, current touch time statistics, user statistics for the different tasks that were performed (how long different tasks took to perform/etc.) or other similar information.
  • a technician needs to know the repair order code (RO) of the vehicle being repaired so that the product / quantity / cost are properly associated with the repair order.
  • the repair order may be stored or accessed through a body shop management software, such as 3M RepairStackTM from 3M Company® of St. Paul, MN USA. Using the RO, a technician may more easily add line items to an invoice.
  • a benefit of using systems herein may be that a planned repair process may be generated and provided to a technician so that the technician notes the items that may be needed.
  • a body shop management system may prompt a technician based on a repair step (e.g. an amount of body filler for a ‘fill dent’ step, or an estimated time needed to ‘seam seal a weld line’, a replacement windshield for a replace windshield’ step, etc.). Additional context around a repair is helpful especially for determining quantities of material such as body filler, weld fillets, etc.
  • system 810 projects details into the physical domain, and can highlight regions of interest on the vehicle that need to be addressed as part of a repair task.
  • an instructional block showing the stepwise procedure for the task could be projected either directly on the vehicle or on a nearby surface (e.g. a floor, table, etc.).
  • camera 806 can detect a technician interacting with information projected by the projector and, in response, change what is displayed. For example, by touching a portion of a surface containing an item on a menu, or an interface such as those presented herein, defect tracking system may facilitate the technician manipulating an interface to, for example, navigate as a series of menus. Alternatively, the camera may localize the hands of an individual in physical space and relate them to where the image is displayed so as to identify a desired action when displaying this overlay. Similarly, it may be possible for a technician to walk through the menus via voice commands . Further, as sensor calibration systems become more complicated, the projection system could highlight the exact positions relative to the vehicle to place necessary markers, light sources, or sensors for calibration.
  • Digital defect tracking system 900 may be accessed using a user interface on a screen of a device such as a mobile phone, tablet or other computing device, or through an augmented reality enabled device, such that a user views an AR-overlay over the physical world, or through a projection system such that information is presented onto a worksurface.
  • a user interface on a screen of a device such as a mobile phone, tablet or other computing device, or through an augmented reality enabled device, such that a user views an AR-overlay over the physical world, or through a projection system such that information is presented onto a worksurface.
  • the information presented may be static or interactive such that a user can customize or select what they want to view.
  • Digital defect tracking system 900 includes or communicates with an imaging system 910.
  • Imaging system may include devices for both capturing and displaying information.
  • imaging system may include one or more image capture devices 902 (e.g. cameras, video cameras, etc.) to capture images of a worksite including defect information, user information and / or environmental information.
  • Imaging system 910 may also include one or more projection devices 904.
  • a projection device 904 includes a traditional projector that projects an image onto a surface as well as a display that provides information through a graphical user interface.
  • projection device 904 in some embodiments, is a projector that can project onto a worksurface - onto a vehicle needing repair, onto a surface near the vehicle, etc.
  • a projector 904 may project information in an interactive manner, e.g. such that image capturing device 902 captures information about a user’s movement or speech and communicates that information to a surface analysis system 920, which then provides new information to a graphical user interface generator 972 for presentation using projector 904.
  • projection 904 may include presenting on an augmented reality-enabled device. For example, information may be projected onto a transparent screen such as a pair of glasses or a face shield.
  • Imaging system 910 may include one or more light sources 906, each of which may have different light settings such as intensity or color. Imaging system 910 may also include one or more movement mechanisms 908 that are responsible for moving any of image capture devise 902, projectors 904 and / or light sources 906. Movement mechanism 908 may include movement in any of an X-Y-Z coordinate direction. For example, any of image capture devices 902, projectors 904, or light sources 906 may be able to rotate or swivel about a mount and / or move in three dimensional space along a rail system or more freely by attachment to a moveable robotic unit.
  • Controller 912 may control operation of imaging system components 902-908, for example a capture rate of image capture device 902, a light intensity or color of light source 906, or a projection resolution 904, for example. Controller 912 may also control the one or more movement mechanisms 908 such that components 902-906 are positioned and oriented correctly.
  • Digital defect tracking system 900 is illustrated as encompassing imaging system 910, surface analysis system 920, graphical user interface generator 972 and database 980. However, it is expressly contemplated that, in some embodiments, at least some of these components are remote from one another.
  • database 980 may be accessed using a wireless or cloud-based network protocol.
  • digital defect tracking system 900 may include one or more processors remote from imaging system 910 that perform functions of surface analysis system 920.
  • GUI generator 972 may use processing circuitry or processing power housed with or separately from surface analysis system 920 or imaging system 910.
  • Communication component 914 in some embodiments herein, is responsible for facilitating communication between components of digital defect tracking system 900, wherever they are housed.
  • Digital defect tracking system 980 may have access to numerous types and sources of data, as illustrated by database 980. While database 980 is illustrated as a single datastore, it is expressly contemplated that, in some embodiments, data in database 980 may be stored in multiple locations. Further, while data in database 980 is illustrated as split between datastores 990, 982 and 985, it is expressly contemplated that these divisions are purely for organizational understanding and not intended to limit how data used in embodiments herein may be stored or organized. Additionally, while database 980 is illustrated as including a number of different data useful for embodiments herein, it is not intended to be an exhaustive list. Other data 901 that may be useful in embodiments herein may also be accessible in some embodiments.
  • a detected defects datastore 990 may receive and house information about defects detected on one or more vehicles, including defect type 991, defect location 992 (which may be relative or absolute), repair plans 993 for detected defects, status’ of repairs 994, as well as any other useful data 995.
  • Defect information in datastore 990 may include historically repaired defect information as well, e.g. defects addressed by a particular technician, time and / or touches taken, abrasive materials used, tools used, etc. Defect information in datastore 990 may be particularly useful for improving future repair operations and determining which technicians would benefit from additional coaching, etc.
  • a vehicle database 982 may include relevant information about vehicles on which repairs may, or have been, conducted.
  • 3D models 983 of different vehicles may be particularly useful for locating defects on a vehicle in real space.
  • 3D models 983 may be provided directly from a manufacturer or otherwise generated - for example using a computer vision system, photogrammetry or another suitable technique.
  • Other vehicle information 984 such as previous repair history for similar vehicles or for a specific vehicle, may also be accessible.
  • a repair database 985 may include information about abrasive articles 986, including which are available, which have been used for similar repairs historically, etc. Additionally, abrasive article information may also include abrasive article types and grit sizes 987 available. In some embodiments wear levels or rates 988 are tracked for individual abrasive articles. Repair database 985 may also include information about tools 989 that may be available, including backup pads available, operational angles, rotational speed, etc. Optional repair plans, or historical repair plans 999 may also be accessible.
  • Digital defect tracking system 900 may also include a surface analysis system 920 that, based on information received from imaging system 910 and database 980, provides analysis of a vehicle surface and / or an ongoing repair process.
  • a defect indicia receiver 960 may receive an indication of a detected on a vehicle surface.
  • the defect indicia may be received through an inputoutput component (I/O component) 974.
  • I/O component inputoutput component
  • a user may indicate a defect using a touchscreen of a mobile computing device, e.g. by pointing out a defect on an image or video feed.
  • a user may indicate a defect using an augmented reality enabled device, by physically touching a vehicle, pointing to a vehicle, or otherwise indicating a location of a defect.
  • an image capture device may capture a gesture (e.g. touching or pointing) indicating a defect.
  • a defect indicia receiver 960 may also receive defect indicia through manual entry, capturing an auditory indication or recognizing a gesture.
  • Defect indicia receiver 960 may receive an indication of defect type 962, e.g. a nib, a paint smear, a dent, etc.
  • a location 964 may also be received.
  • Location 964 may be a relative location, e.g. a point identified in a captured image, or an absolute location, e.g. a coordinate set corresponding to a point on a vehicle.
  • a defect status 966 e.g. needs repair, repair in progress, or repair completed, may also be retrieved.
  • Other defect indicia 968 may also be retrieved.
  • Surface analysis system 920 may include a registration system 952.
  • Registration system 952 analyzes received images, or a received image feed and detects icons or otherwise recognizes features in a space. For example, identifiable curvature on a vehicle, such as a tire, a windshield, or a headlight may be readily identifiable by registration system 952. Additionally, items that may be present in real space may also be identifiable either using a classical feature-based detector or by using a trained machine learning model, for example a wrench, a sander, a transition between applied body filler and sanded metal, etc.
  • Registration system 952 may be able to, based on detected icons, generate a map of a space using a map generator 956 and / or a topography of a vehicle using a topography generator 954.
  • a vehicle specification retriever 950 can retrieve a 3D model 983 for a given vehicle
  • registration system 952 may recognize enough features on a vehicle such that map generator 956 can generate a map of real space based on the 3D model alone.
  • Registration system 952 may have other functionality 958.
  • a vehicle surface mapper 970 may, using images from image capture device 902 and a map generated by registration system 952, generate a defect map that associates detected defects with a relative location, or an absolute location on a vehicle.
  • a surface map generated by surface mapper 970 may be retrieved, in embodiments herein, for a user starting a defect repair process, continuing a defect repair process, or updating defect statuses 966, in accordance with embodiments herein.
  • Surface analysis system 920 may also include a sanding evaluator 940, in some embodiments.
  • Sanding evaluator 940 may evaluate a current sanding operation, plan a next sanding operation, and / or provide real-time feedback for a user during a sanding operation.
  • a target contour retriever 922 may retrieve a target contour for a repair.
  • target contour retriever 922 may retrieve data from vehicle specification retriever 950, or may determine based on pictures from image capture device 902, a correct curvature for a defect area.
  • curvature for a portion of the vehicle opposite the dented portion e.g. curvature of the driver side door
  • a dented portion e.g. the passenger side door
  • a current contour retriever 924 may retrieve information about a defect area, for example a size of a nib or paint smear or depth information about a dent, or a current surface during a repair process, e.g. after a first body filler application.
  • a sanding parameter generator 930 may compare the current contour to the target contour and generate parameters for a next step. Parameters may be generated based on a retrieved repair plan, retrieved by repair plan retriever, which may include a target repair time as well as information about what steps are needed - e.g. body filler application, dent removal and / or sanding.
  • An MRR characterizer 942 may calculate a material removal rate (MRR) needed for a next sanding step. For example, based on time constraints, a next step may need to be completed within 30 minutes. Alternatively or additionally, wear characterizer 942 may determine an actual material removal rate based on known parameters of a previous step. For example, based on a selected abrasive article, tool speed and force applied, a first material removal rate is predicted for a first step, and a sanding time provided to a user. However, after that first step it may be determined that significantly less material was removed than expected.
  • MRR material removal rate
  • An actual material removal rate for the first step may be determined based on the time sanded and volume of material removed, which may then be used by sanding parameter generator when generating sanding parameters for a second sanding step.
  • a sanding evaluator may also, using known heat capacities of materials - e.g. metal, body filler, etc., generate a heat map 944 of work done, or ongoing work. The heat map may be useful for generating and providing an alert to a user, either audiovisual or haptic feedback, that bum-through is likely.
  • Sanding evaluator 940 may have other functionality 946 as well.
  • Digital defect tracking system 900 also includes a GUI generator 972, which generates interfaces for a user to use to interact with system 900.
  • An I/O component may receive or output feedback for a user.
  • a defect overlay generator 978 may generate an overlay to display either over an image of a space, as in an AR-enabled embodiment, with the image of a space, as on a user interface of a computing device, or projected onto the space, using a projector.
  • a GUI communicator 978 may communicate the generated user interface to a device for projection or display.
  • System 900 may also have other functionality 916.
  • Surface inspection systems have been described herein that include image capturing devices, such as a camera, one or more light sources, distant sensors, etc.
  • Systems and methods herein describe components for managing and executing capture of said images, and processing said images to obtain defect characterization information, and surface characterization information.
  • Systems and methods herein have been described that can store and retrieve captured images, image metadata, defect detection and characterization results, and manipulate said information to generate or improve a repair strategy.
  • Systems and methods are described herein that include devices for presenting information about a surface, including mobile computing devices with displays, augmented reality- enabled devices, and projection systems.
  • Systems described herein are expressly contemplated to be interoperable with input/output components. Based on received inputs, systems described herein are configured to change presented information in real-time.
  • Systems and methods herein enable coordination of surface repair operations in a digital system - recording defects, tracking defect repair processes, providing feedback or instruction during the repair, etc.
  • a surface imaging system herein may be useful for other specular surfaces, for example imaging a surface pre-and post-adhesive application, for example.
  • FIG. 10 is a block diagram of a repair strategy generation architecture.
  • the remote server architecture 1000 illustrates one embodiment of an implementation of a repair strategy generator 1010.
  • remote server architecture 1000 can provide computation, software, data access, and storage services that do not require end-user knowledge of the physical location or configuration of the system that delivers the services.
  • remote servers can deliver the services over a wide area network, such as the internet, using appropriate protocols.
  • remote servers can deliver applications over a wide area network and they can be accessed through a web browser or any other computing component.
  • Software or components shown or described in FIGS. 1-10 as well as the corresponding data, can be stored on servers at a remote location.
  • the computing resources in a remote server environment can be consolidated at a remote data center location or they can be dispersed.
  • Remote server infrastructures can deliver services through shared data centers, even though they appear as a single point of access for the user.
  • the components and functions described herein can be provided from a remote server at a remote location using a remote server architecture.
  • they can be provided by a conventional server, installed on client devices directly, or in other ways.
  • a user may interact with system 810 using a user interface 1022.
  • FIG. 10 specifically shows that a digital defect tracking system can be located at a remote server location 1002. Therefore, computing device 1010 accesses those systems through remote server location 1002. Sensing system 1050 can use computing device 1020 to access user interfaces 1022 as well.
  • FIG. 10 also depicts another example of a remote server architecture.
  • FIG. 10 shows that it is also contemplated that some elements of systems described herein are disposed at remote server location 1002 while others are not.
  • storage 1030, 1040 or 1060 or sensing systems 1050 can be disposed at a location separate from location 1002 and accessed through the remote server at location 1002. Regardless of where they are located, they can be accessed directly by computing device 1020, through a network (either a wide area network or a local area network), hosted at a remote site by a service, provided as a service, or accessed by a connection service that resides in a remote location.
  • the data can be stored in substantially any location and intermittently accessed by, or forwarded to, interested parties.
  • physical carriers can be used instead of, or in addition to, electromagnetic wave carriers.
  • FIGS. 11-12 show examples of mobile devices that can be used in the embodiments shown in previous Figures.
  • FIG. 11 is a simplified block diagram of one illustrative example of a handheld or mobile computing device that can be used as a user's or client's handheld device 1121 (e.g., as computing device 1320 in FIG. 13), in which the present system (or parts of it) can be deployed.
  • a mobile device can be deployed in the operator compartment of computing device 1120 for use in generating, processing, or displaying the data.
  • FIG. 12 is another example of a handheld or mobile device.
  • FIG. 11 provides a general block diagram of the components of a client device 1116 that can run some components shown and described herein. Client device 1116 interacts with them, or runs some and interacts with some.
  • a communications link 1113 is provided that allows the handheld device to communicate with other computing devices and under some embodiments provides a channel for receiving information automatically, such as by scanning. Examples of communications link 1113 include allowing communication though one or more communication protocols, such as wireless services used to provide cellular access to a network, as well as protocols that provide local wireless connections to networks.
  • SD Secure Digital
  • Interface 1115 and communication links 1113 communicate with a processor 1117 (which can also embody a processor) along a bus 1119 that is also connected to memory 1121 and input/output (I/O) components 1123, as well as clock 1125 and location system 1127.
  • processor 1117 which can also embody a processor
  • bus 1119 that is also connected to memory 1121 and input/output (I/O) components 1123, as well as clock 1125 and location system 1127.
  • I/O components 1123 are provided to facilitate input and output operations and the device 1116 can include input components such as buttons, touch sensors, optical sensors, microphones, touch screens, proximity sensors, accelerometers, orientation sensors and output components such as a display device, a speaker, and or a printer port. Other I/O components 1123 can be used as well.
  • Clock 1125 illustratively comprises a real time clock component that outputs a time and date. It can also provide timing functions for processor 1117.
  • location system 1127 includes a component that outputs a current geographical location of device 1116.
  • This can include, for instance, a global positioning system (GPS) receiver, a LORAN system, a dead reckoning system, a cellular triangulation system, or other positioning system. It can also include, for example, mapping software or navigation software that generates desired maps, navigation routes and other geographic functions.
  • GPS global positioning system
  • Memory 1121 stores operating system 1129, network settings 1131, applications 1133, application configuration settings 1135, data store 1137, communication drivers 1139, and communication configuration settings 1141.
  • Memory 1121 can include all types of tangible volatile and non-volatile computer-readable memory devices. It can also include computer storage media (described below).
  • Memory 1121 stores computer readable instructions that, when executed by processor 1117, cause the processor to perform computer-implemented steps or functions according to the instructions. Processor 1117 can be activated by other components to facilitate their functionality as well.
  • FIG. 12 shows that the device can be a smart phone 1271.
  • Smart phone 1271 has a touch sensitive display 1273 that displays icons or tiles or other user input mechanisms 1275.
  • Mechanisms 1275 can be used by a user to run applications, make calls, perform data transfer operations, etc.
  • smart phone 1271 is built on a mobile operating system and offers more advanced computing capability and connectivity than a feature phone.
  • FIG. 13 is a block diagram of a computing environment that can be used in embodiments shown in previous Figures.
  • FIG. 13 is one example of a computing environment in which elements of systems and methods described herein, or parts of them (for example), can be deployed.
  • an example system for implementing some embodiments includes a general-purpose computing device in the form of a computer 1310.
  • Components of computer 1310 may include, but are not limited to, a processing unit 1320 (which can comprise aprocessor), a system memory 1330, and a system bus 1321 that couples various system components including the system memory to the processing unit 1320.
  • the system bus 1321 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. Memory and programs described with respect to systems and methods described herein can be deployed in corresponding portions of FIG. 13.
  • Computer 1310 typically includes a variety of computer readable media.
  • Computer readable media can be any available media that can be accessed by computer 1310 and includes both volatile/nonvolatile media and removable/non-removable media.
  • Computer readable media may comprise computer storage media and communication media.
  • Computer storage media is different from, and does not include, a modulated data signal or carrier wave. It includes hardware storage media including both volatile/nonvolatile and removable/non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 1310.
  • Communication media may embody computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • the system memory 1330 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 1331 and random access memory (RAM) 1332.
  • ROM read only memory
  • RAM random access memory
  • BIOS basic input/output system 1333
  • RAM 1332 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 1320.
  • FIG. 13 illustrates operating system 1334, application programs 1335, other program modules 1336, and program data 1337.
  • the computer 1310 may also include other removable/non-removable and volatile/nonvolatile computer storage media.
  • FIG. 13 illustrates a hard disk drive 1341 that reads from or writes to non-removable, nonvolatile magnetic media, nonvolatile magnetic disk 1352, an optical disk drive 1355, and nonvolatile optical disk 1356.
  • the hard disk drive 1341 is typically connected to the system bus 1321 through a non-removable memory interface such as interface 1340
  • optical disk drive 1355 are typically connected to the system bus 1321 by a removable memory interface, such as interface 1350.
  • the functionality described herein can be performed, at least in part, by one or more hardware logic components.
  • illustrative types of hardware logic components include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (e.g., ASICs), Application-specific Standard Products (e.g., ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
  • the drives and their associated computer storage media discussed above and illustrated in FIG. 13, provide storage of computer readable instructions, data structures, program modules and other data for the computer 1310.
  • hard disk drive 1341 is illustrated as storing operating system 1344, application programs 1345, other program modules 1346, and program data 1347. Note that these components can either be the same as or different from operating system 1334, application programs 1335, other program modules 1336, and program data 1337.
  • a user may enter commands and information into the computer 1310 through input devices such as a keyboard 1362, a microphone 1363, and a pointing device 1361, such as a mouse, trackball or touch pad.
  • Other input devices may include a joystick, game pad, satellite receiver, scanner, or the like.
  • These and other input devices are often connected to the processing unit 1320 through a user input interface 1360 that is coupled to the system bus, but may be connected by other interface and bus structures.
  • a visual display 1391 or other type of display device is also connected to the system bus 1321 via an interface, such as a video interface 1390.
  • computers may also include other peripheral output devices such as speakers 1397 and printer 1396, which may be connected through an output peripheral interface 1395.
  • the computer 1310 is operated in a networked environment using logical connections, such as a Local Area Network (LAN) or Wide Area Network (WAN) to one or more remote computers, such as a remote computer 1380.
  • logical connections such as a Local Area Network (LAN) or Wide Area Network (WAN)
  • remote computers such as a remote computer 1380.
  • the computer 1310 When used in a LAN networking environment, the computer 1310 is connected to the LAN 1371 through a network interface or adapter 1370. When used in a WAN networking environment, the computer 1310 typically includes a modem 1372 or other means for establishing communications over the WAN 1373, such as the Internet. In a networked environment, program modules may be stored in a remote memory storage device. FIG. 13 illustrates, for example, that remote application programs 1385 can reside on remote computer 1380.
  • a defect tracking system includes an image capturing device configured to capture an image of a surface, a defect indicia receiver that is configured to receive a defect indication, wherein the defect indication includes a defect on the surface, a defect mapper configured to map the defect to the surface, a user interface generator configured to generate a defect map to show the detected defect, [00160] a display component configured to display the defect map.
  • the system may be implemented such that the captured image includes the defect indication.
  • the system may be implemented such that the defect indication is detected using an image analyzer, the image analyzer being configured to identify the defect indication.
  • the system may be implemented such that the defect indicia receiver receives a user indication of the defect indication.
  • the system of may be implemented such that the user indication is received through a touchscreen.
  • the system of may be implemented such that the captured image includes an image captured by an augmented reality-enabled device, and wherein the defect indication is received through the augmented reality-enabled device.
  • the system of may be implemented such that the defect indication includes the user touching the surface at or near the defect.
  • the system may be implemented such that the display component includes a projection system that projects the updated defect map on the surface.
  • the system may be implemented such that the defect indication is received by a user indication receiver.
  • the system may be implemented such that the user indication receiver includes the image capturing device.
  • the system may be implemented such that the image capturing device is a first image capturing device and wherein the user indication receiver is a second image capturing device.
  • the system may be implemented such that the user indication receiver detects a gesture of the user indicative of a detected defect.
  • the system may be implemented such that the user indication receiver is a microphone, and wherein the defect indication is an audible defect indication.
  • the system may be implemented such that the defect indication includes a defect type, a defect location, a defect status or a surface contour at the defect.
  • the system may be implemented such that the defect indication includes a location relative to the surface.
  • the system may be implemented such that the defect indication includes a location on the surface.
  • the system may be implemented such that the defect location includes a defect area on the surface.
  • the system may be implemented such that it includes a surface map retriever, wherein the surface map includes a three-dimensional (3D) model of the surface, and wherein the location includes a coordinate set corresponding to a point on the 3D model.
  • a surface map retriever wherein the surface map includes a three-dimensional (3D) model of the surface, and wherein the location includes a coordinate set corresponding to a point on the 3D model.
  • the system may be implemented such that it includes a registration system configured to, based on an environment image, recognize an icon in the environment, and, based on the icon, generate a registration map.
  • the system may be implemented such that it includes a surface mapper that is configured to, based on the defect map and the registration map, generate a surface map that includes the icon and the detected defect.
  • the system may be implemented such that it includes a repair plan generator that, based on the defect indication, generates a repair plan for removing the detected defect.
  • the system may be implemented such that it includes a light source.
  • the system may be implemented such that it includes a light source controller configured to control a position and orientation of the light source.
  • the system may be implemented such that it includes a movement mechanism.
  • the system may be implemented such that it includes an image capture device movement mechanism.
  • the system may be implemented such that the defect map is a second defect map, and wherein the second defect map is generated by updating a first defect map to include the mapped defect location.
  • a system for mapping surface defects on a vehicle includes a device configured to present a user interface, an environment image capture device configured to capture an image of an environment, a user interface generator configured to generate the user interface, wherein the user interface includes an input/output component configured to receive a defect indication for a defect, a defect mapper configured to, based on the defect indication, associate the defect with a position in the image of the environment.
  • the user interface generator is configured to update the user interface in response to the associated defect position.
  • the system may be implemented such that it includes a registration system configured to detect a feature in the environment image.
  • the system may be implemented such that the detected feature includes curvature of a surface including the defect.
  • the system may be implemented such that the feature is a first feature, wherein the registration system is configured to detect a second feature in the environment image, and based on the detected first and second features, a registration map generator generates a registration map.
  • the system may be implemented such that it includes a surface specification retriever configured to retrieve a 3D model of the surface, and wherein the defect mapper generates a defect map based on the 3D model and the registration map.
  • the system may be implemented such that the defect map is overlayed onto the environmental image.
  • the system may be implemented such that the registration system detects the feature using a feature-based detection algorithm.
  • the system may be implemented such that the registration system detects the feature using a trained machine learning algorithm.
  • the system may be implemented such that the feature and the defect are different.
  • the system may be implemented such that the defect indication includes a defect type, defect location, a defect repair status.
  • the system may be implemented such that the defect indication is a defect repair status change, and wherein the defect repair status change is a repaired indication indicating that the defect is repaired, and wherein, based on the repaired indication, the user interface generator updates the user interface to change a representation of the defect.
  • the system may be implemented such that the changed representation includes a removal of the defect from the user interface.
  • the system may be implemented such that the changed representation includes a change in color of the defect on the user interface.
  • the system may be implemented such that the device is a mobile computing device.
  • the system may be implemented such that the device includes an image capturing device.
  • the system may be implemented such that the image capturing device is configured to take a surface image of the surface.
  • the system may be implemented such that the defect indication is detected from the surface image.
  • the system may be implemented such that the user interface is graphical user interface and wherein the input/output component is configured to receive the defect indication through the graphical user interface.
  • the system may be implemented such that the device is an augmented-reality enabled device.
  • the system may be implemented such that the device is a projection system.
  • the system may be implemented such that the projection system includes a camera that captures the environmental image.
  • the system may be implemented such that the projection system includes a light source with a movement mechanism configured to change a position or orientation of the light source.
  • a sanding parameter generation system includes a target contour retriever that is configured to retrieve a target contour for a vehicle surface, a current contour retriever that is configured to retrieve a current contour for the vehicle surface, an image capturing device that is configured to capture an image of the vehicle surface, a sanding parameter generator that is configured to, based on a difference of the target contour and the current contour, and the captured image, generate a parameter for a sanding operation on the vehicle surface, and a communication component configured to communicate the generated parameter.
  • the system may be implemented such that it includes an image analyzer configured to determine, based on the captured image of the vehicle surface, a contact time between a tool and the vehicle surface, a work map generator configured to generate a work map for the surface, a bum- through indication generator configured to generate a bum-through indication based on the work map and a graphical user interface generator configured to generate a user interface including the work map.
  • the system may be implemented such that the captured image is a series of captured images, wherein each of the series of captured images is associated with a timestamp.
  • the system may be implemented such that the generated work map is communicated to a user interface generator.
  • the system may be implemented such that the bum through indication is communicated to a user interface generator.
  • the system may be implemented such that the work map includes the bum-through indication.
  • the system may be implemented such that the bum-through indication indicates that more than 50% of a coating layer on the surface has been removed.
  • the system may be implemented such that the bum-through indication indicates that more than 90% of a coating layer on the surface has been removed.
  • the system may be implemented such that the generated user interface is displayed on a mobile computing device.
  • the system may be implemented such that the generated user interface is displayed on a transparent surface by an augmented reality enabled device.
  • the system may be implemented such that the generated user interface is provided to a projection system which projects the user interface on a flat surface.
  • the system may be implemented such that the generated user interface is presented to a projection system which projects the user interface on the vehicle surface.
  • the system may be implemented such that the target contour is retrieved from a 3D model of a vehicle including the vehicle surface.
  • the system may be implemented such that the target contour is retrieved from an image of a vehicle including the vehicle surface.
  • the system may be implemented such that the image includes an opposing side of the vehicle.
  • the system may be implemented such that it includes a material removal rate characterizer configured to calculate a material removal rate for the sanding operation.
  • the system may be implemented such that the material removal rate is calculated based on the current contour and a previously recorded contour, and wherein the material removal rate is based on a detected volume of material removed from the previously recorded contour.
  • the system may be implemented such that the material removal rate is calculated based on the current contour and a target contour.
  • the system may be implemented such that the generated sanding parameter is based on the calculated material removal rate.
  • the system may be implemented such that it includes a tool specification retriever configured to retrieve a tool indication for the sanding operation.
  • the system may be implemented such that the tool indication includes a rotational speed.
  • the system may be implemented such that the tool indication includes an orbital rotational speed.
  • the system may be implemented such that the tool indication includes a random-orbital rotational speed.
  • the system may be implemented such that the tool indication includes a force applied.
  • the system may be implemented such that it includes a light source.
  • the system may be implemented such that it includes a light source movement mechanism configured to change a position or orientation of the light source.
  • the system may be implemented such that it includes a light source controller configured to change a color or intensity of the light source.
  • a method of repairing a dent in a vehicle includes generating a target contour for an area of the vehicle including the dent, capturing an image of the area of the vehicle using an image capture device and, based on the image, generating a current contour for the area, based on the image, generating a contour differential for the area.
  • the contour differential includes a location of a filler material excess for the area.
  • the method further includes generating a sanding parameter for a sanding operation on the area, using a sanding parameter generator, wherein the sanding parameter is based on the contour differential, and generating a user interface, using a user interface generator, including the sanding parameter and displaying the user interface.
  • the method may be implemented such that it includes displaying the user interface using a projection system.
  • the method may be implemented such that the projection system includes a projector and the image capturing device.
  • the method may be implemented such that the user interface is projected onto a surface of the vehicle.
  • the method may be implemented such that the surface includes the area.
  • the method may be implemented such that the projection system includes a projector movement mechanism configured to change a position or an orientation of the projector.
  • the method may be implemented such that the projection system includes an image capturing device movement mechanism configured to change a position or orientation of the image capturing device.
  • the method may be implemented such that it includes displaying the user interface on a device with a screen.
  • the method may be implemented such that the device is an augmented reality enabled device.
  • the method may be implemented such that it includes generating a material removal rate for the sanding operation.
  • the method may be implemented such that the material removal rate is generated based on a difference between the current contour and a previously captured contour.
  • the method may be implemented such that the material removal rate is generated based on a difference between the current contour and the target contour.
  • the method may be implemented such that it includes retrieving a tool parameter related to a tool used in the sanding operation, and wherein the material removal rate is generated on the tool parameter.
  • the method may be implemented such that it includes retrieving an abrasive article parameter related to an abrasive article used in the sanding operation, and wherein the material removal rate is generated based on the abrasive article parameter.
  • a method of repairing a surface defect on a vehicle includes receiving an indication of the surface defect, using a defect indication retriever, receiving an image of the vehicle, from a camera, wherein the image includes the surface defect, generating a location of the surface defect on the vehicle, based on the received image, generating a defect map, using a defect mapper, the defect map including the surface defect and generating a user interface for display, the user interface including the defect map.
  • the method may be implemented such that it includes receiving an indication that a status of the surface defect has changed and updating the defect map to indicate a new status of the surface defect.
  • the method may be implemented such that the user interface is a graphical user interface, and wherein the method further includes displaying the graphical user interface on a display of a device.
  • the method may be implemented such that it includes displaying the user interface on an augmented reality-enabled device.
  • the method may be implemented such that it includes displaying the user interface using a projector.
  • the method may be implemented such that it includes the defect indication retriever includes a camera, wherein the method further includes capturing an image of the surface defect, and wherein the surface defect indication is detected within the captured image.
  • the method may be implemented such that the indication of the surface defect includes: a defect type, a defect location, or a defect status.
  • the method may be implemented such that receiving the indication includes a camera capturing an image of a user pointing to the defect.
  • the method may be implemented such that receiving the indication includes detecting a marking on the vehicle surface.
  • the method may be implemented such that receiving the indication includes receiving, through the user interface, a user input.
  • the method may be implemented such that the location is a relative location.
  • the method may be implemented such that it includes retrieving a 3D model of the vehicle.
  • the method may be implemented such that the location includes a coordinate set corresponding to a point on the 3D model.
  • a projection system for repairing a vehicle includes a projector configured to project a user interface onto a surface, an image capture system configured to capture a first image of an area containing the vehicle and a second image of the vehicle, a light source, and a defect tracking system.
  • the defect tracking system includes a defect indication receiver configured to, based on the first or second image, detect a defect on the vehicle, a defect map generator that generates, based on the second image, a defect map including a position of the detected defect with respect to the vehicle, a user interface generator configured to generate the user interface, the user interface including the defect map and wherein the defect tracking system is configured to communicate the generated user interface to the projector.
  • the system may be implemented such that the image capture system is configured to capture a series of first images.
  • the system may be implemented such that it includes a feature detector configured to, based on the first image, detect a feature in the area and, based on the detected feature, generate a map of the area.
  • the system may be implemented such that each of the series of first images is analyzed by the feature detector. [00266] The system may be implemented such that it includes a vehicle specification retriever that retrieves a 3D model of the vehicle; and wherein the generated map includes the detected defect mapped to the 3D model.
  • the system may be implemented such that the image capture system includes a camera configured to capture a second area image, wherein, based on an analysis of the area image, detect a defect status change.
  • the system may be implemented such that the defect mapper, based on the detected defect status change, generates a new defect map that includes the defect status change.
  • the system may be implemented such that the area image analysis is done in real-time, such that the defect map is updated in real-time.
  • the system may be implemented such that the defect is a first defect, wherein the defect map illustrates a second defect, and wherein the defect mapper generates the new defect map such that a second defect is illustrated as unchanged.
  • the system may be implemented such that a first camera captures the first image and wherein a second camera captures the second image.
  • the system may be implemented such that detecting a defect includes detecting a user pointing to a position in the first image, and analyzing the second image to detect the defect.
  • the system may be implemented such that it includes assigning a first location to the position in the first image, and a second location to the detected defect.
  • the system may be implemented such that it includes a registration system configured to detect a feature in the first image and, based on the detected feature, generate an area map; and a surface mapper that, based on the area map and the second image, generates a coordinate location for the detected defect.
  • the system may be implemented such that it includes a vehicle specification retriever configured to retrieve a 3D model of the vehicle, and wherein the coordinate location corresponds to a point on the 3D model.
  • the system may be implemented such that it includes a light source and a light source controller, wherein the light source controller is configured to, based on the first or second image, change a parameter of the light source.
  • the system may be implemented such that the parameter is a position, orientation, color or intensity.
  • the system may be implemented such that it includes a microphone configured to receive an audio signal and wherein, based on the audio signal, generate a new user interface.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

A defect tracking system that includes an image capturing device configured to capture an image of a surface. The system also includes a defect indicia receiver that is configured to receive a defect indication. The defect indication comprises a defect on the surface. The system also includes a defect mapper configured to map the defect to the surface. The system also includes a user interface generator configured to generate a defect map to show the detected defect. The system also includes a display component configured to display the defect map.

Description

SURFACE MODIFICATION SYSTEMS AND METHODS
BACKGROUND
[0001] Many vehicles such as cars, trucks, boats and planes require replacement parts during their lifetime, either due to part failure or damage. The repair industry deals with large volumes of new and used parts during vehicle repair. Within the collision repair industry, complex repairs are done everyday within repair facilities. When incorrect, damaged, or faulty parts or materials are delivered, the repair process is put on hold until the correct parts and materials are available.
SUMMARY
[0002] A defect tracking system that includes an image capturing device configured to capture an image of a surface. The system also includes a defect indicia receiver that is configured to receive a defect indication. The defect indication comprises a defect on the surface. The system also includes a defect mapper configured to map the defect to the surface. The system also includes a user interface generator configured to generate a defect map to show the detected defect. The system also includes a display component configured to display the defect map.
BRIEF DESCRIPTION OF THE DRAWINGS
[0003] In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.
[0004] FIG. 1 is a schematic of a robotic surface modification system in which embodiments of the present invention are useful.
[0005] FIG. 2 illustrates a process for repairing a vehicle in which embodiments herein may be useful.
[0006] FIGS. 3A-3D illustrate real-time marking and tracking of defects during a vehicle repair in accordance with embodiments herein.
[0007] FIGS. 4A-4D illustrate systems for real-time marking and tracking of defects during a vehicle repair in accordance with embodiments herein.
[0008] FIGS. 5A-5D illustrate systems for accurately applying body filler to a vehicle in accordance with embodiments herein.
[0009] FIG. 6 illustrates a method for determining a repair operation in accordance with embodiments herein.
[0010] FIG. 7 illustrates an example user interface for a repair technician in accordance with embodiments herein.
[0011] FIG. 8 illustrates a digital defect tracking system implemented with a projection system in accordance with embodiments herein.
[0012] FIG. 9 illustrates a defect tracking system in accordance with embodiments herein.
[0013] FIG. 10 is a defect tracking system architecture.
[0014] FIGS. 11-12 show examples of mobile devices that can be used in the embodiments shown in previous Figures.
[0015] FIG. 13 is a block diagram of a computing environment that can be used in embodiments shown in previous Figures.
DETAILED DESCRIPTION
[0016] Within the collision repair industry, complex repairs are done everyday within repair facilities. Some repairs require surface modification - e.g. removing nibs on a surface - while others require significant time and materials - like pulling or filling dents. There are significant challenges presented to technicians to complete repairs in a timely manner. As many repairs are paid for by insurance companies, which usually provide a fixed compensation per repair operation , it is important that repairs are completed correctly the first time, as insurance often only covers a first repair operation, not any additional work to correct mistakes.
[0017] Many surface defects are difficult to find and track on vehicles with specular surfaces. Additionally, once defects have been identified and their locations recorded, it often takes additional time for an operator to re-locate the defect when it comes time to conduct the repair. It is possibly to physically mark defects, e.g. with a wax pen or other implement, but these markings are often removed during the repair process, making it difficult to re-locate defects during a quality check. Additionally, many repair operations require multiple passes - for example a first sanding operation will remove a mark, making the defect harder to find when a next, finer, sanding operation (e.g. moving from 240 to 320 grit) is done. It also makes it difficult to gauge the dullness of a surface before moving to even finer grits.
[0018] In some instances, a repair may need to be redone due to actions taken during the repair. For example, burning can occur during an abrading operation if too much heat is generated. A dent repair may result in too much or too little filler material being applied, resulting in extra time taken to reapply filler or in sanding away excess material.
[0019] Additionally, in some instances a repair needs to be done on potential replacement parts themselves, e.g. due to damage during transport, rust removal, etc. While replacement parts can often be returned, the time taken to return and receive a new replacement part may be outside a timeframe for a given repair. Therefore, a repair shop may deem it worthwhile to repair a replacement part instead of returning it. Systems and methods herein increase efficiency in the repair process by making it easier to locate and track defects and generate repair parameters based on contextual information, e.g. abrasive articles on hand, defect details, etc.
[0020] As used herein, the term “vehicle” is intended to cover a broad range of mobile structures that receive at least one coat of paint and / or clear coat during manufacturing. While many examples herein concern automobiles, it is expressly contemplated that methods and systems described herein are also applicable to trucks, trains, boats (with or without motors), airplanes, helicopters, motorcycles, etc. [0021] The term “paint” is used herein to refer broadly to any of the various layers of e-coat, filler, primer, paint, clear coat, etc. of the vehicle that have been applied in the finishing process. Additionally, the term “paint repair” involves locating and repairing any visual artifacts (defects) on or within any of the paint layers. In some embodiments, systems and methods described herein use clear coat as the target paint repair layer. However, the systems and methods presented apply to any particular paint layer (e-coat, filler, primer, paint, clear coat, etc.) with little to no modification.
[0022] As used herein, the term “defect” refers to an area on a worksurface containing an imperfection requiring removal or repair. Defects may include any item that interrupts the visual aesthetic. For example, many vehicles have specular, or reflective, surfaces that may appear shiny or metallic after painting is completed. A “defect” can include debris trapped within one or more of the various paint layers on the work surface. Defects can also include smudges in the paint, excess paint including smears or dripping, as well as dents or scratches.
[0023] As used herein, the term “real-time” refers to data is processed within milliseconds so that it is available virtually immediately as feedback. While some delay due to processing are inevitable, “realtime” is intended to cover systems and methods where data can be collected or entered and a user can then interact with it without noticeable delay. E.g. a user may make a data entry into a system, and the data entry is then substantially immediately available for viewing or editing.
[0024] FIG. 1 is a schematic of a vehicle repair environment in which embodiments of the present invention are useful. Environment 100 includes a lighting system 110 which may be controlled by a motion controller 112, which may receive instructions from one or more application controllers 150. The application controller may receive input, or provide output, to a user interface 160. In some embodiments, lighting system 110 also includes an inspection system that images the surface of a vehicle 130.
[0025] The first of the two main challenges, inspection of vehicle 130 is interesting due to the nature of the underlying problem domain. In general, the surface of interest is very large in comparison to the defects themselves, with the difference being multiple orders of magnitude. Additionally, each paint layer of the finishing process (e-coat, primer, paint, clear coat, etc.) differs in its visual appearance with specularity being particular noteworthy. Highly specular surfaces (i.e., high-gloss or highly reflective surfaces) pose unique imaging challenges. These issues together make inspection and location of small defects on a surface difficult.
[0026] FIG. 2 illustrates a process for repairing a vehicle in which embodiments herein may be useful. Process 200 may be completed by a repair technician as part of a vehicle intake, for example, or by a repair technician during a repair operation.
[0027] In block 210, defects requiring repair are detected. Due to the specular nature of many vehicle surfaces, it is often required to have a particular lighting setup. For instance, fluorescent lights are advantageous as they create a sharp line in the reflection. A smooth surface will cause the line in the reflection to be free of sharp, localized deviations along its length. When this line traverses a defect (e.g. either by moving the light source or a relative position of the operator with respect to the light source), the reflection will deviate according to the high curvature in the defect, aiding in detection. This is the principle behind structured light and deflectometry.
[0028] A repair technician, or an inspection system, may check an entire vehicle surface for defects to be repaired, and may create a list of defects for later repair.
[0029] In block 212, for each detected defect, a repair option is selected. For example, depending on a location and size of a detected nib, a particular abrasive operation may be recommended - e.g. a particular abrasive grit for a sanding step, a particular polish for a polishing step, and a contact time for each step.
[0030] In block 214, a particular defect is located on the vehicle surface. For example, the list may include coordinates for the defect that can be located using a 3D model of the vehicle surface. The list may include descriptions - e.g. hood defect near driver side door. The list may also use other suitable location identifying information, such as alphanumeric identifiers.
[0031] Once a defect has been located, the selected repair can be completed. For example, for dents, it may be necessary to add material 222, e.g. filler material, remove material, e.g. sanding the surface, or another operation 226, such as pulling a dent from a surface.
[0032] After a defect has been repaired, the surface around the defect may be quality checked, as indicated in block 216. This may be a visual check by a technician or based on an image taken by the technician.
[0033] In block 218, a technician may then proceed to repair additional defects by repeating the operations in blocks 212-216. If no additional defects are present, a final quality check may be completed on the vehicle, as indicated in block 230.
[0034] As illustrated in FIG. 2, a repair technician may move through a series of defect repairs between an initial detection and a quality check. Because it can be difficult to locate defects on a surface, it may be preferred to repair defects using an iterative approach: to locate a first defect 214, select a repair for the defect 212, conduct a repair - adding material 222, removing material 224, or other steps 226, and then move along the surface to locate a second defect 214. This may prevent lost time by first locating defects, as illustrated in block 210, and then having to re-locate each defect, as illustrated in block 214, for each repair.
[0035] Additionally, finding defects after a repair can also present issues. Once a surface is planarized with e.g. 1200 grit, the abraded area is matte, which self-identifies the spot for future operations (refinement, polishing). However, this can be inefficient as it is not always clear which, specific, areas the user has already examined, and given a panel with many matte sanding areas, there is no way to indicate that all spaces between these spots have been inspected and found to be defect free.
[0036] In some embodiments, a post repair inspection is done using an improved lighting setup. For many repair shops, positive ratings by insurance companies are key to reputation and future business. Ensuring that the vehicle is clean, free of repair material, repair-induced defects or incompletely repaired defects makes a positive rating more likely. Some post-review items for a repair shop include: dust on external panels, buffing compound in door jambs or in gaskets/moldings, upholstery bums from weld splatter, dust on interior surfaces (dashboards, seats, etc.), masking tape on the vehicle (including wheel wells and under the vehicle), overspray (paint that is indirectly deposited with a dust-like appearance) on adjacent panels and/or glass surfaces, garbage cleaned from vehicle (shop will dispose of things like fast food bags), any old parts removed from back seat, floor boards, trunk (shop sometimes stores removed parts in the vehicle after removing them), or grease or dirt on upholstery. Many of these tasks are difficult to confirm by a computer vision system due to different vehicle colors, shapes, items around the vehicle, specular surfaces, etc.
[0037] In some embodiments, a vehicle is moved from a repair area into a quality check area. The quality check area may be a space with consistent lighting in some embodiments. The quality check area may include a structured or repetitive pattern used for feature detection. The quality check area may include a reference image or backdrop.
[0038] In some embodiments, the quality check area includes a tent, for example a tent with a gridline (or other repetitive pattern) and consistent lighting for feature detection or curvature. Lighting may be adjustable, or positioned to amplify the appearance of dust, or may cycle through different colors to make buffing compound or masking tape stand out from the surroundings.
[0039] In some embodiments, a quality check area is separate from a repair station for a vehicle, such that a repair technician, once a repair is done, moves the vehicle into the quality control station. While imaging is done and analyzed, the technician may be working on a second vehicle that has been moved into the repair station. When the imaging is finished, the technician may receive an indication of areas that need to be re-repaired.
[0040] A separate lighting station may also be useful for capturing defect information prior to a repair. Pre or post repair imaging may also serve as repair documentation in the event of a customer dispute.
[0041] Another potential solution is to provide a system for a technician to track defects in realtime. A digital option for marking defects would provide a map of defects needing repair, without any markings on the vehicle that can be removed. A digital defect tracking system may also assist in tracking which areas of a vehicle have been inspected for defects, as well as which defects have been repaired and which still need repairing.
[0042] In some embodiments herein, a digital defect tracking system is implemented using an image capture system. For example, the lighting system 110 may be part of an external imaging system. A technician may interact with a display coupled to the imaging system (e.g. user interface 160) to mark defects on captured images. The imaging system, or a controller (e.g. controller 150) may be able to compare captured and marked images to a 3D model of the vehicle surface to track what surfaces of the vehicle have, and have not, been checked for defects.
[0043] In some embodiments herein, a digital defect tracking system is implemented using augmented reality, e.g. such that a technician can, in real-time, mark defects they see. The technician may use an image capturing device, for example built into glasses or a face shield such that the technician can see the vehicle surface and indicate a defect on the surface. Such a system may allow for a technician to indicate the defect location in real-time in a digital record. It may then be possible to indicate repair operations done on the indicated defect, when and by who.
[0044] A digital defect tracking system may allow for a technician to digitally indicate defects during a full inspection of a vehicle, prior to doing any defect repairs, in some embodiments. In other embodiments, a technician may use such a system to repair defects as detected, while tracking what areas have and have not been inspected.
[0045] It has been noted that lighting systems are another potential solution. Automotive OEMs may use deflectometry equipment to provide sufficient light to image a surface and detect defects automatically from the images. However, deflectometry equipment is an expensive solution, and humans are generally fast and accurate at identifying defects on a surface. A digital defect tracking system, such as those discussed herein, may be useful for informing an automated defect repair unit to conduct defect repair at the technician-indicated locations. A digital defect tracking system may, using images of indicated defects - either captured using an augmented reality imaging device or a stand- alone surface imaging system - generate a 3D coordinate for a robotic system to use. The 3D coordinate may be generated using machine learning techniques, in some embodiments. In some embodiments, the 3D coordinate is generated using a fiducial marker and multiple cameras to triangulate the position. In some embodiments, a 3D model of a vehicle is provided - e.g. from a manufacturer or other source - and the digital defect tracking system indicates the defect by comparing the defect indication to the 3D model. In some scenarios a vehicle is only damaged on one side (e.g. only driver’s or passenger’s). The non-damaged side may also be used to evaluate the repair needs of the damaged side.
[0046] FIGS. 3A-3D illustrate real-time marking and tracking of defects during a vehicle repair in accordance with embodiments herein. Such a system may increase efficiency by allowing for repair technicians to keep track of which areas have and have not been inspected, where and what type of defects need to be repaired, and what defects have been repaired already.
[0047] FIG. 3 A illustrates a schematic of a vehicle 300 with several defects. A user of a digital defect tracking system may be able to, in real-time, record the location of defects as they are noted. A repair technician may first notice defect 302 which may, for example, be a nib. The technician might then move toward the driver’s side door and notice defect 304 which may be, for example, a paint smear. The technician may then notice a third defect 306.
[0048] FIG. 3B illustrates an interactive device 320, which displays a user interface through which a repair technician may interact with a defect detection and tracking system. The defect detection and tracking system may include information 322 about a vehicle 300. The repair technician may use interface 320 to indicate defects 310 as they are discovered, and enter defect details, as indicated by identified defects 330. While defect IDs and type are indicated for each detected defect, it is expressly contemplated that additional information may be displayed or otherwise accessible, such as a repair plan selected for each defect, or, if the defect has been repaired, details about the repair such as who completed the defect, when, what materials were used, etc. Defect IDs may also indicate which defects are pre-existing vs. accident-related damage. A technician may also be able to add new defects, as indicated by block 324, or mark repairs for specific defects complete, as indicated by block 326. While some functionality is illustrated in FIG. 3B, it is expressly contemplated that a digital defect tracking system may include other functionality as described in embodiments herein.
[0049] FIGS. 3C and 3D illustrate example user interfaces that a user of a digital defect tracking system may interact with. FIG. 3C illustrates a number of defects 340 that have been detected by a technician and entered into digital defect tracking system. Also indicated is an uninspected zone 350. Defect detection and tracking systems in accordance with embodiments herein may be able to track zones of a vehicle that have, and have not, already been inspected in addition to defects detected and repaired. FIG. 3D illustrates another user interface that may be presented later in a vehicle repair. FIG. 3D illustrates vehicle 300 with several repaired defects 360, and one yet-to-be repaired defect 370. Area 350 from FIG. 3C is no longer indicated as not inspected. Inspected area 352 is now indicated has having no defects.
[0050] FIGS 3B-3D illustrate user interfaces that a repair technician may interact with on a mobile device. However, a user may interact with a digital defect tracking system using other suitable interfaces, such as an augmented reality interface, or through another image capturing system that allows for a user to indicate where, either on an image or 3D model, a defect is located. Defect locations indicated on an image may be correlated with a physical location on a vehicle. For example, the defect detection and tracking system may access a database of 3D models of vehicles, for example provided by vehicle manufactures or another source. Using augmented reality techniques, systems and methods herein may correlate movements of a user with a coordinate on the 3D model. In other embodiments, images of a vehicle with an indicated defect. The position of the defect may be determined from the images using machine learning or other suitable techniques.
[0051] A repair technician may indicate a defect location using any suitable manner. For example, a user may point to a location on a vehicle and make an audio or haptic indication that an augmented reality system can pick up, in some embodiments. In other embodiments, a technician may capture an image, using a suitable image capture device, and indicate a defect location on the image, for example, using a touchscreen. In other embodiments, a technician may place a marker on a vehicle which is then captured by an image capturing device. An image analyzer may then associate that indicated defect position with a 3D coordinate relative to the vehicle.
[0052] FIGS. 4A-4D illustrate systems for real-time marking and tracking of defects during a vehicle repair in accordance with embodiments herein. FIGS. 3A-3D illustrate some user interfaces that a user may interact with, generated by a digital defect tracking system. FIGS. 4A-4B illustrate some examples of devices in which embodiments of a digital defect tracking system may either reside or be accessed through. FIG. 4A illustrates a schematic 400 of a repair technician 410 wearing augmented reality enabled glasses 420. Glasses 420 may include, or interact with, a camera or other image capturing device.
[0053] Glasses 420 include one or more processors that interact with a registration system, which generates icons corresponding to real points in the physical world, such as points on a vehicle undergoing repair. For example, a point where a first curve on a vehicle intersects a second curve may be useful for the registration system as guidance for where an indicated defect is located in three dimensional space. The registration system may, using the identified points, generate a map of a space. The registration system may also correlate the detected defect locations with the generated map, such that they can be overlayed over a view 422 for user 420.
[0054] An augmented reality system, such as glasses 420, either includes or has access to an augmented reality generator, implemented using the one or more processors, that integrate images captured by the image capturing device with the map generated by the registration system, such that information can be overlaid over the images from the image capturing device. The augmented reality generator provides the combined image presented to a user on a display, e.g. projected onto glasses 420 or provided through another display.
[0055] Through glasses 420, a user has a field of view 422. When a vehicle 430 needing repairs comes into field of view 422, previously marked defects 440 may be presented to user 410 using glasses 420.
[0056] While FIG. 4A illustrates an embodiment where augmented reality glasses 420 are worn by a user, it is expressly contemplated that a digital defect tracking system can be incorporated into other devices that have a glass, plastic or other suitable surface for displaying the map generated by the registration system. For example, many repair technicians may wear personal protective devices that include a transparent shield protecting a user’s eyes. For example, FIG. 4B illustrates a face shield 450. FIG. 4C illustrates a Powered Air Purifying Respirator 460. FIG. 4D illustrates safety goggles 470. Other devices, such as a welding helmet, may also be suitable.
[0057] Additionally, while augmented reality techniques are discussed herein, it is also contemplated that virtual reality systems, with an external imaging system, may be used in some embodiments herein. Other suitable technology is also envisioned.
[0058] It is noted that some augmented reality devices are designed to operate wirelessly, such that data can be stored and / or processed remotely, which may reduce power consumption requirements and allow for a device to operate longer between charges. However, it is expressly contemplated that some systems herein may include at least some local storage or analysis components. For example, a helmet, such as face shield 450, may include additional space for a power source sufficient to manage local data and / or analysis.
[0059] Using a digital defect tracking system, such as that described in FIGS. 3A-3D and / or FIGS. 4A-4D, reduces the difficulty in relocating defects for a repair. Similarly, because the defect information (location, type, etc.) is stored digitally, it cannot be wiped away, sanded away, or otherwise obscured due to other repairs in the same area. A digital defect tracking system such as those described herein can also aid in confirming that an entire vehicle surface is checked for defects, and that all located defects are repaired.
[0060] While defects are illustrated herein as small defects on a surface, it is also expressly contemplated that systems herein can be used to store relevant information for defects of larger sizes as well. For larger dents, for example, relevant information may include the amount of body fdler needed for a repair, which correlates to the amount paid for defect repair. For example, a volume deviation from normal could be measured. This would help to provide consistent quantification. A volume deviation may be measured in any suitable way, using a suitable dent measuring tool, such as the Dent Viewer MD from Collision Edge™ Innovative Repair Systems, or the Dentstick, available from Dentstick. For example, a size and / or curvature and / or depth of the dent may be measured. [0061] FIGS. 5A-5D illustrate systems for accurately applying body filler to a vehicle in accordance with embodiments herein. Some defects require application of a filler material or patch material to recreate an original curvature. Applying filler material, especially where a curvature needs to match (e.g. to maintain symmetry of a vehicle), is a blend of art and science. Filler material is applied in a wet form, and then sanded down after drying. Before filler is applied, a surface is sanded down to bare metal (e.g. all paint is removed). The filler material is a two-part material, mixed to a required ratio, and applied with tools - spatulas, spreaders or other tools - often by hand. When dry, body filler material is very hard. The goal of a repair technician is to apply enough filler material so that the target contour is achieved, without being excessive, which will require more time and material to sand down.
[0062] FIG. 5A illustrates an example process of applying body filler. Illustration 500 illustrates a first step where a surface 504, for example dented in a collision, needs to be restored to a target contour 502. Illustration 510 illustrates an intermediate point in the repair process where material has been applied such that a contour 506 is formed. Contour 506, ideally, is applied such that additional material is not needed, e.g. contour 506 should not be less than target contour 502. After a material removal step, the target contour 508 is achieved, as indicated in illustration 520.
[0063] Further complicating the problem, applying and sanding body filler leaves a range of surface textures and heat capacities, ranging from smooth, undamaged parts of a panel with clear coat, a scuffed clear coat portion feathered to bare metal, and a central zone of filler at a variety of heights. Currently, a curve is validated visually and by physical touch to detect deviations, flatness, etc. However, because of the variation in surface texture and heat capacity, “touch” can be misleading. Additionally, because a filler material is matte, it is difficult to visually confirm a curve because the eye can perceive curvature differently on a matte surface than on a specular surface. Therefore, in some cases a curve cannot be verified until paint is applied and the uniform glossy coat can be examined.
[0064] The process of repairing dents and verifying contours and curvature on filled surface can be greatly improved by using vision systems such as those described herein.
[0065] FIG. 5B illustrates a vehicle 530 where a defect area 532 has been indicated, for example using a digital defect tracking system. The digital defect tracking system may retrieve a target contour, for example from a database containing a 3D model of vehicle 530 in some embodiments. In other embodiments, assuming a passenger side of vehicle 530 is relatively undamaged, the contour of a similar area on the opposite side of the car can be imaged. Based on that image, a mirror of the contour can be generated that can serve as the target contour. It may also be possible, based on the generated target contour, to provide feedback to a technician using an AR-enabled device, or using another imaging technique, as to whether or not enough fdler has been applied, or if the area is being overly sanded.
[0066] Systems and methods herein can also periodically scan the in process area, for example as illustrated in the image of FIG. 5C, to track the fdler application and amount of sanding. FIG. 5C illustrates an image 540 that may be captured during a repair. A technician 542 is applying fdter 546 using an applicator 544. The fdler material is not completely smooth along the surface when applied. Using audio, visual or haptic feedback, systems herein can, based on the periodic image and suitable image analysis discussed below, provide feedback and / or coaching for a technician in real-time. This can be very helpful while body fdler is applied to reduce the amount of time needed to sand the hardened fdler material.
[0067] Image analysis techniques may also be used to provide information not readily detectable by a human technician, such as smoothness of the surface in different areas, transition areas between materials and heat capacities, as well as key geometries - e.g. sharper radiused body lines. Using either a projection system or an augmented reality system, it is also possible to provide color-coded or otherwise indicate areas of higher and lower spots in accordance with embodiments herein.
[0068] While FIGS. 5A-5C discuss the scenario where a human operator is applying body fdler, it is also expressly noted that robotic repair units face similar issues. A repair robot may have additional sensors for comparing curvature (e.g. cameras, lasers, analysis of deflectometry results, etc.), but robotic systems also have their own limitations. Real-time feedback about where a current contour is relative to a target contour may be helpful in programming a robotic repair unit to sand applied body fdler to, or close to, the target contour. The reference contour geometry can be fed, along with realtime information about a current contour, to a sander programed to achieve a desired material removal rate, a desired amount of material removed or a desired volume of material removed. The sander may stop periodically for additional contour checks and reprogramming. For example, a robot may not know the specifications of an abrasive product (e.g. whether a correct product is installed and whether it is new or worn) and, therefore, not have accurate information about how much material is being removed during a sanding operation.
[0069] A digital defect tracking system may provide current and target contour information to an abrasive repair planner, which may select parameter settings for a robotic repair unit or another suitable smart tool, such that a majority of material can be removed. In at least some embodiments, a technician may need to finalize the transition areas by smoothing and feathering.
[0070] While augmented reality techniques have been discussed herein, it is noted that augmented reality devices may be cost prohibitive for some cases. FIG. 5D illustrates an example schematic 550 where an image capturing device 552 captures an image of contour 567. While an image capture device 552 is illustrated, other sensors may be suitable, such as a laser measurement system (LIDAR or other suitable system). A contour capturing device 552 may have a field of view 554, or a range of measurement 554.
[0071] As noted previously, specular surfaces are difficult to image because of reflections. Light sources in the room with image capturing device 552 can cause problems with accurately detecting a contour. Obtaining the mirror contour from the undamaged side may prove difficult. Some techniques that can be used may include a particular light source 560. For example, light source 560 may be configured to flash known light patterns such that the surface can be captured using deflectometry techniques.
[0072] It is noted that a single topography capturing device 552 and a single light source 560 are illustrated in schematic 550. However, it is expressly contemplated that multiple of either, or both, may be suitable in other embodiments. For example, a stereo pair of cameras may be able to detect a speckle pattern applied to a surface. Additionally, image capturing device 552 may have a movement mechanism (not shown) such that it can move relative to a vehicle having contour 556. In some embodiments, photogrammetry is used to obtain a topography of a vehicle.
[0073] While the example of applying body filler has been discussed herein, it is expressly contemplated that other applications may benefit from systems and methods herein - such as examining weld fillets or conducting paintless dent removal. For example, weld parameters may be specified by a manufacturer as a total number along a joint, or spacing between them. A weld can be imaged to determine whether the technician met the required number and spacing.
[0074] FIG. 6 illustrates a method for determining a repair operation in accordance with embodiments herein. Method 600 may be implemented using a digital defect tracking system, such as embodiments described herein, or using another suitable system.
[0075] At block 610 a target surface for a vehicle is retrieved. For smaller defects, the target surface may be removal of a small dent, a nib or a smear-free surface. For larger defects, such as a large dent or an area requiring body filler or welding, a target surface may be more difficult to estimate. Retrieving a target surface, in some embodiments, includes scanning an opposite side of a vehicle to obtain a mirror contour, as indicated in block 602. In some embodiments, retrieving a target surface includes retrieving a CAD model, either provided from a manufacturer or otherwise generated, as indicated in block 604. Retrieving a target surface may, in another embodiment, include accessing a database of topographies and curvature samples, as indicated in block 606. Other suitable methods for retrieving a target surface are also envisioned, as indicated in block 608.
[0076] At block 620 an initial surface of a vehicle is captured, for example, a topography 612 of a current surface may be obtained using images from an image capturing device. For example, images may be stitched together using photogrammetry techniques, or the images may be mapped to a CAD model. Additionally, using captured images and a fiducial, it may be possible to determine depth. Other techniques for capturing an initial surface may also be used, as indicated in block 614. For example, depth information may be obtained using laser sensors. Based on the difference between the target and initial surface, a repair plan may be generated. For example, material may need to be added to raise a current surface, or material may need to be removed to reduce the current surface.
[0077] At block 630, a current in-process surface topography is obtained. The in-process topography may be obtained using any suitable techniques including photogrammetry, fiducials, laser sensors or another suitable option.
[0078] At block 640, the current in-process surface topography is compared to the target surface to provide a measure of progress for a current repair.
[0079] At block 650, based on the comparison between the current surface and the target surface, a repair status is generated. In some embodiments, the generated status includes a percentage completion, an estimated sanding time remaining or body filler needed, or another indication.
[0080] At block 660, a next step is determined for the repair. For an additive process, e.g. adding body filler, weld material or another material, the next step may be determined based on a repair status - e.g. if complete, move to sanding or, if not complete, add more material. For a subtractive process, e.g. sanding down to a contour, a next step may be to move to a different grit size or to continue at a current grit size, for example.
[0081] Systems and methods herein may be able to generate instructions for a next step based on available contextual information. Of particular concern is the need to prevent bum through due to generated heat from too long a dwell time, worn abrasive articles, etc. Conditions relevant to an abrasive article are particularly important to know.
[0082] At block 680, current or available abrading specifications are retrieved. A state of an abrasive article, or article wear 682, may be obtained by scanning or imaging an abrasive article selected for use. Systems herein may also receive information about available abrasive article product families, grit sizes, polishes and other materials.
[0083] Information about selected or available tools for a sanding operation may also be retrieved, as indicated in block 684, such as tool make, model, settings like air pressure and rotational speed, interface materials like rubber or foam durometer, and user conditions such as down force, angle to surface, etc. Other information relevant to the amount of force and / or friction the tool can apply, or that may affect a material removal rate, may also be retrieved.
[0084] As information about abrading operations is collected, it may be possible to better understand the relationship between abrasive and tool conditions and the effective material removal rate, which may be able to provide insight into a tool conditions with accuracy over the range of surface area contact, and to quantify the dwell time per area during the process. This could be used to alert a technician of the risk of coating bumthrough, particularly if in-process contours are obtained frequently and / or cameras capture the entire process analyze the image feed in real-time or with frequency. Real-time feedback may also be provided, for example, from pressure sensors in a backup pad, force or angle sensors in the tool, capacitive / resistive sensors in the backup pad, and estimating a foam/rubber deformation vs. force / angle to estimate a surface area contact.
[0085] Information about abrasives and tools may be obtained from sensors, as indicated in block 686, for example images of a worksite may identify a tool used by a technician. An image analysis system may recognize an abrasive article or tool from an image, either by shape or color or by detecting a barcode or other identifier. Similarly, a barcode scanner or RFID sensor may be able to detect a tool or abrasive article based on an RFID tag or barcode. Information about an abrasive article and / or tool may be entered manually in some embodiments, as indicated in block 688. For example, user interfaces are illustrated herein that a technician may use during a repair process. It may be possible to indicate, for a current repair, the abrasive article and / or tool used or planned to be used. Other sources and types of abrading specifications are also envisioned, as indicated in block 689.
[0086] At block 690, a set of abrading parameters is generated for an abrasive operation. In some embodiments, the set of abrasive parameters is provided through a user interface as a recommendation for a next step. Especially for newer repair technicians, it may be helpful to have a recommended tool, article, angle and / or dwell time. The set of abrasive parameters may include a selected abrasive article type and grit, a sanding tool and / or an attack angle or force to apply. Additionally, a dwell time may be generated based on a heat capacity, calculated based on known body filler composition and thickness. As indicated in block 692, the generated abrading parameters may be stored either in a local datastore or a datastore accessible wirelessly or through a cloud-based network.
[0087] At block 670, a set of parameters is provided to a repair technician for a next step, for example provided through a user interface. The parameters may include abrading specifications 632 if the next step is a sanding or polishing step. The parameters may include information about material to be applied, e.g. body filler, weld material, etc., as indicated in block 634. Other information may also be provided, as indicated in block 636.
[0088] The parameters may be provided as an overlay to an augmented reality screen in some embodiments, as indicated in block 674. In some embodiments, the parameters are provided to a user interface for presentation on a display or projector, as indicated in block 672. In some embodiments, the parameters are provided as an audible alert, as indicated in block 676. However, it is expressly contemplated that other communication methods are possible, as indicated in block 678.
[0089] FIG. 7 illustrates an example user interface for a repair technician in accordance with embodiments herein. While the user interface 700 is illustrated as being presented on a device with a display, such as a tablet or a mobile phone, it is expressly contemplated that such information can be presented on a display in a worksite, projected onto a surface, or otherwise presented to one or more users. A digital defect tracking system may provide functionality throughout the entirety of a vehicle repair, as illustrated in FIG. 7.
[0090] Vehicle information 720 may be presented for vehicle 710, alone with other information relevant to detected defects. For example, a vehicle may be presented with highlighted defects - e.g. remaining defects 750 still needing repairs. A user may be able to select a defect, using feature 702, view or edit repair details for either completed repairs 740 or remaining repairs 750. A user may also be able to mark a repair as complete using feature 706.
[0091] Technician information 760 may be presented. A digital defect tracking system may track which technicians complete which repair steps, for example by requiring a log in, using biometrics to identify a user, or through manual entry of information.
Systems and methods herein may track additional details about a repair, such as important timing information 730. For example, based on an insurance estimate, a target repair time for completed defects 1-3 may have been 3 hours and 12 minutes. Thus far, only 2 hours and 45 minutes were used for those defects, and a repair is currently under target by 27 minutes. It may be useful to track the time taken for repairs, in addition to whether repairs were satisfactory or not, both for quality control and training purposes.
[0092] Augmented reality interfaces may not be readily available in all repair shops or work environments. It may instead be more economical to use a single display that may store information from multiple sources. Additionally, as many technicians have mobile phones, it may be possible for technicians to use a mobile phone application to interact with a digital defect tracking system. It may also be possible to use a mobile phone or tablet as an augmented reality enabled device.
[0093] As illustrated by technician information 760 and repair times 730, it is possible to track many technician-related statistics. In some embodiments, information like time taken for previous repairs, touch time statistics, etc. [0094] Discussed herein are many systems and methods for a technician-user. However, it is expressly contemplated that information collected using a digital defect tracking system can also be presented to a supervisor or shop administrator in a useful format. For example, instead of a single vehicle 710, a user interface 700 may allow a supervisor to select from six different vehicles currently undergoing repairs. Instead of a list of completed and remaining repairs. A supervisor may be able to see a percent complete, a current repair being conducted, and / or time statistics 730 for multiple vehicles at once. Such information may be useful to quickly identify where different vehicles are in a repair workflow and how a current technician is doing with their repair work.
[0095] Although visual displays might be beneficial for notification and alert purposes, there are conditions where it may not be feasible for the technician (or for all technicians) to have their own personal displays. In some worksites, it is desired to bring a display to a technician and for multiple workers to share a centralized projection system display. A projection system may also provide an opportunity to project information onto surfaces that are otherwise difficult to visually annotate.
[0096] FIG. 8 illustrates a defect tracking system in accordance with embodiments herein. Embodiments herein are directed toward a system for delivering visual alerts and information to a technician via a projection system, as well as a computer vision system to detect and recognize the technician’s identity and / or recognize where the technician is in the environment and where a suitable projection surface might be that is in view of the technician.
[0097] In some embodiments, a defect tracking system includes a projector, e.g. DLP, LCoS, LCD, LED, Laser, or the like. In some embodiments, a defect tracking system is configured to detect, identify, and localize an object (e.g., product, tool, car, person, etc.). Detection may be done using a computer vision system such as camera and an edge computing device in some embodiment. In some embodiments, detection includes an image capturing device with a communication component that sends images to a cloud-based computer vision system for computations. Systems herein may include one or more traditional cameras, webcameras, rgb + depth cameras, or another suitable image capture device. In embodiments herein, the one or more image capture devices are aligned and calibrated to the one or more projectors.
[0098] FIG. 8 illustrates one example implementation of a digital defect tracking system implemented with a projection system in accordance with embodiments herein. System 800 illustrates a projector 802 with a field of view 804, a camera 806. However, it is expressly contemplated that embodiments herein may utilize multiple projectors 802 and / or cameras 806. Using camera 806, real-time images can be provided to a defect tracking system 810, which can conduct image analysis, which can then provide information for projection by projector 802. Projector 802 is illustrated as projecting information onto vehicle 804, however information may be presented on any suitable surface. Illustrated in FIG. 8 are two defect locations 832 overlayed on vehicle 840, along with descriptions 834. While only defect indicia and type are illustrated in FIG. 8, it is expressly contemplated that a projector with sufficient resolution could display more information such as repair plan, status, equipment and materials used, etc.
[0099] Camera 806 may also detect movement or audio from technicians or others in a worksite. Voice or gesture recognition can be used to modify the information projected, as well as using computer vision to provide context aware information based on an identification of a current task before being performed. Similarly, alerts or other notifications may be provided audibly in some embodiments herein.
[00100] Given a current task to be performed, a worker may need a certain part or tool to complete their work. A vision system may, as discussed herein, use a registration system to either directly classifying objects in the scene or by identifying and reading optical tags/barcodes (either visible or IR retroreflective). Classifying objects may include the registration system using a classical featurebased detector or using suitable machine learning techniques.
[00101] Embodiments herein allow for a defect tracking system to locate a needed item in the scene. For example, camera 806 may locate wrench 803 and a controller could actuate light source 824 to highlight the area, for example by providing a floodlight-type effect surrounding the item. This would reduce unnecessary search time or frustration by a worker in misplacing or tracking a large quantity of specific items needed to complete a task.
[00102] Defect tracking system 810 may also control operation of, or communicate with a controller for light sources 822 and 824. Light sources 822 and 824 may be capable of attenuated lighting. For example, light sources 824 may provide gradations of a floodlight effect based on an order of operation or based on a specific task. For example, all tasks needed for a first step may be highlighted similarly. Additionally, for missing items, projector 802 may provide an indication of what is needed, for example text, an image or a location (e.g. from an inventory or asset management system) of the needed item.
[00103] Defect tracking system 810 may be triggered to assess a scene for missing items, to display defect locations and / or any other information based on a variety of modalities, for example a direct voice request or gesture. E.g. “where’s my wrench” triggers the camera 806 to search for wrench 803, or a raised hand causes projector 802 to add, change or remove projected information 832, 834.
[00104] A defect tracking system including a projection system may provide real-time feedback and alerts to a worker based on requested data or information on their assigned task. This system would reduce the installation and upkeep burden on a shop for pseudo-augmented reality, where classical AR systems may either be too cumbersome for a worker to wear during operation or the hazards/risks there would be subjected to environments where these systems are likely to be damaged.
[00105] If a technician is not in the vicinity of a vehicle/actively working on it (as detected by the camera system), an informational display could be projected onto the vehicle body or on a surface near to the vehicle to highlight details on the current vehicle - for example a VIN, a make, a model, a year, etc. Additional details that can be projected include a current position in the repair process, the scheduled timeline for upcoming stages, current touch time statistics, user statistics for the different tasks that were performed (how long different tasks took to perform/etc.) or other similar information. [00106] To invoice a product, a technician needs to know the repair order code (RO) of the vehicle being repaired so that the product / quantity / cost are properly associated with the repair order. The repair order may be stored or accessed through a body shop management software, such as 3M RepairStack™ from 3M Company® of St. Paul, MN USA. Using the RO, a technician may more easily add line items to an invoice.
[00107] A benefit of using systems herein may be that a planned repair process may be generated and provided to a technician so that the technician notes the items that may be needed. In some embodiments, based on a repair plan, a body shop management system may prompt a technician based on a repair step (e.g. an amount of body filler for a ‘fill dent’ step, or an estimated time needed to ‘seam seal a weld line’, a replacement windshield for a replace windshield’ step, etc.). Additional context around a repair is helpful especially for determining quantities of material such as body filler, weld fillets, etc.
[00108] In some embodiments, system 810 projects details into the physical domain, and can highlight regions of interest on the vehicle that need to be addressed as part of a repair task. As a current task is started, for example, an instructional block showing the stepwise procedure for the task could be projected either directly on the vehicle or on a nearby surface (e.g. a floor, table, etc.).
[00109] In some embodiments, camera 806 can detect a technician interacting with information projected by the projector and, in response, change what is displayed. For example, by touching a portion of a surface containing an item on a menu, or an interface such as those presented herein, defect tracking system may facilitate the technician manipulating an interface to, for example, navigate as a series of menus. Alternatively, the camera may localize the hands of an individual in physical space and relate them to where the image is displayed so as to identify a desired action when displaying this overlay. Similarly, it may be possible for a technician to walk through the menus via voice commands . Further, as sensor calibration systems become more complicated, the projection system could highlight the exact positions relative to the vehicle to place necessary markers, light sources, or sensors for calibration. This would reduce the burden on workers to map these areas correctly as positioning likely varies between car manufacturers and vehicle models/years. [00110] FIG. 9 illustrates a digital defect tracking system in accordance with embodiments herein. Defect tracking system 900 may be implemented in accordance with embodiments herein. For example, digital defect tracking system 900 may be used to capture, store and present information about detected defects on a vehicle. The defects may be detected automatically, by a computer vision system, or may be detected manually by a human technician and entered into digital defect tracking system 900. Digital defect tracking system 900 may be accessed using a user interface on a screen of a device such as a mobile phone, tablet or other computing device, or through an augmented reality enabled device, such that a user views an AR-overlay over the physical world, or through a projection system such that information is presented onto a worksurface. In any of these embodiments, the information presented may be static or interactive such that a user can customize or select what they want to view.
[00111] Digital defect tracking system 900 includes or communicates with an imaging system 910. Imaging system may include devices for both capturing and displaying information. For example, imaging system may include one or more image capture devices 902 (e.g. cameras, video cameras, etc.) to capture images of a worksite including defect information, user information and / or environmental information. Imaging system 910 may also include one or more projection devices 904. As used herein, a projection device 904 includes a traditional projector that projects an image onto a surface as well as a display that provides information through a graphical user interface. For example, projection device 904, in some embodiments, is a projector that can project onto a worksurface - onto a vehicle needing repair, onto a surface near the vehicle, etc. A projector 904 may project information in an interactive manner, e.g. such that image capturing device 902 captures information about a user’s movement or speech and communicates that information to a surface analysis system 920, which then provides new information to a graphical user interface generator 972 for presentation using projector 904. However it is also expressly contemplated that projection 904 may include presenting on an augmented reality-enabled device. For example, information may be projected onto a transparent screen such as a pair of glasses or a face shield.
[00112] Imaging system 910 may include one or more light sources 906, each of which may have different light settings such as intensity or color. Imaging system 910 may also include one or more movement mechanisms 908 that are responsible for moving any of image capture devise 902, projectors 904 and / or light sources 906. Movement mechanism 908 may include movement in any of an X-Y-Z coordinate direction. For example, any of image capture devices 902, projectors 904, or light sources 906 may be able to rotate or swivel about a mount and / or move in three dimensional space along a rail system or more freely by attachment to a moveable robotic unit.
[00113] Controller 912 may control operation of imaging system components 902-908, for example a capture rate of image capture device 902, a light intensity or color of light source 906, or a projection resolution 904, for example. Controller 912 may also control the one or more movement mechanisms 908 such that components 902-906 are positioned and oriented correctly.
[00114] Digital defect tracking system 900 is illustrated as encompassing imaging system 910, surface analysis system 920, graphical user interface generator 972 and database 980. However, it is expressly contemplated that, in some embodiments, at least some of these components are remote from one another. For example, database 980 may be accessed using a wireless or cloud-based network protocol. Similarly, digital defect tracking system 900 may include one or more processors remote from imaging system 910 that perform functions of surface analysis system 920. Similarly, GUI generator 972 may use processing circuitry or processing power housed with or separately from surface analysis system 920 or imaging system 910. Communication component 914, in some embodiments herein, is responsible for facilitating communication between components of digital defect tracking system 900, wherever they are housed.
[00115] Digital defect tracking system 980 may have access to numerous types and sources of data, as illustrated by database 980. While database 980 is illustrated as a single datastore, it is expressly contemplated that, in some embodiments, data in database 980 may be stored in multiple locations. Further, while data in database 980 is illustrated as split between datastores 990, 982 and 985, it is expressly contemplated that these divisions are purely for organizational understanding and not intended to limit how data used in embodiments herein may be stored or organized. Additionally, while database 980 is illustrated as including a number of different data useful for embodiments herein, it is not intended to be an exhaustive list. Other data 901 that may be useful in embodiments herein may also be accessible in some embodiments.
[00116] A detected defects datastore 990 may receive and house information about defects detected on one or more vehicles, including defect type 991, defect location 992 (which may be relative or absolute), repair plans 993 for detected defects, status’ of repairs 994, as well as any other useful data 995. Defect information in datastore 990 may include historically repaired defect information as well, e.g. defects addressed by a particular technician, time and / or touches taken, abrasive materials used, tools used, etc. Defect information in datastore 990 may be particularly useful for improving future repair operations and determining which technicians would benefit from additional coaching, etc.
[00117] A vehicle database 982 may include relevant information about vehicles on which repairs may, or have been, conducted. For example, 3D models 983 of different vehicles may be particularly useful for locating defects on a vehicle in real space. 3D models 983 may be provided directly from a manufacturer or otherwise generated - for example using a computer vision system, photogrammetry or another suitable technique. Other vehicle information 984, such as previous repair history for similar vehicles or for a specific vehicle, may also be accessible.
[00118] A repair database 985 may include information about abrasive articles 986, including which are available, which have been used for similar repairs historically, etc. Additionally, abrasive article information may also include abrasive article types and grit sizes 987 available. In some embodiments wear levels or rates 988 are tracked for individual abrasive articles. Repair database 985 may also include information about tools 989 that may be available, including backup pads available, operational angles, rotational speed, etc. Optional repair plans, or historical repair plans 999 may also be accessible.
[00119] Digital defect tracking system 900 may also include a surface analysis system 920 that, based on information received from imaging system 910 and database 980, provides analysis of a vehicle surface and / or an ongoing repair process. A defect indicia receiver 960 may receive an indication of a detected on a vehicle surface. The defect indicia may be received through an inputoutput component (I/O component) 974. For example, a user may indicate a defect using a touchscreen of a mobile computing device, e.g. by pointing out a defect on an image or video feed. In other embodiments, a user may indicate a defect using an augmented reality enabled device, by physically touching a vehicle, pointing to a vehicle, or otherwise indicating a location of a defect. For a system using a projection, an image capture device may capture a gesture (e.g. touching or pointing) indicating a defect. A defect indicia receiver 960 may also receive defect indicia through manual entry, capturing an auditory indication or recognizing a gesture.
[00120] Defect indicia receiver 960 may receive an indication of defect type 962, e.g. a nib, a paint smear, a dent, etc. A location 964 may also be received. Location 964 may be a relative location, e.g. a point identified in a captured image, or an absolute location, e.g. a coordinate set corresponding to a point on a vehicle. A defect status 966, e.g. needs repair, repair in progress, or repair completed, may also be retrieved. Other defect indicia 968 may also be retrieved.
[00121] Surface analysis system 920 may include a registration system 952. Registration system 952 analyzes received images, or a received image feed and detects icons or otherwise recognizes features in a space. For example, identifiable curvature on a vehicle, such as a tire, a windshield, or a headlight may be readily identifiable by registration system 952. Additionally, items that may be present in real space may also be identifiable either using a classical feature-based detector or by using a trained machine learning model, for example a wrench, a sander, a transition between applied body filler and sanded metal, etc. Registration system 952 may be able to, based on detected icons, generate a map of a space using a map generator 956 and / or a topography of a vehicle using a topography generator 954. In embodiments where a vehicle specification retriever 950 can retrieve a 3D model 983 for a given vehicle, registration system 952 may recognize enough features on a vehicle such that map generator 956 can generate a map of real space based on the 3D model alone. Registration system 952 may have other functionality 958.
[00122] A vehicle surface mapper 970 may, using images from image capture device 902 and a map generated by registration system 952, generate a defect map that associates detected defects with a relative location, or an absolute location on a vehicle. A surface map generated by surface mapper 970 may be retrieved, in embodiments herein, for a user starting a defect repair process, continuing a defect repair process, or updating defect statuses 966, in accordance with embodiments herein.
[00123] Surface analysis system 920 may also include a sanding evaluator 940, in some embodiments. Sanding evaluator 940 may evaluate a current sanding operation, plan a next sanding operation, and / or provide real-time feedback for a user during a sanding operation.
[00124] A target contour retriever 922 may retrieve a target contour for a repair. For example, target contour retriever 922 may retrieve data from vehicle specification retriever 950, or may determine based on pictures from image capture device 902, a correct curvature for a defect area. For example, curvature for a portion of the vehicle opposite the dented portion (e.g. curvature of the driver side door) is likely a mirror to the target curvature of a dented portion (e.g. the passenger side door).
[00125] A current contour retriever 924 may retrieve information about a defect area, for example a size of a nib or paint smear or depth information about a dent, or a current surface during a repair process, e.g. after a first body filler application.
[00126] A sanding parameter generator 930 may compare the current contour to the target contour and generate parameters for a next step. Parameters may be generated based on a retrieved repair plan, retrieved by repair plan retriever, which may include a target repair time as well as information about what steps are needed - e.g. body filler application, dent removal and / or sanding.
[00127] An MRR characterizer 942 may calculate a material removal rate (MRR) needed for a next sanding step. For example, based on time constraints, a next step may need to be completed within 30 minutes. Alternatively or additionally, wear characterizer 942 may determine an actual material removal rate based on known parameters of a previous step. For example, based on a selected abrasive article, tool speed and force applied, a first material removal rate is predicted for a first step, and a sanding time provided to a user. However, after that first step it may be determined that significantly less material was removed than expected. An actual material removal rate for the first step may be determined based on the time sanded and volume of material removed, which may then be used by sanding parameter generator when generating sanding parameters for a second sanding step. A sanding evaluator may also, using known heat capacities of materials - e.g. metal, body filler, etc., generate a heat map 944 of work done, or ongoing work. The heat map may be useful for generating and providing an alert to a user, either audiovisual or haptic feedback, that bum-through is likely. Sanding evaluator 940 may have other functionality 946 as well.
[00128] Digital defect tracking system 900 also includes a GUI generator 972, which generates interfaces for a user to use to interact with system 900. An I/O component may receive or output feedback for a user. A defect overlay generator 978 may generate an overlay to display either over an image of a space, as in an AR-enabled embodiment, with the image of a space, as on a user interface of a computing device, or projected onto the space, using a projector. A GUI communicator 978 may communicate the generated user interface to a device for projection or display.
[00129] System 900 may also have other functionality 916.
[00130] It is expressly noted that, throughout the present description, the example of surface modification of a vehicle surface to remove paint-related defects from a surface is presented as one potential use case. However, other surface modifications are expressly contemplated, such as other abrasive operations (sanding, grinding), other additive processes (e.g. additive manufacturing, adhesive deposition, etc.), or subtractive processes (material removal, cutting, etc.)
[00131] Surface inspection systems have been described herein that include image capturing devices, such as a camera, one or more light sources, distant sensors, etc. Systems and methods herein describe components for managing and executing capture of said images, and processing said images to obtain defect characterization information, and surface characterization information. Systems and methods herein have been described that can store and retrieve captured images, image metadata, defect detection and characterization results, and manipulate said information to generate or improve a repair strategy. Systems and methods are described herein that include devices for presenting information about a surface, including mobile computing devices with displays, augmented reality- enabled devices, and projection systems. Systems described herein are expressly contemplated to be interoperable with input/output components. Based on received inputs, systems described herein are configured to change presented information in real-time.
[00132] Systems and methods herein enable coordination of surface repair operations in a digital system - recording defects, tracking defect repair processes, providing feedback or instruction during the repair, etc.
[00133] However, it is expressly contemplated that systems and methods herein may be useful for other industries, for example while it is envisioned that the vehicle and use cases described herein are being repaired after a collision, it is also contemplated that an OEM use case is also relevant. Additionally, recurring or constant evaluations of internal or external processes such as part repairs, evaluating metallic and/or paint finishes for other groups of products, or even high spatial resolution mapping of an environment using a mobile robot.
[00134] Further, it is contemplated that a surface imaging system herein may be useful for other specular surfaces, for example imaging a surface pre-and post-adhesive application, for example.
[00135] FIG. 10 is a block diagram of a repair strategy generation architecture. The remote server architecture 1000 illustrates one embodiment of an implementation of a repair strategy generator 1010. As an example, remote server architecture 1000 can provide computation, software, data access, and storage services that do not require end-user knowledge of the physical location or configuration of the system that delivers the services. In various embodiments, remote servers can deliver the services over a wide area network, such as the internet, using appropriate protocols. For instance, remote servers can deliver applications over a wide area network and they can be accessed through a web browser or any other computing component. Software or components shown or described in FIGS. 1-10 as well as the corresponding data, can be stored on servers at a remote location. The computing resources in a remote server environment can be consolidated at a remote data center location or they can be dispersed. Remote server infrastructures can deliver services through shared data centers, even though they appear as a single point of access for the user. Thus, the components and functions described herein can be provided from a remote server at a remote location using a remote server architecture. Alternatively, they can be provided by a conventional server, installed on client devices directly, or in other ways. A user may interact with system 810 using a user interface 1022.
[00136] In the example shown in FIG. 10, some items are similar to those shown in earlier figures. FIG. 10 specifically shows that a digital defect tracking system can be located at a remote server location 1002. Therefore, computing device 1010 accesses those systems through remote server location 1002. Sensing system 1050 can use computing device 1020 to access user interfaces 1022 as well.
[00137] FIG. 10 also depicts another example of a remote server architecture. FIG. 10 shows that it is also contemplated that some elements of systems described herein are disposed at remote server location 1002 while others are not. By way of example, storage 1030, 1040 or 1060 or sensing systems 1050 can be disposed at a location separate from location 1002 and accessed through the remote server at location 1002. Regardless of where they are located, they can be accessed directly by computing device 1020, through a network (either a wide area network or a local area network), hosted at a remote site by a service, provided as a service, or accessed by a connection service that resides in a remote location. Also, the data can be stored in substantially any location and intermittently accessed by, or forwarded to, interested parties. For instance, physical carriers can be used instead of, or in addition to, electromagnetic wave carriers.
[00138] It will also be noted that the elements of systems described herein, or portions of them, can be disposed on a wide variety of different devices. Some of those devices include servers, desktop computers, laptop computers, imbedded computer, industrial controllers, tablet computers, or other mobile devices, such as palm top computers, cell phones, smart phones, multimedia players, personal digital assistants, etc.
[00139] FIGS. 11-12 show examples of mobile devices that can be used in the embodiments shown in previous Figures.
[00140] FIG. 11 is a simplified block diagram of one illustrative example of a handheld or mobile computing device that can be used as a user's or client's handheld device 1121 (e.g., as computing device 1320 in FIG. 13), in which the present system (or parts of it) can be deployed. For instance, a mobile device can be deployed in the operator compartment of computing device 1120 for use in generating, processing, or displaying the data. FIG. 12 is another example of a handheld or mobile device.
[00141] FIG. 11 provides a general block diagram of the components of a client device 1116 that can run some components shown and described herein. Client device 1116 interacts with them, or runs some and interacts with some. In the device 1116, a communications link 1113 is provided that allows the handheld device to communicate with other computing devices and under some embodiments provides a channel for receiving information automatically, such as by scanning. Examples of communications link 1113 include allowing communication though one or more communication protocols, such as wireless services used to provide cellular access to a network, as well as protocols that provide local wireless connections to networks.
[00142] In other examples, applications can be received on a removable Secure Digital (SD) card that is connected to an interface 1115. Interface 1115 and communication links 1113 communicate with a processor 1117 (which can also embody a processor) along a bus 1119 that is also connected to memory 1121 and input/output (I/O) components 1123, as well as clock 1125 and location system 1127.
[00143] I/O components 1123, in one embodiment, are provided to facilitate input and output operations and the device 1116 can include input components such as buttons, touch sensors, optical sensors, microphones, touch screens, proximity sensors, accelerometers, orientation sensors and output components such as a display device, a speaker, and or a printer port. Other I/O components 1123 can be used as well.
[00144] Clock 1125 illustratively comprises a real time clock component that outputs a time and date. It can also provide timing functions for processor 1117.
[00145] Illustratively, location system 1127 includes a component that outputs a current geographical location of device 1116. This can include, for instance, a global positioning system (GPS) receiver, a LORAN system, a dead reckoning system, a cellular triangulation system, or other positioning system. It can also include, for example, mapping software or navigation software that generates desired maps, navigation routes and other geographic functions.
[00146] Memory 1121 stores operating system 1129, network settings 1131, applications 1133, application configuration settings 1135, data store 1137, communication drivers 1139, and communication configuration settings 1141. Memory 1121 can include all types of tangible volatile and non-volatile computer-readable memory devices. It can also include computer storage media (described below). Memory 1121 stores computer readable instructions that, when executed by processor 1117, cause the processor to perform computer-implemented steps or functions according to the instructions. Processor 1117 can be activated by other components to facilitate their functionality as well.
[00147] FIG. 12 shows that the device can be a smart phone 1271. Smart phone 1271 has a touch sensitive display 1273 that displays icons or tiles or other user input mechanisms 1275. Mechanisms 1275 can be used by a user to run applications, make calls, perform data transfer operations, etc. In general, smart phone 1271 is built on a mobile operating system and offers more advanced computing capability and connectivity than a feature phone.
[00148] Note that other forms of the devices 1216 are possible.
[00149] FIG. 13 is a block diagram of a computing environment that can be used in embodiments shown in previous Figures.
[00150] FIG. 13 is one example of a computing environment in which elements of systems and methods described herein, or parts of them (for example), can be deployed. With reference to FIG. 13, an example system for implementing some embodiments includes a general-purpose computing device in the form of a computer 1310. Components of computer 1310 may include, but are not limited to, a processing unit 1320 (which can comprise aprocessor), a system memory 1330, and a system bus 1321 that couples various system components including the system memory to the processing unit 1320. The system bus 1321 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. Memory and programs described with respect to systems and methods described herein can be deployed in corresponding portions of FIG. 13.
[00151] Computer 1310 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 1310 and includes both volatile/nonvolatile media and removable/non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media is different from, and does not include, a modulated data signal or carrier wave. It includes hardware storage media including both volatile/nonvolatile and removable/non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 1310. Communication media may embody computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
[00152] The system memory 1330 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 1331 and random access memory (RAM) 1332. A basic input/output system 1333 (BIOS) containing the basic routines that help to transfer information between elements within computer 1310, such as during start-up, is typically stored in ROM 1331. RAM 1332 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 1320. By way of example, and not limitation, FIG. 13 illustrates operating system 1334, application programs 1335, other program modules 1336, and program data 1337.
[00153] The computer 1310 may also include other removable/non-removable and volatile/nonvolatile computer storage media. By way of example only, FIG. 13 illustrates a hard disk drive 1341 that reads from or writes to non-removable, nonvolatile magnetic media, nonvolatile magnetic disk 1352, an optical disk drive 1355, and nonvolatile optical disk 1356. The hard disk drive 1341 is typically connected to the system bus 1321 through a non-removable memory interface such as interface 1340, and optical disk drive 1355 are typically connected to the system bus 1321 by a removable memory interface, such as interface 1350.
[00154] Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (e.g., ASICs), Application-specific Standard Products (e.g., ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
[00155] The drives and their associated computer storage media discussed above and illustrated in FIG. 13, provide storage of computer readable instructions, data structures, program modules and other data for the computer 1310. In FIG. 13, for example, hard disk drive 1341 is illustrated as storing operating system 1344, application programs 1345, other program modules 1346, and program data 1347. Note that these components can either be the same as or different from operating system 1334, application programs 1335, other program modules 1336, and program data 1337.
[00156] A user may enter commands and information into the computer 1310 through input devices such as a keyboard 1362, a microphone 1363, and a pointing device 1361, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a joystick, game pad, satellite receiver, scanner, or the like. These and other input devices are often connected to the processing unit 1320 through a user input interface 1360 that is coupled to the system bus, but may be connected by other interface and bus structures. A visual display 1391 or other type of display device is also connected to the system bus 1321 via an interface, such as a video interface 1390. In addition to the monitor, computers may also include other peripheral output devices such as speakers 1397 and printer 1396, which may be connected through an output peripheral interface 1395.
[00157] The computer 1310 is operated in a networked environment using logical connections, such as a Local Area Network (LAN) or Wide Area Network (WAN) to one or more remote computers, such as a remote computer 1380.
[00158] When used in a LAN networking environment, the computer 1310 is connected to the LAN 1371 through a network interface or adapter 1370. When used in a WAN networking environment, the computer 1310 typically includes a modem 1372 or other means for establishing communications over the WAN 1373, such as the Internet. In a networked environment, program modules may be stored in a remote memory storage device. FIG. 13 illustrates, for example, that remote application programs 1385 can reside on remote computer 1380.
[00159] A defect tracking system includes an image capturing device configured to capture an image of a surface, a defect indicia receiver that is configured to receive a defect indication, wherein the defect indication includes a defect on the surface, a defect mapper configured to map the defect to the surface, a user interface generator configured to generate a defect map to show the detected defect, [00160] a display component configured to display the defect map.
[00161] The system may be implemented such that the captured image includes the defect indication.
[00162] The system may be implemented such that the defect indication is detected using an image analyzer, the image analyzer being configured to identify the defect indication.
[00163] The system may be implemented such that the defect indicia receiver receives a user indication of the defect indication.
[00164] The system of may be implemented such that the user indication is received through a touchscreen.
[00165] The system of may be implemented such that the captured image includes an image captured by an augmented reality-enabled device, and wherein the defect indication is received through the augmented reality-enabled device.
[00166] The system of may be implemented such that the defect indication includes the user touching the surface at or near the defect.
[00167] The system may be implemented such that the display component includes a projection system that projects the updated defect map on the surface.
[00168] The system may be implemented such that the defect indication is received by a user indication receiver.
[00169] The system may be implemented such that the user indication receiver includes the image capturing device.
[00170] The system may be implemented such that the image capturing device is a first image capturing device and wherein the user indication receiver is a second image capturing device.
[00171] The system may be implemented such that the user indication receiver detects a gesture of the user indicative of a detected defect.
[00172] The system may be implemented such that the user indication receiver is a microphone, and wherein the defect indication is an audible defect indication.
[00173] The system may be implemented such that the defect indication includes a defect type, a defect location, a defect status or a surface contour at the defect.
[00174] The system may be implemented such that the defect indication includes a location relative to the surface.
[00175] The system may be implemented such that the defect indication includes a location on the surface.
[00176] The system may be implemented such that the defect location includes a defect area on the surface.
[00177] The system may be implemented such that it includes a surface map retriever, wherein the surface map includes a three-dimensional (3D) model of the surface, and wherein the location includes a coordinate set corresponding to a point on the 3D model.
[00178] The system may be implemented such that it includes a registration system configured to, based on an environment image, recognize an icon in the environment, and, based on the icon, generate a registration map.
[00179] The system may be implemented such that it includes a surface mapper that is configured to, based on the defect map and the registration map, generate a surface map that includes the icon and the detected defect.
[00180] The system may be implemented such that it includes a repair plan generator that, based on the defect indication, generates a repair plan for removing the detected defect.
[00181] The system may be implemented such that it includes a light source.
[00182] The system may be implemented such that it includes a light source controller configured to control a position and orientation of the light source.
[00183] The system may be implemented such that it includes a movement mechanism.
[00184] The system may be implemented such that it includes an image capture device movement mechanism.
[00185] The system may be implemented such that the defect map is a second defect map, and wherein the second defect map is generated by updating a first defect map to include the mapped defect location.
[00186] A system for mapping surface defects on a vehicle is presented that includes a device configured to present a user interface, an environment image capture device configured to capture an image of an environment, a user interface generator configured to generate the user interface, wherein the user interface includes an input/output component configured to receive a defect indication for a defect, a defect mapper configured to, based on the defect indication, associate the defect with a position in the image of the environment. The user interface generator is configured to update the user interface in response to the associated defect position.
[00187] The system may be implemented such that it includes a registration system configured to detect a feature in the environment image.
[00188] The system may be implemented such that the detected feature includes curvature of a surface including the defect.
[00189] The system may be implemented such that the feature is a first feature, wherein the registration system is configured to detect a second feature in the environment image, and based on the detected first and second features, a registration map generator generates a registration map.
[00190] The system may be implemented such that it includes a surface specification retriever configured to retrieve a 3D model of the surface, and wherein the defect mapper generates a defect map based on the 3D model and the registration map.
[00191] The system may be implemented such that the defect map is overlayed onto the environmental image.
[00192] The system may be implemented such that the registration system detects the feature using a feature-based detection algorithm.
[00193] The system may be implemented such that the registration system detects the feature using a trained machine learning algorithm.
[00194] The system may be implemented such that the feature and the defect are different. [00195] The system may be implemented such that the defect indication includes a defect type, defect location, a defect repair status.
[00196] The system may be implemented such that the defect indication is a defect repair status change, and wherein the defect repair status change is a repaired indication indicating that the defect is repaired, and wherein, based on the repaired indication, the user interface generator updates the user interface to change a representation of the defect.
[00197] The system may be implemented such that the changed representation includes a removal of the defect from the user interface.
[00198] The system may be implemented such that the changed representation includes a change in color of the defect on the user interface.
[00199] The system may be implemented such that the device is a mobile computing device.
[00200] The system may be implemented such that the device includes an image capturing device. [00201] The system may be implemented such that the image capturing device is configured to take a surface image of the surface.
[00202] The system may be implemented such that the defect indication is detected from the surface image.
[00203] The system may be implemented such that the user interface is graphical user interface and wherein the input/output component is configured to receive the defect indication through the graphical user interface.
[00204] The system may be implemented such that the device is an augmented-reality enabled device.
[00205] The system may be implemented such that the device is a projection system.
[00206] The system may be implemented such that the projection system includes a camera that captures the environmental image.
[00207] The system may be implemented such that the projection system includes a light source with a movement mechanism configured to change a position or orientation of the light source.
[00208] A sanding parameter generation system is presented that includes a target contour retriever that is configured to retrieve a target contour for a vehicle surface, a current contour retriever that is configured to retrieve a current contour for the vehicle surface, an image capturing device that is configured to capture an image of the vehicle surface, a sanding parameter generator that is configured to, based on a difference of the target contour and the current contour, and the captured image, generate a parameter for a sanding operation on the vehicle surface, and a communication component configured to communicate the generated parameter.
[00209] The system may be implemented such that it includes an image analyzer configured to determine, based on the captured image of the vehicle surface, a contact time between a tool and the vehicle surface, a work map generator configured to generate a work map for the surface, a bum- through indication generator configured to generate a bum-through indication based on the work map and a graphical user interface generator configured to generate a user interface including the work map.
[00210] The system may be implemented such that the captured image is a series of captured images, wherein each of the series of captured images is associated with a timestamp.
[00211] The system may be implemented such that the generated work map is communicated to a user interface generator.
[00212] The system may be implemented such that the bum through indication is communicated to a user interface generator.
[00213] The system may be implemented such that the work map includes the bum-through indication.
[00214] The system may be implemented such that the bum-through indication indicates that more than 50% of a coating layer on the surface has been removed.
[00215] The system may be implemented such that the bum-through indication indicates that more than 90% of a coating layer on the surface has been removed.
[00216] The system may be implemented such that the generated user interface is displayed on a mobile computing device.
[00217] The system may be implemented such that the generated user interface is displayed on a transparent surface by an augmented reality enabled device.
[00218] The system may be implemented such that the generated user interface is provided to a projection system which projects the user interface on a flat surface.
[00219] The system may be implemented such that the generated user interface is presented to a projection system which projects the user interface on the vehicle surface.
[00220] The system may be implemented such that the target contour is retrieved from a 3D model of a vehicle including the vehicle surface.
[00221] The system may be implemented such that the target contour is retrieved from an image of a vehicle including the vehicle surface.
[00222] The system may be implemented such that the image includes an opposing side of the vehicle.
[00223] The system may be implemented such that it includes a material removal rate characterizer configured to calculate a material removal rate for the sanding operation.
[00224] The system may be implemented such that the material removal rate is calculated based on the current contour and a previously recorded contour, and wherein the material removal rate is based on a detected volume of material removed from the previously recorded contour.
[00225] The system may be implemented such that the material removal rate is calculated based on the current contour and a target contour.
[00226] The system may be implemented such that the generated sanding parameter is based on the calculated material removal rate.
[00227] The system may be implemented such that it includes a tool specification retriever configured to retrieve a tool indication for the sanding operation.
[00228] The system may be implemented such that the tool indication includes a rotational speed.
[00229] The system may be implemented such that the tool indication includes an orbital rotational speed.
[00230] The system may be implemented such that the tool indication includes a random-orbital rotational speed.
[00231] The system may be implemented such that the tool indication includes a force applied.
[00232] The system may be implemented such that it includes a light source.
[00233] The system may be implemented such that it includes a light source movement mechanism configured to change a position or orientation of the light source.
[00234] The system may be implemented such that it includes a light source controller configured to change a color or intensity of the light source.
[00235] A method of repairing a dent in a vehicle is presented that includes generating a target contour for an area of the vehicle including the dent, capturing an image of the area of the vehicle using an image capture device and, based on the image, generating a current contour for the area, based on the image, generating a contour differential for the area. The contour differential includes a location of a filler material excess for the area. The method further includes generating a sanding parameter for a sanding operation on the area, using a sanding parameter generator, wherein the sanding parameter is based on the contour differential, and generating a user interface, using a user interface generator, including the sanding parameter and displaying the user interface.
[00236] The method may be implemented such that it includes displaying the user interface using a projection system.
[00237] The method may be implemented such that the projection system includes a projector and the image capturing device.
[00238] The method may be implemented such that the user interface is projected onto a surface of the vehicle.
[00239] The method may be implemented such that the surface includes the area. [00240] The method may be implemented such that the projection system includes a projector movement mechanism configured to change a position or an orientation of the projector.
[00241] The method may be implemented such that the projection system includes an image capturing device movement mechanism configured to change a position or orientation of the image capturing device.
[00242] The method may be implemented such that it includes displaying the user interface on a device with a screen.
[00243] The method may be implemented such that the device is an augmented reality enabled device.
[00244] The method may be implemented such that it includes generating a material removal rate for the sanding operation.
[00245] The method may be implemented such that the material removal rate is generated based on a difference between the current contour and a previously captured contour.
[00246] The method may be implemented such that the material removal rate is generated based on a difference between the current contour and the target contour.
[00247] The method may be implemented such that it includes retrieving a tool parameter related to a tool used in the sanding operation, and wherein the material removal rate is generated on the tool parameter.
[00248] The method may be implemented such that it includes retrieving an abrasive article parameter related to an abrasive article used in the sanding operation, and wherein the material removal rate is generated based on the abrasive article parameter.
[00249] A method of repairing a surface defect on a vehicle is presented that includes receiving an indication of the surface defect, using a defect indication retriever, receiving an image of the vehicle, from a camera, wherein the image includes the surface defect, generating a location of the surface defect on the vehicle, based on the received image, generating a defect map, using a defect mapper, the defect map including the surface defect and generating a user interface for display, the user interface including the defect map.
[00250] The method may be implemented such that it includes receiving an indication that a status of the surface defect has changed and updating the defect map to indicate a new status of the surface defect.
[00251] The method may be implemented such that the user interface is a graphical user interface, and wherein the method further includes displaying the graphical user interface on a display of a device.
[00252] The method may be implemented such that it includes displaying the user interface on an augmented reality-enabled device.
[00253] The method may be implemented such that it includes displaying the user interface using a projector.
[00254] The method may be implemented such that it includes the defect indication retriever includes a camera, wherein the method further includes capturing an image of the surface defect, and wherein the surface defect indication is detected within the captured image.
[00255] The method may be implemented such that the indication of the surface defect includes: a defect type, a defect location, or a defect status.
[00256] The method may be implemented such that receiving the indication includes a camera capturing an image of a user pointing to the defect.
[00257] The method may be implemented such that receiving the indication includes detecting a marking on the vehicle surface.
[00258] The method may be implemented such that receiving the indication includes receiving, through the user interface, a user input.
[00259] The method may be implemented such that the location is a relative location.
[00260] The method may be implemented such that it includes retrieving a 3D model of the vehicle. [00261] The method may be implemented such that the location includes a coordinate set corresponding to a point on the 3D model.
[00262] A projection system for repairing a vehicle is presented that includes a projector configured to project a user interface onto a surface, an image capture system configured to capture a first image of an area containing the vehicle and a second image of the vehicle, a light source, and a defect tracking system. The defect tracking system includes a defect indication receiver configured to, based on the first or second image, detect a defect on the vehicle, a defect map generator that generates, based on the second image, a defect map including a position of the detected defect with respect to the vehicle, a user interface generator configured to generate the user interface, the user interface including the defect map and wherein the defect tracking system is configured to communicate the generated user interface to the projector.
[00263] The system may be implemented such that the image capture system is configured to capture a series of first images.
[00264] The system may be implemented such that it includes a feature detector configured to, based on the first image, detect a feature in the area and, based on the detected feature, generate a map of the area.
[00265] The system may be implemented such that each of the series of first images is analyzed by the feature detector. [00266] The system may be implemented such that it includes a vehicle specification retriever that retrieves a 3D model of the vehicle; and wherein the generated map includes the detected defect mapped to the 3D model.
[00267] The system may be implemented such that the user interface includes a menu of options for a user, and wherein the image capture system includes a camera configured to capture a user image of a user in the area and wherein, based on an analysis of the user image, the user interface generator modifies the user interface such that the projector projects a new user interface.
[00268] The system may be implemented such that the image capture system includes a camera configured to capture a second area image, wherein, based on an analysis of the area image, detect a defect status change.
[00269] The system may be implemented such that the defect mapper, based on the detected defect status change, generates a new defect map that includes the defect status change.
[00270] The system may be implemented such that the area image analysis is done in real-time, such that the defect map is updated in real-time.
[00271] The system may be implemented such that the defect is a first defect, wherein the defect map illustrates a second defect, and wherein the defect mapper generates the new defect map such that a second defect is illustrated as unchanged.
[00272] The system may be implemented such that a first camera captures the first image and wherein a second camera captures the second image.
[00273] The system may be implemented such that detecting a defect includes detecting a user pointing to a position in the first image, and analyzing the second image to detect the defect.
[00274] The system may be implemented such that it includes assigning a first location to the position in the first image, and a second location to the detected defect.
[00275] The system may be implemented such that it includes a registration system configured to detect a feature in the first image and, based on the detected feature, generate an area map; and a surface mapper that, based on the area map and the second image, generates a coordinate location for the detected defect.
[00276] The system may be implemented such that it includes a vehicle specification retriever configured to retrieve a 3D model of the vehicle, and wherein the coordinate location corresponds to a point on the 3D model.
[00277] The system may be implemented such that it includes a light source and a light source controller, wherein the light source controller is configured to, based on the first or second image, change a parameter of the light source.
[00278] The system may be implemented such that the parameter is a position, orientation, color or intensity.
[00279] The system may be implemented such that it includes a microphone configured to receive an audio signal and wherein, based on the audio signal, generate a new user interface.

Claims

CLAIMS What is claimed is:
1. A defect tracking system comprising: an image capturing device configured to capture an image of a surface; a defect indicia receiver that is configured to receive a defect indication, wherein the defect indication comprises a defect on the surface; a defect mapper configured to map the defect to the surface; a user interface generator configured to generate a defect map to show the detected defect; and a display component configured to display the defect map.
2. The system of claim 1, wherein the captured image comprises the defect indication.
3. The system of claim 2, wherein the defect indication is detected using an image analyzer, the image analyzer being configured to identify the defect indication.
4. The system of claim 2, wherein the defect indicia receiver receives a user indication of the defect indication.
5. The system of claim 1, wherein the display component comprises a projection system that projects the updated defect map on the surface.
6. The system of claim 5, wherein the defect indication is received by a user indication receiver.
7. The system of claim 6, wherein the user indication receiver comprises the image capturing device.
8. The system of claim 6, wherein the image capturing device is a first image capturing device and wherein the user indication receiver is a second image capturing device.
9. The system of claim 1, wherein the defect indication comprises a defect type, a defect location, a defect status or a surface contour at the defect.
10. The system of claim 9, wherein the defect indication comprises a location relative to the surface.
11. The system of claim 9, wherein the defect indication comprises a location on the surface.
12. The system of claim 9, wherein the defect location comprises a defect area on the surface.
13. The system of claim 1, and further comprising a registration system configured to, based on an environment image, recognize an icon in the environment, and, based on the icon, generate a registration map.
14. The system of claim 1, and further comprising a repair plan generator that, based on the defect indication, generates a repair plan for removing the detected defect.
15. The system of claim 1, wherein the defect map is a second defect map, and wherein the second defect map is generated by updating a first defect map to include the mapped defect location.
16. A system for mapping surface defects on a vehicle, the system comprising: a device configured to present a user interface; an environment image capture device configured to capture an image of an environment; a user interface generator configured to generate the user interface, wherein the user interface comprises an input/output component configured to receive a defect indication for a defect; a defect mapper configured to, based on the defect indication, associate the defect with a position in the image of the environment; and wherein the user interface generator is configured to update the user interface in response to the associated defect position.
17. The system of claim 16, and further comprising a registration system configured to detect a feature in the environment image.
18. The system of claim 17, wherein the feature is a first feature, wherein the registration system is configured to detect a second feature in the environment image, and based on the detected first and second features, a registration map generator generates a registration map.
19. The system of claim 17, wherein the registration system detects the feature using a featurebased detection algorithm.
20. The system of claim 17, wherein the feature and the defect are different.
21. The system of claim 16, wherein the defect indication comprises a defect type, defect location, a defect repair status.
22. The system of claim 16, wherein the defect indication is a defect repair status change, and wherein the defect repair status change is a repaired indication indicating that the defect is repaired, and wherein, based on the repaired indication, the user interface generator updates the user interface to change a representation of the defect.
23. The system of claim 22, wherein the changed representation comprises a removal of the defect from the user interface.
24. A sanding parameter generation system comprising: a target contour retriever that is configured to retrieve a target contour for a vehicle surface; a current contour retriever that is configured to retrieve a current contour for the vehicle surface; an image capturing device that is configured to capture an image of the vehicle surface; a sanding parameter generator that is configured to, based on a difference of the target contour and the current contour, and the captured image, generate a parameter for a sanding operation on the vehicle surface; and a communication component configured to communicate the generated parameter.
25. The system of claim 24, and further comprising: an image analyzer configured to determine, based on the captured image of the vehicle surface, a contact time between a tool and the vehicle surface; a work map generator configured to generate a work map for the surface; a bum-through indication generator configured to generate a bum-through indication based on the work map; and a graphical user interface generator configured to generate a user interface comprising the work map.
26. The system of claim 25, wherein the captured image is a series of captured images, wherein each of the series of captured images is associated with a timestamp.
27. The system of claim 25, wherein the bum through indication is communicated to a user interface generator.
28. The system of claim 27, wherein the work map comprises the bum-through indication.
29. The system of claim 24, wherein the target contour is retrieved from an image of a vehicle comprising the vehicle surface.
30. The system of claim 29, wherein the image comprises an opposing side of the vehicle.
31. The system of claim 24, and further comprising a material removal rate characterizer configured to calculate a material removal rate for the sanding operation.
32. The system of claim 31, wherein the material removal rate is calculated based on the current contour and a previously recorded contour, and wherein the material removal rate is based on a detected volume of material removed from the previously recorded contour.
33. The system of claim 32, wherein the material removal rate is calculated based on the current contour and a target contour.
34. The system of claim 32, wherein the generated sanding parameter is based on the calculated material removal rate.
35. The system of claim 32, and further comprising a tool specification retriever configured to retrieve a tool indication for the sanding operation.
36. The system of claim 32, wherein the tool indication comprises a rotational speed, an orbital rotational speed, or a random-orbital rotational speed.
37. The system of claim 32, wherein the tool indication comprises a force applied.
38. The system of claim 24, and further comprising a light source.
39. The system of claim 38, and further comprising a light source movement mechanism configured to change a position or orientation of the light source.
40. The system of claim 38, and further comprising a light source controller configured to change a color or intensity of the light source.
PCT/IB2024/057411 2023-08-01 2024-07-31 Surface modification systems and methods WO2025027539A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363516974P 2023-08-01 2023-08-01
US63/516,974 2023-08-01

Publications (1)

Publication Number Publication Date
WO2025027539A1 true WO2025027539A1 (en) 2025-02-06

Family

ID=92503586

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2024/057411 WO2025027539A1 (en) 2023-08-01 2024-07-31 Surface modification systems and methods

Country Status (1)

Country Link
WO (1) WO2025027539A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140065584A1 (en) * 2009-07-10 2014-03-06 Lincoln Global, Inc. Virtual testing and inspection of a virtual weldment
US20160339652A1 (en) * 2015-05-21 2016-11-24 The Boeing Company Remote Advanced Repair Guidance
US20170147990A1 (en) * 2015-11-23 2017-05-25 CSI Holdings I LLC Vehicle transactions using objective vehicle data
DE102018006834A1 (en) * 2018-08-29 2020-03-05 Psa Automobiles Sa System and method for supporting surface processing
WO2022038491A1 (en) * 2020-08-19 2022-02-24 3M Innovative Properties Company Robotic repair control systems and methods
US20220382262A1 (en) * 2019-10-28 2022-12-01 3M Innovative Properties Company Automated vehicle repair system
WO2023002413A1 (en) * 2021-07-21 2023-01-26 3M Innovative Properties Company Systems and methods for processing a worksurface

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140065584A1 (en) * 2009-07-10 2014-03-06 Lincoln Global, Inc. Virtual testing and inspection of a virtual weldment
US20160339652A1 (en) * 2015-05-21 2016-11-24 The Boeing Company Remote Advanced Repair Guidance
US20170147990A1 (en) * 2015-11-23 2017-05-25 CSI Holdings I LLC Vehicle transactions using objective vehicle data
DE102018006834A1 (en) * 2018-08-29 2020-03-05 Psa Automobiles Sa System and method for supporting surface processing
US20220382262A1 (en) * 2019-10-28 2022-12-01 3M Innovative Properties Company Automated vehicle repair system
WO2022038491A1 (en) * 2020-08-19 2022-02-24 3M Innovative Properties Company Robotic repair control systems and methods
WO2023002413A1 (en) * 2021-07-21 2023-01-26 3M Innovative Properties Company Systems and methods for processing a worksurface

Similar Documents

Publication Publication Date Title
US12181862B2 (en) Automated vehicle repair system
US20230321687A1 (en) Robotic repair control systems and methods
JP7710444B2 (en) Robotic repair control system and method
CN108496124A (en) The automatic detection and robot assisted processing of surface defect
WO2017091308A1 (en) Damage assessment and repair based on objective surface data
US20240316768A1 (en) Systems and methods for processing a worksurface
JP7604190B2 (en) Inspection system, management device, inspection method, program, recording medium, and article manufacturing method
WO2025027539A1 (en) Surface modification systems and methods
US20240335943A1 (en) Systems and methods for processing a worksurface
JP2025515436A (en) Defect mapping and repair system and method
US20210097674A1 (en) System for identifying and correcting irregularities of the surfaces of an object
CN112444283B (en) Vehicle assembly detection device and vehicle assembly production system
US20250259290A1 (en) Systems and methods for post-repair inspection of a worksurface
Shi Autonomous Robotic Polishing System
WO2024141859A1 (en) Robotic surface modification systems and methods
WO2024141858A1 (en) Robotic surface modification systems and methods
KR20250129697A (en) Robot surface modification system and method
Martínez et al. A machine vision system for automated headlamp lens inspection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24759255

Country of ref document: EP

Kind code of ref document: A1