CN112444283B - Vehicle assembly detection device and vehicle assembly production system - Google Patents

Vehicle assembly detection device and vehicle assembly production system Download PDF

Info

Publication number
CN112444283B
CN112444283B CN201910822121.7A CN201910822121A CN112444283B CN 112444283 B CN112444283 B CN 112444283B CN 201910822121 A CN201910822121 A CN 201910822121A CN 112444283 B CN112444283 B CN 112444283B
Authority
CN
China
Prior art keywords
vehicle assembly
robot
camera
laser scanner
detection device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910822121.7A
Other languages
Chinese (zh)
Other versions
CN112444283A (en
Inventor
陈卫华
毛瑞杰
马骥
杨洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BMW Brilliance Automotive Ltd
Original Assignee
BMW Brilliance Automotive Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BMW Brilliance Automotive Ltd filed Critical BMW Brilliance Automotive Ltd
Priority to CN201910822121.7A priority Critical patent/CN112444283B/en
Publication of CN112444283A publication Critical patent/CN112444283A/en
Application granted granted Critical
Publication of CN112444283B publication Critical patent/CN112444283B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23PMETAL-WORKING NOT OTHERWISE PROVIDED FOR; COMBINED OPERATIONS; UNIVERSAL MACHINE TOOLS
    • B23P23/00Machines or arrangements of machines for performing specified combinations of different metal-working operations not covered by a single other subclass
    • B23P23/06Metal-working plant comprising a number of associated machines or apparatus
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The application relates to a detection device for a vehicle assembly, comprising: a human-machine cooperative robot for moving around the vehicle assembly conveyed by the conveying unit; a camera mounted at the end of the human-machine cooperative robot for capturing an image of the vehicle assembly; the three-dimensional laser scanner is arranged at the tail end of the man-machine cooperation robot and is used for carrying out three-dimensional scanning on the surface of the vehicle assembly; a control device for the man-machine cooperative robot; and a data processing device for processing the data acquired by the camera and the three-dimensional laser scanner for detecting the installation error and/or the appearance defect of the vehicle assembly, wherein the scanning position of the three-dimensional laser scanner and/or the movement path of the planned man-machine cooperation robot are selected according to the image acquired by the camera. Furthermore, the application relates to a vehicle assembly production system.

Description

Vehicle assembly detection device and vehicle assembly production system
Technical Field
The present application relates to a detection apparatus for a vehicle assembly production system. The application also relates to a vehicle assembly production system.
Background
Currently, quality problems such as whether components in a vehicle assembly, such as an instrument panel assembly, are properly installed and whether gaps between the components meet standards, are detected manually on a manufacturing line. When the conveying unit conveys the vehicle assembly to the quality inspection station, a quality inspector visually inspects the installed parts, and each gap and each surface difference are respectively measured by using the feeler gauge and the surface difference gauge. However, as the types of similar components mounted on the instrument panel assembly are increased, and the production line speed is increased, many quality problems in manual inspection cannot be detected at the first time. And the problem of the instrument board assembly is found after the instrument board assembly is offline, so that the instrument board assembly is difficult to repair, long in working hours and high in cost. Furthermore, the quality inspector needs to move the body constantly. Repeated long-term performance of these actions can also cause labor damage to the quality inspector.
Disclosure of Invention
The object of the present application is to provide a detection device for a vehicle assembly, which enables automatic quality inspection of the vehicle assembly, in particular at the same time enabling automatic inspection of installation errors and/or appearance defects. The application further provides a vehicle assembly production system with such a detection device.
One aspect of the application relates to a detection apparatus for a vehicle assembly, the detection apparatus comprising:
-a human-machine cooperative robot for moving around the vehicle assembly transported by the transport unit;
-a camera mounted at the end of a human-machine cooperative robot for capturing images of the vehicle assembly;
-a three-dimensional laser scanner mounted at the end of the human-machine cooperative robot for three-dimensional scanning of the surface of the vehicle assembly;
-control means for said human-machine cooperative robot;
-data processing means for processing the data acquired by the camera and the three-dimensional laser scanner for detecting mounting errors and/or appearance defects of the vehicle assembly;
and selecting a scanning position of the three-dimensional laser scanner and/or planning a moving path of the man-machine cooperation robot according to the image acquired by the camera.
Within the scope of the application, the vehicle assembly refers to an assembly for a vehicle consisting of a plurality of components, such as a vehicle interior assembly: an instrument panel assembly, a vehicle door trim assembly, a side wall trim assembly, a top cover trim assembly, a back door trim assembly, and a luggage compartment assembly; vehicle body surface assembly: front and rear door assemblies, engine compartment assemblies, back door assemblies, and the like. These vehicle assemblies are assembled from a plurality of different components and are interconnected by way of plugging, overlapping, welding, bonding, bolting, etc.
In vehicle production, it is necessary to monitor the assembly of the vehicle assembly described above to ensure that the correct components are installed in the vehicle assembly in the correct manner. The specific items monitored are summarized herein as mounting errors and/or cosmetic defects, which include layout, configuration, options for each component, modeling, location, angle, material, color, scratches on the surface, cracking, dents, protrusions, warpage, radians, flatness, smoothness, texture, gaps between components, face differences, geometric dimensional deviations of components, as a whole, of the assembly. With particular attention being paid to the gaps and differences in surface of the vehicle assembly that exist due to the above-described connections. Such gaps and surface differences will create an aesthetic feel, represent production quality, directly reflect vehicle grade, and even affect the function of the vehicle assembly. Thus, it is particularly critical for gap and face difference detection.
According to the application, the human-computer collaborative robot is a robot designed for co-working with a human being, which is provided with safety protection for surrounding human beings, taking into account the close presence, even interaction, of the human being at the robot. Compared with a common industrial robot, the man-machine cooperation robot realizes the motion of multiple degrees of freedom and has the advantages of miniaturization size, high motion precision and high personnel safety. Thus, no protective cover has to be provided around the detection device according to the application nor is personnel passage and even interaction prohibited.
In the present application, a man-machine cooperative robot is provided, and a camera for capturing an image of the vehicle assembly and a three-dimensional laser scanner for three-dimensionally scanning a surface of the vehicle assembly are mounted at the end of the man-machine cooperative robot. In the present application, the camera may be an industrial camera in the visible light range. The three-dimensional laser scanner comprises a laser light source, a scanner, a light receiving sensor, a control unit and the like, and can use a time-of-flight ranging or triangular ranging principle. Not limited to three-dimensional laser scanners, the present application may also use three-dimensional structured light scanners because of the ability of man-machine cooperative robots to impart mobility to the scanner. The camera may even be used as a scanner in case a three-dimensional structured light scanner is used.
The mounting errors and/or appearance defects of the vehicle assembly, such as the overall layout, configuration, options of each component, modeling, position, angle, material, color, scratches on the surface, cracking, dents, protrusions, warpage, radians, flatness, smoothness, texture, gaps between each component, surface differences, geometric deviation of each component, can be detected by the image data collected by the camera mounted at the end of the human-computer-assisted robot and by the scanning data of the three-dimensional laser scanner mounted at the end of the human-computer-assisted robot. In contrast to purely fixedly mounted cameras and three-dimensional laser scanners, stereoscopic detection of multiple positions over the surface of a vehicle assembly, such as top, bottom, front, back, side, etc., can be achieved by the movability of a human-machine cooperative robot. Particularly for vehicle assemblies with irregular surfaces, such as instrument panel assemblies, detection can be performed on uneven or curved surfaces without being limited by fixed collection points. Furthermore, the detection can also be carried out in particular for surfaces arranged on the inner wall of the space, for example, in the case of detection lines Li Cangzong, the robot can be operated by man-machine cooperation so that its end is moved into the luggage compartment space. Therefore, the man-machine cooperative robot is combined with the camera and the three-dimensional laser scanner, so that the optical detection range of the man-machine cooperative robot and the camera is expanded, and the man-machine cooperative robot can detect different positions in space, such as shooting and scanning point by point at a plurality of different positions in space; but in particular with the movement of the co-robot. In particular, the detection can be carried out continuously, for example, following a curved or non-linear contour, so that an uninterrupted profile along the curved or contour is obtained.
According to the application, since the camera and the three-dimensional laser scanner are simultaneously installed at the tail end of the man-machine cooperation robot, the image and the three-dimensional configuration of the vehicle assembly, particularly the measurement gap and the surface difference and the defects on the surface can be simultaneously acquired, so that the surface condition of the vehicle assembly can be comprehensively known by only passing over the vehicle assembly once. Thus, not only is a relatively "macroscopic" detection of the vehicle assembly achieved: such as the overall layout, configuration, choice of parts, shape, location, angle, material, color of the assembly, whether correct or not, or whether there is a large difference from the assembly requirements; but also enables a relatively "microscopic" detection of the vehicle assembly: scratches, cracks, depressions, protrusions, warpage, radians, flatness, smoothness, texture, gaps between components, surface differences, geometric dimensional deviations of components on the surface of the vehicle assembly.
Furthermore, according to the application, the scanning position of the three-dimensional laser scanner and/or the movement path of the man-machine cooperation robot are selected according to the image acquired by the camera. The camera arranged at the end of the humanoid robot is not only a simple actuator, but also the basis for visual guidance of the humanoid robot and/or of the three-dimensional laser scanner. The implementation of the measurement, i.e. the movement of the man-machine-collaborative robot and/or the scanning position of the three-dimensional laser scanner, is controlled by the images acquired by the camera. In the present application, the camera, the three-dimensional laser scanner, and the man-machine cooperation robot are not independent of each other or simply added together, but are organically combined and cooperate with each other.
By means of the detection device according to the application, it is possible to automatically detect whether the correct component is installed in the vehicle assembly in the correct manner in a satisfactory manner. By the application, manual inspection can be replaced, the labor cost for inspection can be saved, the additional man-hour caused by repair can be reduced, and the production efficiency can be improved. Moreover, since the use of a touching measuring tool, such as a feeler gauge, by a quality inspector is avoided, accidental damage to the vehicle assembly during the measurement process can also be avoided. Meanwhile, repeated movement of the body during manual inspection is omitted, and labor damage is avoided. Second, since the vehicle assembly can be comprehensively inspected using the man-machine cooperative robot, the dimension of inspection is increased with respect to the fixedly disposed camera and scanner, so that an arbitrary portion on the surface of the vehicle assembly can be inspected without being limited to a fixed photographing range. Thirdly, based on the man-machine cooperation robot, the detection precision and the detection efficiency are obviously improved. For example, the accuracy of gap detection by a three-dimensional laser scanner can reach 0.1mm, and the robot can realize rapid and accurate positioning movement by combining with a man-machine cooperation robot, so that the time length for detecting a vehicle assembly is shortened, and the production efficiency is improved.
According to one embodiment of the application, the camera is used for collecting the overall layout, configuration, options of all parts, modeling, positions, angles, materials and colors of the vehicle assembly; and/or collecting scratches, cracks, depressions, protrusions, warpage, radians, flatness, smoothness, texture, gaps between components, surface differences, geometric dimensional deviations of components on a surface of a vehicle assembly using the three-dimensional laser scanner. It can be seen that both relatively "macroscopic" and relatively "microscopic" detection of the vehicle assembly can be accomplished by the present application. Specifically, a still image (photograph) or a moving image (video) of the vehicle assembly can be captured by a camera provided at the end of the man-machine cooperation robot, and the layout, arrangement, selection of the respective components, shape, position, angle, material, and color as a whole can be known by recognizing the still or moving image. Taking the instrument panel assembly of the vehicle as an example, whether the overall layout of the camera image is correct or not can be detected through the identification of the camera image; whether the correct component is installed; whether each part has the correct shape; whether or not each component is mounted in a position conforming to the specification; whether the individual components are of the correct material and color. In this case, by comparison with the pre-stored image, it is possible to ascertain not only whether the correct component is present in the image, but also whether the component is mounted in the correct manner (e.g., orientation, position, angle, etc.). In contrast, not only can the surface details of the vehicle assembly be captured by the three-dimensional laser scanner, such as scratches, cracks, depressions, protrusions, warpage, radians, flatness, smoothness, texture. And the gaps, the surface differences and the geometric dimension deviations of the components can be obtained with high precision through three-dimensional scanning.
According to one embodiment of the present application, point cloud information including three-dimensional coordinates and color information is obtained by the camera using stereoscopic vision measurement, and the point cloud information can be coupled with point cloud information including three-dimensional coordinates and laser reflection intensity obtained by a three-dimensional laser scanner. Here, the coupling means that color information and laser reflection intensity information having the same three-dimensional coordinates are associated. A point cloud set is thus obtained which enables a complete restoration of the surface conditions of the vehicle assembly, in which point cloud set there is corresponding laser reflection intensity and color information for the three-dimensional coordinates on each surface. In this way, complete surface information about the vehicle assembly is formed, whereby a colored three-dimensional model of the vehicle assembly can be reconstructed. In order to apply the stereoscopic measurement, it is preferred to apply the stereoscopic measurement to the images captured by the monocular camera at different positions by means of the mobility of the transport unit on the basis of the co-robot and/or the vehicle combination. Alternatively, binocular cameras may also be used.
According to one embodiment of the application, the detection device comprises a reading device for reading the number of the vehicle assembly and/or the transport unit, from which the data processing device obtains standard data of the vehicle assembly and compares it with the collected data. Here, knowledge of the number of the vehicle assembly and/or the transport unit facilitates archiving of the detection data for later traceability. This embodiment is particularly advantageous in cases where mixed line production is involved. In a hybrid line production system, vehicle assemblies that are typically produced simultaneously on one production line may belong to different vehicle models and/or different configurations of the same vehicle model. For example, the instrument panel assemblies thereof are produced for passenger and off-road vehicles in tandem, in particular interspersed, on the same production line. It is also possible that there may be subtle differences between instrument panel assemblies for the same vehicle model, such as the installation of chrome-plated trim at a partial seam. At this point, manual detection may confuse the configurations, and even completely fail to take into account the details. According to this embodiment of the present application, standard data of the corresponding vehicle assembly, such as standard images, standard dimensions, prescribed tolerances, etc., can be easily acquired by reading the above-described numbers. The data processing device compares the standard data with the actually acquired data, so as to obtain deviation from the standard data, such as differences in positions, shapes, colors and the like of all parts in the image, and whether gaps at joints meet regulations or not. The reading means may comprise a code scanner, a number reader, an optical sensor with number identification, an RFID tag reader or the like. In particular, the vehicle number reading device reads the number of the vehicle assembly and/or the transport unit by means of image recognition and/or wireless communication. In particular, it is preferable that the camera provided at the end of the man-machine cooperative robot is used as the above-described reading device. The construction of the detection device according to the application is thereby simplified.
According to one embodiment of the application, the three-dimensional laser scanner scans the vehicle assembly as a whole or only locally. The vehicle assembly as a whole may be scanned throughout to obtain comprehensive and detailed inspection information. However, to save detection time and reduce data throughput and storage, spot checks may also be performed on only a portion of the positions of the vehicle assemblies.
According to one embodiment of the application, the detection of the detection device takes place with the transport unit stopped, and the stop position deviation of the vehicle assembly is detected by a camera, and the control device for the human-machine cooperative robot compensates the movement of the human-machine cooperative robot as a function of the stop position deviation. For example, the deviation value of the stop position of the transport unit at which the vehicle assembly is parked from the standard detection position is measured by recognition of the image acquired by the camera. According to the deviation value, if the deviation value is zero, the movement and the positioning of the man-machine cooperation robot during measurement are carried out according to a preset program and a reference preset position; if the deviation value is not zero, compensating the deviation value into a preset track and a preset reference position of the man-machine cooperation robot, and timely adjusting the position of the moving track and the position of the measuring point to ensure that the correct position can be measured.
According to one embodiment of the application, the detection of the detection device takes place with movement of the transport unit, the movement of the vehicle assembly is captured by a camera, the control device for the humanoid robot causes the extremities of the humanoid robot to track the detected movement, and/or the data processing device takes into account the detected movement in the data processing. In this case, real-time position detection of the vehicle assembly moving with the conveyor unit is achieved by means of a camera. Thus, the humanoid collaborative robot can track the detected motion, so that the humanoid collaborative robot tail end can track the motion, and the humanoid collaborative robot tail end and the vehicle assembly synchronously move or superimpose or deduct the motion on the detected motion which the humanoid collaborative robot tail end originally needs to perform under the condition that the vehicle assembly is stationary. Here, superposition means that the movement performed by the robot tip in man-machine cooperation not only contains the detected movement component that should have been performed, but also the component of the movement of the vehicle assembly with the transport unit, whereby the robot tip in man-machine cooperation will achieve a greater movement than when the vehicle assembly is stationary. Subtraction means, for example, that, as a result of the uniform movement of the vehicle assembly in one direction, the movement of the end of the robot in this direction can be dispensed with from the original detection movement. Furthermore, the data processing device takes into account the detected movements in the data processing, in particular compensates for the detected movements, vibrations, jolts, etc.
According to one embodiment of the application, the data processing device performs a verification using one of the camera and the three-dimensional laser scanner when an incorrect installation error and/or an appearance defect is determined by the other. In this way, the same problem is double checked in two different detection data, thereby reducing errors in detection and improving the accuracy of detection.
According to one embodiment of the application, the robot has its end camera capturing images of the vehicle assembly at least two different positions, and the data processing device reconstructs a three-dimensional model of the vehicle assembly from these images and the displacement between the at least two different positions. Stereoscopic vision measurement is also used here. Moreover, a rough three-dimensional modeling of the vehicle assembly to be detected can be achieved based solely on existing monocular cameras. In this way, as many inspection tasks as possible are accomplished with the existing components of the inspection apparatus according to the present application.
According to one embodiment of the application, predefined feature points are determined in the established three-dimensional model of the vehicle assembly, whereby the scanning position of the three-dimensional laser scanner and/or the path of movement that the planning man-machine collaboration robot needs to perform for performing the scanning are selected. In particular, the feature is a region on the surface where the brightness and/or the gray scale change strongly. Such points often represent edges, gaps, steps, decorative strips, etc., and it is particularly desirable to measure gaps, surface differences, etc. at these points. The detection device according to the application can thus find the position to be detected by itself without the need for user-defined provision. Thus, the intellectualization of the detection equipment can be realized.
According to one embodiment of the application, the detection device comprises a gripper or a manipulator arranged at the end of the co-robot. In this way, the detection device according to the application is not only capable of visual detection, but also of detecting the movement function. Preferably, the gripper or manipulator is used to perform manipulations on the vehicle assembly, such as pressing keys, screwing knobs, opening and closing glove boxes or doors, turning sun visors, toggling ventilation grilles, etc. Particularly preferably, sensors are provided in the gripper or the manipulator for measuring the resistance and/or the torque fed back when the vehicle assembly is actuated. The sensor allows further quantitative assessment of the movement function of the vehicle assembly. The gripper or manipulator described above may be integrated with the camera and the three-dimensional laser scanner at the end of the human-machine co-robot in various ways. For example, a gripper or robot may be integrated with the camera and the three-dimensional laser scanner on opposite ends of the rotatable part on the end. Furthermore, it is contemplated that the gripper or manipulator is pivotably or stowably mounted to the back. So that the gripper or manipulator does not interfere with the detection of the camera and the three-dimensional laser scanner.
According to one embodiment of the application, the detection device comprises an archive for storing data about the vehicle assembly. By archiving the data of the vehicle assemblies, the production file of each vehicle assembly can be established, so that the quality inquiry and the quality tracing of the future are facilitated.
According to one embodiment of the application, the detection device comprises an alarm device for alerting a user, in particular for emitting an optical and/or acoustic signal indicating disablement, when an incorrect installation error and/or an appearance defect is determined. The detection device according to the application is preferably arranged after the respective vehicle assembly installation step, in particular immediately adjacent to the vehicle assembly installation site. Thus, the inspection can be performed immediately after the assembly is completed, so that the assembly correctness can be immediately known. When the wrong component is assembled or is not assembled according to the specification, the alarm device of the detection device is immediately fed back to the assembly personnel through light and/or sound, so that the assembly personnel can immediately correct the error, for example, replace the correct component or adjust the installation of the component. The visual signal can be given by the lighting and extinguishing of the lamplight, the lamplight color, the dynamic effect of the lamplight and the like; whereas the audible signal may be given by a prompt tone, speech, music, etc. Preferably an audible signal is emitted, since the assembly personnel does not have to spend time looking at the corresponding light and only have to concentrate on the next assembly step. According to this embodiment, only the vehicle assembly in which the correct components are mounted in the correct manner in a satisfactory manner can be passed through and then assembled further along the production line. Therefore, the qualification rate can be ensured with extremely high efficiency, the quality of the vehicle assembly is improved, and the labor hour consumed by reworking and maintenance is reduced. In particular, the detection device comprises a display device for displaying the detection result and/or the determined non-compliance installation errors and/or appearance defects to the user with an image. The display displays at least one of: reference image, differences in acquired image from reference image, correct components and vehicle configuration. Therefore, an assembler can clearly know the problem of component assembly through the display of the output device, and is convenient for component replacement or component refitting or refitting. In particular, on the display device, the determined non-compliance mounting errors and/or appearance defects are marked on the image captured by the camera. Preferably, the display device displays at least one of: reference image, difference between detected image and reference image, correct component layout and configuration. Therefore, an assembler can clearly know the problem of assembly through the display device, and is convenient for replacing or refitting the components.
According to a further aspect, the application also relates to a vehicle assembly production system comprising a conveying unit for conveying a vehicle assembly and a detection device according to the application. It should be noted that the various embodiments and technical effects of the vehicle assembly production system according to the present application may be referred to the above description of the inspection apparatus. Automated inspection, efficient production, reduced error rates, increased vehicle assembly qualification rates, and reduced labor and time costs can be achieved by the vehicle assembly production system of the present application.
Drawings
FIG. 1 shows a schematic view of a detection apparatus for a vehicle assembly and a detected vehicle assembly according to the present application;
FIG. 2 shows a perspective view of a detection device and detected vehicle assembly according to the present application;
FIG. 3 shows a schematic view of one embodiment of the tip of a humanoid robot of a detection device according to the present application;
FIG. 4 shows a schematic view of one embodiment of the tip of a humanoid robot of a detection device according to the present application;
fig. 5 shows the detection path of the detection device according to the application when the instrument panel assembly is scanned locally.
Detailed Description
Fig. 1 shows a schematic view of a detection device for a vehicle assembly and a detected vehicle assembly according to the application. The detection of a vehicle assembly is illustrated in the figures by way of example in an instrument panel assembly 1 for a vehicle. The instrument panel assembly 1 is carried by a transport unit 2 configured as a spreader. Each instrument panel assembly 1 is passed successively in the transport direction 9 over the steel girder by means of the transport unit 2 through the detection device according to the application. The transport unit 2 has a motor 3 for driving its movement.
The detection device according to the present application comprises:
a human-machine cooperative robot 4 for moving around the instrument panel assembly 1 conveyed by the conveying unit 2;
a camera 5 mounted at the end of the human-machine cooperative robot 4 for capturing images of the dashboard assembly 1;
a three-dimensional laser scanner 6 mounted at the end of the human-computer cooperative robot 4 for three-dimensionally scanning the surface of the instrument panel assembly 1;
-control means 7 for a human-machine cooperative robot 4;
data processing means 8 for processing the data acquired by the camera 5 and the three-dimensional laser scanner 6 for detecting installation errors and/or appearance defects of the dashboard assembly 1;
according to the application, the scanning position of the three-dimensional laser scanner 6 is selected and/or the path of movement of the human-computer cooperative robot 4 is planned from the image acquired by the camera 5.
Collecting the overall layout, configuration, options of all parts, modeling, positions, angles, materials and colors of the vehicle assembly by using the camera 5; and/or collecting scratches, cracks, depressions, protrusions, warpage, radians, flatness, smoothness, texture, gaps between components, surface differences, geometric dimensional deviations of components on the surface of the vehicle assembly using the three-dimensional laser scanner 6. In particular, point cloud information including three-dimensional coordinates (e.g., xyz) and color information (e.g., RGB) can be obtained by the camera 5 while applying stereoscopic vision measurement, and can be coupled with point cloud information including three-dimensional coordinates (e.g., xyz) and laser reflection intensity (e.g., I) obtained by the three-dimensional laser scanner 6. A point cloud set is thus obtained which enables the complete restoration of the surface condition of the dashboard assembly 1, in which point cloud set there is corresponding laser reflection intensity and color information for the three-dimensional coordinates on each surface.
In addition, since the camera 5 and the three-dimensional laser scanner 6 are provided on the same end, double verification can also be achieved: the data processing device 8 performs verification using one of the camera 5 and the three-dimensional laser scanner 6 when a non-compliant mounting error and/or an appearance defect is determined by the other.
In fig. 1, each transport unit 2 is marked with its own number, shown as 001 and 125. The detection device according to the application may be provided with reading means for reading the number from which the data processing means 8 obtain standard data of the instrument panel assembly 1 and compare it with the acquired data. Particularly advantageously, a camera 5 is used as the reading device.
Fig. 2 shows a perspective view of a detection device and a detected vehicle assembly according to the application. In this figure only the tip 10 of the humanoid robot 4 is schematically shown, the camera 5 and the three-dimensional laser scanner 6 arranged on this tip 10 being not shown for clarity reasons.
The vehicle assembly, for example a dashboard assembly 1, is transported along a steel beam via a schematically shown transport unit 2. Here, the instrument panel assembly 1 reaches the inspection station.
After the instrument panel assembly 1 has reached the inspection station, the number of the instrument panel assembly and/or the transport unit can first be read, from which the data processing device 8 obtains standard data, i.e. standard images, for the instrument panel assembly. Then, the camera 5 may take a full view image of the instrument panel assembly 1 at a predetermined position. The data processing means 8 comprise image processing means for performing image comparisons, such as a computer, programmable logic circuitry, field Programmable Gate Array (FPGA) or Programmable Logic Array (PLA) electronic circuitry or the like, on which a corresponding program or instruction is installed. In the present application, the image processing apparatus may compare each part of the acquired image with the standard image. The comparison may include whether the acquired image matches or does not match the standard image. Advantageously, only feature comparisons are made in the comparison, the features comprising: icons, characters, bar codes, numbers, lines, corners of the vehicle assembly; the size, shape, contour, position, orientation of the components; as well as the distance, angle, connection, etc. of the components relative to other parts of the vehicle. In this way, a targeted test can be achieved by extracting only these simple features from the acquired images. For example, in verifying a dashboard, since differently configured dashboards may bear different patterns, numbers and/or characters, it may be possible to verify whether the correct dashboard is assembled in the vehicle configuration and whether the correct dashboard is assembled in the correct manner by comparing these features of the dashboards. Particularly preferably, in the above-described image processing, the images are compared by means of gray-scale processing of the acquired images. The gray scale processing of the acquired image is particularly advantageous here, since the data volume of the gray scale map is greatly reduced with respect to the color map, reducing the computational performance requirements of the comparison device. In addition, the gray level map can be processed uniformly, so that deviation caused by light environment change is prevented. In the case of a comparison with a gray scale, the above-described features of the components to be compared, in particular of the components, can be obtained simply by extracting pixels with gray scale values in a threshold range. Likewise, corresponding features may also be extracted by the change in gray scale gradient within the image. Therefore, the gray processing of the acquired image can simplify calculation, reduce the calculation performance requirement, and improve the verification speed and the verification accuracy.
The transport unit 2 may be used to park the instrument panel assembly 1 at the inspection station or may be used to slowly pass the instrument panel assembly at the inspection station while the inspection is performed.
In the case of a stop of the conveying unit, a deviation of the stop position of the dashboard assembly 1 from a standard detection position is detected by the camera 5, and the control device 7 for the human-machine cooperative robot 4 compensates the movement of the human-machine cooperative robot 4 performed in the detection as a function of the deviation of the stop position.
However, without being limited to the above-described stopping situation, the detection of the detection device may also take place in the case of a movement of the transport unit 2, the movement of the dashboard assembly 1 being acquired by the camera 5, the control means 7 for the humanoid robot 4 causing the extremity 10 of the humanoid robot 4 to track the detected movement, and/or the data processing means 8 taking into account the detected movement in the data processing, for example the control means 7 causing the extremity 10 of the humanoid robot 4 to move synchronously with the dashboard assembly 1 or superimposing or subtracting the movement on the detection movement that the extremity 10 of the humanoid robot 4 would have performed in the case of a stationary vehicle assembly. Furthermore, the data processing device 8 takes into account the detected movements in the data processing, in particular to compensate for movements, vibrations, jolts, etc. of the dashboard assembly 1 detected by the camera 5.
Fig. 3 shows a schematic view of an embodiment of the tip 10 of the humanoid robot 4 of the detection device according to the application. A camera 5 and a three-dimensional laser scanner 6 arranged side by side are schematically shown on a movable end 10 of the human-computer collaborative robot 4. The camera 5 is an industrial camera, and the three-dimensional laser scanner 6 includes a laser light source and a scanner, a light receiving sensor, a control unit, and the like, and can use a time-of-flight ranging principle or a triangle ranging principle. The camera 5 and the three-dimensional laser scanner 6 are not limited to being arranged side by side in the left-right direction as shown in the figure, but may be arranged side by side up and down, or the respective components of the three-dimensional laser scanner 6 may be arranged separately, for example, a laser light source is arranged on one side of the camera 5 and a light receiving sensor is arranged on the other side of the camera 5.
Fig. 4 shows a schematic view of an embodiment of the tip 10 of the humanoid robot 4 of the detection device according to the application. Fig. 4 differs from the human-machine-collaborative robot 4 shown in fig. 3 in that the detection device comprises a schematically shown gripper 11 arranged at the end 10 of the human-machine-collaborative robot 4. The gripping tool can be used for gripping objects and the like. The gripper 11 is integrated with the camera 5 and the laser scanner 6 in this case on the same component, which can be rotated about the axis 12. By rotation, the gripper 11 can be moved out of the storage position and constitutes an end effector of the humanoid robot 4. The handling of the vehicle assembly is performed with the gripper 11. In particular, sensors, not shown, are provided in the gripper 11 for measuring the resistance and/or torque fed back when the vehicle assembly is being operated.
The three-dimensional laser scanner 6 may scan the vehicle assembly as a whole during inspection. At this time, the control device 7 of the human-computer cooperative robot 4 may control the tip 10 thereof to pass over the entire surface of the instrument panel assembly 1 in a reciprocating manner. The three-dimensional information of the entire surface of the instrument panel assembly 1 can be comprehensively understood.
Fig. 5 shows the detection path of the detection device according to the present application when the instrument panel assembly 1 is locally scanned. Here, for efficient detection efficiency, the detection apparatus scans the local gap with the three-dimensional laser scanner 6 only on the individual region on the instrument panel assembly 1. The locations to be scanned are marked with circular marks in fig. 5. In addition, the position of the glove box switch where the opening operation needs to be performed by the gripper 11 is also marked with a square mark. At the time of detection, the humanoid robot 4 rests its end 10 above the position of these marks, scans the aligned gap with the three-dimensional laser scanner 6 or manipulates the switch with the gripper 11. After scanning of one position is completed, the human-computer cooperative robot 4 moves the tip 10 to the next position along a movement path indicated by an arrow in fig. 5 to continue scanning.
In particular, based on the mobility of the human-machine-collaborative robot 4, stereoscopic vision measurement can be achieved even if only one camera 5 is provided, in that the human-machine-collaborative robot 4 has the camera 5 of its tip 10 acquire images of the dashboard assembly 1 in at least two different positions, and the data processing device 8 reconstructs a three-dimensional model of the dashboard assembly 1 from these images and the displacement between the at least two different positions. The predefined feature is preferably determined in the three-dimensional model of the vehicle assembly established. The feature may be a region on the surface where the brightness and/or gray scale varies strongly. The circles marked in fig. 5 represent these features derived from the three-dimensional model. After determining the feature to be detected, the scanning position of the three-dimensional laser scanner 6 may be determined and a path of movement of the humanoid robot 4, such as the path of movement indicated by the arrow in fig. 5, may be planned to be performed for the scanning.
Although not shown in the above figures, the detection device according to the application may also comprise archiving means, alarm means and/or display means. Wherein the archive is for storing data regarding the vehicle assembly. In particular, the data acquired by the three-dimensional laser scanner 6 is recorded in the storage device in combination with the data acquired by the camera 5. The alarm device may be used to alert a user when an improper installation error and/or an appearance defect is determined. The display device may be used to display the detection result and/or the determined non-compliance installation error and/or the appearance defect to the user in an image.
The application is not limited to the embodiments shown, but includes or extends to all technical equivalents which fall within the effective scope of the appended claims. The positional references selected in the description, such as, for example, up, down, left, right, etc., refer to the direct description and the drawings shown and can be transferred to new positions in the sense of a change in position.
Although the present application has been described in terms of the preferred embodiments, it is not intended to be limited to the embodiments, and any person skilled in the art can make any possible variations and modifications to the technical solution of the present application by using the methods and technical matters disclosed above without departing from the spirit and scope of the present application, so any simple modifications, equivalent variations and modifications to the embodiments described above according to the technical matters of the present application are within the scope of the technical matters of the present application.
The features disclosed in the present document can be implemented not only individually but also in any combination. Furthermore, it is noted that the various figures of the application are schematic and may not be shown to scale. In various embodiments, the number, configuration, and/or arrangement of components is also not limited to the examples shown. The values recited in the specification are merely reference values which can be exceeded or undershot when the dimensions are appropriately selected.

Claims (15)

1. A detection apparatus for a vehicle assembly, the detection apparatus comprising:
-a human-machine cooperative robot for moving around the vehicle assembly transported by the transport unit;
-a camera mounted at the end of a human-machine cooperative robot for capturing images of the vehicle assembly;
-a three-dimensional laser scanner mounted at the end of the human-machine cooperative robot for three-dimensional scanning of the surface of the vehicle assembly;
-control means for a human-machine cooperative robot;
-data processing means for processing the data acquired by the camera and the three-dimensional laser scanner for detecting mounting errors and/or appearance defects of the vehicle assembly;
wherein the scanning position of the three-dimensional laser scanner is selected and/or the movement path of the robot is planned based on the images acquired by the cameras, wherein the robot acquires images of the vehicle assembly at least two different positions by the camera at the end of the robot, wherein the data processing device reconstructs a three-dimensional model of the vehicle assembly from these images and the displacement between the at least two different positions, wherein a plurality of feature points are determined in the three-dimensional model of the vehicle assembly which are points of strongly varying brightness and/or gray scale and represent edges, gaps, steps or decorative strips, wherein the scanning position of the three-dimensional laser scanner is selected by the feature points and the movement path which the robot needs to perform for the scanning is planned,
and the detection of the detection device takes place in the event of a movement of the transport unit, the movement of the vehicle assembly being acquired by means of a camera, the control means for the co-robot causing the tip of the co-robot to track the detected movement, and the data processing means taking into account the detected movement in the data processing,
the camera and the three-dimensional laser scanner simultaneously acquire an image and a three-dimensional configuration of the vehicle assembly, respectively, and point cloud information including three-dimensional coordinates and color information is obtained by the camera using stereoscopic vision measurement, and can be coupled with the point cloud information including three-dimensional coordinates and laser reflection intensity obtained by the three-dimensional laser scanner.
2. The inspection apparatus of claim 1 wherein said camera is utilized to capture the overall layout, configuration, options for each component, shape, location, angle, material, color of the vehicle assembly; and/or collecting scratches, cracks, depressions, protrusions, warpage, radians, flatness, smoothness, texture, gaps between components, surface differences, geometric dimensional deviations of components on a surface of a vehicle assembly using the three-dimensional laser scanner.
3. A detection device according to claim 1 or 2, characterized in that the detection device comprises reading means for reading the number of the vehicle assembly and/or the transport unit, from which number the data processing means obtain standard data of the vehicle assembly and compare it with the collected data.
4. A detection device according to claim 3, characterized in that the camera is used as the reading means.
5. The detection apparatus according to claim 1 or 2, characterized in that the three-dimensional laser scanner performs a whole scan or only a partial scan of the vehicle assembly.
6. A test device according to claim 1 or 2, wherein the data processing means uses one of the camera and the three-dimensional laser scanner to verify if an incorrect mounting error and/or appearance defect is determined by the other.
7. A detection device according to claim 1 or 2, characterized in that the feature is a region on the surface where the brightness and/or grey level varies strongly.
8. A detection device according to claim 1 or 2, characterized in that the detection device comprises a gripper or a manipulator arranged at the end of a co-robot.
9. The inspection apparatus of claim 8 wherein the handling of the vehicle assembly is performed using the gripper or robot.
10. A detection apparatus according to claim 9, wherein a sensor is provided in the gripper or manipulator for measuring the resistance and/or torque fed back when maneuvering the vehicle assembly.
11. A detection apparatus according to claim 1 or 2, characterized in that the detection apparatus comprises archiving means for storing data about the vehicle assembly.
12. The inspection apparatus of claim 11 wherein in the memory means, data acquired with the three-dimensional laser scanner is recorded in combination with data acquired with the camera.
13. A detection device according to claim 1 or 2, characterized in that the detection device comprises alarm means for alerting a user when an improper installation error and/or an appearance defect is determined.
14. A detection device according to claim 1 or 2, characterized in that the detection device comprises display means for displaying the detection result and/or the determined non-compliance installation errors and/or appearance defects to the user in an image.
15. A vehicle assembly production system, comprising
A conveying unit for conveying the vehicle assembly; and
the detection apparatus according to one of claims 1 to 14.
CN201910822121.7A 2019-09-02 2019-09-02 Vehicle assembly detection device and vehicle assembly production system Active CN112444283B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910822121.7A CN112444283B (en) 2019-09-02 2019-09-02 Vehicle assembly detection device and vehicle assembly production system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910822121.7A CN112444283B (en) 2019-09-02 2019-09-02 Vehicle assembly detection device and vehicle assembly production system

Publications (2)

Publication Number Publication Date
CN112444283A CN112444283A (en) 2021-03-05
CN112444283B true CN112444283B (en) 2023-12-05

Family

ID=74734787

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910822121.7A Active CN112444283B (en) 2019-09-02 2019-09-02 Vehicle assembly detection device and vehicle assembly production system

Country Status (1)

Country Link
CN (1) CN112444283B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106338245A (en) * 2016-08-15 2017-01-18 南京工业大学 Workpiece noncontact mobile measurement method
CN107507274A (en) * 2017-08-30 2017-12-22 北京图航科技有限公司 A kind of quick restoring method of public security criminal-scene three-dimensional live based on cloud computing
CN107643293A (en) * 2017-08-15 2018-01-30 广东工业大学 The outgoing detector and system of a kind of automotive seat
CN107860338A (en) * 2017-12-08 2018-03-30 张宇航 Industrial automation three-dimensional detection system and method
CN108332660A (en) * 2017-11-10 2018-07-27 广东康云多维视觉智能科技有限公司 Robot three-dimensional scanning system and scan method
CN108466265A (en) * 2018-03-12 2018-08-31 珠海市俊凯机械科技有限公司 Mechanical arm path planning and operational method, device and computer equipment
CN109493422A (en) * 2018-12-28 2019-03-19 国网新疆电力有限公司信息通信公司 A kind of substation's 3 D model construction method based on three-dimensional laser scanning technique
CN110081821A (en) * 2019-05-09 2019-08-02 西南交通大学 Intelligent high-speed rail white body assembling quality detection device and its method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016209557B4 (en) * 2016-06-01 2022-03-31 Carl Zeiss Industrielle Messtechnik Gmbh Method for identifying a workpiece, determining a measurement sequence and measuring a workpiece with a measuring device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106338245A (en) * 2016-08-15 2017-01-18 南京工业大学 Workpiece noncontact mobile measurement method
CN107643293A (en) * 2017-08-15 2018-01-30 广东工业大学 The outgoing detector and system of a kind of automotive seat
CN107507274A (en) * 2017-08-30 2017-12-22 北京图航科技有限公司 A kind of quick restoring method of public security criminal-scene three-dimensional live based on cloud computing
CN108332660A (en) * 2017-11-10 2018-07-27 广东康云多维视觉智能科技有限公司 Robot three-dimensional scanning system and scan method
CN107860338A (en) * 2017-12-08 2018-03-30 张宇航 Industrial automation three-dimensional detection system and method
CN108466265A (en) * 2018-03-12 2018-08-31 珠海市俊凯机械科技有限公司 Mechanical arm path planning and operational method, device and computer equipment
CN109493422A (en) * 2018-12-28 2019-03-19 国网新疆电力有限公司信息通信公司 A kind of substation's 3 D model construction method based on three-dimensional laser scanning technique
CN110081821A (en) * 2019-05-09 2019-08-02 西南交通大学 Intelligent high-speed rail white body assembling quality detection device and its method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
谈国新.《文化资源与产业文库 民族文化资源数字化与产业化开发》.2012,第102页. *

Also Published As

Publication number Publication date
CN112444283A (en) 2021-03-05

Similar Documents

Publication Publication Date Title
CA2554641C (en) Method for planning an inspection path and for determining areas to be inspected
KR101782542B1 (en) System and method for inspecting painted surface of automobile
US20130057678A1 (en) Inspection system and method of defect detection on specular surfaces
EP2998927B1 (en) Method for detecting the bad positioning and the surface defects of specific components and associated detection device
US10875592B2 (en) Automobile manufacturing plant and method
US20210255117A1 (en) Methods and plants for locating points on complex surfaces
CN104385282B (en) Visual intelligent numerical control system and visual measuring method thereof
EP3388781B1 (en) System and method for detecting defects in specular or semi-specular surfaces by means of photogrammetric projection
US20190080446A1 (en) System and method for automated defect detection
ES2620786T3 (en) Procedure and system for the control of construction parts and / or functional units with an inspection device
JP4862765B2 (en) Surface inspection apparatus and surface inspection method
KR20080075506A (en) System for projecting flaws and inspection locations and associated method
EP1650530B1 (en) Three-dimensional shape measuring method and measuring apparatus thereof
CN111971522B (en) Vehicle detection system
EP3775854B1 (en) System for the detection of defects on a surface of at least a portion of a body and method thereof
JP2007523334A (en) Defect position identification method and marking system
JP6998183B2 (en) Robot system and robot control method
CN112444283B (en) Vehicle assembly detection device and vehicle assembly production system
WO2002069062A2 (en) A system and method for planning a tool path along a contoured surface
US20230115037A1 (en) Ply templating for composite fabrication with ai quality control modules
da Silva Santos et al. 3D scanning method for robotized inspection of industrial sealed parts
JP2012141223A (en) Surface flaw detecting and indicating system and work piece manufacturing method involving surface treatment
US20060280355A1 (en) Method of evaluating and designing sealing elements
US20210387289A1 (en) Method for Testing a Joint
DE102022202571B3 (en) Checking a given path of a robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant