CN112444283A - Detection apparatus for vehicle assembly and vehicle assembly production system - Google Patents

Detection apparatus for vehicle assembly and vehicle assembly production system Download PDF

Info

Publication number
CN112444283A
CN112444283A CN201910822121.7A CN201910822121A CN112444283A CN 112444283 A CN112444283 A CN 112444283A CN 201910822121 A CN201910822121 A CN 201910822121A CN 112444283 A CN112444283 A CN 112444283A
Authority
CN
China
Prior art keywords
robot
vehicle assembly
camera
human
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910822121.7A
Other languages
Chinese (zh)
Other versions
CN112444283B (en
Inventor
陈卫华
毛瑞杰
马骥
杨洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BMW Brilliance Automotive Ltd
Original Assignee
BMW Brilliance Automotive Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BMW Brilliance Automotive Ltd filed Critical BMW Brilliance Automotive Ltd
Priority to CN201910822121.7A priority Critical patent/CN112444283B/en
Publication of CN112444283A publication Critical patent/CN112444283A/en
Application granted granted Critical
Publication of CN112444283B publication Critical patent/CN112444283B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23PMETAL-WORKING NOT OTHERWISE PROVIDED FOR; COMBINED OPERATIONS; UNIVERSAL MACHINE TOOLS
    • B23P23/00Machines or arrangements of machines for performing specified combinations of different metal-working operations not covered by a single other subclass
    • B23P23/06Metal-working plant comprising a number of associated machines or apparatus
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The invention relates to a detection device for a vehicle assembly, comprising: a human-machine cooperative robot for moving around the vehicle assembly conveyed by the conveying unit; a camera mounted at a distal end of the human-machine-cooperative robot for capturing images of the vehicle assembly; a three-dimensional laser scanner mounted at a distal end of the human-machine-collaboration robot for three-dimensionally scanning a surface of the vehicle assembly; control means for the human-machine cooperative robot; a data processing device for processing data acquired by the camera and the three-dimensional laser scanner for detecting mounting errors and/or cosmetic defects of the vehicle assembly, wherein a scanning position of the three-dimensional laser scanner is selected and/or a movement path of the robot-assisted robot is planned in accordance with an image acquired by the camera. Furthermore, the invention relates to a vehicle assembly production system.

Description

Detection apparatus for vehicle assembly and vehicle assembly production system
Technical Field
The present invention relates to a detection apparatus for a vehicle assembly production system. The invention also relates to a vehicle assembly production system.
Background
At present, quality problems such as whether parts in a vehicle assembly such as an instrument panel assembly are correctly installed and whether gaps among the parts meet standards are manually detected in a production line. When the conveying unit conveys the vehicle assembly to the quality inspection station, a quality inspector visually inspects the mounted components and measures each gap and surface difference by using the feeler gauge and the surface difference gauge respectively. However, as the number of types of similar components mounted on the instrument panel assembly increases, and the production line speed increases continuously, many quality problems in manual inspection cannot be detected at the first time. And the problem of instrument board assembly is found after the production line is off, the repair difficulty is high, the working hour is long, and the cost is high. In addition, the quality inspector needs to constantly move the body. Repeated operation for a long time will cause labor damage to quality inspectors.
Disclosure of Invention
The object of the invention is to provide a testing device for a vehicle assembly, which enables an automatic quality test of the vehicle assembly, in particular an automatic test of installation errors and/or cosmetic defects, to be carried out simultaneously. Furthermore, the invention also provides a vehicle assembly production system with such a detection device.
One aspect of the invention relates to a detection apparatus for a vehicle assembly, the detection apparatus comprising:
-a human-machine-cooperative robot for moving around a vehicle assembly transported by the transport unit;
-a camera mounted at the end of the human-machine-cooperative robot for capturing images of the vehicle assembly;
-a three-dimensional laser scanner mounted at the end of the human-machine-cooperative-robot for three-dimensional scanning of the surface of the vehicle assembly;
-control means for the human-machine-collaborative robot;
-data processing means for processing data acquired by the camera and the three-dimensional laser scanner for detecting mounting errors and/or appearance defects of the vehicle assembly;
and selecting the scanning position of the three-dimensional laser scanner and/or planning the moving path of the robot-robot cooperative robot according to the image acquired by the camera.
Within the scope of the present invention, the vehicle assembly refers to an assembly for a vehicle consisting of a plurality of components, such as a vehicle interior trim assembly: the automobile door trim assembly comprises an instrument panel assembly, an automobile door trim assembly, a side wall trim assembly, a top cover trim assembly, a back door trim assembly and a luggage compartment assembly; vehicle body surface assembly: front and rear door assemblies, engine compartment assemblies, tailgate assemblies, and the like. These vehicle assemblies are assembled from a number of different components and are connected to one another by plugging, overlapping, welding, bonding, bolting, etc.
The assembly of the above-described vehicle assemblies needs to be monitored during vehicle production to ensure that the correct components are installed in the vehicle assembly in the correct manner. Specific items to be monitored are summarized here as mounting errors and/or appearance defects, which include the layout, arrangement, selection of parts, shape, position, angle, material, color, scratches on the surface, cracks, depressions, protrusions, warpage, curvature, flatness, smoothness, texture, gaps between parts, surface differences, and geometric deviations of parts as a whole. Of particular concern are gaps and surface differences that exist on the surface of the vehicle assembly due to the above-described connections. Such gaps and imperfections create aesthetic appeal, reflect production quality, directly reflect vehicle grade, and even affect the functionality of the vehicle assembly. Thus, detection of the gap and the surface difference is particularly critical.
According to the invention, the human-computer cooperative robot refers to a robot designed to be in a common working space with a human being, which takes into account the close presence, even the interaction, of the human being in the robot and is provided with safety protection for the surrounding human being. Compared with a common industrial robot, the robot-robot cooperative robot has the advantages of small size, high movement precision and high personnel safety while realizing the movement of multiple degrees of freedom. It is therefore not necessary to provide protective hoods around the detection device according to the invention and to prevent personnel from passing through and even interacting.
In the invention, a man-machine cooperation robot is provided, and a camera for acquiring an image of the vehicle assembly and a three-dimensional laser scanner for three-dimensionally scanning the surface of the vehicle assembly are arranged at the tail end of the man-machine cooperation robot. In the present invention, the camera may be an industrial camera in the visible light range. The three-dimensional laser scanner includes a laser light source and scanner, a light receiving sensor, a control unit, etc. and may use time-of-flight ranging or triangulation ranging principles. Not limited to the three-dimensional laser scanner, the present invention may also use a three-dimensional structured light scanner since the human-machine cooperative robot can impart mobility to the scanner. The camera may even be used as a scanner in case of using a three-dimensional structured light scanner.
The mounting error and/or the appearance defect of the vehicle assembly, such as the layout, the configuration of the assembly as a whole, the selection of each component, the shape, the position, the angle, the material, the color, the scratch on the surface, the crack, the dent, the protrusion, the warpage, the radian, the flatness, the smoothness, the texture, the gap between each component, the surface difference, and the geometric dimension deviation of each component, can be detected through the image data collected by the camera installed at the end of the human-computer cooperation robot and the scanning data of the three-dimensional laser scanner installed at the end of the human-computer cooperation robot. Compared with a camera and a three-dimensional laser scanner which are installed only fixedly, the three-dimensional detection at multiple positions on the surface of the vehicle assembly, such as the top surface, the bottom surface, the front surface, the back surface, the side surface and the like, can be realized through the movable performance of the man-machine cooperation robot. In particular, for vehicle assemblies with irregular surfaces, such as dashboard assemblies, detection can be performed on uneven or curved surfaces without being limited by a fixed collection point. Furthermore, it is also possible to carry out inspection, in particular for surfaces arranged on the inner wall of the space, for example when inspecting a luggage compartment assembly, the human-machine-cooperative robot can move its end into the luggage compartment space. Thus, by the invention, the human-computer cooperation robot is combined with the camera and the three-dimensional laser scanner, and the optical detection ranges of the camera and the three-dimensional laser scanner are expanded, so that the human-computer cooperation robot can not only detect at different positions in space, such as shooting and scanning at a plurality of different positions point by point in space; and in particular may be detected as the movement of the robot-cooperative robot. In particular, the detection can be carried out continuously, for example following a curved or non-straight contour, so that an uninterrupted profile along the curved or contour is obtained.
According to the present invention, since the simultaneous installation of the camera and the three-dimensional laser scanner at the tip of the human-computer cooperative robot enables the simultaneous acquisition of the image and the three-dimensional configuration of the vehicle assembly, particularly the measurement of the gap and the surface difference and the defect on the surface, the surface condition thereof can be comprehensively understood by only one pass over the vehicle assembly. In this way, not only is a relatively "macroscopic" detection of the vehicle assembly achieved: such as the overall layout and configuration of the assembly, the selection of parts, the shape, the position, the angle, the material and the color of each part, or whether the assembly requirements are different greatly; but also enables the detection of a relatively "microscopic" of the vehicle assembly: scratches, cracks, depressions, protrusions, warping, camber, flatness, smoothness, texture, gaps between components, surface differences, geometric deviations of components on the surface of the vehicle assembly.
Furthermore, according to the present invention, a scanning position of the three-dimensional laser scanner is selected and/or a moving path of the robot-cooperative robot is planned based on the image acquired by the camera. In this case, the camera arranged at the end of the human-machine interaction robot is not only a simple actuator, but also the basis for visual guidance of the human-machine interaction robot and/or the three-dimensional laser scanner. The images acquired by the camera control the measurement, i.e. the movement of the human-machine-assisted robot and/or the scanning position of the three-dimensional laser scanner. Here, in the present invention, the camera, the three-dimensional laser scanner, and the robot-robot cooperative robot are not independent or simply added, but are organically combined with each other and cooperate with each other.
By means of the detection device according to the invention, it is possible to automatically detect whether the correct component is installed in the vehicle assembly in a correct manner in conformity with the regulations. By the invention, firstly, the manual inspection can be replaced, the labor cost for inspection is saved, the additional working hours caused by repair are reduced, and the production efficiency is improved. Furthermore, since the quality inspector is prevented from using a contact measuring tool, such as a feeler gauge, accidental damage to the vehicle assembly during the measurement process can also be avoided. Meanwhile, the repeated movement of the body during manual inspection is also omitted, and labor damage is avoided. Second, since the vehicle assembly can be comprehensively inspected by using the human-machine cooperative robot, the inspection dimension is increased relative to a fixedly disposed camera and scanner, so that any portion on the surface of the vehicle assembly can be inspected without being limited to a fixed photographing range. And thirdly, the detection precision and the detection efficiency are obviously improved based on the human-computer cooperation robot. For example, the precision of gap detection through the three-dimensional laser scanner can reach 0.1mm, and the robot can realize quick and accurate positioning movement by combining a robot and a robot cooperation, so that the time length for detecting the vehicle assembly is shortened, and the production efficiency is improved.
According to one embodiment of the invention, the camera is used to capture the overall layout, configuration, selection of parts, shape, position, angle, material, color of the vehicle assembly; and/or collecting scratches, cracks, depressions, protrusions, warps, radians, flatness, smoothness, texture, gaps between parts, surface differences, geometric dimensional deviations of parts on the surface of the vehicle assembly using the three-dimensional laser scanner. It can be seen that both relatively "macroscopic" and relatively "microscopic" inspection of the vehicle assembly can be achieved by the present invention. Specifically, a camera provided at the end of the human-machine cooperative robot can capture a still image (photograph) or a moving image (video) of the vehicle assembly, and the layout, arrangement, selection of each component, model, position, angle, material, and color of the whole can be known by recognizing the still or moving image. Taking the instrument panel assembly of the vehicle as an example, whether the overall layout is correct can be detected through the recognition of the camera image; whether the correct component is installed; whether each part has a correct shape; whether or not the respective components are mounted in the positions in conformity with the regulations; whether each component has the correct material and color. Here, by comparing with the pre-stored image, it is possible to find out not only whether the correct component is present in the image, but also whether the component is mounted in a correct manner (such as direction, position, angle, etc.). In contrast, detailed problems of the surface of the vehicle assembly, such as scratches, cracks, depressions, protrusions, warpage, curvature, flatness, smoothness, texture, can be captured by the three-dimensional laser scanner. And the clearance, the surface difference and the geometric dimension deviation of each component can be obtained with high precision through three-dimensional scanning.
According to one embodiment of the invention, the point cloud information comprising three-dimensional coordinates and color information is obtained by the camera using stereo vision measurement, which point cloud information can be coupled with the point cloud information comprising three-dimensional coordinates and laser reflection intensity obtained by the three-dimensional laser scanner. Here, the coupling means associating color information and laser reflection intensity information having the same three-dimensional coordinates. This results in a point cloud set which can completely reconstruct the surface conditions of the vehicle assembly, in which corresponding laser reflection intensity and color information is present for the three-dimensional coordinates on each surface. In this way, complete surface information about the vehicle assembly is formed, thereby enabling the reconstruction of a colored three-dimensional model of the vehicle assembly. In this case, for the application of the stereovision measurement, it is preferably possible to apply the stereovision measurement to the images recorded by the monocular camera at different positions on the basis of the mobility of the human-machine-cooperative robot and/or the vehicle combination by means of the transport unit. Alternatively, a binocular camera may also be used.
According to one embodiment of the invention, the detection device comprises a reading device for reading the number of the vehicle assembly and/or the transport unit, the data processing device taking the standard data of the vehicle assembly from the number and comparing it with the acquired data. In this case, knowing the number of the vehicle assembly and/or the transport unit facilitates the archiving of the test data for later retrospective purposes. This embodiment is particularly advantageous in cases involving mixed line production. In a mixed line production system, vehicle assemblies that are typically produced simultaneously on one production line may belong to different vehicle models and/or different configurations of the same vehicle model. For example, instrument panel assemblies are produced for passenger and off-road vehicles one after the other, in particular interspersed with one another, on the same production line. It is also possible that there are minor differences between instrument panel assemblies for the same vehicle model, such as the installation of chrome trim strips at partial seams. At this point, manual inspection may confuse these configurations or even completely fail to take into account these details. According to this embodiment of the present invention, standard data of the corresponding vehicle assembly, such as a standard image, a standard size, a prescribed tolerance, and the like, can be easily acquired by reading the above-mentioned numbers. The data processing device compares the standard data with the actually acquired data, so as to obtain the deviation from the standard data, such as the difference in the position, shape, color and the like of each part in the image and whether the gap at the seam is in accordance with the specification. The reading device may include a scanner, a number reader, an optical sensor with number identification, an RFID tag reader, or the like. In particular, the vehicle number reading device reads the number of the vehicle assembly and/or the transport unit by means of image recognition and/or wireless communication. In particular, it is preferable that the camera provided at the tip of the human-machine cooperative robot is used as the above-mentioned reading device. The construction of the detection device according to the invention is thereby simplified.
According to one embodiment of the invention, the three-dimensional laser scanner scans the vehicle assembly in its entirety or only partially. The entire vehicle assembly can be scanned in a comprehensive manner in order to obtain comprehensive and detailed detection information. However, in order to save detection time and reduce data throughput and memory, it is also possible to spot check only part of the positions of the vehicle assembly.
According to one embodiment of the invention, the detection of the detection device is carried out with the transport unit stopped, a stop position deviation of the vehicle assembly is detected by means of a camera, and the control device for the human-machine-cooperation robot compensates the movement of the human-machine-cooperation robot on the basis of the stop position deviation. For example, a deviation value of a stop position at which the vehicle assembly is parked by the conveying unit from a standard detection position is measured by recognizing an image captured by the camera. According to the deviation value, if the deviation value is zero, the movement and the positioning of the robot-robot cooperation robot during the measurement are carried out according to a preset program and a reference preset position; and if the deviation value is not zero, compensating the deviation value to a track and a reference position preset by the man-machine cooperative robot, and adjusting the positions of the running track and the measuring point in time to ensure that the correct position can be measured.
According to one embodiment of the invention, the detection of the detection device takes place in the event of a movement of the transport unit, the movement of the vehicle assembly is detected by means of a camera, the control device for the robot follows the detected movement at the end of the robot, and/or the data processing device takes the detected movement into account in the data processing. In this case, a camera allows a real-time position detection of the vehicle assembly moving with the conveyor unit. Therefore, the human-computer cooperation robot can track the detected movement, so that the tail end of the human-computer cooperation robot tracks the movement, and the tail end of the human-computer cooperation robot moves synchronously with the vehicle assembly or superposes or deducts the movement on or from the detection movement originally performed by the tail end of the human-computer cooperation robot under the static condition of the vehicle assembly. Here, the superimposition means that the movement performed by the human-machine-cooperation robot end includes not only the detection movement component that should have been performed but also the component of the movement of the vehicle assembly with the transport unit, whereby the human-machine-cooperation robot end will achieve a greater movement than when the vehicle assembly is stationary. Deduction then means, for example, that the movement of the end of the ergonomic robot in one direction can be dispensed with from the original detection movement, since the vehicle assembly is in a uniform movement in this direction. Furthermore, the data processing device takes into account the detected motion in the data processing, in particular compensating for detected movements, vibrations, shaking, etc.
According to one embodiment of the invention, the data processing device checks with one of the camera and the three-dimensional laser scanner when an installation error and/or an appearance defect that is not compliant is determined by the other. By the mode, double check is carried out on the same problem in two different detection data, so that the error in detection is reduced, and the accuracy of detection is improved.
According to one embodiment of the invention, the human-machine-cooperative robot has a camera at its end to acquire images of the vehicle assembly at least two different positions, and the data processing device reconstructs a three-dimensional model of the vehicle assembly from these images and the displacement between the at least two different positions. The stereovision measurement method is likewise used here. Furthermore, a coarse stereoscopic modeling of the vehicle assembly to be detected can be achieved based on only existing monocular cameras. In this way, as many detection tasks as possible are performed with the existing components of the detection device according to the invention to the maximum.
According to one embodiment of the invention, predefined features are determined in the established three-dimensional model of the vehicle assembly, whereby the scanning position of the three-dimensional laser scanner is selected and/or the movement path to be executed by the human-computer cooperative robot for performing the scanning is planned. In particular, the characteristic portion is a portion on the surface where the brightness and/or the gradation is strongly changed. Such locations typically represent edges, gaps, steps, trim strips, etc., and measuring gaps, surface differences, etc., is particularly desirable for these locations. Thus, the detection device according to the invention can find the position to be detected by itself without the need of user specification. Thus, the intellectualization of the detection equipment can be realized.
According to one embodiment of the invention, the detection device comprises a gripper or manipulator arranged at the end of a robot-co-operating robot. In this way, the detection device according to the invention enables not only visual detection but also motion function detection. The gripper or robot is preferably used to actuate vehicle components, such as pressing buttons, turning knobs, opening and closing glove boxes or doors, folding sun visors, toggling ventilation grills, etc. Particularly preferably, sensors are provided in the gripper or robot for measuring the resistance and/or torque fed back when the vehicle assembly is being handled. The motion function of the vehicle assembly can be further evaluated quantitatively by the sensor. The above described gripper or robot arm may be integrated with the camera and the three-dimensional laser scanner at the end of the human-machine cooperative robot in various ways. For example, a gripper or robot may be integrated with the camera and the three-dimensional laser scanner on opposite ends of the rotatable member on the distal end. Furthermore, it is contemplated that the gripper or robot may be pivotally or stowably mounted to the back. So that the gripper or robot does not interfere with the detection of the camera and the three-dimensional laser scanner.
According to one embodiment of the invention, the detection device comprises an archive device for storing data relating to the vehicle assembly. The data archive of the vehicle assemblies can realize that the production file of each vehicle assembly is established, so that the future quality query and quality tracing are facilitated.
According to one embodiment of the invention, the inspection device comprises an alarm device for alerting a user, in particular emitting an optical and/or acoustic signal indicating a disabled state, when an out-of-compliance mounting error and/or an appearance defect is determined. The detection device according to the invention is preferably arranged after the respective vehicle assembly step, in particular in the immediate vicinity of the vehicle assembly site. Thus, the check can be performed immediately after the assembly is completed, so that the assembly can be immediately known to be correct or not. When the wrong component is mounted or not mounted as intended, the alarm device of the detection device immediately feeds back to the mounting person by means of light and/or sound, so that the mounting person can immediately correct the error, for example, replace the correct component or adjust the component. Here, the visual signal may be given by lighting on and off of the light, the color of the light, the dynamic effect of the light, and the like; while the auditory signal may be given by a cue tone, speech, music, etc. The emission of an audible signal is preferred, since the assembler does not have to spend time observing the corresponding light but only has to focus on the next assembly step. According to this embodiment, only vehicle assemblies in which the correct components are installed in a correct manner in conformity with the regulations can be released and assembly can then be continued along the production line. Therefore, the qualification rate can be ensured with extremely high efficiency, the quality of the vehicle assembly is improved, and the working hours consumed by reworking and maintaining are reduced. In particular, the inspection device comprises display means for displaying the inspection result and/or the determined out-of-compliance mounting errors and/or appearance defects with an image to a user. The display displays at least one of: reference image, difference of captured image and reference image, correct component and vehicle configuration. Therefore, an assembler can clearly know the problem of component assembly through the display of the output device, and the component replacement or component modification or reassembly is convenient. In particular, the determined non-compliant mounting errors and/or cosmetic defects are marked on the image captured by the camera on the display device. Preferably, the display means displays at least one of: reference image, difference of detection image and reference image, correct component layout and configuration. Therefore, an assembling person can clearly know the problem of assembly through the display device, and the display device is convenient for the assembling person to replace or refit or reinstall components.
According to a further aspect, the invention also relates to a vehicle assembly production system comprising a transport unit for transporting a vehicle assembly and a detection device according to the invention. It is to be noted that various embodiments and technical effects of the vehicle assembly production system according to the present invention can be referred to the above description of the detection device. The vehicle assembly production system of the present invention enables automated inspection, efficient production, reduced error rates, improved vehicle assembly qualification rates, and reduced labor and time costs.
Drawings
FIG. 1 shows a schematic view of a detection apparatus for a vehicle assembly and a detected vehicle assembly according to the present invention;
FIG. 2 shows a perspective view of a detection apparatus and a vehicle assembly being detected in accordance with the present invention;
FIG. 3 shows a schematic view of one embodiment of the tip of a cooperative human-machine robot of the detection apparatus according to the present invention;
FIG. 4 shows a schematic view of one embodiment of the tip of a cooperative human-machine robot of the detection apparatus according to the present invention;
fig. 5 shows the detection path of the detection device according to the invention during partial scanning of the instrument panel assembly.
Detailed Description
Fig. 1 shows a schematic view of a detection device for a vehicle assembly and a detected vehicle assembly according to the invention. The detection of a vehicle assembly is illustrated in the figures by way of example in an instrument panel assembly 1 for a vehicle. The instrument panel assembly 1 is carried by a transport unit 2 configured as a spreader. The individual instrument panel assemblies 1 are moved by means of the transport unit 2 in succession along the transport direction 9 over the steel beam past the inspection device according to the invention. The transport unit 2 has a motor 3 for driving its movement.
The detection device according to the invention comprises:
a human-machine-cooperation robot 4 for moving around the dashboard assembly 1 transported by the transport unit 2;
a camera 5 mounted at the end of the human-machine-collaboration robot 4 for acquiring images of the dashboard assembly 1;
a three-dimensional laser scanner 6 mounted at the end of the ergonomic robot 4 for three-dimensional scanning of the surface of the dashboard assembly 1;
control means 7 for the human-machine-collaboration robot 4;
a data processing device 8 for processing the data acquired by the camera 5 and the three-dimensional laser scanner 6 for detecting mounting errors and/or cosmetic defects of the dashboard assembly 1;
according to the invention, the scanning position of the three-dimensional laser scanner 6 is selected and/or the moving path of the man-machine cooperation robot 4 is planned according to the image acquired by the camera 5.
The camera 5 is used for collecting the overall layout, configuration, selection, modeling, position, angle, material and color of each part of the vehicle assembly; and/or the three-dimensional laser scanner 6 is used to collect scratches, cracks, depressions, protrusions, warps, radians, flatness, smoothness, texture, gaps between parts, surface differences, and geometric deviations of parts on the surface of the vehicle assembly. In particular, point cloud information comprising three-dimensional coordinates (e.g. xyz) and color information (e.g. RGB) can be obtained by the camera 5 using stereo vision measurement, which can be coupled to the point cloud information comprising three-dimensional coordinates (e.g. xyz) and laser reflection intensity (e.g. I) obtained by the three-dimensional laser scanner 6. This results in a point cloud set which can completely restore the surface conditions of the instrument panel assembly 1, in which the corresponding laser reflection intensity and color information is present for the three-dimensional coordinates on each surface.
In addition, since the camera 5 and the three-dimensional laser scanner 6 are provided on the same tip, double verification can also be realized: the data processing device 8 checks with one of the camera 5 and the three-dimensional laser scanner 6 when an installation error and/or an appearance defect that is not compliant is determined by the other.
In fig. 1, each transport unit 2 is marked with its own number, as indicated at 001 and 125 in the figure. The detection device according to the invention may be provided with reading means for reading such a number, on the basis of which the data processing means 8 acquire the standard data of the instrument panel assembly 1 and compare it with the acquired data. Particularly advantageously, a camera 5 is used as the reading device.
Figure 2 shows a perspective view of a detection apparatus and a vehicle assembly to be detected according to the invention. In this figure, the end 10 of the human-machine-cooperation robot 4 is only schematically shown, and the camera 5 and the three-dimensional laser scanner 6 provided on the end 10 are not shown for clarity reasons.
The vehicle assembly, as an example of an instrument panel assembly 1, is transported along the steel beam via a schematically shown transport unit 2. Here, the instrument panel assembly 1 reaches the inspection station.
After the instrument panel assembly 1 has reached the inspection station, the number of the instrument panel assembly and/or the transport unit can first be read, and the data processing device 8 acquires the standard data, i.e. the standard image, for this instrument panel assembly on the basis of the number. Then, the camera 5 can take a full-view image of the instrument panel assembly 1 at a predetermined position. The data processing device 8 includes an image processing device for performing image comparison, such as a computer, a programmable logic circuit system, an electronic circuit system of a Field Programmable Gate Array (FPGA) or a Programmable Logic Array (PLA), etc., in which a corresponding program or instructions are installed. In the present invention, the image processing apparatus may compare each part of the captured image with the standard image. The comparison result may include that the acquired image is consistent or inconsistent with the standard image. Advantageously, only feature comparisons are made in the comparison, said features comprising: icons, characters, bar codes, numbers, lines, corners of vehicle assemblies; size, shape, contour, position, orientation of a component; as well as the distance, angle, connection, etc. of the components relative to the rest of the vehicle. In this way, targeted testing can be achieved by extracting only these simple features from the captured image. For example, when inspecting the dashboard, since differently configured dashboards may carry different patterns, numbers and/or characters, it is possible to verify whether the correct dashboard is fitted according to the vehicle configuration and whether the correct dashboard is fitted in the correct manner by comparing these features of the dashboard. Particularly preferably, in the above-described image processing, the images are compared by means of a gradation process on the captured images. The gray-scale processing of the captured image is particularly advantageous here, since the data volume of the gray-scale image is greatly reduced compared to the color image, reducing the requirements on the computing performance of the comparison device. In addition, the gray-scale map can be processed uniformly, and deviation caused by light environment change is prevented. In the case of comparison with a grayscale map, this can be done simply by extracting pixels with grayscale values in a threshold range as the features to be compared, in particular the features mentioned above for the components. Likewise, corresponding features may also be extracted by gray scale gradient changes within the image. Therefore, the gray processing of the collected image can simplify calculation, reduce the calculation performance requirement, and improve the verification speed and the verification accuracy.
It should be noted that the transport unit 2 may stop the instrument panel assembly 1 at the inspection station while the inspection is being performed or may make it pass through the inspection station slowly.
In the case of the stop of the transport unit, the deviation of the stop position of the dashboard assembly 1 from the standard detection position is detected by the camera 5, and the control device 7 for the human-machine-cooperation robot 4 compensates the movement of the human-machine-cooperation robot 4 performed in the detection according to the stop position deviation.
However, the detection of the detection device may be performed when the transport unit 2 is moving, the camera 5 captures the movement of the instrument panel assembly 1, the control device 7 for the ergonomic robot 4 causes the end 10 of the ergonomic robot 4 to track the detected movement, and/or the data processing device 8 takes into account the detected movement in data processing, for example, the control device 7 causes the end 10 of the ergonomic robot 4 to move synchronously with the instrument panel assembly 1 or superimposes or subtracts the movement on or from the detection movement that the end 10 of the ergonomic robot 4 should originally perform when the vehicle assembly is stationary. Furthermore, the data processing device 8 takes into account the detected movement in the data processing, in particular compensating for movements, vibrations, jolts, etc. of the instrument panel assembly 1 detected by the camera 5.
Fig. 3 shows a schematic view of an embodiment of the tip 10 of the co-operating robot 4 of the detection device according to the invention. A camera 5 and a three-dimensional laser scanner 6 arranged side by side are schematically shown on a movable end 10 of the human-machine-cooperation robot 4. Here, the camera 5 is configured as an industrial camera, and the three-dimensional laser scanner 6 includes a laser light source, a scanner, a light receiving sensor, a control unit, and the like, and can use the time-of-flight distance measurement or the principle of triangulation distance measurement. The camera 5 and the three-dimensional laser scanner 6 are not limited to the arrangement side by side in the left and right in the drawing, and an arrangement side by side in the up and down direction is also conceivable, or respective components of the three-dimensional laser scanner 6 are separately arranged, for example, a laser light source is arranged on one side of the camera 5 and a light receiving sensor is arranged on the other side of the camera 5.
Fig. 4 shows a schematic view of an embodiment of the tip 10 of the co-operating robot 4 of the detection device according to the invention. Fig. 4 differs from the human-machine-cooperation robot 4 shown in fig. 3 in that the detection device comprises a schematically shown gripper 11 arranged at the end 10 of the human-machine-cooperation robot 4. The gripper can be used for realizing actions of gripping objects and the like. The gripper 11 is integrated with the camera 5 and the laser scanner 6 on the same component, which can be rotated about an axis 12. By the rotation, the gripper 11 can be moved out of the storage position and constitutes an end effector of the robot-cooperative robot 4. The gripper 11 is used to perform a manoeuvre for the vehicle assembly. In particular, sensors, not shown, are provided in the gripper 11 for measuring the resistance and/or torque fed back when the vehicle assembly is being handled.
Upon detection, the three-dimensional laser scanner 6 can scan the vehicle assembly as a whole. At this time, the control device 7 of the human-machine cooperative robot 4 can control the tip end 10 thereof to pass over the entire surface of the instrument panel assembly 1 in a reciprocating staggered manner. Thereby, the three-dimensional information of the entire surface of the instrument panel assembly 1 can be comprehensively understood.
Fig. 5 shows a detection path of the detection device according to the present invention when partially scanning the instrument panel assembly 1. Here, for efficient detection efficiency, the detection device scans the local gap with the three-dimensional laser scanner 6 only on individual areas on the instrument panel assembly 1. The positions that need to be scanned are marked with circular marks in fig. 5. In addition, the position of the glove box switch requiring the opening operation by the gripper 11 is indicated by a square mark. Upon detection, the human-machine-cooperation robot 4 stops its tip 10 over the position at these marks, scans the aligned gap with the three-dimensional laser scanner 6 or manipulates a switch with the gripper 11. After the scanning of one position is completed, the human-machine cooperation robot 4 moves the end 10 to the next position along the moving path shown by the arrow in fig. 5 to continue the scanning.
It should be noted in particular that, on the basis of the mobility of the human-machine-cooperation robot 4, a stereoscopic measurement can also be achieved in the case of only one camera 5, in that the human-machine-cooperation robot 4 has the camera 5 at its end 10 pick up images of the instrument panel assembly 1 in at least two different positions, and the data processing device 8 reconstructs a three-dimensional model of the instrument panel assembly 1 from these images and the displacement between the at least two different positions. The predefined features are preferably determined in a three-dimensional model of the built vehicle assembly. The feature may be a region on the surface where the brightness and/or gray scale changes strongly. The circles marked in fig. 5 represent these features derived from the three-dimensional model. After the feature portion to be detected is determined, the scanning position of the three-dimensional laser scanner 6 may be determined, and a movement path, such as a movement path indicated by an arrow in fig. 5, which the human-machine-cooperation robot 4 needs to perform for the scanning may be planned.
Although not shown in the above figures, the detection device according to the invention may also comprise archiving means, alarm means and/or display means. Wherein the archive device is for storing data relating to the vehicle assembly. In particular, the data acquired with the three-dimensional laser scanner 6 are recorded in combination with the data acquired with the camera 5 in an archive device. The alarm means may be used to alert a user when an out-of-compliance mounting error and/or appearance defect is determined. The display means may be adapted to display the detection results and/or the determined out-of-compliance mounting errors and/or appearance defects to a user with an image.
The invention is not limited to the embodiments shown but comprises or extends to all technical equivalents that may fall within the scope and spirit of the appended claims. The positional references selected in the description, such as, for example, upper, lower, left, right, etc., refer to the direct description and to the illustrated drawings and can be transferred to new positions in the event of a change in position.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to limit the present invention, and those skilled in the art can make variations and modifications of the present invention without departing from the spirit and scope of the present invention by using the methods and technical contents disclosed above.
The features disclosed in the present document can be implemented both individually and in any combination. It is further noted that the various drawings of the invention are schematic and may not be shown to scale. The number, configuration and/or arrangement of the components in the various embodiments is also not limited to the examples shown. The values listed in the description are reference values only, which may be exceeded or fallen below when the dimensions are suitably selected.

Claims (20)

1. A detection apparatus for a vehicle assembly, the detection apparatus comprising:
-a human-machine-cooperative robot for moving around a vehicle assembly transported by the transport unit;
-a camera mounted at the end of the human-machine-cooperative robot for capturing images of the vehicle assembly;
-a three-dimensional laser scanner mounted at the end of the human-machine-cooperative-robot for three-dimensional scanning of the surface of the vehicle assembly;
-control means for a human-machine-cooperative robot;
-data processing means for processing data acquired by the camera and the three-dimensional laser scanner for detecting mounting errors and/or appearance defects of the vehicle assembly;
and selecting the scanning position of the three-dimensional laser scanner and/or planning the moving path of the robot-robot cooperative robot according to the image acquired by the camera.
2. The inspection apparatus according to claim 1, wherein the camera is used to capture the overall layout, configuration, selection of parts, shape, position, angle, material, color of the vehicle assembly; and/or collecting scratches, cracks, depressions, protrusions, warps, radians, flatness, smoothness, texture, gaps between parts, surface differences, geometric dimensional deviations of parts on the surface of the vehicle assembly using the three-dimensional laser scanner.
3. Detection device according to one of the preceding claims, characterized in that point cloud information comprising three-dimensional coordinates and color information is obtained by the camera using stereo vision measurements, which point cloud information can be coupled with point cloud information comprising three-dimensional coordinates and laser reflection intensity obtained by a three-dimensional laser scanner.
4. Detection device according to one of the preceding claims, characterized in that the detection device comprises reading means for reading the number of the vehicle combination and/or the transport unit, the data processing means taking the standard data of the vehicle combination on the basis of the number and comparing it with the acquired data.
5. A testing device according to claim 4, characterized in that said camera is used as said reading means.
6. Detection device according to one of the preceding claims, characterized in that the three-dimensional laser scanner scans the vehicle package in its entirety or only partially.
7. Detection device according to one of the preceding claims, characterised in that the detection of the detection device is carried out with the transport unit stopped, a stop position deviation of the vehicle package is detected by means of a camera, and the control means for the human-machine-cooperative robot compensate the movement of the human-machine-cooperative robot on the basis of the stop position deviation.
8. A detection apparatus according to one of the claims 1 to 6, characterized in that the detection of the detection apparatus is performed in the event of a movement of the transport unit, the movement of the vehicle package is captured by means of a camera, the control means for the human-machine-cooperative robot cause the tip of the human-machine-cooperative robot to track the detected movement, and/or the data processing means take the detected movement into account in the data processing.
9. Inspection device according to one of the preceding claims, characterized in that the data processing means check with one of the camera and the three-dimensional laser scanner when an out-of-compliance mounting error and/or an appearance defect is determined by the other.
10. Detection apparatus according to any one of the preceding claims, the robot cooperating with the robot having a camera at its distal end to capture images of the vehicle assembly at least two different positions, the data processing means being arranged to reconstruct a three-dimensional model of the vehicle assembly from the images and the displacement between said at least two different positions.
11. The inspection apparatus of claim 10, wherein predefined features are determined in the established three-dimensional model of the vehicle assembly, thereby selecting a scanning position of the three-dimensional laser scanner and/or planning a path of movement to be performed by the human-machine-cooperative robot for performing the scanning.
12. The inspection apparatus of claim 11, wherein the feature is a region on the surface where brightness and/or gray scale changes strongly.
13. A testing device according to any one of the preceding claims, wherein said testing device comprises a gripper or manipulator arranged at the end of a robot-co-operating robot.
14. A testing device according to claim 13 wherein the vehicle combination is manoeuvred by means of said gripper or robot.
15. A testing device according to claim 14, wherein sensors are provided in the gripper or robot for measuring resistance and/or torque fed back when manipulating the vehicle combination.
16. A testing device according to any one of the preceding claims, wherein said testing device comprises archiving means for storing data relating to the vehicle assembly.
17. A testing device according to claim 16, characterized in that the data acquired with the three-dimensional laser scanner are recorded in combination with the data acquired with the camera in the archiving means.
18. A testing device according to any one of the preceding claims, wherein said testing device comprises warning means for warning a user when an out-of-compliance mounting error and/or appearance defect is determined.
19. Inspection apparatus according to any preceding claim, characterised in that the inspection apparatus includes display means for displaying the inspection results and/or the determined out-of-compliance mounting errors and/or cosmetic defects graphically to a user.
20. A vehicle assembly production system, comprising
A transport unit for transporting the vehicle assembly; and
a test device according to any one of claims 1 to 19.
CN201910822121.7A 2019-09-02 2019-09-02 Vehicle assembly detection device and vehicle assembly production system Active CN112444283B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910822121.7A CN112444283B (en) 2019-09-02 2019-09-02 Vehicle assembly detection device and vehicle assembly production system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910822121.7A CN112444283B (en) 2019-09-02 2019-09-02 Vehicle assembly detection device and vehicle assembly production system

Publications (2)

Publication Number Publication Date
CN112444283A true CN112444283A (en) 2021-03-05
CN112444283B CN112444283B (en) 2023-12-05

Family

ID=74734787

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910822121.7A Active CN112444283B (en) 2019-09-02 2019-09-02 Vehicle assembly detection device and vehicle assembly production system

Country Status (1)

Country Link
CN (1) CN112444283B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114427836A (en) * 2022-02-10 2022-05-03 上汽通用五菱汽车股份有限公司 Method for controlling dimensional precision of vehicle body process

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106338245A (en) * 2016-08-15 2017-01-18 南京工业大学 Workpiece noncontact mobile measurement method
CN107507274A (en) * 2017-08-30 2017-12-22 北京图航科技有限公司 A kind of quick restoring method of public security criminal-scene three-dimensional live based on cloud computing
CN107643293A (en) * 2017-08-15 2018-01-30 广东工业大学 The outgoing detector and system of a kind of automotive seat
CN107860338A (en) * 2017-12-08 2018-03-30 张宇航 Industrial automation three-dimensional detection system and method
CN108332660A (en) * 2017-11-10 2018-07-27 广东康云多维视觉智能科技有限公司 Robot three-dimensional scanning system and scan method
CN108466265A (en) * 2018-03-12 2018-08-31 珠海市俊凯机械科技有限公司 Mechanical arm path planning and operational method, device and computer equipment
CN109493422A (en) * 2018-12-28 2019-03-19 国网新疆电力有限公司信息通信公司 A kind of substation's 3 D model construction method based on three-dimensional laser scanning technique
US20190101889A1 (en) * 2016-06-01 2019-04-04 Carl Zeiss Industrielle Messtechnik Gmbh Method for identifying a workpiece, determining a measurement sequence, and measuring a workpiece with a measurement device
CN110081821A (en) * 2019-05-09 2019-08-02 西南交通大学 Intelligent high-speed rail white body assembling quality detection device and its method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190101889A1 (en) * 2016-06-01 2019-04-04 Carl Zeiss Industrielle Messtechnik Gmbh Method for identifying a workpiece, determining a measurement sequence, and measuring a workpiece with a measurement device
CN106338245A (en) * 2016-08-15 2017-01-18 南京工业大学 Workpiece noncontact mobile measurement method
CN107643293A (en) * 2017-08-15 2018-01-30 广东工业大学 The outgoing detector and system of a kind of automotive seat
CN107507274A (en) * 2017-08-30 2017-12-22 北京图航科技有限公司 A kind of quick restoring method of public security criminal-scene three-dimensional live based on cloud computing
CN108332660A (en) * 2017-11-10 2018-07-27 广东康云多维视觉智能科技有限公司 Robot three-dimensional scanning system and scan method
CN107860338A (en) * 2017-12-08 2018-03-30 张宇航 Industrial automation three-dimensional detection system and method
CN108466265A (en) * 2018-03-12 2018-08-31 珠海市俊凯机械科技有限公司 Mechanical arm path planning and operational method, device and computer equipment
CN109493422A (en) * 2018-12-28 2019-03-19 国网新疆电力有限公司信息通信公司 A kind of substation's 3 D model construction method based on three-dimensional laser scanning technique
CN110081821A (en) * 2019-05-09 2019-08-02 西南交通大学 Intelligent high-speed rail white body assembling quality detection device and its method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
谈国新: "《文化资源与产业文库 民族文化资源数字化与产业化开发》", pages: 102 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114427836A (en) * 2022-02-10 2022-05-03 上汽通用五菱汽车股份有限公司 Method for controlling dimensional precision of vehicle body process

Also Published As

Publication number Publication date
CN112444283B (en) 2023-12-05

Similar Documents

Publication Publication Date Title
CN108291880B (en) Arrangement, method, device and software for checking containers
JP4862765B2 (en) Surface inspection apparatus and surface inspection method
CA2554641C (en) Method for planning an inspection path and for determining areas to be inspected
EP3388781B1 (en) System and method for detecting defects in specular or semi-specular surfaces by means of photogrammetric projection
US10976262B2 (en) Mobile and automated apparatus for the detection and classification of damages on the body of a vehicle
US7639349B2 (en) Method and system for inspecting surfaces
US8050486B2 (en) System and method for identifying a feature of a workpiece
US20130057678A1 (en) Inspection system and method of defect detection on specular surfaces
EP2998927B1 (en) Method for detecting the bad positioning and the surface defects of specific components and associated detection device
US20210255117A1 (en) Methods and plants for locating points on complex surfaces
US20200055558A1 (en) Automobile manufacturing plant and method
JP2007523334A (en) Defect position identification method and marking system
EP3775854B1 (en) System for the detection of defects on a surface of at least a portion of a body and method thereof
CN111971522B (en) Vehicle detection system
CN112444283A (en) Detection apparatus for vehicle assembly and vehicle assembly production system
JP2786070B2 (en) Inspection method and apparatus for transparent plate
US9485470B2 (en) Evaluation unit, evaluation method, measurement system for a crash test vehicle measurement and a method for performing a crash test vehicle measurement
CA3235422A1 (en) Systems and methods for controlled cleaning of vehicles
CN115963113A (en) Workpiece glue tank gluing detection method and system
CN116258667A (en) Analytical digital twins for non-destructive inspection of vehicles
US20060280355A1 (en) Method of evaluating and designing sealing elements
WO2021176386A1 (en) Method and systems for the detection and classification of defects on surfaces
JP2012141223A (en) Surface flaw detecting and indicating system and work piece manufacturing method involving surface treatment
JP2007014855A (en) Coating condition inspection method and coating condition testing apparatus
CN115203815A (en) Production speed component inspection system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant