WO2017128865A1 - 一种基于多镜头的智能机械手及定位装配方法 - Google Patents

一种基于多镜头的智能机械手及定位装配方法 Download PDF

Info

Publication number
WO2017128865A1
WO2017128865A1 PCT/CN2016/108868 CN2016108868W WO2017128865A1 WO 2017128865 A1 WO2017128865 A1 WO 2017128865A1 CN 2016108868 W CN2016108868 W CN 2016108868W WO 2017128865 A1 WO2017128865 A1 WO 2017128865A1
Authority
WO
WIPO (PCT)
Prior art keywords
assembled
positioning
image
computer
cameras
Prior art date
Application number
PCT/CN2016/108868
Other languages
English (en)
French (fr)
Inventor
杜娟
谭健胜
冯颖
Original Assignee
华南理工大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华南理工大学 filed Critical 华南理工大学
Priority to US15/781,856 priority Critical patent/US10899014B2/en
Publication of WO2017128865A1 publication Critical patent/WO2017128865A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages
    • H05K13/081Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
    • H05K13/0812Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines the monitoring devices being integrated in the mounting machine, e.g. for monitoring components, leads, component placement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/005Manipulators for mechanical processing tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/081Touching devices, e.g. pressure-sensitive
    • B25J13/084Tactile sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/06Programme-controlled manipulators characterised by multi-articulated arms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/04Mounting of components, e.g. of leadless components
    • H05K13/0404Pick-and-place heads or apparatus, e.g. with jaws
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages
    • H05K13/081Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
    • H05K13/0815Controlling of component placement on the substrate during or after manufacturing
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages
    • H05K13/082Integration of non-optical monitoring devices, i.e. using non-optical inspection means, e.g. electrical means, mechanical means or X-rays
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K3/00Apparatus or processes for manufacturing printed circuits
    • H05K3/30Assembling printed circuits with electric components, e.g. with resistor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30141Printed circuit board [PCB]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/04Mounting of components, e.g. of leadless components
    • H05K13/0404Pick-and-place heads or apparatus, e.g. with jaws
    • H05K13/0408Incorporating a pick-up tool
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K2203/00Indexing scheme relating to apparatus or processes for manufacturing printed circuits covered by H05K3/00
    • H05K2203/01Tools for processing; Objects used during processing
    • H05K2203/0195Tool for a process not provided for in H05K3/00, e.g. tool for handling objects using suction, for deforming objects, for applying local pressure

Definitions

  • the invention relates to the field of electronic assembly, and in particular relates to a multi-lens based intelligent manipulator and a positioning assembly method.
  • the present invention provides a multi-lens based intelligent manipulator and positioning matching method.
  • a multi-lens-based intelligent robot comprising a multi-joint multi-function manipulator, a CCD camera for collecting an image of a PCB to be assembled, a bio-contact device and a computer, the CCD camera being mounted on a multi-joint multi-function manipulator, bio-touch
  • the pointing device is mounted on the tip of the finger of the multi-joint multi-function manipulator, and the CCD camera, the bio-contact device, and the multi-joint multi-function manipulator are connected to the computer.
  • the CCD camera is specifically two, which are respectively mounted on the left and right sides of the forearm of the multi-joint multifunctional manipulator.
  • the biocontact device includes a patch made of a sensitive material and a signal measuring circuit for measuring the deformation of the patch and outputting an electrical signal, the patch covering the finger tip of the robot, the patch and signal measurement
  • the circuit is connected, and the signal measuring circuit is connected to a computer, and the signal measuring circuit is built in the multi-joint multi-function manipulator.
  • the shape of the patch is a finger sleeve type.
  • the sensitive material is specifically a conductive rubber.
  • the models and parameters of the two CCD cameras are identical, the coordinate systems of the two CCD cameras are coplanar, and the coordinate axes are placed in parallel.
  • a positioning assembly method for an intelligent robot includes the following steps:
  • the S2 computer matches the obtained two PCB images with the pre-entered PCB template in the computer to determine the target area to be assembled in the two images;
  • S3 uses the binocular positioning algorithm to measure the distance between the target area to be assembled and the mechanical finger tip, and the computer control robot moves to the target area to be assembled to complete the preliminary positioning;
  • S4 uses the bio-contact device of the mechanical finger tip to make contact with the target area after initial positioning in S3, and outputs the deformation electric signal generated by the patch to the computer through the signal measuring circuit, and the computer performs fine adjustment of the position to achieve accurate positioning and completion. assembly.
  • the S2 is specifically: filtering and denoising the image of the PCB to be assembled acquired by the two CCD cameras, and then extracting the edge portion of the PCB by using the edge detection operator to remove the background portion, and then enhancing the image by using the piecewise linear transformation. To improve the contrast of the image, and finally use the template matching method based on the gray value to determine the target area to be assembled.
  • the S3 uses a binocular positioning algorithm to measure the distance between the target area to be assembled and the mechanical finger tip, and the computer control robot moves to the target area to be assembled to complete the preliminary positioning, specifically:
  • S3.1 uses Zhang Zhengyou calibration algorithm to solve the internal parameter matrix and external parameter matrix of the CCD camera, and then performs binocular stereo calibration to determine the relative positional relationship between the two cameras.
  • the relative positional relationship includes the rotation matrix R and the translation vector. T;
  • S3.2 uses pixel matching matching based on template matching to complete pixel matching of two images
  • S3.3 uses the binocular positioning algorithm, ie the binocular ranging system imaging principle, to measure the distance l between the target area to be assembled and the mechanical finger tip;
  • the distance between the projection centers of the left and right cameras is the baseline distance B
  • the binocular ranging system composed of left and right cameras parallel to the two optical axes, respectively
  • the A1 point and the right CCD imaged on the left CCD image plane are respectively formed.
  • the position of the A2 point on the image plane is x left and x right on the left and right image planes respectively.
  • the focal length of the two cameras is f
  • x is the target point A.
  • the image is imaged on the left and right CCD image planes by binocular cameras. Poor position.
  • the S4 is specifically: the bio-contact device of the mechanical finger tip is in contact with the PCB to be assembled after the preliminary positioning is completed, and the sensitive material is deformed to be subjected to pressure change, the output electrical signal is changed, and the computer according to the output signal and The mathematical model of pressure and position adjusts the position of the manipulator to achieve precise positioning of the shaped part.
  • the invention can complete the automatic and precise positioning of the area to be assembled of the PCB board without manual intervention, and complete the assembly work of the special-shaped parts;
  • the invention adopts two CCD cameras to separately collect images of a PCB board for processing, which satisfies the requirements of rapid and real-time requirements of industrial electronic assembly;
  • the invention adds a bio-tactile device in the robot hand, can accurately position the assembly target area, and improves the accuracy of the assembly of the special-shaped parts;
  • the present invention uses a binocular positioning algorithm in the digital image processing technology in the positioning of the robot, and can effectively measure the distance between the robot and the target assembly area;
  • the invention utilizes the method of template matching to determine the target area of the shaped part assembly, and improves the accuracy of the subsequent positioning work.
  • FIG. 1 is a schematic structural view of a smart manipulator of the present invention
  • FIG. 2 is a schematic structural view of a biocontact device of the present invention
  • Figure 3a is a searched view of the template matching method of the present invention.
  • Figure 3b is a schematic diagram of a matching template of the present invention.
  • Figure 4 is a flow chart of the operation of the present invention.
  • a multi-lens-based intelligent robot includes a multi-joint multi-function manipulator 1 , a CCD camera 2 for collecting an image of a PCB to be assembled, a bio-contact device 3 and a computer, and the CCD camera is specifically Two, respectively installed on the left and right sides of the manipulator's forearm, the camera's
  • the specific position can be installed in different parts of the robot according to actual conditions, and the bio-contact device can convert the position information of the area to be assembled into an electrical signal and feed back to the computer.
  • the number of biocontacts can be adjusted according to actual needs.
  • the computer is responsible for processing the acquired image and the feedback electrical signal and issuing a control signal to direct the robot to complete the final positioning and assembly of the shaped piece.
  • the robot has a control circuit of the robot, and the control circuit is connected with the computer.
  • the model and parameters of the two CCD cameras used in the invention are kept consistent, and the optical axes are kept parallel, the coordinate system of the binocular camera is coplanar, and the coordinate axes are placed in parallel, and the size and scale of the left and right images are synchronously acquired, and the gray scale information of the image is maintained. More complete,
  • the biocontact device includes a patch 5 made of a sensitive material and a signal measuring circuit 4 for measuring the deformation of the finger sleeve and outputting an electrical signal, the patch 5 covering the finger tip of the robot
  • the patch is connected to the signal measuring circuit, and the signal measuring circuit 4 is connected to a computer, and the signal measuring circuit is built in the multi-joint multi-function manipulator.
  • the patch is selected from the finger sleeve type, and three patches are used, which are placed on the finger tip of the robot.
  • the sensitive material has a pressure-electric output characteristic that can convert the received pressure into an electrical signal output.
  • the shape of the patch is a finger sleeve type, and the sensitive material used is a conductive rubber.
  • the biocontact device made of the sensitive material is generally mounted on the fingertip of the robot when the fingertip contacts the PCB. At the time of the plate, the pressure of the sensitive material on the biocontact device changes due to the deformation, resulting in a change in the output electrical signal.
  • the output electrical signal which is changed by the deformation of the biocontact device can be measured and transmitted as an output signal to the computer, and the computer can obtain the position of the current robot according to the mathematical model obtained in advance, and output corresponding control. The signal is adjusted.
  • Figure 3a is a searched map
  • Figure 3b is a matching template.
  • the matching template is placed on the searched image for translation.
  • the searched image under the template overlay is a subgraph, and the content in the subgraph and the matching template is compared. If the similarity measure of the two is the largest, the content of the two is the same.
  • the subgraph at this time is the matching area to be searched.
  • the positioning assembly method implemented by the robot includes the following steps:
  • S1 acquires an image of the PCB to be assembled.
  • the PCB board to be assembled is transported to the front of the robot by the assembly line, and the brightness of the light source is adjusted.
  • the image of the standard PCB board is obtained by the left and right CCD cameras respectively, and the CCD camera mounted on the left side becomes the left CCD camera, and the image is called the left image.
  • the image taken by the right CCD camera is called the right image.
  • the S2 template matches to find the target area to be assembled.
  • the image captured by the camera includes the PCB to be assembled, called the foreground, and the background part is also included in the image.
  • the image is Image segmentation processing. Since the target (PCB board) is a regular geometry, The edge detection portion can be used to extract the edge portion of the PCB to take the background portion.
  • the details and edges of the image are highlighted and strengthened, which is beneficial to the template matching of the target area to be assembled later. It is necessary to enhance the image of the PCB image, improve the contrast of the image, and set the original gray value interval.
  • the range is converted according to a certain mapping relationship, thereby achieving the effect of enhancing the contrast between the background image and the target image.
  • the present invention employs a piecewise linear transform enhancement method. Let the gradation function of the original image be f(r,c), the gradation range be [0,M f ], the transformed image function be represented as g(r,c), and the gradation range is [0,M g ]
  • the transformation formula can be expressed as:
  • Template matching is a process of finding a template image in an image to be searched by calculating a similarity measure of the template image and the image to be searched.
  • the process of template matching can be expressed as follows: firstly, the similarity measure of the template image and the image to be searched is calculated by pixel, and then the largest similar metric area is found as the matching position, and the principle is shown in FIG. 3a and FIG. 3b.
  • the present invention adopts a template matching method based on the gray value.
  • the template matching method based on the gray value uses the gray value of the whole image as the similarity measure, and searches for the qualified area in the image to be searched in the order from top to bottom and left to right by using the defined search strategy. Set a search window of the specified size and perform a search comparison in the search window.
  • the position of the object in the image to be searched can be described by translation.
  • the template is represented by the image t(r,c), where the region of interest is designated as T, and the template matching is to translate the template region of interest T in a certain order in the image to be matched, and then calculate the region and template sense in the image to be matched.
  • the similarity measure s of the region of interest. The similarity measure is described by:
  • s(r,c) represents the similarity measure calculated based on the gray value
  • t(u,v) represents the gray value of each point in the template
  • f(r+u,c+v) indicates that the template region of interest moves to The gray value of the current position of the image.
  • S3 uses the binocular positioning algorithm to measure the distance between the target area to be assembled and the mechanical finger tip, and the computer control robot moves to the target area to be assembled to complete the preliminary positioning;
  • A is the internal parameter matrix of the camera.
  • M is determined by the position and direction of the camera relative to the world coordinate system. It is independent of the camera's internal parameter map pinhole camera model. M is called the camera external parameter matrix, and the camera is solved by Zhang Zhengyou calibration algorithm. Internal parameter matrix and external parameter matrix.
  • a binocular stereo calibration is performed to calculate the relative position between the left and right cameras, that is, the external parameters, that is, the relative positional relationship between the left and right cameras, including the rotation matrix. R and translation vector T.
  • the three-dimensional world coordinates of the point P on the checkerboard scale are Xw
  • the images are acquired simultaneously using two cameras.
  • the coordinates of the P point in the left and right camera coordinate systems are X L and X R respectively
  • the outer parameter matrices of the left and right cameras are respectively ( R L , t L ) and (R R , t R ), according to the conversion relationship between the world coordinate system and the camera coordinate system are:
  • the relationship parameters of the stereo camera in the binocular vision system that is, the positional relationship of the stereo camera, can be obtained, and the calibration of the binocular camera is completed.
  • S3.2 uses pixel matching matching based on template matching to complete pixel matching of two images
  • the left and right camera models and parameters used in the invention are kept consistent, and the optical axes are kept parallel, the binocular camera coordinate system is coplanar, and the coordinate axes are placed in parallel, and the left and right image sizes and ratios acquired simultaneously are consistent, and the gray scale information of the images is kept compared.
  • the image of the target area to be completed and to be assembled has been determined in S2. Therefore, the present invention performs high-precision matching of target image regions by using a gray-scale cross-correlation matching method based on template matching.
  • the normalized cross-correlation matching algorithm judges whether to match according to the cross-correlation function established between the template image and the search sub-picture on the image to be matched, and uses the cross-correlation function expression as follows:
  • the template image is T(m, n), and the template image size is M ⁇ N.
  • the search image area with the (i,j) as the upper left pixel on the reference image is Si,j(m,n)
  • the image matching is to match the target image area, that is, the upper left pixel of the template.
  • the value of the cross-correlation function value N(i,j) is 0 ⁇ N(i,j) ⁇ 1, and the magnitude of its value depends on the search image with the (i,j) as the upper left pixel on the reference image.
  • the degree to which the area matches the template image. The larger the value of the cross-correlation function corresponding to a pixel, the higher the matching degree of the pixel, and the pixel with the largest value of the cross-correlation function is the most matching pixel.
  • the original image is obtained according to the binocular ranging system.
  • the left and right CCD cameras of the same model are placed in parallel on the same plane, and the distance between the projection centers of the left and right cameras is the baseline distance B.
  • the target point A passes through the binocular ranging system consisting of left and right cameras with two optical axes parallel, it is imaged on the left CCD image plane at the A1 point and the right CCD image plane at the A2 point, and its position on the left and right image planes. They are x lef t and x right respectively . It is known that the focal lengths of both cameras are f, and the measured distance l can be derived according to the triangle similarity principle:
  • x is the point difference between the image points on the left and right CCD image planes that are imaged by the binocular camera at point A, and is also called binocular parallax. It can be seen that in the ideal state in which the optical axes of the binocular cameras are strictly parallel, and the image of the target object is acquired at the same time, the corresponding position of the same target in the left and right CCD images is determined by the image matching algorithm, and the binocular parallax x is calculated, and the focal length and the known focal length are The baseline size can be obtained by the above formula, and the initial positioning of the target area to be assembled is completed.
  • S4 uses the bio-contact device of the mechanical finger tip to make contact with the target area after initial positioning in S3, and outputs the deformation electric signal generated by the patch to the computer through the signal measuring circuit, and the computer performs fine adjustment of the position to achieve accurate positioning and completion. assembly.
  • the robot When the robot is assembled, it will come into contact with the PCB, and the fingertips of the robot will be subjected to pressure.
  • the pressure When the shaped component is in exact contact with the area to be assembled and the shaped component is in contact with other areas of the PCB, the pressure is different. Therefore, a mathematical model of pressure and position can be established to adjust the manipulator through the pressure applied by the fingertip. The position to complete the precise positioning of the target area to be assembled.
  • the invention adopts a sensitive material to design a bio-contact device having a pressure-electric output characteristic, which can convert the received pressure into an electrical signal output.
  • the biocontact device made of the sensitive material is generally mounted on the fingertip of the manipulator, and when the fingertip contacts the PCB board, the pressure of the conductive material on the biocontact device changes due to deformation. This results in a change in the output electrical signal.
  • the robot completes the initial positioning of the area to be assembled in the step S3, moves the robot to the area, and finely adjusts the position of the manipulator according to the output signal of the biocontact device on the fingertip after the robot contacts the PCB board, and finally achieves precision. Position and complete the assembly of the profiled part.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Manufacturing & Machinery (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Signal Processing (AREA)
  • Operations Research (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

一种基于多镜头的智能机械手及定位装配方法,包括多关节多功能机械手(1)、用于采集待装配PCB板图像的CCD摄像机(2)、生物触点装置(3)及计算机,CCD摄像机安装在多关节多功能机械手上,生物触点装置安装在多关节功能机械手的手指尖,CCD摄像机、生物触点装置及多关节多功能机械手与计算机连接。智能机械手在双目定位的基础上,在机械手中加入了生物触点装置,能对装配目标区域进行精确定位,提高了异形件装配的准确率。利用的是生物触点装置中的敏感材料的压力-电输出转换关系以及机械手位置与压力的数学模型。智能机械手及定位装配方法实现了异形元器件的高精度和实时性装配,提高了电子装配工业的生产效率。

Description

一种基于多镜头的智能机械手及定位装配方法 技术领域
本发明涉及电子装配领域,具体涉及一种基于多镜头的智能机械手及定位装配方法。
背景技术
目前,在机械电子装配行业中,把异形元器件安装到PCB上往往依靠流水线上的工人进行手动装配。人力装配主要是依靠人眼和经验,在长时间的工作后无法保证生产效率,而且手动装配带有一定的主观性,也不能维持每件产品的同等质量,因此把自动化技术引入到电子装配业迫在眉睫。将自动化技术取代人力装配能有效地提高生产效率,节省人力资源,保证装配的质量。
相关企业已经利用机械手代替人手在装配流水线进行工作,但由于机械手没有人的双眼来进行定位,也不像工人可以依靠手部的触感经验来提示装配,因此机械手的装配准确率反而不如人力装配,生产效率无法得到提高。
发明内容
为了克服现有技术存在的缺点与不足,本发明提供一种基于多镜头的智能机械手及定位匹配方法。
本发明采用如下技术方案:
一种基于多镜头的智能机械手,包括多关节多功能机械手、用于采集待装配PCB板图像的CCD摄像机、生物触点装置及计算机,所述CCD摄像机安装在多关节多功能机械手上,生物触点装置安装在多关节多功能机械手的手指尖,所述CCD摄像机、生物触点装置及多关节多功能机械手与计算机连接。
所述CCD摄像机具体为两个,分别安装在多关节多功能机械手前臂的左侧及右侧。
所述生物触点装置包括用敏感材料制成的贴片及用于测量贴片形变并输出电信号的信号测量电路,所述贴片覆盖在机械手的手指尖上,所述贴片与信号测量电路连接,信号测量电路与计算机连接,所述信号测量电路内置在多关节多功能机械手内部。
所述贴片的形状为指套式。
所述敏感材料具体为导电橡胶。
所述两个CCD摄像机的型号及参数完全相同,两个CCD摄像机的坐标系共面,且各坐标轴平行放置。
一种智能机械手的定位装配方法,包括如下步骤:
S1待装配PCB板移动至机械手前方,打开两个CCD摄像机获取两张待装配PCB板的图像,分别为左、右图像;
S2计算机将获得的两张PCB板图像与计算机内预先输入的PCB模板进行匹配,确定在两张图像中异形件待装配的目标区域;
S3利用双目定位算法测量得到待装配的目标区域与机械手指尖的距离,计算机控制机械手进行移动至待装配的目标区域,完成初步定位;
S4利用机械手指尖的生物触点装置与S3中初步定位后的目标区域进行接触,将贴片产生的形变电信号通过信号测量电路输出到计算机,计算机进行位置的微调,实现精确定位,完成装配。
所述S2具体为:对两个CCD摄像机获取的待装配PCB板的图像进行滤波去噪,再利用边缘检测算子提取PCB板的边缘部分,去除背景部分,然后采用分段线性变换对图像增强,提高图像的对比度,最后采用基于灰度值的模板匹配方法确定待装配的目标区域。
所述S3利用双目定位算法测量得到待装配的目标区域与机械手指尖的距离,计算机控制机械手进行移动至待装配的目标区域,完成初步定位,具体为:
S3.1采用张正友标定算法求解CCD摄像机的内部参数矩阵及外部参数矩阵,再进行双目立体定标,确定两个摄像机之间的相对位置关系,所述相对位置关系包括旋转矩阵R及平移向量T;
S3.2采用基于模板匹配的灰度互相关匹配方法完成两张图像的像素点匹配;
S3.3利用双目定位算法即双目测距系统成像原理测得待装配的目标区域与机械手指尖的距离l;
Figure PCTCN2016108868-appb-000001
其中,左右两摄像机的投影中心的距离为基线距B,目标点A经过由两光轴平行的左右摄像机组成的双目测距系统时,分别成像于左CCD像面上的A1点及右CCD像面上的A2点,其在左右像面上的位置分别为xleft和xright,两摄像机的 焦距为f,x为目标点A通过双目摄像机分别成像在左右CCD像面上成像点的位置差。
所述S4具体为:机械手指尖的生物触点装置在完成初步定位后,与待装配PCB板接触,敏感材料会发生形变导致受到压力的改变,输出的电信号改变,计算机根据输出的信号以及压力与位置的数学模型对机械手的位置进行调整,从而达到精确定位装配异形件的目的。
本发明的有益效果:
(1)本发明可以在无人工干预的状态下完成对PCB板待装配区域的自动精确定位,完成异形件的装配工作;
(2)本发明采用两个CCD摄像头分别采集一幅PCB板的图像进行处理,满足了工业电子装配的快速性和实时性要求;
(3)本发明在机械手中加入生物触觉装置,能对装配目标区域进行精确定位,提高了异形件装配的准确率;
(4)本发明在机械手的定位中使用了数字图像处理技术中的双目定位算法,能有效地测量得到机械手与目标装配区域间的距离;
(5)本发明利用模板匹配的方法来确定异形件装配的目标区域,提高了后续定位工作的准确性。
附图说明
图1是本发明的智能机械手的结构示意图;
图2是本发明生物触点装置的结构示意图;
图3a是本发明的模板匹配法的被搜索图;
图3b是本发明的匹配模板示意图;
图4是本发明的工作流程图。
具体实施方式
下面结合实施例及附图,对本发明作进一步地详细说明,但本发明的实施方式不限于此。
实施例
如图1所示,一种基于多镜头的智能机械手,包括一个多关节多功能机械手1、用于采集待装配PCB板图像的CCD摄像机2、生物触点装置3及计算机,所述CCD摄像机具体为两个,分别安装在机械手前臂的左侧及右侧,摄像机的 具体位置可以根据实际情况安装在机械手的不同部位,生物触点装置可以将待装配区域的位置信息转化为电信号反馈到计算机。生物触点的数目可以根据实际需求加以调整。计算机负责对采集的图像以及反馈的电信号进行处理并发出控制信号,指挥机械手完成最终的定位及装配异形件的工作,所述机械手内置机械手的控制电路,控制电路与计算机连接。
本发明采用的两个CCD摄像机型号、参数保持一致,基本保持光轴平行、双目摄像机坐标系共面且各坐标轴平行放置,同步采集的左右图像大小、比例一致,图像的灰度信息保持比较完整,
如图2所示,所述生物触点装置包括用敏感材料制成的贴片5及用于测量指套形变并输出电信号的信号测量电路4,所述贴片5覆盖在机械手的手指尖上,所述贴片与信号测量电路连接,信号测量电路4与计算机连接,所述信号测量电路内置在多关节多功能机械手内部。本实施例中贴片选用指套式,并且采用三个贴片,套在机械手的手指尖上。
所述敏感材料具有压力-电输出特性,可以将受到的压力转化为电信号输出。本实施例中贴片的形状为指套式,采用的敏感材料为导电橡胶,用该敏感材料制成的生物触点装置像指套一般安装在机械手的指尖上,当指尖接触到PCB板时,生物触点装置上的敏感材料的由于形变受到的压力会发生改变,从而导致输出电信号的改变。通过信号测量电路,可以测量得到生物触点装置由于形变而发生改变的输出电信号作为输出信号传送至计算机,计算机根据预先得到的数学模型即可得到目前机械手所处的位置,并输出相应的控制信号进行调整。
图3a为被搜索图,图3b为匹配模板,设匹配模板叠放在被搜索图上进行平移,模板覆盖下的那块被搜索图为子图,比较子图和匹配模板中的内容,若两者相似度量最大则表示两者内容一致,此时的子图则为待寻找的匹配区域。
如图4所示,采用本机械手实现的定位装配方法,包括如下步骤:
S1获取待装配PCB板的图像。
待装配的PCB板经流水线运送至机械手的前方,调节好光源的亮度,通过左右两个CCD摄像机分别拍摄得到标PCB板图像,安装在左侧的CCD摄像机成为左CCD摄像机,图像称为左图像,右CCD摄像机拍摄的图像称为右图像。
S2模板匹配寻找待装配目标区域。
从工业现场采集得到的PCB板图像存在较多噪声,因此首先要对图像进行滤波去噪。摄像机采集的图像包括了待装配的PCB板,称为前景,同时图像上还包括了背景部分,要对目标区域进行辨识和分析,首先就要把它从背景中提取出来,因此要对图像进行图像分割处理。由于目标(PCB板)是规则的几何形状, 可以利用边缘检测算子提取PCB板的边缘部分,去取背景部分。
为了提高图像的辨识度,将图像的细节信息以及边缘进行突出和强化,有利于后续待装配目标区域的模板匹配,需要对PCB板图像进行图像增强,提高图像的对比度,将原灰度值区间范围按照某种映射关系进行转换,借此实现背景图像与目标图像对比度增强的效果。本发明采用的是分段线性变换增强方法。设原始图像的灰度函数为f(r,c),灰度范围为[0,Mf],变换后的图像函数表示为g(r,c),灰度范围为[0,Mg],变换公式可以表示为:
Figure PCTCN2016108868-appb-000002
模板匹配是通过计算模板图像和待搜索图像的相似度量,从而在待搜索图像中找到模板图像的过程。模板匹配的过程可以表述为:首先按像素计算模板图像与待搜索图像的相似度量,然后找到最大的相似度量区域作为匹配位置,其原理如图3a及图3b所示。
在对PCB板图像进行图像增强以后,由于PCB板图像各区域的灰度值分布是均匀固定的,故本发明采用了基于灰度值的模板匹配方法。基于灰度值的模板匹配方法将整幅图像的灰度值作为相似度量,利用定义好的搜索策略按照从上到下、从左到右的顺序在待搜索图像中搜索符合条件的区域,通过设定一个指定大小的搜索窗口,在搜索窗口中进行搜索比较。
待搜索图像中目标物的位置可以通过平移来描述。模板由图像t(r,c)来表示,其中的感兴趣区域指定为T,模板匹配就是在待匹配图像中按照一定顺序平移模板感兴趣区域T,然后计算待匹配图像中该区域与模板感兴趣区域的相似度量值s。相似度量由下式描述:
s(r,c)=s{t(u,v),f(r+u,c+v);(u,v)∈T}
其中s(r,c)表示基于灰度值计算的相似度量,t(u,v)表示模板中各点的灰度值,f(r+u,c+v)表示模板感兴趣区域移到图像当前位置的灰度值。
求取相似度量的最简单方法是计算两图像之间灰度值差值的绝对值之和(SAD)或所有差值的平方和(SSD),SAD和SSD可以分别用以下两式表示:
Figure PCTCN2016108868-appb-000003
Figure PCTCN2016108868-appb-000004
其中,n表示模板该兴趣区域内像素点的数量,即n=|T|。对于SAD和SSD来说,相似度量的值越大,待搜索图像与模板之间的差别也就越大。采用基于灰度值的模板匹配方法即可确定待装配目标区域。
S3利用双目定位算法测量得到待装配的目标区域与机械手指尖的距离,计算机控制机械手进行移动至待装配的目标区域,完成初步定位;
S3.1对双目摄像机进行标定。
由于本文采用的是参数完全相同的两台摄像机组成双目立体系统,因此提出了先分别对两台摄像机进行定标,求解其内外参数。图像坐标系与世界坐标系之间的转换关系为:
Figure PCTCN2016108868-appb-000005
A为摄像机的内参数矩阵,M由摄像机相对于世界坐标系的位置和方向所决定,与摄像机的内部参数图针孔摄像机模型无关,称M为摄像机外部参数矩阵,使用张正友标定算法求解摄像机的内部参数矩阵和外部参数矩阵。
在完成对单个相机的标定,获取单个相机的内外参数后,再进行一次双目立体定标,计算左右两摄像机之间的相对位置即外参数,即左右两摄像机的相对位置关系,包括旋转矩阵R和平移向量T。
假设棋盘格标定板上点P的三维世界坐标为Xw,使用两摄像机同时采集图像,P点在左右摄像机坐标系下的坐标分别为XL和XR,左右摄像机的外参矩阵分别为(RL,tL)和(RR,tR),根据世界坐标系与摄像机坐标系的转换关系有:
Figure PCTCN2016108868-appb-000006
根据上式可以得到从左摄像机到右摄像机的变换关系:
XR=RXL+T
其中
R=RRRL -1,T=tR-RtL
那么根据左右摄像机各自的外参数就可以获取双目视觉系统中立体相机的关系参数,即立体相机的位置关系,至此完成双目摄像机的标定工作。
S3.2采用基于模板匹配的灰度互相关匹配方法完成两张图像的像素点匹配;
本发明所采用的左右摄像机型号、参数保持一致,基本保持光轴平行、双目摄像机坐标系共面且各坐标轴平行放置,同步采集的左右图像大小、比例一致,图像的灰度信息保持比较完整,且待装配的目标区域图像已在S2中确定。因此本发明采用基于模板匹配的灰度互相关匹配方法完成目标图像区域的高精度匹配。
归一化互相关匹配算法根据模板图像与待匹配图像上搜索子图之间建立的互相关函数来判断是否匹配,使用互相关函数表达式如下:
Figure PCTCN2016108868-appb-000007
Figure PCTCN2016108868-appb-000008
Figure PCTCN2016108868-appb-000009
Figure PCTCN2016108868-appb-000010
在上式中,模板图像为T(m,n),模板图像大小为M×N,
Figure PCTCN2016108868-appb-000011
为T(m,n)上所有像素灰度的平均值;在参考图像上以(i,j)为左上角像素点的搜索图像区域为Si,j(m,n),
Figure PCTCN2016108868-appb-000012
为搜索图像上所有像素灰度的平均值。图像匹配是匹配目标图像区域即模板的左上角像素点。互相关函数值N(i,j)的取值范围是0≤N(i,j)≤1,它的值的大小取决于参考图像上以(i,j)为左上角像素点的搜索图像区域与模板图像的匹配程度。某像素点对应的互相关函数值越大,说明该像素点匹配程度越高,选取互相关函数值最大的像素点,即为最匹配像素点。
在完成左右图像像素点的匹配以后,如图所示,根据双目测距系统成像原 理,左右两型号一致的CCD摄像机在同一平面上平行放置,左右两摄像机的投影中心的距离为基线距B。
目标点A经过由两光轴平行的左右摄像机组成的双目测距系统时,分别成像于左CCD像面上的A1点及右CCD像面上的A2点,其在左右像面上的位置分别为xleft和xright。已知两摄像机焦距均为f,根据三角形相似原理可推导出被测距离l:
Figure PCTCN2016108868-appb-000013
x为A点通过双目摄像机分别成像在左右CCD像面上成像点的位置差,又被称为双目视差。可见在双目摄像机光轴严格平行的理想状态下,并且同时获取目标物体的图像,通过图像匹配算法确定同一目标在左右CCD图像中的相应位置,计算出双目视差x,又己知焦距和基线大小,可以通过上式得到目标距离,至此完成待装配目标区域的初定位。
S4利用机械手指尖的生物触点装置与S3中初步定位后的目标区域进行接触,将贴片产生的形变电信号通过信号测量电路输出到计算机,计算机进行位置的微调,实现精确定位,完成装配。
机械手在进行装配时,会与PCB板进行接触,机械手的指尖会受到压力的作用。当异形元器件与待装配区域准确接触以及异形元器件接触到PCB板的其它区域时受到的压力作用是不同的,因此可以建立压力与位置的数学模型,通过指尖受到的压力作用来调整机械手的位置,从而完成对待装配目标区域的精确定位。
本发明采用了一种敏感材料设计了生物触点装置,该敏感材料具有压力-电输出特性,可以将受到的压力转化为电信号输出。用该敏感材料制成的生物触点装置像指套一般安装在机械手的指尖上,当指尖接触到PCB板时,生物触点装置上的导电材料的由于形变受到的压力会发生改变,从而导致输出电信号的改变。在敏感材料与机械手指尖之间有信号测量电路,可以测量得到生物触点装置由于形变而发生改变的输出电信号作为输出信号传送至计算机,计算机根据预先得到的数学模型即可得到目前机械手所处的位置,并输出相应的控制信号进行调整。
机械手在S3步骤中完成了待装配区域的初定位,移动机械手到该区域,在机械手与PCB板发生接触后根据指尖上的生物触点装置的输出信号对机械手的位置进行微调,最终达到精确定位并完成异形件的装配。
上述实施例为本发明较佳的实施方式,但本发明的实施方式并不受所述实施例的限制,其他的任何未背离本发明的精神实质与原理下所作的改变、修饰、 替代、组合、简化,均应为等效的置换方式,都包含在本发明的保护范围之内。

Claims (10)

  1. 一种基于多镜头的智能机械手,其特征在于,包括多关节多功能机械手、用于采集待装配PCB板图像的CCD摄像机、生物触点装置及计算机,所述CCD摄像机安装在多关节多功能机械手上,生物触点装置安装在多关节多功能机械手的手指尖,所述CCD摄像机、生物触点装置及多关节多功能机械手与计算机连接。
  2. 根据权利要求1所述的智能机械手,其特征在于,所述CCD摄像机具体为两个,分别安装在多关节多功能机械手前臂的左侧及右侧。
  3. 根据权利要求1所述的智能机械手,其特征在于,所述生物触点装置包括用敏感材料制成的贴片及用于测量贴片形变并输出电信号的信号测量电路,所述贴片覆盖在机械手的手指尖上,所述贴片与信号测量电路连接,信号测量电路与计算机连接,所述信号测量电路内置在多关节多功能机械手内部。
  4. 根据权利要求3所述的智能机械手,其特征在于,所述贴片的形状为指套式。
  5. 根据权利要求1所述的智能机械手,其特征在于,所述敏感材料具体为导电橡胶。
  6. 根据权利要求2所述的智能机械手,其特征在于,所述两个CCD摄像机的型号及参数完全相同,两个CCD摄像机的坐标系共面,且各坐标轴平行放置。
  7. 应用权利要求1-6任一项所述的智能机械手的定位装配方法,其特征在于,包括如下步骤:
    S1待装配PCB板移动至机械手前方,打开两个CCD摄像机获取两张待装配PCB板的图像,分别为左、右图像;
    S2计算机将获得的两张PCB板图像与计算机内预先输入的PCB模板进行匹配,确定在两张图像中异形件待装配的目标区域;
    S3利用双目定位算法测量得到待装配的目标区域与机械手指尖的距离,计算机控制机械手进行移动至待装配的目标区域,完成初步定位;
    S4利用机械手指尖的生物触点装置与S3中初步定位后的目标区域进行接触,将贴片产生的形变电信号通过信号测量电路输出到计算机,计算机进行位置的微调,实现精确定位,完成装配。
  8. 根据权利要求7所述的定位装配方法,其特征在于,所述S2具体为:对两个CCD摄像机获取的待装配PCB板的图像进行滤波去噪,再利用边缘检测 算子提取PCB板的边缘部分,去除背景部分,然后采用分段线性变换对图像增强,提高图像的对比度,最后采用基于灰度值的模板匹配方法确定待装配的目标区域。
  9. 根据权利要求7所述的定位装配方法,其特征在于,S3利用双目定位算法测量得到待装配的目标区域与机械手指尖的距离,计算机控制机械手进行移动至待装配的目标区域,完成初步定位,具体为:
    S3.1采用张正友标定算法求解CCD摄像机的内部参数矩阵及外部参数矩阵,再进行双目立体定标,确定两个摄像机之间的相对位置关系,所述相对位置关系包括旋转矩阵R及平移向量T;
    S3.2采用基于模板匹配的灰度互相关匹配方法完成两张图像的像素点匹配;
    S3.3利用双目定位算法即双目测距系统成像原理测得待装配的目标区域与机械手指尖的距离l;
    Figure PCTCN2016108868-appb-100001
    其中,左右两摄像机的投影中心的距离为基线距B,目标点A经过由两光轴平行的左右摄像机组成的双目测距系统时,分别成像于左CCD像面上的A1点及右CCD像面上的A2点,其在左右像面上的位置分别为xleft和xright,两摄像机的焦距为f,x为目标点A通过双目摄像机分别成像在左右CCD像面上成像点的位置差。
  10. 根据权利要求7所述的定位装配方法,其特征在于,所述S4具体为:机械手指尖的生物触点装置在完成初步定位后,与待装配PCB板接触,敏感材料会发生形变导致受到压力的改变,输出的电信号改变,计算机根据输出的信号以及压力与位置的数学模型对机械手的位置进行调整,从而达到精确定位装配异形件的目的。
PCT/CN2016/108868 2016-01-27 2016-12-07 一种基于多镜头的智能机械手及定位装配方法 WO2017128865A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/781,856 US10899014B2 (en) 2016-01-27 2016-12-07 Multiple lens-based smart mechanical arm and positioning and assembly method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610057202.9 2016-01-27
CN201610057202.9A CN105538345B (zh) 2016-01-27 2016-01-27 一种基于多镜头的智能机械手及定位装配方法

Publications (1)

Publication Number Publication Date
WO2017128865A1 true WO2017128865A1 (zh) 2017-08-03

Family

ID=55818135

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/108868 WO2017128865A1 (zh) 2016-01-27 2016-12-07 一种基于多镜头的智能机械手及定位装配方法

Country Status (3)

Country Link
US (1) US10899014B2 (zh)
CN (1) CN105538345B (zh)
WO (1) WO2017128865A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110744549A (zh) * 2019-11-11 2020-02-04 电子科技大学 一种基于人机协同的智能装配工艺
CN111951248A (zh) * 2020-08-12 2020-11-17 上海仁度生物科技有限公司 一种用于自动化核酸提取设备的定位校准装置及方法
CN115503024A (zh) * 2021-06-07 2022-12-23 中移雄安信息通信科技有限公司 720°视角的双目机器人及其工作方法

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105538345B (zh) 2016-01-27 2017-09-26 华南理工大学 一种基于多镜头的智能机械手及定位装配方法
CN105856241B (zh) * 2016-06-14 2018-09-14 上海贝特威自动化科技有限公司 一种不同尺寸电子液晶屏定位抓取方法
CN106204605B (zh) * 2016-07-15 2019-02-19 上海乐相科技有限公司 一种定位方法及装置
CN106313041B (zh) * 2016-08-03 2018-09-25 山东中清智能科技股份有限公司 喷涂机器人的双目立体视觉定位装置及定位方法
CN107703933B (zh) * 2016-08-09 2021-07-06 深圳光启合众科技有限公司 机器人的充电方法、装置和设备
CN106530276B (zh) * 2016-10-13 2019-04-09 中科金睛视觉科技(北京)有限公司 一种用于非标准件抓取的机械手定位方法以及定位系统
CN106529510B (zh) * 2016-12-12 2019-07-05 中国科学院合肥物质科学研究院 一种用于电容薄膜的褶皱识别方法及装置
CN108480239B (zh) * 2018-02-10 2019-10-18 浙江工业大学 基于立体视觉的工件快速分拣方法及装置
CN109108975A (zh) * 2018-08-31 2019-01-01 深圳市博辉特科技有限公司 一种双目视觉6轴机器人引导系统
CN109712114A (zh) * 2018-11-30 2019-05-03 无锡维胜威信息科技有限公司 一种应用于拉链缺陷检测的系统及其检测方法
US11185978B2 (en) * 2019-01-08 2021-11-30 Honda Motor Co., Ltd. Depth perception modeling for grasping objects
CN110147162B (zh) * 2019-04-17 2022-11-18 江苏大学 一种基于指尖特征的增强装配示教系统及其控制方法
CN110253596A (zh) * 2019-06-24 2019-09-20 北京理工华汇智能科技有限公司 机器人绑扎定位的方法及装置
CN110340901B (zh) * 2019-06-28 2022-09-27 深圳盈天下视觉科技有限公司 一种控制方法、控制装置及终端设备
CN110516618B (zh) * 2019-08-29 2022-04-12 苏州大学 装配机器人及基于视觉和力位混合控制的装配方法和系统
EP3787014A1 (de) * 2019-08-29 2021-03-03 Siemens Aktiengesellschaft Erfassung von prozessparametern einer montagelinie
CN110621150B (zh) * 2019-09-20 2020-11-10 上海节卡机器人科技有限公司 印制电路板的组装方法及相关装置
CN110675376A (zh) * 2019-09-20 2020-01-10 福建工程学院 一种基于模板匹配的pcb缺陷检测方法
CN110722561A (zh) * 2019-10-25 2020-01-24 北京华商三优新能源科技有限公司 一种全自动充电机器人控制方法及装置
CN111028231B (zh) * 2019-12-27 2023-06-30 易思维(杭州)科技有限公司 基于arm和fpga的工件位置获取系统
CN111267139B (zh) * 2020-02-20 2021-12-31 哈工大机器人(合肥)国际创新研究院 一种机器人智能末端执行器
CN111583342B (zh) * 2020-05-14 2024-02-23 中国科学院空天信息创新研究院 一种基于双目视觉的目标快速定位方法及装置
CN111993420A (zh) * 2020-08-10 2020-11-27 广州瑞松北斗汽车装备有限公司 一种固定式双目视觉3d引导上件系统
CN112894823B (zh) * 2021-02-08 2022-06-21 珞石(山东)智能科技有限公司 一种基于视觉伺服的机器人高精度装配方法
CN113012236B (zh) * 2021-03-31 2022-06-07 武汉理工大学 一种基于交叉式双目视觉引导的机器人智能打磨方法
CN113409343B (zh) * 2021-06-16 2022-03-15 浙江大学 一种实时固体燃料料层厚度的测量方法
US11416072B1 (en) 2021-07-20 2022-08-16 Bank Of America Corporation Data entry apparatus leveraging smart contact lenses
CN113510708B (zh) * 2021-07-28 2021-12-28 南京航空航天大学 一种基于双目视觉的接触式工业机器人自动标定系统
CN113858239B (zh) * 2021-09-24 2023-08-15 华能山东石岛湾核电有限公司 机械手
US11875323B2 (en) 2021-10-05 2024-01-16 Bank Of America Corporation Automated teller machine (ATM) transaction processing leveraging smart contact lenses
CN114905511B (zh) * 2022-05-12 2023-08-11 南京航空航天大学 一种工业机器人装配误差检测与精度补偿系统标定方法
CN116305484B (zh) * 2023-03-28 2023-10-10 山东方杰建工集团有限公司金乡二十分公司 基于bim的装配式建筑异形件模块安装定位方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1067505A (zh) * 1991-06-05 1992-12-30 北京理工大学 组合式触觉传感器
CN101924953A (zh) * 2010-09-03 2010-12-22 南京农业大学 基于基准点的简便匹配方法
JP2011011315A (ja) * 2009-07-06 2011-01-20 Canon Inc 部品組付け方法
CN203266633U (zh) * 2013-05-21 2013-11-06 洛阳理工学院 一种空间坐标定位抓取机械手
CN104933718A (zh) * 2015-06-23 2015-09-23 广东省自动化研究所 一种基于双目视觉的物理坐标定位方法
CN105538345A (zh) * 2016-01-27 2016-05-04 华南理工大学 一种基于多镜头的智能机械手及定位装配方法
CN205466320U (zh) * 2016-01-27 2016-08-17 华南理工大学 一种基于多镜头的智能机械手

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010276447A (ja) * 2009-05-28 2010-12-09 Seiko Epson Corp 位置計測装置、位置計測方法およびロボットシステム
JP6511715B2 (ja) * 2013-10-31 2019-05-15 セイコーエプソン株式会社 ロボット制御装置、ロボットシステム、及びロボット
CN104296657B (zh) * 2014-10-11 2016-09-07 三峡大学 一种基于双目视觉的石壁爆破孔检测与定位装置及定位方法
CN104476549B (zh) * 2014-11-20 2016-04-27 北京卫星环境工程研究所 基于视觉测量的机械臂运动路径补偿方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1067505A (zh) * 1991-06-05 1992-12-30 北京理工大学 组合式触觉传感器
JP2011011315A (ja) * 2009-07-06 2011-01-20 Canon Inc 部品組付け方法
CN101924953A (zh) * 2010-09-03 2010-12-22 南京农业大学 基于基准点的简便匹配方法
CN203266633U (zh) * 2013-05-21 2013-11-06 洛阳理工学院 一种空间坐标定位抓取机械手
CN104933718A (zh) * 2015-06-23 2015-09-23 广东省自动化研究所 一种基于双目视觉的物理坐标定位方法
CN105538345A (zh) * 2016-01-27 2016-05-04 华南理工大学 一种基于多镜头的智能机械手及定位装配方法
CN205466320U (zh) * 2016-01-27 2016-08-17 华南理工大学 一种基于多镜头的智能机械手

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110744549A (zh) * 2019-11-11 2020-02-04 电子科技大学 一种基于人机协同的智能装配工艺
CN111951248A (zh) * 2020-08-12 2020-11-17 上海仁度生物科技有限公司 一种用于自动化核酸提取设备的定位校准装置及方法
CN111951248B (zh) * 2020-08-12 2023-11-28 上海仁度生物科技股份有限公司 一种用于自动化核酸提取设备的定位校准装置及方法
CN115503024A (zh) * 2021-06-07 2022-12-23 中移雄安信息通信科技有限公司 720°视角的双目机器人及其工作方法

Also Published As

Publication number Publication date
CN105538345A (zh) 2016-05-04
US10899014B2 (en) 2021-01-26
CN105538345B (zh) 2017-09-26
US20180361588A1 (en) 2018-12-20

Similar Documents

Publication Publication Date Title
WO2017128865A1 (zh) 一种基于多镜头的智能机械手及定位装配方法
CN109308693B (zh) 由一台ptz相机构建的目标检测和位姿测量单双目视觉系统
CN105729468B (zh) 一种基于多深度摄像机增强的机器人工作台
CN109035309B (zh) 基于立体视觉的双目摄像头与激光雷达间的位姿配准方法
CN109658457B (zh) 一种激光与相机任意相对位姿关系的标定方法
CN110580725A (zh) 一种基于rgb-d相机的箱体分拣方法及系统
CN105835507B (zh) 一种手机盖板玻璃和液晶屏的贴合方法
US11488322B2 (en) System and method for training a model in a plurality of non-perspective cameras and determining 3D pose of an object at runtime with the same
WO2023060926A1 (zh) 一种基于3d光栅引导机器人定位抓取的方法、装置及设备
CN111721259A (zh) 基于双目视觉的水下机器人回收定位方法
Tran et al. Non-contact gap and flush measurement using monocular structured multi-line light vision for vehicle assembly
CN111598172B (zh) 基于异构深度网络融合的动态目标抓取姿态快速检测方法
Hsu et al. Development of a faster classification system for metal parts using machine vision under different lighting environments
CN111784655A (zh) 一种水下机器人回收定位方法
CN111624203A (zh) 一种基于机器视觉的继电器接点齐度非接触式测量方法
CN205466320U (zh) 一种基于多镜头的智能机械手
CN110097540A (zh) 多边形工件的视觉检测方法及装置
CN116749198A (zh) 一种基于双目立体视觉引导机械臂抓取方法
CN111145254B (zh) 一种基于双目视觉的门阀毛坯定位方法
CN113112543A (zh) 一种基于视觉移动目标的大视场二维实时定位系统及方法
CN109410272B (zh) 一种变压器螺母识别与定位装置及方法
Choi et al. Appearance-based gaze estimation using kinect
CN105138999B (zh) 基于阴影的夜间物体单目定位装置及方法
CN111536895B (zh) 外形识别装置、外形识别系统以及外形识别方法
CN112102419B (zh) 双光成像设备标定方法及系统、图像配准方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16887736

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14.11.2018)

122 Ep: pct application non-entry in european phase

Ref document number: 16887736

Country of ref document: EP

Kind code of ref document: A1