CN110017769A - Part detection method and system based on industrial robot - Google Patents
Part detection method and system based on industrial robot Download PDFInfo
- Publication number
- CN110017769A CN110017769A CN201910185667.6A CN201910185667A CN110017769A CN 110017769 A CN110017769 A CN 110017769A CN 201910185667 A CN201910185667 A CN 201910185667A CN 110017769 A CN110017769 A CN 110017769A
- Authority
- CN
- China
- Prior art keywords
- orientation
- detected
- industrial robot
- coordinate system
- shooting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Manipulator (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Analysis (AREA)
Abstract
The present invention relates to Industrial Robot Technology fields, a kind of part detection method and system based on industrial robot is provided, wherein this method comprises: obtaining multiple tripleplane's images of part to be detected, multiple tripleplane's image parts to be detected are shot to obtain in different shooting orientation;According to the part to be detected vision orientation in each tripleplane's image respectively, determine in the visual coordinate system under each shooting orientation and the relative tertiary location between industrial robot and part to be detected;Determine that each visual coordinate system is respectively relative to the coordinate system transformation amount of the target object coordinate system as indicated by part to be detected;The departure between relative tertiary location and corresponding coordinate system transformation amount is calculated, and determines that the target part for detecting part of industrial robot detects orientation according to departure.Allow the diversity of part position as a result, and industrial robot is made adaptively to determine the target detection orientation for being directed to part, improves the accuracy of piece test.
Description
Technical field
The present invention relates to Industrial Robot Technology field, in particular to a kind of part detection method based on industrial robot
And system.
Background technique
The presently relevant technology of part detection method in to(for) industrial robot is by configuring vision for industrial robot
System is directed to the image of part using vision system acquisition, and then completes the detection process to image deflects based on image.But
Be that present inventor has found during practicing the application: detected part or target object may be with setting position
There are deviation between setting, leads to the visual field there may be target object beyond camera and occur that asking for part can not be effectively detected
Topic.
Summary of the invention
In view of this, the present invention is directed to propose a kind of part detection method based on industrial robot, at least to solve mesh
Preceding target object exceeds the visual field of camera and the problem of part can not be effectively detected occurs.
In order to achieve the above objectives, the technical scheme of the present invention is realized as follows:
A kind of part detection method based on industrial robot, wherein the piece test side based on industrial robot
Method includes: multiple the tripleplane's images for obtaining part to be detected, multiple described tripleplane's images are the part to be detected
It is shot in different shooting orientation;According to the part to be detected vision in each tripleplane's image respectively
Orientation determines the visual coordinate system under each shooting orientation, and is determined under each shooting orientation according to the vision orientation
Relative tertiary location between the industrial robot and the part to be detected;Determine that each visual coordinate system is respectively relative to
The coordinate system transformation amount of the target object coordinate system as indicated by the part to be detected;It calculates under each shooting orientation
Departure between the relative tertiary location and the corresponding coordinate system transformation amount, and according to departure determination
The target part detection orientation for detecting part of industrial robot.
Further, described multiple tripleplane's images for obtaining part to be detected include: to be surveyed based on binocular vision 3 D
Amount instrument takes pictures to the part to be detected, wherein the blue light illumination that the part carriers are emitted by blue light projector.
Further, described according to the part to be detected vision orientation in each tripleplane's image respectively, really
It is scheduled on visual coordinate system corresponding under each shooting orientation, and the institute under each shooting orientation is determined according to the vision orientation
Relative tertiary location between the corresponding industrial robot and the part to be detected comprises determining that the part to be detected
Upper characteristic point visual space position of the preset each characteristic point on tripleplane's image, wherein the part to be detected
On each characteristic point can indicate target object coordinate system;By characteristic point visual space position and by the default feature
The indicated target object coordinate system of point is matched, and corresponds to visual coordinate system described in shooting orientation to determine.
Further, described multiple tripleplane's images for obtaining part to be detected include: the mobile work of control
Industry robot, and be scanned during the industrial robot is mobile to judge whether there is part to be detected;And
When scanning is to there are when the part to be detected, control stops the mobile industrial robot and adopts respectively from multiple shooting orientation
Collection is directed to multiple tripleplane's images of part to be detected.
Further, the target part for being used to detect part that the industrial robot is determined according to the departure
Detection orientation includes: the difference between the departure compared under the first shooting orientation and the second shooting orientation, wherein described first
Shooting orientation is shooting orientation locating for industrial robot described in the previous shooting time in second shooting orientation;Described in judgement
Whether difference is less than preset threshold;And when the departure is less than the preset threshold, first shooting orientation is determined
It is that the target part detects orientation with second shooting orientation.
Compared with the existing technology, the part detection method of the present invention based on industrial robot has the advantage that
In part detection method of the present invention based on industrial robot, treated by acquiring in different direction knit stitch
Multiple tripleplane's images of part are detected, and constructs corresponding visual coordinate system and determines that visual coordinate system and target object sit
The relative tertiary location of coordinate system transformation amount and industrial robot and part to be detected between mark system, and then become according to coordinate system
The departure of the amount of changing and relative tertiary location under different shooting orientation determines the target part detection side for detecting part
Position.Image is acquired by the single location in setting compared to industrial robot in presently relevant technology as a result, and detects part,
The present invention allows the diversity of part position and can also make industrial robot self-adapting adjusted positions to determine for part
Target detection orientation improves the accuracy of piece test.
Another object of the present invention is to propose a kind of piece test system based on industrial robot, at least to solve mesh
Preceding target object exceeds the visual field of camera and the problem of part can not be effectively detected occurs.
In order to achieve the above objectives, the technical scheme of the present invention is realized as follows:
A kind of piece test system based on industrial robot, the piece test system packet based on industrial robot
Include: image acquisition unit, for obtaining multiple tripleplane's images of part to be detected, multiple described tripleplane's images are institute
State what part to be detected was shot in different shooting orientation;Visual information determination unit, for according to described to be detected
The part vision orientation in each tripleplane's image respectively determines the visual coordinate system under each shooting orientation, and root
It is determined according to the vision orientation relatively empty between the industrial robot and the part to be detected under each shooting orientation
Between position;Coordinate system transformation determination unit, for determining that each visual coordinate system is respectively relative to by the part institute to be detected
The coordinate system transformation amount of the target object coordinate system of instruction;Target detection orientation determination element, for calculating in each bat
The departure under orientation between the relative tertiary location and the corresponding coordinate system transformation amount is taken the photograph, and according to the departure
Determine the target part detection orientation for detecting part of the industrial robot.
In some embodiments, described image acquiring unit is used for based on binocular vision 3 D measurement instrument to described to be checked
It surveys part to take pictures, wherein the blue light illumination that the part carriers are emitted by blue light projector.
In some embodiments, the visual information determination unit includes: characteristic point visual position determining module, is used for
Determine that multiple characteristic point visions of the preset each characteristic point on the part to be detected on tripleplane's image are empty
Between position, wherein each characteristic point on the part to be detected can indicate target object coordinate system;Matching module, being used for will
Characteristic point visual space position and the target object coordinate system as indicated by the default characteristic point are matched, with
Determine the visual coordinate system for corresponding to shooting orientation.
In some embodiments, described image acquiring unit is also used to control the mobile industrial robot, and in institute
It is scanned during stating industrial robot movement to judge whether there is part to be detected, and, when scanning is to there are institutes
When stating part to be detected, control stops the mobile industrial robot and from multiple shooting orientation, acquisition is directed to be detected zero respectively
Multiple tripleplane's images of part.
In some embodiments, the target detection orientation determination element includes: departure computing module, for comparing
The difference between departure under first shooting orientation and the second shooting orientation, wherein it is described second that the first count, which takes the photograph orientation,
Shoot shooting orientation locating for industrial robot described in the previous shooting time in orientation;Threshold decision execution module, for judging
Whether the difference is less than preset threshold, and, when the departure is less than the preset threshold, determine first shooting
Orientation and second shooting orientation are that the target part detects orientation.
The piece test system based on industrial robot and the above-mentioned part detection method phase based on industrial robot
Identical for advantage possessed by the prior art, details are not described herein.
Other features and advantages of the present invention will the following detailed description will be given in the detailed implementation section.
Detailed description of the invention
The attached drawing for constituting a part of the invention is used to provide further understanding of the present invention, schematic reality of the invention
It applies mode and its explanation is used to explain the present invention, do not constitute improper limitations of the present invention.In the accompanying drawings:
Fig. 1 is the flow chart of the part detection method based on industrial robot described in embodiment of the present invention;
Fig. 2 is sweep type applied in the part detection method based on industrial robot described in embodiment of the present invention
The schematic diagram of vision system;
Fig. 3 is binocular vision applied in the part detection method based on industrial robot described in embodiment of the present invention
Feel the schematic diagram of 3 D measuring instrument;
Fig. 4 is the schematic diagram of the various coordinate systems of the industrial robot system applied to the embodiment of the present invention;
Fig. 5 is the principle that coordinate computation is carried out in the part detection method based on industrial robot of one embodiment of the invention
Flow chart;
Fig. 6 is that robot workstation detected in the part detection method based on industrial robot of one embodiment of the invention
The flow chart of journey;
Fig. 7 is the structural block diagram of the piece test system based on industrial robot of one embodiment of the invention.
Description of symbols:
701 image acquisition unit, 702 visual information determination unit
703 coordinate system transformation determination unit, 704 target detection orientation determination element
The 70 piece test systems based on industrial robot
Specific embodiment
It should be noted that in the absence of conflict, the feature in embodiment and embodiment in the present invention can
To be combined with each other.
In addition, the industrial robot being previously mentioned in embodiments of the present invention, refers to and is installed with vision system or takes pictures
The robot of device is widely used in detecting the defect of part (such as auto parts).
The present invention will be described in detail below with reference to the accompanying drawings and in conjunction with embodiment.
As shown in Figure 1, the part detection method based on industrial robot of one embodiment of the invention, this method comprises:
S11, multiple the tripleplane's images for obtaining part to be detected.
Wherein, which is that the part to be detected is shot in different shooting orientation
It obtains.
About the acquisition modes of tripleplane's image, it can be using the combination of projection and scanning and realize.Such as figure
2, showing being capable of the sweep type applied in the part detection method based on industrial robot of the embodiment of the present invention
Vision system can observe the blue structure light that deformation occurs by the sweep type vision system, to collected deforming stripe
Image carries out demodulating the available phase change comprising elevation information, later according to principle of triangulation obtain image pixel it
Between grid deviation (parallax);The three-dimensional information of object to be measured is obtained on the basis of above.
Preferably, can also be and taken pictures based on binocular vision 3 D measurement instrument to part to be detected, wherein this zero
The blue light illumination that part bracket is emitted by blue light projector.Binocular vision 3 D measurement instrument as shown in Figure 3 (including binocular head
M and blue light grating N) Lai Jinhang takes pictures, to acquire tripleplane's image of photograph subject.
S12, according to the part to be detected vision orientation in each tripleplane's image respectively, determine in each shooting side
Visual coordinate system under position, and determined under each shooting orientation between industrial robot and part to be detected according to vision orientation
Relative tertiary location.
As shown in figure 4, it illustrates the various coordinate systems for the industrial robot system for being applied to the embodiment of the present invention, wherein
Including robot base coordinate sys-tem (it can be matches with world coordinate system), TCP (Tool center point, in tool
Heart point) coordinate system or visual coordinate system, and the user coordinate system as indicated by part to be detected or target object coordinate system.
It should be noted that when being in different orientation due to industrial robot, the Image Acquisition orientation of vision system
Can change therewith, meeting so that sighting distance or photo angle between vision system and part to be detected change so that
The spatial positional information that part to be detected is identified in tripleplane's image that difference takes pictures orientation may have deviation, very
It is huge to the data differences sometimes obtained.And when industrial robot is in different shooting orientation, industrial machine
Relative tertiary location between people and part to be detected is different or changes, and, view in robotic vision system
Feel what coordinate system was also different or changed.And shooting orientation described herein can be expression relative to initial
The orientation that orientation has setting orientation step change is shot, is also possible to the shooting orientation that robot adaptively determines, and all
In belonging to the scope of protection of the present invention.
Come specifically, can be by means of the size or ratio of the three-dimensional information of part to be detected in tripleplane's image
Directly determine the relative tertiary location between industrial robot and part to be detected.In addition, it can also be according to be detected zero
Vision orientation of the part in tripleplane's image determines multiple visual coordinate systems corresponding with multiple shooting orientation respectively.Example
Property, it can be and determine visual coordinate system in the following manner: firstly, determining preset each on part to be detected
Multiple characteristic point visual space position of the characteristic point on tripleplane's image, wherein each characteristic point energy on part to be detected
Enough indicate that target object coordinate system, such as multiple characteristic points can be at 1 points and (including be located at target object coordinate
Reference axis, such as X, Y-axis, on three points), therefore can determine corresponding target object coordinate system by these characteristic points;
Later, it can also be multiple characteristic point visual spaces position and the target object coordinate as indicated by multiple default characteristic points
System is matched, and corresponds to shooting orientation visual coordinate system to determine.
S13, determine that each visual coordinate system is respectively relative to the target object coordinate system as indicated by part to be detected
Coordinate system transformation amount.
Wherein, coordinate system transformation amount can be reference and the mesh of instruction vision system and part relative position to be detected
Mark the reference of object and robot body relative position.Specifically, can be by determining coordinate origin location variation and seat
Parameter direction change amount can be by coordinate origin location variation and change in coordinate axis direction variable quantity (such as visual coordinate system X
Direction change amount of the axis relative to target object coordinate system X-axis) it determines jointly.
The departure of S14, calculating under each shooting orientation between relative tertiary location and corresponding coordinate system transformation amount,
And determine that the target part for detecting part of industrial robot detects orientation according to departure.
It is understood that taking pictures under orientation in desired difference, between relative tertiary location and coordinate system transformation amount
Departure is constant, therefore by calculating the corresponding multiple departures in multiple shooting orientation, and be can be more using this
A departure detects orientation come the target part for determining that industrial robot has been located in for detecting part, such as when determination deviation amount
It is constant or when tending to be constant, determine that industrial robot has been located in piece test orientation, and can formally start to be directed to be detected
The detection process of part.
In some embodiments, it can be and judge whether industrial robot arrived target zero in the following manner
Part detects orientation.Specifically, it can be and implements in the following manner: firstly, compare the first shooting orientation and second count
The difference between the departure under orientation is taken the photograph, wherein the first shooting orientation is work described in the previous shooting time in the second shooting orientation
Shooting orientation locating for industry robot;Then, judge whether the difference is less than preset threshold, and preset when the departure is less than
When threshold value, it was demonstrated that departure has tended to be constant, can determine that the first shooting orientation and the second shooting orientation are target zero at this time
Part detects orientation, it can is to detect in the first shooting orientation or the second shooting orientation to part.
In some embodiments, industrial robot can be moves on moving guide rail, therefore in order to realize dynamic
Ground detects part to be detected, such as in assembly line measuring station, can be control mobile industrial robot, and in industrial machine
It is scanned during people is mobile to judge whether there is part to be detected, and when scanning to there are when part to be detected, controlled
Stop stops mobile industrial robot and is directed to multiple tripleplane's images of part to be detected from the acquisition of multiple shooting orientation respectively.
It is thereby achieved that the position of automatic tracing target object (or part to be detected) and being automatically adjusted to target for part to be detected
Detection range, to also improve the detection accuracy of part while ensureing assembly line detection efficiency.
In embodiments of the present invention, 3D visual imaging theory and robot coordinate system's principle are applied, proposes a kind of improvement
Type vision guide scheme, including by robot adjust automatically picture-taking position, robot directly reads vision data, system, robot
System directly calculates the deviation of target object, solves the problems, such as that conventional visual scheme can not be guided when deviation range is excessive.
It is illustrated in fig. 5 be one embodiment of the invention the part detection method based on industrial robot in carry out coordinate fortune
The principle process of calculation;Output characteristic, machine are mainly scanned using blue light scanning vision system in the embodiment of the present invention
Device people calculates deviation, and robot carries out 3D guidance according to deviation.
Firstly, calibration vision system:
Read machine manually has and robot current coordinate system, and the location information that read machine people is current, extracts machine
Device people current coordinate system to robot tool vision system vector, then the direction value in vector is converted to by quaternary number appearance
The angle value of state data conversion, to complete being associated between robot current coordinate system and vision system so that robot with
SM communication connection is established between vision system.
In turn, after completing to the proving operation of vision system, the mesh for being suitable for taking pictures to part is begun look for
Mark piece test orientation:
1) three coordinate output.
Robot sends a command to vision system so that vision system scanning part, obtains the point cloud number an of photo
According to three points of search in point cloud data.After finding these three points, the coordinate value view-based access control model system calibrating coordinate system is direct
Issue robot.
2) it establishes coordinate system and converts.
Robot constructs coordinate system according to 3 coordinate values (X-Y-Z) of vision system, such as can be three groups of coordinate values
Sorted with trigonometry and use line-of-sight course construct coordinate system, thus calculate the coordinate system tool coordinates system direction and partially
Then difference is the value of workpiece coordinate system according to each axis transformation of robot.It is workpiece seat according to each axis transformation of robot
Mark the value of system.System is reference with first time result, obtains two reference quantities: the reference of camera and target object relative position,
The reference of target object and robot body relative position.
3) it takes pictures adjustment.
It repeats first item and Section 2, robot compares the deviation of calculated result and reference value, according to deviation, robot
Adjust position and posture.
4) repeatedly (3 times) adjustment.
After the completion of taking pictures, robot calculates deviation, and according to deviation adjusting robot location and posture, i.e. adjustment sighting distance system
The relative position and posture of system and target object.Such as Fig. 6, it illustrates robot workstation's detection example processes, thus
Piece test quality has also been ensured while ensureing assembly line detection efficiency.
It should be noted that the calculation method of view-based access control model controller involved in embodiments of the present invention, can be
Implement in various processors (or controller), can be universal processor, such as can be and connect with robot
It is completed on computer,
It also proposed in embodiments of the present invention:
I Image solutions based on tool coordinates system.
Conventional vision system builds camera coordinates system and some workpiece coordinate system (or target object coordinate system) of robot
Vertical connection, after camera changes with robot body pedestal target relative position, whole system needs to re-scale, building
Relationship coordinate system.Moreover, picture-taking position can only be the position of a fixed pose for the camera being installed on robot arm
Set, though target object exceed camera the visual field, such system also can not change location take pictures.The scheme that this text is mentioned, phase
The coordinate system (or visual coordinate system) that machine had been demarcated be present in theoretical space is contacted with robot tool establishment of coordinate system, image
The coordinate value of processing is all embodied under tool coordinates system.All Data Integrations are to robot coordinate system.Theoretically robot
Any target object in working range can be shot with vision system.
II multiple bearing solves coordinate value low precision caused by sighting distance variation.
Vision system shoots the same object, and under different sighting distances, different shooting angle, the data differences obtained are huge
Greatly.The conventional fixed position once photo taking of guidance system, can not solve the problems, such as deviation.The scheme that this text is mentioned, after taking pictures every time,
Robot calculates object deviation (it is possible that inaccuracy), robot motion's deviation, even if calculating inaccuracy, but levels off to target
Value.It takes pictures again calculating in the position of approach, robot motion's deviation.It is arrived in this way by approach movement, robot motion three times
Almost with the consistent position and attitude of initial picture-taking position.This ensure that the one of the optical conditions such as the vision of camera, angle
It causes, the image data precision obtained is higher.
As shown in fig. 7, the piece test system 70 based on industrial robot of one embodiment of the invention, described based on industry
The piece test system of robot includes: image acquisition unit 701, for obtaining multiple three-dimensional projections of part to be detected
Picture, multiple tripleplane's images part to be detected are shot to obtain in different shooting orientation;Visual information
Determination unit 702, for determining each according to the part to be detected vision orientation in each tripleplane's image respectively
Visual coordinate system under a shooting orientation, and the industrial robot under each shooting orientation is determined according to the vision orientation
With the relative tertiary location between the part to be detected;Coordinate system transformation determination unit 703, for determining each visual coordinate
System is respectively relative to the coordinate system transformation amount of the target object coordinate system as indicated by the part to be detected;Target detection orientation
Determination unit 704 is used to calculate the relative tertiary location under each shooting orientation and becomes with the corresponding coordinate system
Departure between the amount of changing, and determine that the target part for detecting part of the industrial robot is examined according to the departure
Interception.
In some embodiments, described image acquiring unit 701 is used for based on binocular vision 3 D measurement instrument to described
Part to be detected is taken pictures, wherein the blue light illumination that the part carriers are emitted by blue light projector.
In some embodiments, the visual information determination unit 702 includes: characteristic point visual position determining module,
For determining multiple characteristic points view of the preset each characteristic point on the part to be detected on tripleplane's image
Spatial position is felt, wherein each characteristic point on the part to be detected can indicate target object coordinate system;Matching module is used
It is carried out in by characteristic point visual space position and the target object coordinate system as indicated by the default characteristic point
Match, to determine the visual coordinate system for corresponding to shooting orientation.
In some embodiments, described image acquiring unit 701 is also used to control the mobile industrial robot, and
It is scanned during the industrial robot is mobile to judge whether there is part to be detected, and, when scanning to presence
When the part to be detected, control stops the mobile industrial robot and respectively from the acquisition of multiple shooting orientation for be detected
Multiple tripleplane's images of part.
In some embodiments, the target detection orientation determination element 704 includes: departure computing module, is used for
Compare the difference between the departure under the first shooting orientation and the second shooting orientation, wherein it is described that the first count, which takes the photograph orientation,
Shooting orientation locating for industrial robot described in the previous shooting time in the second shooting orientation;Threshold decision execution module, is used for
Judge whether the difference is less than preset threshold, and, when the departure is less than the preset threshold, determine described first
Shooting orientation and second shooting orientation is that the target part detects orientation.
More details of the piece test system based on industrial robot about the embodiment of the present invention, can be ginseng
According to the description of the embodiment above with respect to the part detection method based on industrial robot, and obtains and be based on industrial robot
Part detection method is identical or corresponding technical effect, therefore details are not described herein.
The foregoing is merely better embodiments of the invention, are not intended to limit the invention, all of the invention
Within spirit and principle, any modification, equivalent replacement, improvement and so on be should all be included in the protection scope of the present invention.
Claims (10)
1. a kind of part detection method based on industrial robot, which is characterized in that the part inspection based on industrial robot
Survey method includes:
Multiple tripleplane's images of part to be detected are obtained, multiple described tripleplane's images are the part to be detected not
It is shot with shooting orientation;
According to the part to be detected vision orientation in each tripleplane's image respectively, determine under each shooting orientation
Visual coordinate system, and according to the vision orientation determine under each shooting orientation the industrial robot with it is described to be detected
Relative tertiary location between part;
Determine that each visual coordinate system is respectively relative to the coordinate of the target object coordinate system as indicated by the part to be detected
It is converted quantity;
It calculates inclined between the relative tertiary location and the corresponding coordinate system transformation amount under each shooting orientation
Residual quantity, and determine that the target part for detecting part of the industrial robot detects orientation according to the departure.
2. the part detection method according to claim 1 based on industrial robot, which is characterized in that the acquisition is to be checked
Survey part multiple tripleplane's images include:
It is taken pictures based on binocular vision 3 D measurement instrument to the part to be detected, wherein the part carriers are projected by blue light
The blue light illumination that instrument is emitted.
3. the part detection method according to claim 1 based on industrial robot, which is characterized in that described according to
The part to be detected vision orientation in each tripleplane's image respectively determines the visual coordinate under each shooting orientation
System, and determined under each shooting orientation between the industrial robot and the part to be detected according to the vision orientation
Relative tertiary location includes:
Determine characteristic point visual space of the preset each characteristic point on tripleplane's image on the part to be detected
Position, wherein each characteristic point on the part to be detected can indicate target object coordinate system;
Characteristic point visual space position and the target object coordinate system as indicated by the characteristic point are matched,
To determine the visual coordinate system for corresponding to shooting orientation.
4. the part detection method according to claim 1 based on industrial robot, which is characterized in that the acquisition is to be checked
Survey part multiple tripleplane's images include:
The mobile industrial robot of control, and be scanned during the industrial robot is mobile to judge whether to deposit
In part to be detected;And
When scanning is to there are when the part to be detected, control stops the mobile industrial robot and respectively from multiple shooting side
Position acquisition is directed to multiple tripleplane's images of part to be detected.
5. the part detection method according to claim 1 based on industrial robot, which is characterized in that described according to
Departure determines that the target part detection orientation for detecting part of the industrial robot includes:
Compare the difference between the departure under the first shooting orientation and the second shooting orientation, wherein the first count takes the photograph orientation is
Shooting orientation locating for industrial robot described in the previous shooting time in second shooting orientation;
Judge whether the difference is less than preset threshold;And
When the departure is less than the preset threshold, determine that the first shooting orientation and second shooting orientation are institute
State target part detection orientation.
6. a kind of piece test system based on industrial robot, which is characterized in that the part inspection based on industrial robot
Examining system includes:
Image acquisition unit, for obtaining multiple tripleplane's images of part to be detected, multiple described tripleplane's images are
The part to be detected is shot in different shooting orientation;
Visual information determination unit, for according to the part to be detected vision side in each tripleplane's image respectively
Position determines the visual coordinate system under each shooting orientation, and determines the institute under each shooting orientation according to the vision orientation
State the relative tertiary location between industrial robot and the part to be detected;
Coordinate system transformation determination unit, for determining that each visual coordinate system is respectively relative to as indicated by the part to be detected
Target object coordinate system coordinate system transformation amount;
Target detection orientation determination element, for calculate under each shooting orientation the relative tertiary location with it is corresponding
Departure between the coordinate system transformation amount, and according to the departure determine the industrial robot for detecting part
Target part detect orientation.
7. the piece test system according to claim 6 based on industrial robot, which is characterized in that described image obtains
Unit is used to take pictures to the part to be detected based on binocular vision 3 D measurement instrument, wherein the part carriers are by blue light
The blue light illumination that projector is emitted.
8. the piece test system according to claim 6 based on industrial robot, which is characterized in that the visual information
Determination unit includes:
Characteristic point visual position determining module, for determining preset each characteristic point on the part to be detected described three
Multiple characteristic point visual spaces position in projected image is tieed up, wherein each characteristic point on the part to be detected can indicate
Target object coordinate system;
Matching module, for by characteristic point visual space position and the object as indicated by the default characteristic point
Body coordinate system is matched, to determine the visual coordinate system for corresponding to shooting orientation.
9. the piece test system according to claim 6 based on industrial robot, which is characterized in that described image obtains
Unit is also used to control the mobile industrial robot, and is scanned during the industrial robot is mobile to judge
With the presence or absence of part to be detected, and, when scanning is to there are when the part to be detected, control stops the mobile industrial machine
People and multiple the tripleplane's images for being directed to part to be detected from the acquisition of multiple shooting orientation respectively.
10. the piece test system according to claim 6 based on industrial robot, which is characterized in that the target inspection
Interception determination unit includes:
Departure computing module, for comparing the difference between the departure under the first shooting orientation and the second shooting orientation,
Described in the first shooting orientation be shooting side locating for industrial robot described in the previous shooting time in second shooting orientation
Position;
Threshold decision execution module, for judging whether the difference is less than preset threshold, and, when the departure is less than institute
When stating preset threshold, determine that the first shooting orientation and second shooting orientation are that the target part detects orientation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910185667.6A CN110017769A (en) | 2019-03-12 | 2019-03-12 | Part detection method and system based on industrial robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910185667.6A CN110017769A (en) | 2019-03-12 | 2019-03-12 | Part detection method and system based on industrial robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110017769A true CN110017769A (en) | 2019-07-16 |
Family
ID=67189451
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910185667.6A Pending CN110017769A (en) | 2019-03-12 | 2019-03-12 | Part detection method and system based on industrial robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110017769A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111145247A (en) * | 2019-12-18 | 2020-05-12 | 配天机器人技术有限公司 | Vision-based position detection method, robot and computer storage medium |
CN112729117A (en) * | 2020-12-30 | 2021-04-30 | 郑州大学 | Compressor barrel electric box threaded hole detection device based on machine vision |
CN113008153A (en) * | 2021-02-25 | 2021-06-22 | 山东建筑大学 | Time baseline parallax method calculation method suitable for oblique photography and based on cooperative change of optical axis and control plane |
CN113129304A (en) * | 2021-05-18 | 2021-07-16 | 郑州轻工业大学 | Part detection method based on machine vision |
CN113189010A (en) * | 2021-05-18 | 2021-07-30 | 郑州轻工业大学 | Part detection mechanism based on machine vision and use method thereof |
CN113844674A (en) * | 2020-12-30 | 2021-12-28 | 上海飞机制造有限公司 | Fastener installation state detection method and device, electronic equipment and medium |
CN114194058A (en) * | 2022-01-12 | 2022-03-18 | 开迈斯新能源科技有限公司 | Detection device and detection method for automobile charging robot |
CN114322754A (en) * | 2020-10-09 | 2022-04-12 | 维尔泰克视觉国际有限公司 | Method and system for checking repair or assembly operations |
Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3888362A (en) * | 1973-05-31 | 1975-06-10 | Nasa | Cooperative multiaxis sensor for teleoperation of article manipulating apparatus |
CN101239469A (en) * | 2007-02-05 | 2008-08-13 | 发那科株式会社 | Calibration device and method for robot mechanism |
CN101261118A (en) * | 2008-04-17 | 2008-09-10 | 天津大学 | Rapid automatized three-dimensional appearance on-line measurement method and system based on robot |
CN101395440A (en) * | 2006-03-07 | 2009-03-25 | 赛德思公司 | Method, system and computer program product for locating a measuring device and for measuring large objects |
CN101666619A (en) * | 2009-09-27 | 2010-03-10 | 长沙长泰输送包装设备有限公司 | Method for calculating absolute coordinates of work piece |
CN101788805A (en) * | 2010-01-27 | 2010-07-28 | 暨南大学 | High-accuracy machine vision two-dimensional positioning method based on motion servo correction |
CN102575926A (en) * | 2009-09-10 | 2012-07-11 | 卡尔蔡司股份公司 | Devices and methods for determining positions and measuring surfaces |
CN102564319A (en) * | 2011-12-30 | 2012-07-11 | 清华大学 | Method for detecting slip during linear delivery of wafer by using image processing technology |
CN102914293A (en) * | 2011-07-08 | 2013-02-06 | 佳能株式会社 | Information processing apparatus and information processing method |
JP5500714B2 (en) * | 2009-09-30 | 2014-05-21 | ダイハツ工業株式会社 | Movable axis position management device |
CN103822594A (en) * | 2014-02-28 | 2014-05-28 | 华南理工大学 | Workpiece scanning imaging method based on laser sensor and robot |
CN104089586A (en) * | 2014-07-16 | 2014-10-08 | 浙江大学宁波理工学院 | Image detection device and method of engine crankshaft journal shape errors |
CN105066884A (en) * | 2015-09-09 | 2015-11-18 | 大族激光科技产业集团股份有限公司 | Robot tail end positioning deviation correction method and system |
CN205058046U (en) * | 2015-08-07 | 2016-03-02 | 浙江工业大学 | Vision positioning's injection molding machine finished product snatchs manipulator device |
CN105945994A (en) * | 2016-05-10 | 2016-09-21 | 华讯方舟科技有限公司 | Calibrating method and device for robot head joint steering engine positions and robot |
CN106338245A (en) * | 2016-08-15 | 2017-01-18 | 南京工业大学 | Workpiece noncontact mobile measurement method |
CN106813570A (en) * | 2015-11-30 | 2017-06-09 | 中国科学院沈阳自动化研究所 | Based on the elongated cylindrical object dimensional identification of line-structured light scanning and localization method |
CN107436125A (en) * | 2017-08-03 | 2017-12-05 | 环旭电子股份有限公司 | Position finding and detection method |
CN107764193A (en) * | 2016-08-19 | 2018-03-06 | 株式会社斯库林集团 | Displacement detector, displacement detecting method and substrate board treatment |
CN107984201A (en) * | 2017-11-30 | 2018-05-04 | 中国地质大学(武汉) | A kind of screw hole positioning of view-based access control model servo and lock unload screw method |
CN108133477A (en) * | 2017-12-29 | 2018-06-08 | 深圳市越疆科技有限公司 | A kind of object detecting method and intelligent machine arm |
-
2019
- 2019-03-12 CN CN201910185667.6A patent/CN110017769A/en active Pending
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3888362A (en) * | 1973-05-31 | 1975-06-10 | Nasa | Cooperative multiaxis sensor for teleoperation of article manipulating apparatus |
CN101395440A (en) * | 2006-03-07 | 2009-03-25 | 赛德思公司 | Method, system and computer program product for locating a measuring device and for measuring large objects |
CN101239469A (en) * | 2007-02-05 | 2008-08-13 | 发那科株式会社 | Calibration device and method for robot mechanism |
CN101261118A (en) * | 2008-04-17 | 2008-09-10 | 天津大学 | Rapid automatized three-dimensional appearance on-line measurement method and system based on robot |
CN102575926A (en) * | 2009-09-10 | 2012-07-11 | 卡尔蔡司股份公司 | Devices and methods for determining positions and measuring surfaces |
CN101666619A (en) * | 2009-09-27 | 2010-03-10 | 长沙长泰输送包装设备有限公司 | Method for calculating absolute coordinates of work piece |
JP5500714B2 (en) * | 2009-09-30 | 2014-05-21 | ダイハツ工業株式会社 | Movable axis position management device |
CN101788805A (en) * | 2010-01-27 | 2010-07-28 | 暨南大学 | High-accuracy machine vision two-dimensional positioning method based on motion servo correction |
CN102914293A (en) * | 2011-07-08 | 2013-02-06 | 佳能株式会社 | Information processing apparatus and information processing method |
CN102564319A (en) * | 2011-12-30 | 2012-07-11 | 清华大学 | Method for detecting slip during linear delivery of wafer by using image processing technology |
CN103822594A (en) * | 2014-02-28 | 2014-05-28 | 华南理工大学 | Workpiece scanning imaging method based on laser sensor and robot |
CN104089586A (en) * | 2014-07-16 | 2014-10-08 | 浙江大学宁波理工学院 | Image detection device and method of engine crankshaft journal shape errors |
CN205058046U (en) * | 2015-08-07 | 2016-03-02 | 浙江工业大学 | Vision positioning's injection molding machine finished product snatchs manipulator device |
CN105066884A (en) * | 2015-09-09 | 2015-11-18 | 大族激光科技产业集团股份有限公司 | Robot tail end positioning deviation correction method and system |
CN106813570A (en) * | 2015-11-30 | 2017-06-09 | 中国科学院沈阳自动化研究所 | Based on the elongated cylindrical object dimensional identification of line-structured light scanning and localization method |
CN105945994A (en) * | 2016-05-10 | 2016-09-21 | 华讯方舟科技有限公司 | Calibrating method and device for robot head joint steering engine positions and robot |
CN106338245A (en) * | 2016-08-15 | 2017-01-18 | 南京工业大学 | Workpiece noncontact mobile measurement method |
CN107764193A (en) * | 2016-08-19 | 2018-03-06 | 株式会社斯库林集团 | Displacement detector, displacement detecting method and substrate board treatment |
CN107436125A (en) * | 2017-08-03 | 2017-12-05 | 环旭电子股份有限公司 | Position finding and detection method |
CN107984201A (en) * | 2017-11-30 | 2018-05-04 | 中国地质大学(武汉) | A kind of screw hole positioning of view-based access control model servo and lock unload screw method |
CN108133477A (en) * | 2017-12-29 | 2018-06-08 | 深圳市越疆科技有限公司 | A kind of object detecting method and intelligent machine arm |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111145247A (en) * | 2019-12-18 | 2020-05-12 | 配天机器人技术有限公司 | Vision-based position detection method, robot and computer storage medium |
CN111145247B (en) * | 2019-12-18 | 2023-07-07 | 配天机器人技术有限公司 | Position degree detection method based on vision, robot and computer storage medium |
CN114322754A (en) * | 2020-10-09 | 2022-04-12 | 维尔泰克视觉国际有限公司 | Method and system for checking repair or assembly operations |
CN112729117A (en) * | 2020-12-30 | 2021-04-30 | 郑州大学 | Compressor barrel electric box threaded hole detection device based on machine vision |
CN113844674A (en) * | 2020-12-30 | 2021-12-28 | 上海飞机制造有限公司 | Fastener installation state detection method and device, electronic equipment and medium |
CN113844674B (en) * | 2020-12-30 | 2024-03-01 | 上海飞机制造有限公司 | Fastener installation state detection method and device, electronic equipment and medium |
CN113008153A (en) * | 2021-02-25 | 2021-06-22 | 山东建筑大学 | Time baseline parallax method calculation method suitable for oblique photography and based on cooperative change of optical axis and control plane |
CN113008153B (en) * | 2021-02-25 | 2023-03-14 | 山东建筑大学 | Time baseline parallax method calculation method suitable for oblique photography and based on cooperative change of optical axis and control plane |
CN113129304A (en) * | 2021-05-18 | 2021-07-16 | 郑州轻工业大学 | Part detection method based on machine vision |
CN113189010A (en) * | 2021-05-18 | 2021-07-30 | 郑州轻工业大学 | Part detection mechanism based on machine vision and use method thereof |
CN114194058A (en) * | 2022-01-12 | 2022-03-18 | 开迈斯新能源科技有限公司 | Detection device and detection method for automobile charging robot |
CN114194058B (en) * | 2022-01-12 | 2023-12-22 | 开迈斯新能源科技有限公司 | Detection device and detection method for automobile charging robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110017769A (en) | Part detection method and system based on industrial robot | |
CN106056587B (en) | Full view line laser structured light three-dimensional imaging caliberating device and method | |
CN109676243A (en) | Weld distinguishing and tracking system and method based on dual laser structure light | |
CN111775146A (en) | Visual alignment method under industrial mechanical arm multi-station operation | |
CN108177143A (en) | A kind of robot localization grasping means and system based on laser vision guiding | |
CN111028340B (en) | Three-dimensional reconstruction method, device, equipment and system in precise assembly | |
CN111707189B (en) | Laser displacement sensor light beam direction calibration method based on binocular vision | |
JP2021193400A (en) | Method for measuring artefact | |
JP7191309B2 (en) | Automatic Guidance, Positioning and Real-time Correction Method for Laser Projection Marking Using Camera | |
CN109212497A (en) | A kind of measurement of space six degree of freedom vehicle radar antenna pose deviation and interconnection method | |
US20220230348A1 (en) | Method and apparatus for determining a three-dimensional position and pose of a fiducial marker | |
CN109444916A (en) | The unmanned travelable area determining device of one kind and method | |
CN110370316A (en) | It is a kind of based on the robot TCP scaling method vertically reflected | |
CN108942921A (en) | A kind of grabbing device at random based on deep learning object identification | |
CN106352871A (en) | Indoor visual positioning system and method based on artificial ceiling beacon | |
CN109773589B (en) | Method, device and equipment for online measurement and machining guidance of workpiece surface | |
CN113012238B (en) | Method for quick calibration and data fusion of multi-depth camera | |
Yamauchi et al. | Calibration of a structured light system by observing planar object from unknown viewpoints | |
WO2019087253A1 (en) | Stereo camera calibration method | |
CN110370272A (en) | It is a kind of based on the robot TCP calibration system vertically reflected | |
JPH11118438A (en) | Method and device for measuring three-dimensional shape | |
CN110849285A (en) | Welding spot depth measuring method, system and medium based on monocular camera | |
JP6420530B2 (en) | Information processing apparatus, measurement system, control system, light quantity determination method, program, and storage medium | |
El-Hakim | A hierarchical approach to stereo vision | |
JPH0675617A (en) | Camera view point change system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190716 |