CN105666274A - Dinner plate edging method based on vision control - Google Patents

Dinner plate edging method based on vision control Download PDF

Info

Publication number
CN105666274A
CN105666274A CN201610077868.0A CN201610077868A CN105666274A CN 105666274 A CN105666274 A CN 105666274A CN 201610077868 A CN201610077868 A CN 201610077868A CN 105666274 A CN105666274 A CN 105666274A
Authority
CN
China
Prior art keywords
image
service plate
edging
registration
dinner plate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610077868.0A
Other languages
Chinese (zh)
Other versions
CN105666274B (en
Inventor
王平江
郭磊
张顺林
李世其
钟治魁
苏德瑜
陈达伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Quanzhou Huazhong University Of Science And Technology Institute Of Manufacturing
Huazhong University of Science and Technology
Original Assignee
Quanzhou Huazhong University Of Science And Technology Institute Of Manufacturing
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Quanzhou Huazhong University Of Science And Technology Institute Of Manufacturing, Huazhong University of Science and Technology filed Critical Quanzhou Huazhong University Of Science And Technology Institute Of Manufacturing
Priority to CN201610077868.0A priority Critical patent/CN105666274B/en
Publication of CN105666274A publication Critical patent/CN105666274A/en
Application granted granted Critical
Publication of CN105666274B publication Critical patent/CN105666274B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B9/00Machines or devices designed for grinding edges or bevels on work or for removing burrs; Accessories therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B17/00Special adaptations of machines or devices for grinding controlled by patterns, drawings, magnetic tapes or the like; Accessories therefor
    • B24B17/04Special adaptations of machines or devices for grinding controlled by patterns, drawings, magnetic tapes or the like; Accessories therefor involving optical auxiliary means, e.g. optical projection form grinding machines

Abstract

The invention discloses a dinner plate edging method based on vision control. The method comprises the following steps: clamping and locating a dinner plate to be edged, and acquiring images of the dinner plate by virtue of two cameras which are symmetrically arranged at the two sides of the dinner plate; integrating the images through pre-treatment, image registration and fusion processing, and then extracting contour features to generate a complete dinner plate contour drawing; comparing the dinner plate contour drawing with a theoretical CAD model of the dinner plate, and obtaining a central position and a deviation angle of the current clamping and locating; and carrying out coordinate offset in a numerical control system according to the central position and the deviation angle, and controlling an edging unit to realize edging for the dinner plate by the numerical control system after the coordinate offset according to G code corresponding to the theoretical CAD model. According to the method disclosed by the invention, the two cameras which are symmetrically distributed at the two sides of the dinner plate are used for carrying out contour acquisition on the edge of the dinner plate, at a machining station, comprehensive acquisition and control for the contour of the dinner plate can be realized, and the G code is finally used for machining; and the method has the advantages of high edging quality, high machining efficiency and the like.

Description

The service plate edging method that a kind of view-based access control model controls
Technical field
The invention belongs to automatization's edging technical field, more particularly, to the service plate edging method that a kind of view-based access control model controls.
Background technology
Service plate is as the common tool in life, and it needs to carry out edging process in process of production. At present, service plate edging processing method is broadly divided into two classes: edging and automatization's edging manually, wherein automatization's edging processes owing to automaticity is high, efficiency advantages of higher is widely applied, and automatization's edging is most commonly used that the edging technology that view-based access control model controls in processing.
Existing techniques below scheme: the CN201520216408.2 of edging technology controlled about view-based access control model in prior art discloses the glass edging machining center of a kind of automatization, it launches infrared ray to workpiece by RF transmitter, and utilize the infrared ray anchor point on scanning device sensing workpiece profile, generating machining profile with this and obtain machining locus, mainshaft mechanism is processed according to machining locus; CN201420470616.0 discloses a kind of display glass substrate edge polisher, it seizes, by vision imaging, the anchor point that camera lens is seized on glass substrate, work out the reference route of processing edging according to the coordinate seizing anchor point, emery wheel and bearing fixture carry out edging work along machining benchmark route.
But, the edging technology that above-mentioned view-based access control model controls there is problems in that it adopts the scheme of single-station single camera, the problems such as inaccurate or Set and Positioning inaccuracy are transmitted due to drive mechanism, although edging efficiency is high, but edging weak effect, for the edging of the plate-rack product of high-grade environmental protection, such as service plates such as all kinds of sponges, edging required precision is higher, it is impossible to meet instructions for use.
Summary of the invention
Disadvantages described above or Improvement requirement for prior art, the invention provides the service plate edging method that a kind of view-based access control model controls, the camera of two plate-rack both sides being symmetrically distributed in Set and Positioning is wherein adopted at processing stations, the edge of plate-rack is carried out profile acquisition, and it is preprocessed, image is integrated by registration and fusion, the plate-rack profile diagram complete to generate a width, contrasted by the theoretical cad model with plate-rack, obtain center and the angle of deviation of current Set and Positioning, coordinate biasing is carried out according to center, the edging of plate-rack uniform high-efficiency is realized according to the G code programme-control that theoretical model is corresponding, the present invention can not only light all kinds of special-edge plate-rack of grinding, also there is edging quality good, edging precision is high, working (machining) efficiency advantages of higher.
For achieving the above object, the present invention proposes the service plate edging method that a kind of view-based access control model controls, and the method comprises the steps:
The service plate Set and Positioning of edging will be treated, utilize the contour images of two collected by camera service plates being symmetricly set on service plate both sides;
Preprocessed for the contour images collected, image registration and fusion treatment are integrated, then extracts the service plate profile diagram that contour feature is complete to generate a width;
The theoretical cad model of the service plate profile diagram of generation Yu service plate is contrasted, obtains center and the angle of deviation of current Set and Positioning;
Digital control system is carried out coordinate biasing by the center according to current Set and Positioning, then according to G code corresponding to theoretical cad model controls edging unit and realize the uniform high-efficiency edging of service plate.
As it is further preferred that described pretreatment is mean filter process, specifically processing in the following way:
If there being n pixel in image, the gray scale of each pixel is fi(x, y), with g (x, y) representative process after the pixel value of this image, then:
g ( x , y ) = 1 n Σ i = 1 n f i ( x , y ) .
As it is further preferred that described image registration specifically processes in the following way:
Select piece image as with reference to figure, another piece image is as search graph, with reference to the image subblock chosen on figure centered by a certain impact point registration template as image, allow the movement that registration template is orderly on search graph, often move on to a position, registration template is carried out correlation ratio relatively with the corresponding part in search graph, until finding registration position.
As it is further preferred that described registration template carries out correlation ratio relatively with the corresponding part in search graph, until finding registration position, particularly as follows:
The coordinate of establishing quasi-mode plate upper left angle point is (x, y), the region being registered template covering on search graph is search subgraph, the coordinate of this search subgraph upper left angle point be (x ', y '), relatively (x, y) with the gray value acquiring size optimal registration point at (x ', y ') place.
As it is further preferred that compare (x, y) with the gray value acquiring size optimal registration point at (x ', y ') place particularly as follows: calculate the minima of difference of two squares D in region of search (x ', y '), it is thus achieved that optimal registration point, described inWherein, (m, n), (m, n) respectively (m, n) the search subgraph at pixel coordinate place and the gray value of actual subgraph, c and d represents length and the width of registration template for N for M.
As it is preferred that, described fusion treatment is specially and moves line by line in the overlapping region of two images with a window, using pixel minimum for every a line grey value difference as image mosaic point, it is sequentially connected with each splice point and obtains best splicing seams, then smoothing processing it is weighted, image overlapping region is carried out transition, it is achieved fusion naturally smooth between adjacent image.
As it is further preferred that described extraction contour feature specifically includes following steps with the service plate profile diagram generating a width complete:
Divide an image into background and two regions of profile target, obtain gray value when gray variance is maximum between the class making background and profile target as optimal segmenting threshold t;
Make gray value ω in imageiThe new gray value of the pixel less than or equal to t is 0, and the new gray value of the pixel more than t is 1, and segmentation obtains two-valued function image: F ( ω 1 ) = 0 , ω i ≤ t 1 , ω i > t ; Utilize two-valued function image zooming-out bianry image profile.
As it is further preferred that obtain and make gray value that between the class of background and profile target, gray variance is maximum as optimal segmenting threshold t, specifically including:
If the gray level of image is L level, the probability that background occurs is PA, the probability that profile target occurs is PB, the average gray value of background area is ωA, the average gray value of profile target area is ωB, then:
P A = Σ i = 0 t P i , P B = Σ i = t + 1 L - 1 P i = 1 - P A ;
ω A = Σ i = 0 t iP i / P A , ω B = Σ i = t + 1 L - 1 iP i / P B ;
The meansigma methods ω of gradation of image0It is then:
ω 0 = P A ω A + P B ω B = Σ i = 0 L - 1 iP i ;
Gray variance σ between the class of background area and profile target area2For:
σ2=PAA0)2+PBB0)2;
Try to achieve and image makes gray variance σ between above-mentioned class2It is optimal segmenting threshold t for gray value time maximum.
In general, by the contemplated above technical scheme of the present invention compared with prior art, mainly possess following technological merit:
1. in the present invention, adopt two cameras of single-station, and two cameras are symmetrically distributed in the both sides of the plate-rack of Set and Positioning, can realize at clamping position, the whole of plate-rack edge being gathered, eliminate because of trembling and the error that causes when conveyer belt transmits inaccurate or Set and Positioning, make edging precision higher, complete the profile acquisition that plate-rack is complete and control at grinding station.
2. in the present invention, after plate-rack profile of two collected by cameras, once gather again after plate-rack 90-degree rotation, four profile diagrams collected twice merge, and obtain the profile of whole plate-rack, and the plate-rack profile got and existing cad model are carried out registration, to find the position at plate-rack center under current workpiece coordinate system, then carry out coordinate biasing, can further reduce error, improve edging precision.
3. in the present invention, preprocessed for the image collected, image registration and fusion treatment are integrated, then the plate-rack profile diagram that contour feature is complete to generate a width is extracted, carry out studying to pretreatment, image registration, fusion treatment and contours extract mode simultaneously and set, obtain and be applicable to the best pretreatment of tableware edging, image registration, fusion treatment and contours extract mode, improve picture quality, fusion accuracy and edging precision and efficiency further.
Accompanying drawing explanation
Fig. 1 is based on the front view of the edging device of visual spatial attention;
Fig. 2 is based on the side view of the edging device of visual spatial attention;
Fig. 3 (a), (b) are the schematic diagrams of desirable clip position and actual clip position.
Detailed description of the invention
In order to make the purpose of the present invention, technical scheme and advantage clearly understand, below in conjunction with drawings and Examples, the present invention is further elaborated. Should be appreciated that specific embodiment described herein is only in order to explain the present invention, is not intended to limit the present invention. As long as just can be mutually combined additionally, technical characteristic involved in each embodiment of invention described below does not constitute conflict each other.
The service plate edging method that a kind of view-based access control model of the embodiment of the present invention controls, its ultimate principle is: with edging device, the edge of service plate is carried out automatization's edging, grinding station adopts two cameras, two cameras are symmetrically distributed in the both sides of the service plate of Set and Positioning, the edge of service plate is carried out profile acquisition by two cameras, the image collected is integrated, through pretreatment, registration and fusion, generate service plate profile diagram, then contrast with the theoretical cad model of service plate, thus getting center and the angle of deviation of current Set and Positioning, digital control system carries out coordinate biasing, the G code corresponding according to theoretical model is controlled, finally realize the edging of the uniform high-efficiency of service plate. as shown in Figure 1-2, edging device includes the cantilever beam 1 for fixed camera 5, drives the rotating servo motor 2 that plate-rack 7 rotates, for clamping the newel 3 of plate-rack, for carrying out the bistrique 4 of edging and for transmitting the synchrome conveying belt 6 of plate-rack.
The method specifically includes following steps:
(1) the service plate Set and Positioning of edging will be treated, utilize the image of two collected by camera service plates being symmetricly set on service plate both sides, the multiple image of the Same Scene of service plate different visual angles, different time can be obtained by symmetrically arranged two cameras, acquisition mode is after service plate clamps, service plate profile of two collected by cameras, then once gather again after service plate 90-degree rotation, obtain four service plate profile diagrams.
(2) preprocessed for the image collected, image registration and fusion treatment are integrated, then extract the service plate profile diagram that contour feature is complete to generate a width, below each is processed step and be described in detail:
Pretreatment: pretreatment is to improve processing accuracy and speed, after obtaining multiple such as four images to be spliced, first noise reduction pretreatment it is filtered, to avoid the output that the noise on image that the deviation because of local is bigger splices to impact, the present embodiment adopts the mode of mean filter to process, and specifically carries out in the following way:
If there being n pixel in image near current pixel, (x, y) for the two dimensional surface position at pixel place, the gray scale of each pixel is fi(x, y), with g (x, y) representative process after the gray value of this image current pixel, then:
g ( x , y ) = 1 n Σ i = 1 n f i ( x , y ) .
Image registration: the four width images that image registration is the Same Scene in order to different visual angles, different time be obtained after pre-processing transform under the same coordinate system, then the match information extracted in image finds best match position, the present embodiment adopts the method compared one by one to carry out registration, specifically carries out in the following way:
Taking piece image therein as joining reference of reference (namely with reference to figure), another piece image is as search graph; With reference to figure chooses the image subblock centered by the impact point registration template N as image successively from the upper left corner, then movement orderly on search graph for registration template N is allowed, often move on to a position, registration template N is carried out correlation ratio relatively with the corresponding part in search graph, until finding registration position.
Concrete, if search graph is sized to the rectangle of a × b, registration masterplate N is sized to the rectangle of c × d, the coordinate of establishing quasi-mode plate N upper left angle point is (x, y), search graph intercepts one and the equirotal block image of template N, the total individual block image of (a-c+1) × (b-d+1), the target of registration finds out an image most like with registration template exactly in the individual block image of (a-c+1) × (b-d+1), and the datum mark that this image is corresponding is optimal registration point.
More specifically, registration template N moves on search graph, the region covered by template on search graph is search subgraph M, if the coordinate of this search subgraph M upper left angle point is (x ', y '), with as a reference point, then compare that (x, y) with the size of (x ', y ') place's gray value, if both are consistent, then N (x, y) and the difference of M (x ', y ') be zero, and in real image registration process, two width images are difficult to completely the same, only need to ensure N (x, y) with M (x ', y ') difference minimum, to obtain optimal registration point.
Further, the present embodiment adopts difference of two squares method to obtain optimal registration point, calculates the minima of difference of two squares D (x ', y '), it is thus achieved that optimal registration point, concrete:
Wherein, M (m, n), N (m, n) respectively (m, n) the search subgraph at pixel coordinate place and the gray value of actual subgraph, due toIt is a constant definite value for searching for the half-tone information of subgraph;
OrderAccording to Cauchy-Schwarz inequality known 0 < R < 1, and and if only if:
M ( 1 , 1 ) N ( 1 , 1 ) = M ( 1 , 2 ) N ( 1 , 2 ) = M ( 1 , 3 ) N ( 1 , 3 ) = ... = M ( m , n ) N ( m , n ) Time, R (x ', y ') takes maximum, and now D (x ', y ') is minimum, then this search subgraph and actual subgraph are most like, and (x ', y ') some place cooperation at this moment is the most accurate.
Fusion treatment: significantly connect vestige by image overlapping region can be made after image registration to occur, cause the fuzzy of image border and profile to a certain extent, fusion treatment is precisely in order to eliminate the impact of stitching thread and luminosity and colourity, eliminate the vestige of splicing, making image vision reach gratifying effect, the present embodiment adopts linear weighted function algorithm to merge, for four profile diagrams, merge between two respectively, particularly as follows:
Move line by line in the overlapping region of two images with a window, by pixel minimum for every a line grey value difference, namely the most similar between neighbor during splicing pixel is as image mosaic point, it is sequentially connected with each splice point composition splicing line using as best splicing seams, then smoothing processing it is weighted, image overlapping region is carried out transition, it is achieved fusion naturally smooth between adjacent image.In like manner, all the other two profile diagrams can merge to constitute the whole figure of a width service plate successively.
More specifically, if two adjacent width image lap pixel value respectively P1(x, y) and P2(x, y), then be weighted the pixel value P after smoothing processing3(x, y) has:
P3(x, y)=α P1(x,y)+(1-α)P2(x,y)
Wherein α is variable factor, when lap position before overlapping widths in the overlapping region of 1/3 and the overlapping region of rear 1/3 time, P3(x, y) is the pixel value in two width image each regions, when the middle part at overlapping widths, the lap position, after being weighted smoothing processing, and now P3(x, y)=α P1(x,y)+(1-α)P2(x, y). If the pixel value after weighting smoothing processing is P3(x, y), namely through the pixel value of fusion treatment.
P 3 ( x , y ) = P 1 ( x , y ) ( 0 < l < L 3 ) P 3 ( x , y ) = &alpha; P 1 ( x , y ) + ( 1 - &alpha; ) P 2 ( x , y ) ( L 3 < l < 2 L 3 ) P 3 ( x , y ) = P 2 ( x , y ) ( 2 L 3 < l < L )
Wherein, l is pixel position in horizontal direction in overlapping region, and L is the width that horizontal direction epigraph is overlapping, takes α=(3l-L)/L in the present invention.
Contours extract: in order to find and demonstrate the obvious pixel of brightness flop in image, contours extract need to be carried out, the present embodiment adopts grey relevant dynamic matrix to carry out image segmentation, selected optimal segmenting threshold, then each pixel of pending image is carried out standalone processes, compare by its gray value and set threshold value, thus obtaining the artwork master of binaryzation. Particularly as follows:
A) adopting maximum between-cluster variance algorithm, divide an image into background and two parts of profile target according to gamma characteristic, partitioning standards is for choosing threshold value so that the variance between background and target is maximum.
Being provided with L gray level, the probability that each gray level occurs is P, then obtain following image grey level histogram:
O = &Sigma; i = 0 L - 1 n i , P i = n i / O ;
Wherein: O is the sum of image pixel, niFor the number of pixel that gray value is i, PiBeing the i probability occurred for gray value, i is gray value.
If t is optimal segmenting threshold, PARepresent the probability that background occurs, PBRepresenting the probability that profile target occurs, tonal range is 0-N, then the probability P that background occursAFor:
P A = &Sigma; i = 0 t P i ;
The probability P that target occursBFor:
P B = &Sigma; i = t + 1 L - 1 P i = 1 - P A ;
The average gray value ω of background areaAFor:
&omega; A = &Sigma; i = 0 t iP i / P A ;
The average gray value ω of profile target areaBFor:
&omega; B = &Sigma; i = t + 1 L - 1 iP i / P B ;
The meansigma methods ω of gradation of image0For:
&omega; 0 = P A &omega; A + P B &omega; B = &Sigma; i = 0 L - 1 iP i ;
The inter-class variance σ of background area and profile target area2:
σ2=PAA0)2+PBB0)2
The optimal segmenting threshold t tried to achieve enables to gray variance σ between the class of background area and profile target area in image2For gray value time maximum.
B) all gray value ω in image are madeiThe new gray value of the pixel less than or equal to t is all 0, and the new gray value of the pixel more than t is all 1, and segmentation obtains two-valued function image:
F ( &omega; i ) = 0 , &omega; i &le; t 1 , &omega; i > t ;
Utilize two-valued function image zooming-out bianry image profile.
Concrete, utilize the two-valued function image zooming-out bianry image profile to be:
By two-valued function image F (ωi) be 1 point be extracted as black color dots, black color dots is stitched together and is linked to be curve, can obtain the circumference of workpiece.
(3) the theoretical cad model of the service plate profile diagram of generation Yu service plate is contrasted, obtain the center of current Set and Positioning, specifically in order to eliminate the location deflection and angular deviation produced because transmission and clamping are inaccurate, and calculate the size of the skew with the central point of the service plate profile diagram of generation of the center of current Set and Positioning and angle of deviation α. When plate-rack clamping station due to transmit inaccurate, tremble the deviations caused, as Fig. 3 (a), (b), center and angle all there occurs deviation, the service plate profile diagram of Nature creating also creates position and angular deviation.Such as Fig. 3 (a), offset deviation length is the distance between O1, O2 2; Such as Fig. 3 (b), when there is not angular deflection error in theory, grinding position is A1, and after there is angular deflection, A1 rotates to the position of A2, currently grinding position should arrive B point, it is therefore desirable to use rotation transformation function, it is possible to specified angle α will be rotated according to the machining locus of theoretical CAD profile establishment around center of rotation.
(4) according to the center of current Set and Positioning, digital control system carried out coordinate biasing (i.e. displacement bias and angle offset), then according to G code corresponding to theoretical cad model controls edging unit and realizes the uniform high-efficiency edging of service plate, particularly as follows: the overall some position in known G code program is translated and rotation process by digital control system. First carry out coordinate translation biasing, the profile initial point collected is overlapped with the profile initial point of cad model; Then carry out coordinate again and rotate biasing, by the G code of the cad model point position cosine value divided by the angle of deviation, bias through twice coordinate, processing can be completed according to the G code of cad model.
Detailed process is as follows:
First, as Fig. 3 (a) profile diagram central point O1 generated offsets to the center O2 of plate-rack theory CAD profile, to eliminate offset deviation; Then, rotation alpha angle as counterclockwise in Fig. 3 (b), to eliminate angular deviation.
Concrete, utilize G code coordinate offset instructions can realize location bias and angle offset, successfully elimination transmission is inaccurate, tremble the deviations caused, it is achieved efficient G code processing.
Specifically, location bias can by the machining locus translation designated displacement according to theoretical CAD profile establishment, G code format sample:
G90G01X[100+(#50001)]Y[200+(#50001)]F800
X[150+(#50001)]Y[240+(#50001)]F700
X[180+(#50001)]Y[260+(#50001)]F900
Wherein #50001 is the user's macro-variable that can customize, and here by side-play amount, namely the distance of O1, O2 2 is assigned to #50001, can realize displacement bias.
Specifically, angle offset, in G code, use rotation transformation function, it is possible to specified angle α will be rotated according to the machining locus of theoretical CAD profile establishment around center of rotation. G code format sample:
G17
G68X0Y0P[#50002]
G69
Wherein, G17 selects XOY Plane of rotation; G68X0Y0P [#50002] _ set up rotation transformation, P [#50002] refer to rotate [#50002] degree, user's macro-variable that #50002 also can customize, and here offset angle are assigned to #50002, and namely all positions have rotated α degree; G69 cancels rotation transformation.
Those skilled in the art will readily understand; the foregoing is only presently preferred embodiments of the present invention; not in order to limit the present invention, all any amendment, equivalent replacement and improvement etc. made within the spirit and principles in the present invention, should be included within protection scope of the present invention.

Claims (8)

1. the service plate edging method that a view-based access control model controls, it is characterised in that the method comprises the steps:
The service plate Set and Positioning of edging will be treated, utilize the contour images of two collected by camera service plates being symmetricly set on service plate both sides;
Preprocessed for the contour images collected, image registration and fusion treatment are integrated, then extracts the service plate profile diagram that contour feature is complete to generate a width;
The theoretical cad model of the service plate profile diagram of generation Yu service plate is contrasted, obtains center and the angle of deviation of current Set and Positioning;
Digital control system is carried out coordinate biasing by center and the angle of deviation according to current Set and Positioning, then according to G code corresponding to theoretical cad model controls edging unit and realize the uniform high-efficiency edging of service plate.
2. the service plate edging method that view-based access control model as claimed in claim 1 controls, it is characterised in that described pretreatment is that mean filter processes, and specifically processes in the following way:
If there being n pixel in image, the gray scale of each pixel is fi(x, y), with g (x, y) representative process after the pixel value of this image, then:
g ( x , y ) = 1 n &Sigma; i = 1 n f i ( x , y ) .
3. the service plate edging method that view-based access control model as claimed in claim 1 or 2 controls, it is characterised in that described image registration specifically processes in the following way:
Select piece image as with reference to figure, another piece image is as search graph, with reference to the image subblock chosen on figure centered by a certain impact point registration template as image, allow the movement that registration template is orderly on search graph, often move on to a position, registration template is carried out correlation ratio relatively with the corresponding part in search graph, until finding registration position.
4. the service plate edging method that view-based access control model as claimed in claim 3 controls, it is characterised in that described registration template carries out correlation ratio relatively with the corresponding part in search graph, until finding registration position, particularly as follows:
The coordinate of establishing quasi-mode plate upper left angle point is (x, y), the region being registered template covering on search graph is search subgraph, the coordinate of this search subgraph upper left angle point be (x ', y '), relatively (x, y) with the gray value acquiring size optimal registration point at (x ', y ') place.
5. the service plate edging method that view-based access control model as claimed in claim 4 controls, it is characterized in that, relatively (x, y) with (x ', y ') place gray value acquiring size optimal registration point particularly as follows: calculate difference of two squares D in region of search (x ', y ') minima, it is thus achieved that optimal registration point, wherein:
In formula, (m, n), (m, n) respectively (m, n) the search subgraph at pixel coordinate place and the gray value of actual subgraph, c and d represents length and the width of registration template for N for M.
6. the service plate edging method that view-based access control model as claimed in claim 1 or 2 controls, it is characterized in that, described fusion treatment is specially and moves line by line in the overlapping region of two images with a window, using pixel minimum for every a line grey value difference as image mosaic point, it is sequentially connected with each splice point and obtains best splicing seams, then it is weighted smoothing processing, image overlapping region is carried out transition, it is achieved fusion naturally smooth between adjacent image.
7. the service plate edging method that view-based access control model as claimed in claim 1 or 2 controls, it is characterised in that described extraction contour feature specifically includes following steps with the service plate profile diagram generating a width complete:
Divide an image into background and two regions of profile target, obtain gray value when gray variance is maximum between the class making background and profile target as optimal segmenting threshold t;
Make gray value ω in imageiThe new gray value of the pixel less than or equal to t is 0, and the new gray value of the pixel more than t is 1, and segmentation obtains two-valued function image: F ( &omega; i ) = 0 , &omega; i &le; t 1 , &omega; i > t ; Utilize two-valued function image zooming-out bianry image profile.
8. the edging method that view-based access control model as claimed in claim 7 controls, it is characterised in that obtain gray value that between the class making background and profile target, gray variance is maximum as optimal segmenting threshold t, specifically include:
If the gray level of image is L level, the probability that background occurs is PA, the probability that profile target occurs is PB, the average gray value of background area is ωA, the average gray value of profile target area is ωB, then:
P A = &Sigma; i = 0 t P i , P B = &Sigma; i = t + 1 L - 1 P i = 1 - P A ;
&omega; A = &Sigma; i = 0 t iP i / P A , &omega; B = &Sigma; i = t + 1 L - 1 iP 1 / P B ;
The meansigma methods ω of gradation of image0It is then:
&omega; 0 = P A &omega; A + P B &omega; B = &Sigma; i = 0 L - 1 iP i ;
Gray variance σ between the class of background area and profile target area2For:
σ2=PAA0)2+PBB0)2;
Try to achieve and image makes gray variance σ between above-mentioned class2It is optimal segmenting threshold t for gray value time maximum.
CN201610077868.0A 2016-02-03 2016-02-03 A kind of service plate edging method of view-based access control model control Active CN105666274B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610077868.0A CN105666274B (en) 2016-02-03 2016-02-03 A kind of service plate edging method of view-based access control model control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610077868.0A CN105666274B (en) 2016-02-03 2016-02-03 A kind of service plate edging method of view-based access control model control

Publications (2)

Publication Number Publication Date
CN105666274A true CN105666274A (en) 2016-06-15
CN105666274B CN105666274B (en) 2018-03-09

Family

ID=56303589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610077868.0A Active CN105666274B (en) 2016-02-03 2016-02-03 A kind of service plate edging method of view-based access control model control

Country Status (1)

Country Link
CN (1) CN105666274B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106248349A (en) * 2016-10-10 2016-12-21 长飞光纤光缆股份有限公司 A kind of test optical fiber automatic coupler
CN107729824A (en) * 2017-09-28 2018-02-23 湖北工业大学 A kind of monocular visual positioning method for intelligent scoring of being set a table for Chinese meal dinner party table top
CN111024005A (en) * 2019-12-31 2020-04-17 芜湖哈特机器人产业技术研究院有限公司 Furniture spraying quality detection method based on vision
CN111360166A (en) * 2018-12-26 2020-07-03 郭磊 Automatic riveting equipment for disc sheet metal parts
CN112365499A (en) * 2021-01-11 2021-02-12 深兰人工智能芯片研究院(江苏)有限公司 Contour detection method, contour detection device, electronic equipment and storage medium
CN112461130A (en) * 2020-11-16 2021-03-09 北京平恒智能科技有限公司 Positioning method for visual inspection tool frame of adhesive product
CN114536156A (en) * 2020-11-25 2022-05-27 广东天机工业智能系统有限公司 Shoe upper grinding track generation method
CN114821114A (en) * 2022-03-28 2022-07-29 南京业恒达智能系统股份有限公司 Groove cutting robot image processing method based on visual system
CN114821114B (en) * 2022-03-28 2024-04-30 南京业恒达智能系统有限公司 Groove cutting robot image processing method based on vision system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3699519B2 (en) * 1996-02-02 2005-09-28 富士写真フイルム株式会社 Image processing device
JP2009125876A (en) * 2007-11-26 2009-06-11 Nakamura Tome Precision Ind Co Ltd Registration method of correction value in side edge machining apparatus of glass substrate
CN101532926A (en) * 2008-12-12 2009-09-16 齐齐哈尔华工机床制造有限公司 On-line vision detecting system for automatic impact specimen processing device and image processing method thereof
CN102306375A (en) * 2011-08-31 2012-01-04 北京航空航天大学 Segmentation method for synthetic aperture radar (SAR) and visible light pixel-level fused image
CN103220880A (en) * 2012-01-19 2013-07-24 昆山思拓机器有限公司 Method for improving flexible printed circuit (FPC) processing precision
CN203426822U (en) * 2013-08-09 2014-02-12 中村留精密工业株式会社 Grinding device of hard brittle plate
US20140285676A1 (en) * 2011-07-25 2014-09-25 Universidade De Coimbra Method and apparatus for automatic camera calibration using one or more images of a checkerboard pattern
US9002062B2 (en) * 2008-10-14 2015-04-07 Joshua Victor Aller Target and method of detecting, identifying, and determining 3-D pose of the target
CN105149794A (en) * 2015-08-18 2015-12-16 河海大学常州校区 Intelligent laser trimming system and method based on binocular vision

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3699519B2 (en) * 1996-02-02 2005-09-28 富士写真フイルム株式会社 Image processing device
JP2009125876A (en) * 2007-11-26 2009-06-11 Nakamura Tome Precision Ind Co Ltd Registration method of correction value in side edge machining apparatus of glass substrate
US9002062B2 (en) * 2008-10-14 2015-04-07 Joshua Victor Aller Target and method of detecting, identifying, and determining 3-D pose of the target
CN101532926A (en) * 2008-12-12 2009-09-16 齐齐哈尔华工机床制造有限公司 On-line vision detecting system for automatic impact specimen processing device and image processing method thereof
US20140285676A1 (en) * 2011-07-25 2014-09-25 Universidade De Coimbra Method and apparatus for automatic camera calibration using one or more images of a checkerboard pattern
CN102306375A (en) * 2011-08-31 2012-01-04 北京航空航天大学 Segmentation method for synthetic aperture radar (SAR) and visible light pixel-level fused image
CN103220880A (en) * 2012-01-19 2013-07-24 昆山思拓机器有限公司 Method for improving flexible printed circuit (FPC) processing precision
CN203426822U (en) * 2013-08-09 2014-02-12 中村留精密工业株式会社 Grinding device of hard brittle plate
CN105149794A (en) * 2015-08-18 2015-12-16 河海大学常州校区 Intelligent laser trimming system and method based on binocular vision

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张丰收等: "《数字图像处理技术与应用》", 30 September 2014, 中国水利水电出版社 *
温正等: "《精通MATLAB智能算法》", 31 May 2015, 清华大学出版社 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106248349A (en) * 2016-10-10 2016-12-21 长飞光纤光缆股份有限公司 A kind of test optical fiber automatic coupler
CN107729824A (en) * 2017-09-28 2018-02-23 湖北工业大学 A kind of monocular visual positioning method for intelligent scoring of being set a table for Chinese meal dinner party table top
CN111360166A (en) * 2018-12-26 2020-07-03 郭磊 Automatic riveting equipment for disc sheet metal parts
CN111024005A (en) * 2019-12-31 2020-04-17 芜湖哈特机器人产业技术研究院有限公司 Furniture spraying quality detection method based on vision
CN112461130A (en) * 2020-11-16 2021-03-09 北京平恒智能科技有限公司 Positioning method for visual inspection tool frame of adhesive product
CN114536156A (en) * 2020-11-25 2022-05-27 广东天机工业智能系统有限公司 Shoe upper grinding track generation method
CN112365499A (en) * 2021-01-11 2021-02-12 深兰人工智能芯片研究院(江苏)有限公司 Contour detection method, contour detection device, electronic equipment and storage medium
CN114821114A (en) * 2022-03-28 2022-07-29 南京业恒达智能系统股份有限公司 Groove cutting robot image processing method based on visual system
CN114821114B (en) * 2022-03-28 2024-04-30 南京业恒达智能系统有限公司 Groove cutting robot image processing method based on vision system

Also Published As

Publication number Publication date
CN105666274B (en) 2018-03-09

Similar Documents

Publication Publication Date Title
CN105666274A (en) Dinner plate edging method based on vision control
CN111462135B (en) Semantic mapping method based on visual SLAM and two-dimensional semantic segmentation
CN106767399B (en) The non-contact measurement method of logistics goods volume based on binocular stereo vision and dot laser ranging
CN110569704B (en) Multi-strategy self-adaptive lane line detection method based on stereoscopic vision
CN109255813A (en) A kind of hand-held object pose real-time detection method towards man-machine collaboration
CN111721259B (en) Underwater robot recovery positioning method based on binocular vision
CN110405773A (en) A kind of floor mounting method and robot
CN105345254A (en) Calibration method for positional relation between paraxial type visual system and laser vibrating mirror machining system
CN105574812B (en) Multi-angle three-dimensional data method for registering and device
CN109947097A (en) A kind of the robot localization method and navigation application of view-based access control model and laser fusion
CN104748683A (en) Device and method for online and automatic measuring numerical control machine tool workpieces
CN108907526A (en) A kind of weld image characteristic recognition method with high robust
CN110136211A (en) A kind of workpiece localization method and system based on active binocular vision technology
JPH08136220A (en) Method and device for detecting position of article
CN111524115A (en) Positioning method and sorting system for steel plate cutting piece
CN107160241A (en) A kind of vision positioning system and method based on Digit Control Machine Tool
CN111784655A (en) Underwater robot recovery positioning method
CN110992410B (en) Robot vision guiding method and device based on RGB-D data fusion
KR100824744B1 (en) Localization System and Method for Mobile Robot Using Corner&#39;s Type
CN111340942A (en) Three-dimensional reconstruction system based on unmanned aerial vehicle and method thereof
CN108416735B (en) Method and device for splicing digital X-ray images based on geometric features
Han et al. Target positioning method in binocular vision manipulator control based on improved canny operator
Sui et al. Extrinsic calibration of camera and 3D laser sensor system
CN110726402B (en) Laser point vision guiding method of non-orthogonal shafting laser total station
CN115830018B (en) Carbon block detection method and system based on deep learning and binocular vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant