CN104036494A - Fast matching computation method used for fruit picture - Google Patents

Fast matching computation method used for fruit picture Download PDF

Info

Publication number
CN104036494A
CN104036494A CN201410217655.4A CN201410217655A CN104036494A CN 104036494 A CN104036494 A CN 104036494A CN 201410217655 A CN201410217655 A CN 201410217655A CN 104036494 A CN104036494 A CN 104036494A
Authority
CN
China
Prior art keywords
point
fruit
pixel
gaussian
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410217655.4A
Other languages
Chinese (zh)
Other versions
CN104036494B (en
Inventor
饶秀勤
林文彬
应义斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou nuotian Intelligent Technology Co.,Ltd.
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201410217655.4A priority Critical patent/CN104036494B/en
Publication of CN104036494A publication Critical patent/CN104036494A/en
Application granted granted Critical
Publication of CN104036494B publication Critical patent/CN104036494B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a fast matching computation method used for a fruit picture. Through the extraction of an effective matching area and the improvement on a SIFT (scale invariant feature transform) method, a fruit image can be fast matched; the side surface image of a fruit is firstly obtained; the left boundary points and the right boundary points of a fruit profile are searched in the side surface image so as to extract an effective area required for matching; then, the feature points of the effective area are extracted; feature point sets with feature point descriptors are obtained; the feature points are matched by calculating an Euclidean distance between the feature point descriptors in the two feature point sets; and after a candidate matching point set is obtained, mismatching points are removed to obtain a correct matching point set. According to the fast matching computation method used for the fruit picture, a single layer of difference-of-Gaussian image is used for replacing a scale space construction in the SIFT method, rotation invariance is omitted, and the feature point descriptors are subjected to dimensionality reduction to reduce redundant computation in a fruit picture feature point detection process; and through the extraction of an effective overlapping area, an unnecessary computation amount is omitted, and operating rate is improved.

Description

A kind of rapid matching computation method for fruit image
Technical field
The present invention relates to a kind of images match computing method, especially relate to a kind of rapid matching computation method for fruit image of technical field of image processing.
Background technology
Images match refers to by identifying same point between certain matching process two width or multiple image.In recent years, images match oneself become object identification, robot map perception with navigation, image stitching, the foundation of 3D model, gesture identification, image tracing and action than gordian technique and the study hotspot in reciprocity image analysis processing field.
Obtaining of fruit surface information is the basis that the index of quality such as size, shape, surface color and surface imperfection of fruit detects.The accuracy of surface color and surface defects detection depends on obtaining of fruit full surface image, and Image Mosaics technology is to realize the key of fruit full surface Image Acquisition, and image matching technology is the basis of Image Mosaics technology.
SIFT (Scale Invariant Feature Transform) method is local feature description's (David.G.Lowe.Object recognition from local scale-invariant features.International Conference on Computer Vision that David Lowe proposed in 1999, Corfu, Greece, 1999:1150-1157), and carried out in 2004 more deep development and improvement (David.G.Lowe.Distinctive image features from scale-invariant keypoints[J] .International Journal of Computer Vision, 2004, 60 (2): 91-110).The SIFT proper vector of extracting can change and maintain the invariance rotation, yardstick convergent-divergent, brightness, and visual angle change, affined transformation, noise are also kept to stability to a certain degree.
But it is higher to the rate request of method that fruit detects with classification technique, although the coupling adaptability of SIFT method is higher, the method complexity, large, the consuming time length of calculated amount.
Summary of the invention
In order to solve the problem existing in background technology, the present invention proposes a kind of rapid matching computation method for fruit image, adopt individual layer difference of Gaussian image to replace metric space complicated in SIFT method to construct to improve arithmetic speed, and by verification experimental verification the reliability of match point computing method of the present invention.
The technical solution adopted for the present invention to solve the technical problems is:
The present invention includes following concrete steps:
1) original image obtains:
By fruit horizontal positioned, make the calyx carpopodium line of fruit vertical with surface level, gather a width side image is as left figure clearly from fruit side, and centered by the calyx carpopodium line of fruit, 60 ° of angles of axle rotation gather another width side image again as right figure;
2) fruit border points extraction:
Left figure and right figure are carried out to background segment, then scan image from top to bottom, record respectively in left figure and right figure the pixel of the pixel of the fruit profile leftmost side and the fruit profile rightmost side separately, the abscissa value of the pixel of the pixel of the leftmost side and the rightmost side is subtracted each other, obtain respectively left figure fruit maximum transverse diameter and right figure fruit maximum transverse diameter;
3) effective image extracts:
In left figure, the pixel left side width that the rightmost side is removed in cutting be the region of 0.1 × left figure fruit maximum transverse diameter and this pixel with right All Ranges, the pixel right side width of removing the leftmost side is that the region of 0.55 × left figure fruit maximum transverse diameter and this pixel are with left All Ranges; In right figure, the pixel left side width that the rightmost side is removed in cutting be the region of 0.55 × right figure fruit maximum transverse diameter and this pixel with right All Ranges, the pixel right side width of removing the leftmost side is that the region of 0.1 × right figure fruit maximum transverse diameter and this pixel are with left All Ranges; After cutting, obtain effective left figure and effective right figure, the picture traverse of remembering effective left figure is width;
4) feature point extraction:
4.1) detect extreme point:
4.1.1) first effective left figure, effective right figure are calculated by formula 1, obtain the left figure of initial Gaussian, the right figure of initial Gaussian:
L(x,y,σ)=G(x,y,σ)*I(x,y) (1)
Wherein, L (x, y, σ) the image for calculating, G (x, y, σ) be gaussian kernel, I (x, y) is image to be calculated, the horizontal ordinate that x is image to be calculated, y is the ordinate of image to be calculated, the expression formula of gaussian kernel G (x, y, σ) is as shown in following formula 2:
G ( x , y , σ ) = 1 2 π σ 2 e - ( x 2 + y 2 ) / 2 σ 2 - - - ( 2 )
Wherein, σ is yardstick coordinate;
Then by top sampling method, left initial Gaussian figure, the right figure of initial Gaussian are put and be twice, obtain the left figure of ground floor Gauss, the right figure of ground floor Gauss; Again left ground floor Gauss figure, the right figure of ground floor Gauss are calculated by above-mentioned formula 1, obtain the left figure of second layer Gauss, the right figure of second layer Gauss; Finally left second layer Gauss figure is deducted to the left figure of ground floor Gauss and obtain the left figure of difference of Gaussian, right second layer Gauss figure is deducted to the right figure of ground floor Gauss and obtain the right figure of difference of Gaussian;
4.1.2), for each pixel p in the left figure of difference of Gaussian and the right figure of difference of Gaussian, in 3 × 3 neighborhoods centered by p, if the gray-scale value of p is less than or greater than the gray-scale value of each pixel in 3 × 3 neighborhoods, this pixel p is labeled as to extreme point;
The left figure of difference of Gaussian and the right figure of difference of Gaussian are traveled through according to above step, obtain respectively left figure extreme value point set and right figure extreme value point set, the extreme point quantity of left figure extreme value point set and right figure extreme value point set is respectively n 1and n 2;
4.2) by left figure extreme value point set and right figure extreme value point set generating feature point descriptor:
Concentrate each extreme point (x, y) to be handled as follows respectively to left figure extreme value point set and right figure extreme point:
4.2.1) in the left figure of difference of Gaussian and the right figure of difference of Gaussian, centered by each extreme point of concentrating by left figure extreme value point set and right figure extreme point respectively, choose neighborhood, and be divided into separately 4 × 4 sub regions, and amounting to 16 sub regions, the width of every sub regions is dis;
4.2.2) in each subregion to each pixel 3 compute gradient mould values, as follows 4 compute gradient directions as follows:
m ( x , y ) = ( f ( x + 1 , y ) - f ( x - 1 , y ) ) 2 + ( f ( x , y + 1 ) - f ( x , y - 1 ) ) 2 - - - ( 3 )
θ(x,y)=arctan((f(x,y+1)-f(x,y-1))/(f(x+1,y)-f(x-1,y))) (4)
Wherein, f (x, y) be extreme point (x in the left figure of difference of Gaussian or the right figure of difference of Gaussian, y) gray-scale value, m (x, y) is extreme point (x in the left figure of difference of Gaussian image or the right figure of difference of Gaussian, y) the gradient-norm value of locating, θ (x, y) is the gradient direction that in the left figure of difference of Gaussian or the right figure of difference of Gaussian, extreme point (x, y) is located;
Then the direction of 0~360 degree is divided into 0 ° successively, 10 ° ..., 340 °, 350 ° amount to 36 angle values, and the gradient direction that each pixel is obtained is grouped into by rounding up in corresponding angle value;
4.2.3) direction of 0~360 degree is divided into 0 °, 45 ° successively, 270 °, 315 ° amount to eight angle values, for all directions α under eight angle values, the weight mould value weight that the gradient-norm value of each pixel in three sub regions of this subregion and vicinity is calculated by formula 5 adds up, and obtains the mould value of this subregion at this direction α:
weight = m ( x , y ) × e - x k 2 + y k 2 2 σ w × ( 1 - d r ) × ( 1 - d c ) × ( 1 - d o ) - - - ( 5 )
Wherein, m (x, y) is the gradient-norm value of this pixel in a direction, x kfor the column distance of this pixel and current extreme point, y kfor the row distance of this pixel and current extreme point, σ wfor scale parameter, σ wbe the width dis of 2 × subregion, d rfor row contribution factor, d cfor row contribution factor, d ofor direction contribution factor;
Finally obtain 4 × 4 sub regions each gradient under eight directions, symbiosis becomes 128 dimensional feature point descriptor vectors;
4.3) normalization unique point descriptor vector:
By step 4.2) in the initial magnitude of the 128 dimensional feature point descriptor vectors that obtain be H=(h 1, h 2... h 128), according to following formula 6,128 dimensional feature point descriptor vectors being normalized, the amplitude after normalization is L=(l 1, l 2... l 128):
l i = h i Σ j = 1 128 h j - - - ( 6 )
Wherein, current vectorial dimension i=1,2,3 ... 128; Every one-dimensional vector dimension j=1,2,3 ... 128; If wherein l ibe greater than 0.2, changed into 0.2, then according to formula 6, the amplitude of this 128 dimensional feature point descriptor vector is re-started to a normalization, obtain final amplitude T=(t 1, t 2... t 128);
Then left figure extreme value point set and right figure extreme point are concentrated each extreme point horizontal stroke, ordinate to be kept to original 1/2, finally obtained left figure feature point set and right figure feature point set with unique point descriptor;
4.4) unique point descriptor dimensionality reduction:
By angle in eight unique point descriptor vectors of each subregion be 45 °, 135 °, 225 °, 315 ° four vectors respectively projection be added in two adjacent separately vectors, finally obtain 64 dimensional feature point descriptors;
5) Feature Points Matching:
Start until n from first unique point of left figure feature point set 1individual unique point, travels through right figure feature point set for each unique point, therefrom search two unique points nearest with the Euclidean distance of this unique point descriptor, and wherein the computing formula of Euclidean distance is as shown in the formula 7:
d ( p _ left i , p _ right j ) = Σ n = 1 64 ( Des i ( n ) - Des j ( n ) ) 2 - - - ( 7 )
Wherein, Des i(n) refer to the amplitude of i unique point n-dimensional vector in left figure feature point set, Des j(n) refer to the amplitude of j unique point n-dimensional vector in right figure feature point set;
Suppose to find in right figure feature point set with left figure feature point set in the nearest unique point of the descriptor Euclidean distance of current unique point and time near unique point, and obtain nearest Euclidean distance value and time near Euclidean distance value, then nearest Euclidean distance value is removed near Euclidean distance value in proper order, is obtained ratio value k:
If the ratio value k<0.95 obtaining, is a pair of match point to be selected with the nearest unique point of this unique point Euclidean distance in current unique point and right figure feature point set in this left figure feature point set, otherwise gives up;
In left figure feature point set, all unique points all complete coupling according to after above-mentioned traversal, obtain coupling point set to be selected;
6) match point screening:
To step 5) match point to be selected that obtains concentrates each to screen match point to be selected, by following concrete steps:
Match point to be selected in effective left figure is width1 to the distance of effective left figure right hand edge, match point to be selected in the effective right figure corresponding with it is width2 to the distance of effective right figure left hand edge, to be added and obtain apart from total value Sum apart from width1 and apart from width2, the picture traverse of effective left figure be width; If apart from the picture traverse width of total value Sum<1.25 × effective left figure, and apart from the picture traverse width of total value Sum>0.75 × effective left figure, this is judged as to correct match point to match point to be selected, otherwise this is judged as to Mismatching point and is given up match point to be selected, finally correctly mated point set;
7) match point coordinate reduction:
To step 6) the correct match point that obtains concentrates each to add 0.55 × left figure fruit maximum transverse diameter to being arranged in the correct match point horizontal ordinate of effective left figure in correct match point, and the correct match point horizontal ordinate that is arranged in effective right figure adds 0.1 × right figure fruit maximum transverse diameter;
Described step 4.2.3) row contribution factor d in formula 4 requal the absolute value of horizontal ordinate difference of this pixel and subregion central point divided by the width dis of subregion, if the row contribution factor d calculating rbe greater than 1, row contribution factor d rget 1.
Described step 4.2.3) row contribution factor d in formula 4 cequal the absolute value of ordinate difference of this pixel and subregion central point divided by the width dis of subregion, if the row contribution factor d calculating cbe greater than 1, row contribution factor d cget 1.
Described step 4.2.3) direction contribution factor d in formula 4 oequal the absolute value of this pixel gradient direction angle and direction α difference, if the direction contribution factor d calculating obe greater than 1, direction contribution factor d oget 1.
Described fruit is snake fruit, apple or pears.
Described step 1) in the left figure of fruit side image identical with the resolution of right figure, and be 0.1~0.2mm/pixel.
The invention has the beneficial effects as follows:
The present invention is replaced the structure in SIFT method mesoscale space and is cancelled rotational invariance by employing individual layer difference of Gaussian image, and unique point descriptor is carried out to dimensionality reduction, has reduced unnecessary calculating in fruit image feature point detection process; Extract by effective overlapping region, removed unnecessary calculated amount, improved arithmetic speed.
Brief description of the drawings
Fig. 1 is matching process key step of the present invention.
Fig. 2 is left figure and the right figure of the embodiment of the present invention.
Fig. 3 is the pixel of the fruit profile leftmost side and the pixel of the fruit profile rightmost side in the left figure of the embodiment of the present invention and right figure.
Fig. 4 is the effective left figure leaching process schematic diagram of the embodiment of the present invention.
Fig. 5 is the effective right figure leaching process schematic diagram of the embodiment of the present invention.
Fig. 6 is effective left figure and the effective right figure that the embodiment of the present invention is extracted.
Fig. 7 is the left figure of initial Gaussian and the right figure of initial Gaussian extracting in the embodiment of the present invention.
Fig. 8 is the left figure of ground floor Gauss and the right figure of ground floor Gauss extracting in the embodiment of the present invention.
Fig. 9 is the left figure of second layer Gauss and the right figure of second layer Gauss extracting in the embodiment of the present invention.
Figure 10 is the left figure of difference of Gaussian and the right figure of difference of Gaussian extracting in the embodiment of the present invention.
Figure 11 is 3 × 3 neighborhood schematic diagram of the present invention centered by p.
Figure 12 is left figure extreme value point set and the right figure extreme value point set extracting in the embodiment of the present invention.
Figure 13 is the subregion schematic diagram centered by each extreme point in the embodiment of the present invention.
Figure 14 is the gradient schematic diagram of the interior each pixel of each subregion in the embodiment of the present invention.
Figure 15 be in the embodiment of the present invention each subregion at the gradient schematic diagram of all directions.
Figure 16 is the 64 dimensional feature vector schematic diagram that in the embodiment of the present invention, 128 dimensional feature vector dimensionality reductions obtained.
Figure 17 is the coupling point set to be selected extracting in the embodiment of the present invention.
Figure 18 is the schematic diagram that in the embodiment of the present invention, the match point to be selected in effective left figure arrives the distance width2 of effective right figure left hand edge to the match point to be selected in distance width1 and effective right figure of effective left figure right hand edge.
Figure 19 is the correct coupling point set extracting in the embodiment of the present invention.
Figure 20 is the coupling point set finally obtaining in the embodiment of the present invention.
Embodiment
Below in conjunction with drawings and Examples, the present invention is further illustrated, and following examples are used for illustrating the present invention, but do not limit the scope of the invention.As shown in Figure 1, this implementation process comprises the following steps:
1) original image obtains:
By fruit horizontal positioned, make the calyx carpopodium line of fruit vertical with surface level, gather a width side image is as left figure clearly from fruit side, and centered by the calyx carpopodium line of fruit, 60 ° of angles of axle rotation gather another width side image again as right figure;
2) fruit border points extraction:
Left figure and right figure are carried out to background segment, then scan image from top to bottom, record respectively in left figure and right figure the pixel of the pixel of the fruit profile leftmost side and the fruit profile rightmost side separately, the abscissa value of the pixel of the pixel of the leftmost side and the rightmost side is subtracted each other, obtain respectively left figure fruit maximum transverse diameter and right figure fruit maximum transverse diameter;
3) effective image extracts:
In left figure, the pixel left side width that the rightmost side is removed in cutting be the region of 0.1 × left figure fruit maximum transverse diameter and this pixel with right All Ranges, the pixel right side width of removing the leftmost side is that the region of 0.55 × left figure fruit maximum transverse diameter and this pixel are with left All Ranges; In right figure, the pixel left side width that the rightmost side is removed in cutting be the region of 0.55 × right figure fruit maximum transverse diameter and this pixel with right All Ranges, the pixel right side width of removing the leftmost side is that the region of 0.1 × right figure fruit maximum transverse diameter and this pixel are with left All Ranges; After cutting, obtain effective left figure and effective right figure, the picture traverse of remembering effective left figure is width;
4) feature point extraction:
4.1) detect extreme point:
4.1.1) first effective left figure, effective right figure are calculated by formula 1, obtain the left figure of initial Gaussian, the right figure of initial Gaussian:
L(x,y,σ)=G(x,y,σ)*I(x,y) (1)
Wherein, L (x, y, σ) the image for calculating, G (x, y, σ) be gaussian kernel, I (x, y) is image to be calculated, the horizontal ordinate that x is image to be calculated, y is the ordinate of image to be calculated, the expression formula of gaussian kernel G (x, y, σ) is as shown in following formula 2:
G ( x , y , &sigma; ) = 1 2 &pi; &sigma; 2 e - ( x 2 + y 2 ) / 2 &sigma; 2 - - - ( 2 )
Wherein, σ is yardstick coordinate;
Then by top sampling method, left initial Gaussian figure, the right figure of initial Gaussian are put and be twice, obtain the left figure of ground floor Gauss, the right figure of ground floor Gauss; Again left ground floor Gauss figure, the right figure of ground floor Gauss are calculated by above-mentioned formula 1, obtain the left figure of second layer Gauss, the right figure of second layer Gauss; Finally left second layer Gauss figure is deducted to the left figure of ground floor Gauss and obtain the left figure of difference of Gaussian, right second layer Gauss figure is deducted to the right figure of ground floor Gauss and obtain the right figure of difference of Gaussian;
4.1.2), for each pixel p in the left figure of difference of Gaussian and the right figure of difference of Gaussian, in 3 × 3 neighborhoods centered by p, if the gray-scale value of p is less than or greater than the gray-scale value of each pixel in 3 × 3 neighborhoods, this pixel p is labeled as to extreme point;
The left figure of difference of Gaussian and the right figure of difference of Gaussian are traveled through according to above step, obtain respectively left figure extreme value point set and right figure extreme value point set, the extreme point quantity of left figure extreme value point set and right figure extreme value point set is respectively n 1and n 2;
4.2) by left figure extreme value point set and right figure extreme value point set generating feature point descriptor:
Concentrate each extreme point (x, y) to be handled as follows respectively to left figure extreme value point set and right figure extreme point:
4.2.1) in the left figure of difference of Gaussian and the right figure of difference of Gaussian, centered by each extreme point of concentrating by left figure extreme value point set and right figure extreme point respectively, choose neighborhood, and be divided into separately 4 × 4 sub regions, and amounting to 16 sub regions, the width of every sub regions is dis;
4.2.2) in each subregion to each pixel 3 compute gradient mould values, as follows 4 compute gradient directions as follows:
m ( x , y ) = ( f ( x + 1 , y ) - f ( x - 1 , y ) ) 2 + ( f ( x , y + 1 ) - f ( x , y - 1 ) ) 2 - - - ( 3 )
θ(x,y)=arctan((f(x,y+1)-f(x,y-1))/(f(x+1,y)-f(x-1,y))) (4)
Wherein, f (x, y) be extreme point (x in the left figure of difference of Gaussian or the right figure of difference of Gaussian, y) gray-scale value, m (x, y) is extreme point (x in the left figure of difference of Gaussian image or the right figure of difference of Gaussian, y) the gradient-norm value of locating, θ (x, y) is the gradient direction that in the left figure of difference of Gaussian or the right figure of difference of Gaussian, extreme point (x, y) is located;
Then the direction of 0~360 degree is divided into 0 ° successively, 10 ° ..., 340 °, 350 ° amount to 36 angle values, and the gradient direction that each pixel is obtained is grouped into by rounding up in corresponding angle value;
4.2.3) direction of 0~360 degree is divided into 0 °, 45 ° successively, 270 °, 315 ° amount to eight angle values, for all directions α under eight angle values, the weight mould value weight that the gradient-norm value of each pixel in three sub regions of this subregion and vicinity is calculated by formula 5 adds up, and obtains the mould value of this subregion at this direction α:
weight = m ( x , y ) &times; e - x k 2 + y k 2 2 &sigma; w &times; ( 1 - d r ) &times; ( 1 - d c ) &times; ( 1 - d o ) - - - ( 5 )
Wherein, m (x, y) is the gradient-norm value of this pixel in a direction, x kfor the column distance of this pixel and current extreme point, y kfor the row distance of this pixel and current extreme point, σ wfor scale parameter, σ wbe the width dis of 2 × subregion, d rfor row contribution factor, d cfor row contribution factor, d ofor direction contribution factor;
Finally obtain 4 × 4 sub regions each gradient under eight directions, symbiosis becomes 128 dimensional feature point descriptor vectors;
4.3) normalization unique point descriptor vector:
By step 4.2) in the initial magnitude of the 128 dimensional feature point descriptor vectors that obtain be H=(h 1, h 2... h 128), according to following formula 6,128 dimensional feature point descriptor vectors being normalized, the amplitude after normalization is L=(l 1, l 2... l 128):
l i = h i &Sigma; j = 1 128 h j - - - ( 6 )
Wherein, current vectorial dimension i=1,2,3 ... 128; The dimension j=1 of every one-dimensional vector, 2,3 ... 128; If wherein l ibe greater than 0.2, changed into 0.2, then according to formula 6, the amplitude of this 128 dimensional feature point descriptor vector is re-started to a normalization, obtain final amplitude T=(t 1, t 2... t 128);
Then left figure extreme value point set and right figure extreme point are concentrated each extreme point horizontal stroke, ordinate to be kept to original 1/2, finally obtained left figure feature point set and right figure feature point set with unique point descriptor;
4.4) unique point descriptor dimensionality reduction:
By in eight unique point descriptor vectors of each subregion, angle be 45 °, 135 °, 225 °, 315 ° four vectors respectively projection be added in two adjacent separately vectors, finally obtain 64 dimensional feature point descriptors;
5) coupling of unique point:
From first unique point of left figure feature point set 1start until n 1individual unique point, travels through right figure feature point set for each unique point, therefrom search two unique points nearest with the Euclidean distance of this unique point descriptor, and wherein the computing formula of Euclidean distance is as shown in the formula 7:
d ( p _ left i , p _ right j ) = &Sigma; n = 1 64 ( Des i ( n ) - Des j ( n ) ) 2 - - - ( 7 )
Wherein, Des i(n) refer to the amplitude of i unique point n-dimensional vector in left figure feature point set, Des j(n) refer to the amplitude of j unique point n-dimensional vector in right figure feature point set;
Suppose to find in right figure feature point set with left figure feature point set in the nearest unique point of the descriptor Euclidean distance of current unique point and time near unique point, and obtain nearest Euclidean distance value and time near Euclidean distance value, then nearest Euclidean distance value is removed near Euclidean distance value in proper order, obtained ratio value k;
If the ratio value k<0.95 obtaining, is a pair of match point to be selected with the nearest unique point of this unique point Euclidean distance in current unique point and right figure feature point set in this left figure feature point set, otherwise gives up;
In left figure feature point set, all unique points all complete coupling according to after above-mentioned traversal, obtain coupling point set { match to be selected t.
6) match point screening:
To step 5) the coupling point set { match to be selected that obtains tin each screens match point to be selected, by following concrete steps:
Match point to be selected in effective left figure is width1 to the distance of effective left figure right hand edge, match point to be selected in the effective right figure corresponding with it is width2 to the distance of effective right figure left hand edge, to be added and obtain apart from total value Sum apart from width1 and apart from width2, the picture traverse of effective left figure be width; If apart from the picture traverse width of total value Sum<1.25 × effective left figure, and apart from the picture traverse width of total value Sum>0.75 × effective left figure, this is judged as to correct match point to match point to be selected, otherwise this is judged as to Mismatching point and is given up match point to be selected, finally correctly mated point set;
7) match point coordinate reduction:
To step 6) the correct match point that obtains concentrates each to add 0.55 × left figure fruit maximum transverse diameter to being arranged in the horizontal ordinate of the correct match point of effective left figure in correct match point, and the correct match point horizontal ordinate that is arranged in effective right figure adds 0.1 × right figure fruit maximum transverse diameter;
Step 4.2.3) row contribution factor d in formula 4 requal the absolute value of horizontal ordinate difference of this pixel and subregion central point divided by the width dis of subregion, if the row contribution factor d calculating rbe greater than 1, row contribution factor d rget 1.
Step 4.2.3) row contribution factor d in formula 4 cequal the absolute value of ordinate difference of this pixel and subregion central point divided by the width dis of subregion, if the row contribution factor d calculating cbe greater than 1, row contribution factor d cget 1.
Step 4.2.3) direction contribution factor d in formula 4 oequal the absolute value of this pixel gradient direction angle and direction α difference, if the direction contribution factor d calculating obe greater than 1, direction contribution factor d oget 1.
The inventive method is applicable to adopt the spotted fruit of tool, and the spotted fruit of tool is snake fruit, apple or pears.
Step 1) in left figure and the resolution of right figure all between 0.1~0.2mm/pixel and left figure and right figure there is identical resolution.
Specific embodiments of the invention are as follows:
Adopt the spotted snake fruit of tool as objective for implementation, snake fruit is placed on fruit tray, make the calyx carpopodium line of snake fruit substantially vertical with surface level, regulating snake fruit is 970mm with the object distance of camera lens, it is 25mm that the zoom lens of adjusting camera makes focal length, camera CCD is of a size of 1/3 inch, carry out image acquisition, image acquisition picture is carried out to region of interesting extraction, the width that obtains left figure and right figure is 548 pixels, be highly 568 pixels, resolution is 0.146mm/pixel, as shown in Figure 2.
As shown in Figure 3, left figure and right figure are carried out to background segment, then scan image from top to bottom, record respectively in left figure and right figure the pixel of the pixel of the fruit profile leftmost side and the fruit profile rightmost side separately, the abscissa value of the pixel of the pixel of the leftmost side and the rightmost side is subtracted each other and obtained snake fruit maximum transverse diameter, obtaining respectively left figure snake fruit maximum transverse diameter is 548 pixels, and right figure snake fruit maximum transverse diameter is 548 pixels;
As shown in Figure 4, in left figure, the pixel left side width that the rightmost side is removed in cutting is the region of 548 pixels of snake fruit maximum transverse diameter in 0.1 × left figure, the width of clipping region is 54 pixels herein, the pixel right side width that the leftmost side is removed in cutting is the region of 548 pixels of 0.55 × left figure snake fruit maximum transverse diameter, the width of clipping region is 301 pixels herein, obtains a width width=191 pixel of effective left figure after cutting; As shown in Figure 5, in right figure, the pixel left side width that the rightmost side is removed in cutting is the region of 548 pixels of 0.55 × right figure snake fruit maximum transverse diameter, the width of clipping region is 54 pixels herein, the pixel right side width that the leftmost side is removed in cutting is the region of 548 pixels of 0.1 × right figure snake fruit maximum transverse diameter, and the width of clipping region is 301 pixels herein.Effective left figure and effective right figure are as shown in Figure 6.
As shown in Figure 7, effective left figure, effective right figure are calculated by formula 1, get 0.5 herein, obtain the left figure of initial Gaussian, the right figure of initial Gaussian.
As shown in Figure 8, by top sampling method, left initial Gaussian figure, the right figure of initial Gaussian are put and be twice, obtain the left figure of ground floor Gauss, the right figure of ground floor Gauss; As shown in Figure 9, more left ground floor Gauss figure, the right figure of ground floor Gauss are calculated by above-mentioned formula 1, σ gets 1.0 herein, obtains the left figure of second layer Gauss, the right figure of second layer Gauss; As shown in figure 10, finally left second layer Gauss figure is deducted to the left figure of ground floor Gauss and obtain the left figure of difference of Gaussian, right second layer Gauss figure is deducted to the right figure of ground floor Gauss and obtain the right figure of difference of Gaussian;
As shown in figure 11, for each pixel p in the left figure of difference of Gaussian and the right figure of difference of Gaussian, in 3 × 3 neighborhoods centered by p, if the gray-scale value of p is less than or greater than the gray-scale value of each pixel in 3 × 3 neighborhoods, this pixel p is labeled as to extreme point;
As shown in figure 12, the left figure of difference of Gaussian and the right figure of difference of Gaussian are traveled through according to above step, obtain respectively left figure extreme value point set and right figure extreme value point set, the extreme point quantity of left figure extreme value point set and right figure extreme value point set is respectively 71 and 77;
As shown in figure 13, in the left figure of difference of Gaussian and the right figure of difference of Gaussian, centered by each extreme point of concentrating by left figure extreme value point set and right figure extreme point respectively, choose neighborhood, and be divided into separately 4 × 4 sub regions, amount to 16 sub regions, the width of every sub regions is 3 pixels;
In each subregion, each pixel is pressed to formula 3 compute gradient mould values, by formula 4 compute gradient directions, obtain the gradient of each pixel as shown in figure 14.
As shown in figure 15, calculate the gradient of each subregion in all directions, and the gradient-norm value obtaining is normalized, obtain 128 dimensional feature vectors of each extreme point, then concentrate each extreme point horizontal stroke, ordinate to be kept to original 1/2 left figure extreme value point set and right figure extreme point.
As shown in figure 16,128 dimensional vectors that obtain are carried out to dimension-reduction treatment, obtain 64 dimensional feature vectors.
Unique point in left figure feature point set and right figure feature point set is mated, obtain coupling point set { match to be selected t, match point is as shown in figure 17.
As shown in figure 18, with coupling point set { match to be selected tin a pair of match point to be selected be that example judges, match point to be selected in effective left figure is width1=98 pixel to the distance of effective left figure right hand edge, match point to be selected in the effective right figure corresponding with it is width2=53 pixel to the distance of effective right figure left hand edge, to be added and obtain a pixel apart from total value Sum=151 apart from width1 and apart from width2, the picture traverse of effective left figure is width=191 pixel, can be apart from a total value Sum<238 pixel, and apart from a total value Sum>143 pixel, this is judged as correct match point to match point to be selected, press the same step of taking a picture for remaining to be selected and judge, correctly mated point set, as shown in figure 19.
Concentrate the horizontal ordinate of the correct match point that is arranged in effective left figure to add 0.55 × left figure snake fruit maximum transverse diameter to correct match point, add 301 pixels herein; The correct match point horizontal ordinate that is arranged in effective right figure adds 0.1 × right figure snake fruit maximum transverse diameter, adds 54 pixels herein.Finally obtain match point as shown in figure 20.
Checking by experiment, under equal conditions, matching process of the present invention is that former SIFT method is mated 14.5% of the used time under 2 layers of metric space of 2 dimension.As can be seen here, the present invention has reduced unnecessary calculating in fruit image feature point detection process, has removed unnecessary calculated amount, has improved arithmetic speed.
Above-mentioned embodiment is used for the present invention that explains, instead of limits the invention, and in the protection domain of spirit of the present invention and claim, any amendment and change that the present invention is made, all fall into protection scope of the present invention.

Claims (6)

1. for a rapid matching computation method for fruit image, it is characterized in that the step of the method is as follows:
1) original image obtains:
By fruit horizontal positioned, make the calyx carpopodium line of fruit vertical with surface level, gather a width side image is as left figure clearly from fruit side, and centered by the calyx carpopodium line of fruit, 60 ° of angles of axle rotation gather another width side image again as right figure;
2) fruit border points extraction:
Left figure and right figure are carried out to background segment, then scan image from top to bottom, record respectively in left figure and right figure the pixel of the pixel of the fruit profile leftmost side and the fruit profile rightmost side separately, the abscissa value of the pixel of the pixel of the leftmost side and the rightmost side is subtracted each other, obtain respectively left figure fruit maximum transverse diameter and right figure fruit maximum transverse diameter;
3) effective image extracts:
In left figure, the pixel left side width that the rightmost side is removed in cutting be the region of 0.1 × left figure fruit maximum transverse diameter and this pixel with right All Ranges, the pixel right side width of removing the leftmost side is that the region of 0.55 × left figure fruit maximum transverse diameter and this pixel are with left All Ranges; In right figure, the pixel left side width that the rightmost side is removed in cutting be the region of 0.55 × right figure fruit maximum transverse diameter and this pixel with right All Ranges, the pixel right side width of removing the leftmost side is that the region of 0.1 × right figure fruit maximum transverse diameter and this pixel are with left All Ranges; After cutting, obtain effective left figure and effective right figure, the picture traverse of remembering effective left figure is width;
4) feature point extraction:
4.1) detect extreme point:
4.1.1) first effective left figure, effective right figure are calculated by formula 1, obtain the left figure of initial Gaussian, the right figure of initial Gaussian:
L(x,y,σ)=G(x,y,σ)*I(x,y) (1)
Wherein, L (x, y, σ) the image for calculating, G (x, y, σ) be gaussian kernel, I (x, y) is image to be calculated, the horizontal ordinate that x is image to be calculated, y is the ordinate of image to be calculated, the expression formula of gaussian kernel G (x, y, σ) is as shown in following formula 2:
Wherein, σ is yardstick coordinate;
Then by top sampling method, left initial Gaussian figure, the right figure of initial Gaussian are put and be twice, obtain the left figure of ground floor Gauss, the right figure of ground floor Gauss; Again left ground floor Gauss figure, the right figure of ground floor Gauss are calculated by above-mentioned formula 1, obtain the left figure of second layer Gauss, the right figure of second layer Gauss; Finally left second layer Gauss figure is deducted to the left figure of ground floor Gauss and obtain the left figure of difference of Gaussian, right second layer Gauss figure is deducted to the right figure of ground floor Gauss and obtain the right figure of difference of Gaussian;
4.1.2), for each pixel p in the left figure of difference of Gaussian and the right figure of difference of Gaussian, in 3 × 3 neighborhoods centered by p, if the gray-scale value of p is less than or greater than the gray-scale value of each pixel in 3 × 3 neighborhoods, this pixel p is labeled as to extreme point;
The left figure of difference of Gaussian and the right figure of difference of Gaussian are traveled through according to above step, obtain respectively left figure extreme value point set and right figure extreme value point set, the extreme point quantity of left figure extreme value point set and right figure extreme value point set is respectively n 1and n 2;
4.2) by left figure extreme value point set and right figure extreme value point set generating feature point descriptor:
Concentrate each extreme point (x, y) to be handled as follows respectively to left figure extreme value point set and right figure extreme point:
4.2.1) in the left figure of difference of Gaussian and the right figure of difference of Gaussian, centered by each extreme point of concentrating by left figure extreme value point set and right figure extreme point respectively, choose neighborhood, and be divided into separately 4 × 4 sub regions, and amounting to 16 sub regions, the width of every sub regions is dis;
4.2.2) in each subregion to each pixel 3 compute gradient mould values, as follows 4 compute gradient directions as follows:
θ(x,y)=arctan((f(x,y+1)-f(x,y-1))/(f(x+1,y)-f(x-1,y))) (4)
Wherein, f (x, y) be extreme point (x in the left figure of difference of Gaussian or the right figure of difference of Gaussian, y) gray-scale value, m (x, y) is extreme point (x in the left figure of difference of Gaussian image or the right figure of difference of Gaussian, y) the gradient-norm value of locating, θ (x, y) is the gradient direction that in the left figure of difference of Gaussian or the right figure of difference of Gaussian, extreme point (x, y) is located;
Then the direction of 0~360 degree is divided into 0 ° successively, 10 ° ..., 340 °, 350 ° amount to 36 angle values, and the gradient direction that each pixel is obtained is grouped into by rounding up in corresponding angle value;
4.2.3) direction of 0~360 degree is divided into 0 °, 45 ° successively, 270 °, 315 ° amount to eight angle values, for all directions α under eight angle values, the weight mould value weight that the gradient-norm value of each pixel in three sub regions of this subregion and vicinity is calculated by formula 5 adds up, and obtains the mould value of this subregion at this direction α:
Wherein, m (x, y) is the gradient-norm value of this pixel in a direction, x kfor the column distance of this pixel and current extreme point, y kfor the row distance of this pixel and current extreme point, σ wfor scale parameter, σ wbe the width dis of 2 × subregion, d rfor row contribution factor, d cfor row contribution factor, d ofor direction contribution factor;
Finally obtain 4 × 4 sub regions each gradient under eight directions, symbiosis becomes 128 dimensional feature point descriptor vectors;
4.3) normalization unique point descriptor vector:
By step 4.2) in the initial magnitude of the 128 dimensional feature point descriptor vectors that obtain be H=(h 1, h 2... h 128), according to following formula 6,128 dimensional feature point descriptor vectors being normalized, the amplitude after normalization is L=(l 1, l 2... l 128):
Wherein, current vectorial dimension i=1,2,3 ... 128; Every one-dimensional vector dimension j=1,2,3 ... 128; If wherein l ibe greater than 0.2, changed into 0.2, then according to formula 6, the amplitude of this 128 dimensional feature point descriptor vector is re-started to a normalization, obtain final amplitude T=(t 1, t 2... t 128);
Then left figure extreme value point set and right figure extreme point are concentrated each extreme point horizontal stroke, ordinate to be kept to original 1/2, finally obtained left figure feature point set and right figure feature point set with unique point descriptor;
4.4) unique point descriptor dimensionality reduction:
By angle in eight unique point descriptor vectors of each subregion be 45 °, 135 °, 225 °, 315 ° four vectors respectively projection be added in two adjacent separately vectors, finally obtain 64 dimensional feature point descriptors;
5) Feature Points Matching:
Start until n from first unique point of left figure feature point set 1individual unique point, travels through right figure feature point set for each unique point, therefrom search two unique points nearest with the Euclidean distance of this unique point descriptor, and wherein the computing formula of Euclidean distance is as shown in the formula 7:
Wherein, Des i(n) refer to the amplitude of i unique point n-dimensional vector in left figure feature point set, Des j(n) refer to the amplitude of j unique point n-dimensional vector in right figure feature point set;
Suppose to find in right figure feature point set with left figure feature point set in the descriptor of current unique point the unique point that Euclidean distance is nearest and time near unique point, and obtain nearest Euclidean distance value and time near Euclidean distance value, then nearest Euclidean distance value is removed near Euclidean distance value in proper order, obtain ratio value k:
If the ratio value k<0.95 obtaining, is a pair of match point to be selected with the nearest unique point of this unique point Euclidean distance in current unique point and right figure feature point set in this left figure feature point set, otherwise gives up;
In left figure feature point set, all unique points all complete coupling according to after above-mentioned traversal, obtain coupling point set to be selected;
6) match point screening:
To step 5) match point to be selected that obtains concentrates each to screen match point to be selected, by following concrete steps:
Match point to be selected in effective left figure is width1 to the distance of effective left figure right hand edge, match point to be selected in the effective right figure corresponding with it is width2 to the distance of effective right figure left hand edge, to be added and obtain apart from total value Sum apart from width1 and apart from width2, the picture traverse of effective left figure be width; If apart from the picture traverse width of total value Sum<1.25 × effective left figure, and apart from the picture traverse width of total value Sum>0.75 × effective left figure, this is judged as to correct match point to match point to be selected, otherwise this is judged as to Mismatching point and is given up match point to be selected, finally correctly mated point set;
7) match point coordinate reduction:
To step 6) the correct match point that obtains concentrates each to add 0.55 × left figure fruit maximum transverse diameter to being arranged in the correct match point horizontal ordinate of effective left figure in correct match point, and the correct match point horizontal ordinate that is arranged in effective right figure adds 0.1 × right figure fruit maximum transverse diameter.
2. a kind of rapid matching computation method for fruit image according to claim 1, is characterized in that: described step 4.2.3) row contribution factor d in formula 4 requal the absolute value of horizontal ordinate difference of this pixel and subregion central point divided by the width dis of subregion, if the row contribution factor d calculating rbe greater than 1, row contribution factor d rget 1.
3. a kind of rapid matching computation method for fruit image according to claim 1, is characterized in that: described step 4.2.3) row contribution factor d in formula 4 cequal the absolute value of ordinate difference of this pixel and subregion central point divided by the width dis of subregion, if the row contribution factor d calculating cbe greater than 1, row contribution factor d cget 1.
4. a kind of rapid matching computation method for fruit image according to claim 1, is characterized in that: described step 4.2.3) direction contribution factor d in formula 4 oequal the absolute value of this pixel gradient direction angle and direction α difference, if the direction contribution factor d calculating obe greater than 1, direction contribution factor d oget 1.
5. a kind of rapid matching computation method for fruit image according to claim 1, is characterized in that: described fruit is snake fruit, apple or pears.
6. a kind of rapid matching computation method for fruit image according to claim 1, is characterized in that: described step 1) in the left figure of fruit side image identical with the resolution of right figure, and be 0.1~0.2mm/pixel.
CN201410217655.4A 2014-05-21 2014-05-21 A kind of rapid matching computation method for fruit image Active CN104036494B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410217655.4A CN104036494B (en) 2014-05-21 2014-05-21 A kind of rapid matching computation method for fruit image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410217655.4A CN104036494B (en) 2014-05-21 2014-05-21 A kind of rapid matching computation method for fruit image

Publications (2)

Publication Number Publication Date
CN104036494A true CN104036494A (en) 2014-09-10
CN104036494B CN104036494B (en) 2016-10-12

Family

ID=51467253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410217655.4A Active CN104036494B (en) 2014-05-21 2014-05-21 A kind of rapid matching computation method for fruit image

Country Status (1)

Country Link
CN (1) CN104036494B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104751455A (en) * 2015-03-13 2015-07-01 华南农业大学 Crop image dense matching method and system
CN108304883A (en) * 2018-02-12 2018-07-20 西安电子科技大学 Based on the SAR image matching process for improving SIFT
WO2020098532A1 (en) * 2018-11-12 2020-05-22 杭州萤石软件有限公司 Method for positioning mobile robot, and mobile robot
CN116542987A (en) * 2023-04-19 2023-08-04 翼存(上海)智能科技有限公司 Image clipping method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006127713A2 (en) * 2005-05-23 2006-11-30 The Penn State Research Foundation A fast 2d-3d image registration method with application to continuously guided endoscopy
CN101714254A (en) * 2009-11-16 2010-05-26 哈尔滨工业大学 Registering control point extracting method combining multi-scale SIFT and area invariant moment features
CN103177435A (en) * 2013-04-10 2013-06-26 浙江大学 Apple surface non-redundancy information image processing method based on machine vision
CN103177434A (en) * 2013-04-10 2013-06-26 浙江大学 Method for calculating rotation angle of apple by using stem images of apple

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006127713A2 (en) * 2005-05-23 2006-11-30 The Penn State Research Foundation A fast 2d-3d image registration method with application to continuously guided endoscopy
CN101714254A (en) * 2009-11-16 2010-05-26 哈尔滨工业大学 Registering control point extracting method combining multi-scale SIFT and area invariant moment features
CN103177435A (en) * 2013-04-10 2013-06-26 浙江大学 Apple surface non-redundancy information image processing method based on machine vision
CN103177434A (en) * 2013-04-10 2013-06-26 浙江大学 Method for calculating rotation angle of apple by using stem images of apple

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DAVID G. LOWE: "Distinctive Image Features from Scale-Invariant Keypoints", 《INTERNATIONAL JOURNAL OF COMPUTER VISION》 *
刘立 等: "采用简化SIFT算法实现快速图像匹配", 《红外与激光工程》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104751455A (en) * 2015-03-13 2015-07-01 华南农业大学 Crop image dense matching method and system
CN108304883A (en) * 2018-02-12 2018-07-20 西安电子科技大学 Based on the SAR image matching process for improving SIFT
WO2020098532A1 (en) * 2018-11-12 2020-05-22 杭州萤石软件有限公司 Method for positioning mobile robot, and mobile robot
CN116542987A (en) * 2023-04-19 2023-08-04 翼存(上海)智能科技有限公司 Image clipping method and device, electronic equipment and storage medium
CN116542987B (en) * 2023-04-19 2024-06-04 翼存(上海)智能科技有限公司 Image clipping method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN104036494B (en) 2016-10-12

Similar Documents

Publication Publication Date Title
CN111340701B (en) Circuit board image splicing method for screening matching points based on clustering method
JP5703312B2 (en) Efficient scale space extraction and description of feature points
CN103426186A (en) Improved SURF fast matching method
CN102722731A (en) Efficient image matching method based on improved scale invariant feature transform (SIFT) algorithm
Wang et al. Fan shape model for object detection
CN104200461A (en) Mutual information image selected block and sift (scale-invariant feature transform) characteristic based remote sensing image registration method
CN108710909B (en) Counting method for deformable, rotary and invariant boxed objects
Tzeng et al. User-driven geolocation of untagged desert imagery using digital elevation models
CN107886539B (en) High-precision gear visual detection method in industrial scene
Logoglu et al. Cospair: colored histograms of spatial concentric surflet-pairs for 3d object recognition
CN110544202B (en) Parallax image splicing method and system based on template matching and feature clustering
CN111967337A (en) Pipeline line change detection method based on deep learning and unmanned aerial vehicle images
CN102446356A (en) Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points
CN104036494A (en) Fast matching computation method used for fruit picture
CN108182705A (en) A kind of three-dimensional coordinate localization method based on machine vision
CN103336964B (en) SIFT image matching method based on module value difference mirror image invariant property
CN107369179B (en) High-precision image positioning method
CN107016414A (en) A kind of recognition methods of footprint
CN114926635B (en) Target segmentation method in multi-focus image combined with deep learning method
CN104036492A (en) Speckle extraction and adjacent point vector method-based fruit image matching method
Bui et al. A texture-based local soft voting method for vanishing point detection from a single road image
Li et al. Box refinement: Object proposal enhancement and pruning
Li et al. Human Action Recognition Using Multi-Velocity STIPs and Motion Energy Orientation Histogram.
Yuan et al. Graph neural network based multi-feature fusion for building change detection
CN103617616A (en) Affine invariant image matching method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210525

Address after: 310012 room 1102, block B, Lishui digital building, 153 Lianchuang street, Wuchang Street, Yuhang District, Hangzhou City, Zhejiang Province

Patentee after: Hangzhou nuotian Intelligent Technology Co.,Ltd.

Address before: 310058 Yuhang Tang Road, Xihu District, Hangzhou, Zhejiang 866

Patentee before: ZHEJIANG University

TR01 Transfer of patent right