CN105469120A - Image matching method and apparatus - Google Patents

Image matching method and apparatus Download PDF

Info

Publication number
CN105469120A
CN105469120A CN201510937636.3A CN201510937636A CN105469120A CN 105469120 A CN105469120 A CN 105469120A CN 201510937636 A CN201510937636 A CN 201510937636A CN 105469120 A CN105469120 A CN 105469120A
Authority
CN
China
Prior art keywords
particle
current
template
image
subregion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510937636.3A
Other languages
Chinese (zh)
Inventor
路廷文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inspur Electronic Information Industry Co Ltd
Original Assignee
Inspur Electronic Information Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inspur Electronic Information Industry Co Ltd filed Critical Inspur Electronic Information Industry Co Ltd
Priority to CN201510937636.3A priority Critical patent/CN105469120A/en
Publication of CN105469120A publication Critical patent/CN105469120A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an image matching method and apparatus. The method comprises: determining a target position of a template image, presetting a threshold range, and determining the position of each to-be-matched image; S1: setting a number of particles and initializing a current speed and a current position of each particle, wherein the number is a first preset value; S2, for each particle, determining a transitional speed of a current particle, and according to the transitional speed and the current position, determining a transitional position of the current particle; S3: judging whether the number of particles within the threshold range is greater than a second preset value or not, and if yes, performing the step S5, otherwise, performing the step S4; S4: for each particle, taking the transitional speed of the current particle as a current speed of the current particle, taking the transitional position of the current particle as a current position of the current particle, and returning to the step S2; and S5: determining a to-be-matched image matched with the template image. According to the image matching method and apparatus provided by the invention, the processing efficiency of image matching can be improved.

Description

A kind of method of images match and device
Technical field
The present invention relates to field of computer technology, particularly a kind of method of images match and device.
Background technology
Images match refers to by the corresponding relation to picture material, feature, structure, relation, texture etc., carries out similarity and conforming analysis, seeks the method for similar image.The development of accompanying information technology, the application of images match is more extensive.
In existing image matching method, source images all in template image to be matched and picture library is carried out comparison one by one, and when comparing with arbitrary source images in picture library, need to be compared by the feature such as texture, structure of image.Finally, the source images matched is obtained.
Visible by foregoing description, the method for images match of the prior art, calculated amount is comparatively large, and the processing time of needs is more, and in a word, the work efficiency of prior art is lower.
Summary of the invention
The invention provides a kind of method and device of images match, the treatment effeciency of images match can be improved.
On the one hand, the invention provides a kind of method of images match, comprise: S0: according to the gray value information of template image, determine the target location of described template image, according to described target location, pre-set threshold range, according to the gray value information of each image to be matched, determine the position of each image to be matched, also comprise:
S1: the first preset value particle is set, the present speed of each particle of initialization and current location;
S2: for each particle, according to the optimum position in the current location of the optimum position in all historical position of the current location of the present speed of the default weight of current particle, current particle, current particle, current particle and all particles, determine the transition speed of current particle;
S3: for each particle, according to described transition speed and current location, determines the crossover position of current particle;
S4: according to the crossover position of each particle, judges whether the quantity of the particle in described threshold range is greater than the second preset value, if so, performs step S6, otherwise, perform step S5;
S5: for each particle, using the present speed of the transition speed of current particle as current particle, using the current location of the crossover position of current particle as current particle, returns step S2;
S6: according to the crossover position of the particle in described threshold range and the position of each image to be matched, determine the image to be matched matched with described template image.
Further, described S2, comprising:
For each particle, determine the transition speed of current particle according to formula one, described formula one is:
v g[i]=w i×v[i]+c 1×random1×(pbest[i]-present[i])
+c 2×random2×(gbest-present[i])+random(v)
Wherein, v g[i] is the transition speed of i-th particle, w iit is the default weight of i-th particle, v [i] is the present speed of i-th particle, present [i] is the current location of i-th particle, pbest [i] is the optimum position in all historical position of i-th particle, gbest is the optimum position in the current location of all particles, and random1 is the first random number, and random2 is the second random number, the speed that random (v) is random particles, c 1for the primary importance parameter preset, c 2for the second place parameter preset.
Further, before described S5, also comprise:
A1: for each particle, judges that whether the crossover position of current particle is the optimum position in all historical position of current particle, if so, performs step S5, otherwise, perform steps A 2;
A2: for each particle, judges whether the number that the crossover position of current particle is identical with all historical position of current particle is greater than the 3rd preset value, if so, performs steps A 3, otherwise, perform step S5;
A3: for each particle, using the crossover position of the current location of the particle adjacent with current particle as current particle, performs step S5.
Further, described S0, comprising: on described template image, with any point on described template image for template true origin, extract the first predetermined number template subregion from described template image, determine the coordinate figure of each template pixel in each template subregion;
For each image to be matched, using point corresponding with described template true origin on current image to be matched as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of matched pixel point corresponding with each template pixel on current image to be matched, determine according to the coordinate figure of all matched pixel points the coupling subregion that each template subregion is corresponding;
Calculate the gray-scale value of each template subregion of described template image, determine the target location of described template image according to the gray-scale value of each template subregion of described template image;
For each image to be matched, calculate the gray-scale value of each coupling subregion of current image to be matched, determine the position of current image to be matched according to the gray-scale value of each coupling subregion of current image to be matched.
Further, described S3, comprising:
For each particle, determine the crossover position of current particle according to formula two, described formula two is:
present g[i]=present[i]+v g[i]
Wherein, present g[i] is the crossover position of i-th particle, and present [i] is the current location of i-th particle, v g[i] is the transition speed of i-th particle.
Further, before described S0, also comprise:
On described template image, with any point on described template image for template true origin, extract the second predetermined number template subregion from described template image, determine the coordinate figure of each template pixel in each template subregion;
For each source images, using point corresponding with described template true origin on current source picture as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of source image vegetarian refreshments corresponding with each template pixel on current source picture, determine according to the coordinate figure of institute's active pixel point the source subregion that each template subregion is corresponding;
Determine the gray-scale value of each template subregion of described template image, determine the gray-scale value of each source subregion of each source images;
For each source images, judge the source subregion that whether there is a 4th preset value Satisfying Matching Conditions in current source picture, if, then determine that current source picture is described image to be matched, otherwise, determine that current source picture is not described image to be matched, wherein, matching condition comprises: the difference of the gray-scale value of the template subregion that the gray-scale value of the current source subregion of current source picture is corresponding with described template image is in default intensity value ranges.
On the other hand, the invention provides a kind of device of images match, comprising:
Position determination unit, for the gray value information according to template image, determines the target location of described template image, according to described target location, pre-set threshold range, according to the gray value information of each image to be matched, determine the position of each image to be matched;
Initialization unit, for arranging the first preset value particle, the present speed of each particle of initialization and current location;
Transition speed unit, for for each particle, according to the optimum position in the current location of the optimum position in all historical position of the current location of the present speed of the default weight of current particle, current particle, current particle, current particle and all particles, determine the transition speed of current particle;
First Transition position units, for for each particle, according to described transition speed and current location, determines the crossover position of current particle;
First judging unit, for the crossover position according to each particle, judges whether the quantity of the particle in described threshold range is greater than the second preset value, if so, then and trigger match unit, otherwise, trigger current location element;
Described current location element, for for each particle, using the present speed of the transition speed of current particle as current particle, using the current location of the crossover position of current particle as current particle, triggers described transition speed unit;
Described matching unit, for according to the crossover position of particle in described threshold range and the position of each image to be matched, determines the image to be matched matched with described template image.
Further, described transition speed unit, specifically for for each particle, determines the transition speed of current particle according to formula one, and described formula one is:
v g[i]=w i×v[i]+c 1×random1×(pbest[i]-present[i])
+c 2×random2×(gbest-present[i])+random(v)
Wherein, v g[i] is the transition speed of i-th particle, w iit is the default weight of i-th particle, v [i] is the present speed of i-th particle, present [i] is the current location of i-th particle, pbest [i] is the optimum position in all historical position of i-th particle, gbest is the optimum position in the current location of all particles, and random1 is the first random number, and random2 is the second random number, the speed that random (v) is random particles, c 1for the primary importance parameter preset, c 2for the second place parameter preset.
Further, this device also comprises:
Second judging unit, for for each particle, judges that whether the crossover position of current particle is the optimum position in all historical position of current particle, if so, triggers described current location element, otherwise, trigger the 3rd judging unit;
Described 3rd judging unit, for for each particle, judges whether the number that the crossover position of current particle is identical with all historical position of current particle is greater than the 3rd preset value, if, trigger the second crossover position unit, otherwise, trigger described current location element;
Described second crossover position unit, for for each particle, using the crossover position of the current location of the particle adjacent with current particle as current particle, triggers described current location element.
Further, described position determination unit, comprising:
First zoning unit, for at described template image, with any point on described template image for template true origin, extract the first predetermined number template subregion from described template image, determine the coordinate figure of each template pixel in each template subregion;
Second zoning unit, for for each image to be matched, using point corresponding with described template true origin on current image to be matched as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of matched pixel point corresponding with each template pixel on current image to be matched, determine according to the coordinate figure of all matched pixel points the coupling subregion that each template subregion is corresponding;
Target location determining unit, for calculating the gray-scale value of each template subregion of described template image, determines the target location of described template image according to the gray-scale value of each template subregion of described template image;
Matched position determining unit, for for each image to be matched, calculates the gray-scale value of each coupling subregion of current image to be matched, determines the position of current image to be matched according to the gray-scale value of each coupling subregion of current image to be matched.
Further, described First Transition position units, specifically for for each particle, determines the crossover position of current particle according to formula two, and described formula two is:
present g[i]=present[i]+v g[i]
Wherein, present g[i] is the crossover position of i-th particle, and present [i] is the current location of i-th particle, v g[i] is the transition speed of i-th particle.
Further, this device also comprises:
3rd zoning unit, for at described template image, with any point on described template image for template true origin, extract the second predetermined number template subregion from described template image, determine the coordinate figure of each template pixel in each template subregion;
4th zoning unit, for for each source images, using point corresponding with described template true origin on current source picture as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of source image vegetarian refreshments corresponding with each template pixel on current source picture, determine according to the coordinate figure of institute's active pixel point the source subregion that each template subregion is corresponding;
Gray-scale value determining unit, for determining the gray-scale value of each template subregion of described template image, determines the gray-scale value of each source subregion of each source images;
Screening unit, for for each source images, judge the source subregion that whether there is a 4th preset value Satisfying Matching Conditions in current source picture, if, then determine that current source picture is described image to be matched, otherwise, determine that current source picture is not described image to be matched, wherein, matching condition comprises: the difference of the gray-scale value of the template subregion that the gray-scale value of the current source subregion of current source picture is corresponding with described template image is in default intensity value ranges.
The invention provides a kind of method and device of images match, identification image is carried out by the gray value information of image, used without quality by one, without the particle of volume as individuality, by the speed of particle and the change of position, particle can be moved, here by presetting weight, present speed, current location, the next speed of particle is decided in optimum position in the current location of the optimum position in historical position and all particles, and then determine the next position of particle, particle can be close to target location, when have arrive in threshold range more than the second preset value particle time, according to the position of the particle in threshold range, determine the image to be matched matched with template image, the method and device avoid the process that great amount of images and template image carry out mating, decrease calculated amount and processing time, improve the treatment effeciency of images match.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, be briefly described to the accompanying drawing used required in embodiment or description of the prior art below, apparently, accompanying drawing in the following describes is some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
Fig. 1 is the process flow diagram of the method for a kind of images match that one embodiment of the invention provides;
Fig. 2 is the process flow diagram of the method for the another kind of images match that one embodiment of the invention provides;
Fig. 3 is a kind of schematic diagram image being carried out to subregion that one embodiment of the invention provides;
Fig. 4 is the schematic diagram of the device of a kind of images match that one embodiment of the invention provides;
Fig. 5 is the schematic diagram of the device of the another kind of images match that one embodiment of the invention provides.
Embodiment
For making the object of the embodiment of the present invention, technical scheme and advantage clearly; below in conjunction with the accompanying drawing in the embodiment of the present invention; technical scheme in the embodiment of the present invention is clearly and completely described; obviously; described embodiment is the present invention's part embodiment, instead of whole embodiments, based on the embodiment in the present invention; the every other embodiment that those of ordinary skill in the art obtain under the prerequisite not making creative work, all belongs to the scope of protection of the invention.
As shown in Figure 1, embodiments provide a kind of method of images match, the method can comprise the following steps:
S0: according to the gray value information of template image, determines the target location of described template image, according to described target location, pre-sets threshold range, according to the gray value information of each image to be matched, determines the position of each image to be matched;
S1: the first preset value particle is set, the present speed of each particle of initialization and current location;
S2: for each particle, according to the optimum position in the current location of the optimum position in all historical position of the current location of the present speed of the default weight of current particle, current particle, current particle, current particle and all particles, determine the transition speed of current particle;
S3: for each particle, according to described transition speed and current location, determines the crossover position of current particle;
S4: according to the crossover position of each particle, judges whether the quantity of the particle in described threshold range is greater than the second preset value, if so, performs step S6, otherwise, perform step S5;
S5: for each particle, using the present speed of the transition speed of current particle as current particle, using the current location of the crossover position of current particle as current particle, returns step S2;
S6: according to the crossover position of the particle in described threshold range and the position of each image to be matched, determine the image to be matched matched with described template image.
Embodiments provide a kind of method of images match, identification image is carried out by the gray value information of image, used without quality by one, without the particle of volume as individuality, by the speed of particle and the change of position, particle can be moved, here by presetting weight, present speed, current location, the next speed of particle is decided in optimum position in the current location of the optimum position in historical position and all particles, and then determine the next position of particle, particle can be close to target location, when have arrive in threshold range more than the second preset value particle time, according to the position of the particle in threshold range, determine the image to be matched matched with template image, this method avoid the process that great amount of images and template image carry out mating, decrease calculated amount and processing time, improve the treatment effeciency of images match.
In the method, threshold range can be less than or equal to the distance of target location the scope presetting citing.Optimum position in all historical position of current particle can be the nearest historical position in all historical position mid-range objectives positions.Optimum position in the current location of all particles can be the current location of the nearest particle in the current location mid-range objectives position of all particles.When have be greater than the second preset value particle in threshold range time, then can determine that the method has restrained, and then output matching result.In step sl, when initialization is carried out to particle, can using the current location of the position of arbitrary image to be matched as arbitrary particle.
In a kind of possible implementation, described S2, comprising:
For each particle, determine the transition speed of current particle according to formula one, described formula one is:
v g[i]=w i×v[i]+c 1×random1×(pbest[i]-present[i])
+c 2×random2×(gbest-present[i])+random(v)
Wherein, v g[i] is the transition speed of i-th particle, w iit is the default weight of i-th particle, v [i] is the present speed of i-th particle, present [i] is the current location of i-th particle, pbest [i] is the optimum position in all historical position of i-th particle, gbest is the optimum position in the current location of all particles, and random1 is the first random number, and random2 is the second random number, the speed that random (v) is random particles, c 1for the primary importance parameter preset, c 2for the second place parameter preset.
By this implementation, each particle can be made towards the particle nearest from target location study, and then each particle is moved towards target location as far as possible.Specify rule of conduct by formula one for each particle, each particle is moved according to behavior rule.Make each particle follow the nearest Particles Moving in distance objective position by formula one, be finally met the solution of requirement.Can particle be avoided to turn back in historical position by the speed of adding random particles as far as possible.Here c 1and c 2can 2 be set to.The random number that random1 and random2 can be set between (0,1), can make the motion of particle more have randomness by random1 and random2.
In order to avoid particle walks overlapping route, in a kind of possible implementation, before described S5, also comprise:
A1: for each particle, judges that whether the crossover position of current particle is the optimum position in all historical position of current particle, if so, performs step S5, otherwise, perform steps A 2;
A2: for each particle, judges whether the number that the crossover position of current particle is identical with all historical position of current particle is greater than the 3rd preset value, if so, performs steps A 3, otherwise, perform step S5;
A3: for each particle, using the crossover position of the current location of the particle adjacent with current particle as current particle, performs step S5.
In this implementation, it not the situation of the optimum position in all historical position of current particle at crossover position, if the number that the crossover position of current particle is identical with all historical position of current particle is greater than the 3rd preset value, then determine that current particle is walking overlapping route, in order to avoid this situation, the crossover position of current particle is set to the current location of the particle adjacent with current particle, particle adjacent with current particle here can be apart from minimum particle with current particle.When crossover position is the optimum position in all historical position of current particle, the crossover position of the current particle number identical with all historical position of current particle can be allowed to be greater than the 3rd preset value, because this situation, may be that current particle has arrived the nearest optimal location in citing target location.By this implementation, avoid particle to walk overlapping route, and then avoid local deadlock, improve treatment effeciency.
This implementation can pass through following codes implement:
If (present [i]==cache & & present unequal to pbest) // current is not optimal particle
{
Present [i]=present [j]; // if what find to walk is the route of repetition, is revised as adjacent particles position
}
Wherein, present [j] is the current location of the particle adjacent with current particle.
In a kind of possible implementation, described S3, comprising:
For each particle, determine the crossover position of current particle according to formula two, described formula two is:
present g[i]=present[i]+v g[i]
Wherein, present g[i] is the crossover position of i-th particle, and present [i] is the current location of i-th particle, v g[i] is the transition speed of i-th particle.
In a kind of possible implementation, described S0, comprise: on described template image, with any point on described template image for template true origin, extract the first predetermined number template subregion from described template image, determine the coordinate figure of each template pixel in each template subregion;
For each image to be matched, using point corresponding with described template true origin on current image to be matched as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of matched pixel point corresponding with each template pixel on current image to be matched, determine according to the coordinate figure of all matched pixel points the coupling subregion that each template subregion is corresponding;
Calculate the gray-scale value of each template subregion of described template image, determine the target location of described template image according to the gray-scale value of each template subregion of described template image;
For each image to be matched, calculate the gray-scale value of each coupling subregion of current image to be matched, determine the position of current image to be matched according to the gray-scale value of each coupling subregion of current image to be matched.
In this implementation, extract the first predetermined number template subregion from template image, from each image to be matched, also extract the coupling subregion corresponding with each template subregion respectively.For example, the true origin of template image is the center of template image, then the true origin of image to be matched is also the center of image to be matched, one of them template subregion is circular, and the coordinate in the center of circle is (1,1), radius is 1, so corresponding with this template subregion coupling subregion is also the coordinate in the center of circle is (1,1), and radius is the circle of 1.In addition, template image has 3 template subregions A, B, C, gray-scale value is 10,20,30 respectively, the coupling subregion that image A to be matched is corresponding with template subregion A, B, C is D, E, F respectively, gray-scale value is 40,50,60 respectively, then the target location of template image is (10,20,30), the position of image A to be matched is (40,50,60).
In this implementation, particle in threshold range is the particle very near with template image distance, namely, each gray-scale value of coupling subregion of the image to be matched that this particle is corresponding is all very close with the gray-scale value of the template subregion of the corresponding of template image, also just explanation, the coupling subregion of this image to be matched is very similar with the template subregion of the corresponding of template image, then can confirm that this image to be matched and template image match.
In order to reduce calculated amount further, in a kind of possible implementation, before described S0, also comprise:
On described template image, with any point on described template image for template true origin, extract the second predetermined number template subregion from described template image, determine the coordinate figure of each template pixel in each template subregion;
For each source images, using point corresponding with described template true origin on current source picture as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of source image vegetarian refreshments corresponding with each template pixel on current source picture, determine according to the coordinate figure of institute's active pixel point the source subregion that each template subregion is corresponding;
Determine the gray-scale value of each template subregion of described template image, determine the gray-scale value of each source subregion of each source images;
For each source images, judge the source subregion that whether there is a 4th preset value Satisfying Matching Conditions in current source picture, if, then determine that current source picture is described image to be matched, otherwise, determine that current source picture is not described image to be matched, wherein, matching condition comprises: the difference of the gray-scale value of the template subregion that the gray-scale value of the current source subregion of current source picture is corresponding with described template image is in default intensity value ranges.
By this implementation, weed out the undesirable image of part from source images, remaining image to be matched, reduces the process range of subsequent processes, accelerates processing speed.In addition, the gray-scale value of each template subregion of the template image in this implementation, can be used for determining the target location of template image, the gray-scale value of each source subregion of each source images, can be used for determining the position of image to be matched.
In addition, can sort to source images according to the gray-scale value of each source subregion of each source images, determine image to be matched by binary chop.This implementation can improve processing speed.
For making the object, technical solutions and advantages of the present invention clearly, below in conjunction with drawings and the specific embodiments, the present invention is described in further detail.
In this embodiment, from institute's active image of picture library, the image matched with template image is found.
As shown in Figure 2, embodiments provide a kind of method of images match, the method can comprise the following steps:
Step 201: on template image, with any point on template image for template true origin, extracts 3 template subregions from template image, determines the coordinate figure of each template pixel in each template subregion.
For example, true origin is the central point of template image, and 3 template subregions of template image are a-quadrant, B region, C region respectively.The coordinate figure of a pixel A in a-quadrant is (1,1).
As shown in Figure 3, can carry out subregion according to the mode shown in Fig. 3, concerning template image, in figure, black part is divided into the template subregion extracted, and concerning source images, black part is divided into the source subregion extracted.If two width images match, then the gray-scale value of the subregion of each correspondence of two width images is close.
Step 202: for each source images, using point corresponding with described template true origin on current source picture as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of source image vegetarian refreshments corresponding with each template pixel on current source picture, determine according to the coordinate figure of institute's active pixel point the source subregion that each template subregion is corresponding.
For example, the current source picture point corresponding with the central point of template image is the central point of current source picture, using the true origin of the central point of current source picture as current source picture.The coordinate figure of the pixel A in the a-quadrant of template image is (1,1), the coordinate of source image vegetarian refreshments corresponding with this pixel A in current source picture is (1,1), in this manner, determine the source image vegetarian refreshments that each template pixel is corresponding, and then determine the source subregion that each template subregion is corresponding.
Step 203: the gray-scale value determining each template subregion of described template image, determines the gray-scale value of each source subregion of each source images.
Step 204: for each source images, judge the source subregion that whether there is a 4th preset value Satisfying Matching Conditions in current source picture, if, then determine that current source picture is image to be matched, otherwise, determine that current source picture is not image to be matched, wherein, matching condition comprises: the difference of the gray-scale value of the template subregion that the gray-scale value of the current source subregion of current source picture is corresponding with described template image is in default intensity value ranges.
For example, the gray-scale value of the template subregion a-quadrant of template image is 10, and the source subregion of the current source picture that a-quadrant is corresponding is region D, and the gray-scale value of region D is 13, and default intensity value ranges is less than or equal to 5, and region D meets this matching condition.Other source subregions by that analogy.
Step 205: the gray-scale value of each template subregion of calculation template image, determines the target location of described template image, according to described target location, pre-sets threshold range according to the gray-scale value of each template subregion of template image.
For example, 3 template subregions of template image are a-quadrant, B region, C region respectively, and gray-scale value is 10,11,12 respectively, determine that target location is (10,11,12).
Step 206: for each image to be matched, calculates the gray-scale value of each source subregion of current image to be matched, determines the position of current image to be matched according to the gray-scale value of each source subregion of current image to be matched.
For example, the source subregion of the image current to be matched that the a-quadrant of template image, B region, C region are corresponding is respectively: D region, E region, F region, and the gray-scale value of these three source subregions is 14,15,16 respectively, and the position of current image to be matched is (14,15,16).
Step 207: the first preset value particle is set, the present speed of each particle of initialization and current location.
For example, arrange 10 particles, for one of them particle, the present speed of this particle can be initialized as 10, current location is initialized as (1,1,1).
Step 208: for each particle, according to the optimum position in the current location of the optimum position in all historical position of the current location of the present speed of the default weight of current particle, current particle, current particle, current particle and all particles, determine the transition speed of current particle.
Particularly, this step can be realized in the following manner:
For each particle, determine the transition speed of current particle according to formula one, described formula one is:
v g[i]=w i×v[i]+c 1×random1×(pbest[i]-present[i])
+c 2×random2×(gbest-present[i])+random(v)
Wherein, v g[i] is the transition speed of i-th particle, w iit is the default weight of i-th particle, v [i] is the present speed of i-th particle, present [i] is the current location of i-th particle, pbest [i] is the optimum position in all historical position of i-th particle, gbest is the optimum position in the current location of all particles, and random1 is the first random number, and random2 is the second random number, the speed that random (v) is random particles, c 1for the primary importance parameter preset, c 2for the second place parameter preset.
Step 209: for each particle, according to transition speed and current location, determines the crossover position of current particle.
Particularly, can realize in the following manner:
For each particle, determine the crossover position of current particle according to formula two, described formula two is:
present g[i]=present[i]+v g[i]
Wherein, present g[i] is the crossover position of i-th particle, and present [i] is the current location of i-th particle, v g[i] is the transition speed of i-th particle.
Step 210: according to the crossover position of each particle, judges whether the quantity of the particle in threshold range is greater than the second preset value, if so, performs step 212, otherwise, perform step 211.
For example, the second preset value is 8.
Step 211: for each particle, using the present speed of the transition speed of current particle as current particle, using the current location of the crossover position of current particle as current particle, returns step 208.
Step 212: according to the crossover position of the particle in threshold range and the position of each image to be matched, determine the image to be matched matched with template image.
For example, the crossover position of the particle A in threshold range is (14,15,16), has the position of an image B to be matched for (14,15,16), then determines that image B to be matched is for match with template image.
In addition, before step 211, can also comprise:
A1: for each particle, judges that whether the crossover position of current particle is the optimum position in all historical position of current particle, if so, performs step 211, otherwise, perform steps A 2;
A2: for each particle, judges whether the number that the crossover position of current particle is identical with all historical position of current particle is greater than the 3rd preset value, if so, performs steps A 3, otherwise, perform step 211;
A3: for each particle, using the crossover position of the current location of the particle adjacent with current particle as current particle, performs step 211.
The method of a kind of images match provided by the embodiment of the present invention, identification image is carried out by giving the gray-scale value of picture portion and each subregion, by arranging particle motion in certain rule, make particle close to target location, after in arrival threshold range, determine the image to be matched matched with template image according to the position of particle, the method increase accuracy and the matching speed of images match.
As shown in Figure 4, Figure 5, a kind of device of images match is embodiments provided.Device embodiment can pass through software simulating, also can be realized by the mode of hardware or software and hardware combining.Say from hardware view; as shown in Figure 4; a kind of hardware structure diagram of the device place equipment of a kind of images match provided for the embodiment of the present invention; except the processor shown in Fig. 4, internal memory, network interface and nonvolatile memory; in embodiment, the equipment at device place can also comprise other hardware usually, as the forwarding chip etc. of responsible process message.For software simulating, as shown in Figure 5, as the device on a logical meaning, be by the CPU of its place equipment, computer program instructions corresponding in nonvolatile memory is read operation in internal memory to be formed.The device of a kind of images match that the present embodiment provides, comprising:
Position determination unit 501, for the gray value information according to template image, determines the target location of described template image, according to described target location, pre-set threshold range, according to the gray value information of each image to be matched, determine the position of each image to be matched;
Initialization unit 502, for arranging the first preset value particle, the present speed of each particle of initialization and current location;
Transition speed unit 503, for for each particle, according to the optimum position in the current location of the optimum position in all historical position of the current location of the present speed of the default weight of current particle, current particle, current particle, current particle and all particles, determine the transition speed of current particle;
First Transition position units 504, for for each particle, according to described transition speed and current location, determines the crossover position of current particle;
First judging unit 505, for the crossover position according to each particle, judges whether the quantity of the particle in described threshold range is greater than the second preset value, if so, then and trigger match unit 507, otherwise, trigger current location element 506;
Described current location element 506, for for each particle, using the present speed of the transition speed of current particle as current particle, using the current location of the crossover position of current particle as current particle, triggers described transition speed unit 503;
Described matching unit 507, for according to the crossover position of particle in described threshold range and the position of each image to be matched, determines the image to be matched matched with described template image.
In a kind of possible implementation, described transition speed unit 503, specifically for for each particle, determines the transition speed of current particle according to formula one, and described formula one is:
v g[i]=w i×v[i]+c 1×random1×(pbest[i]-present[i])
+c 2×random2×(gbest-present[i])+random(v)
Wherein, v g[i] is the transition speed of i-th particle, w iit is the default weight of i-th particle, v [i] is the present speed of i-th particle, present [i] is the current location of i-th particle, pbest [i] is the optimum position in all historical position of i-th particle, gbest is the optimum position in the current location of all particles, and random1 is the first random number, and random2 is the second random number, the speed that random (v) is random particles, c 1for the primary importance parameter preset, c 2for the second place parameter preset.
In a kind of possible implementation, this device also comprises:
Second judging unit, for for each particle, judges that whether the crossover position of current particle is the optimum position in all historical position of current particle, if so, triggers described current location element, otherwise, trigger the 3rd judging unit;
Described 3rd judging unit, for for each particle, judges whether the number that the crossover position of current particle is identical with all historical position of current particle is greater than the 3rd preset value, if, trigger the second crossover position unit, otherwise, trigger described current location element;
Described second crossover position unit, for for each particle, using the crossover position of the current location of the particle adjacent with current particle as current particle, triggers described current location element.
In a kind of possible implementation, described position determination unit 501, comprising:
First zoning unit, for at described template image, with any point on described template image for template true origin, extract the first predetermined number template subregion from described template image, determine the coordinate figure of each template pixel in each template subregion;
Second zoning unit, for for each image to be matched, using point corresponding with described template true origin on current image to be matched as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of matched pixel point corresponding with each template pixel on current image to be matched, determine according to the coordinate figure of all matched pixel points the coupling subregion that each template subregion is corresponding;
Target location determining unit, for calculating the gray-scale value of each template subregion of described template image, determines the target location of described template image according to the gray-scale value of each template subregion of described template image;
Matched position determining unit, for for each image to be matched, calculates the gray-scale value of each coupling subregion of current image to be matched, determines the position of current image to be matched according to the gray-scale value of each coupling subregion of current image to be matched;
In a kind of possible implementation, described First Transition position units 504, specifically for for each particle, determines the crossover position of current particle according to formula two, and described formula two is:
present g[i]=present[i]+v g[i]
Wherein, present g[i] is the crossover position of i-th particle, and present [i] is the current location of i-th particle, v g[i] is the transition speed of i-th particle.
In a kind of possible implementation, this device also comprises:
3rd zoning unit, for at described template image, with any point on described template image for template true origin, extract the second predetermined number template subregion from described template image, determine the coordinate figure of each template pixel in each template subregion;
4th zoning unit, for for each source images, using point corresponding with described template true origin on current source picture as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of source image vegetarian refreshments corresponding with each template pixel on current source picture, determine according to the coordinate figure of institute's active pixel point the source subregion that each template subregion is corresponding;
Gray-scale value determining unit, for determining the gray-scale value of each template subregion of described template image, determines the gray-scale value of each source subregion of each source images;
Screening unit, for for each source images, judge the source subregion that whether there is a 4th preset value Satisfying Matching Conditions in current source picture, if, then determine that current source picture is described image to be matched, otherwise, determine that current source picture is not described image to be matched, wherein, matching condition comprises: the difference of the gray-scale value of the template subregion that the gray-scale value of the current source subregion of current source picture is corresponding with described template image is in default intensity value ranges.
The content such as information interaction, implementation between each unit in said apparatus, due to the inventive method embodiment based on same design, particular content can see in the inventive method embodiment describe, repeat no more herein.
The method of a kind of images match that the embodiment of the present invention provides and device, have following beneficial effect:
1, embodiments provide a kind of method and device of images match, identification image is carried out by the gray value information of image, used without quality by one, without the particle of volume as individuality, by the speed of particle and the change of position, particle can be moved, here by presetting weight, present speed, current location, the next speed of particle is decided in optimum position in the current location of the optimum position in historical position and all particles, and then determine the next position of particle, particle can be close to target location, when have arrive in threshold range more than the second preset value particle time, according to the position of the particle in threshold range, determine the image to be matched matched with template image, the method and device avoid the process that great amount of images and template image carry out mating, decrease calculated amount and processing time, improve the treatment effeciency of images match.
2, a kind of method and device of images match is embodiments provided, it not the situation of the optimum position in all historical position of current particle at crossover position, if the number that the crossover position of current particle is identical with all historical position of current particle is greater than the 3rd preset value, then determine that current particle is walking overlapping route, in order to avoid this situation, the crossover position of current particle is set to the current location of the particle adjacent with current particle, particle adjacent with current particle here can be apart from minimum particle with current particle.When crossover position is the optimum position in all historical position of current particle, the crossover position of the current particle number identical with all historical position of current particle can be allowed to be greater than the 3rd preset value, because this situation, may be that current particle has arrived the nearest optimal location in citing target location.By this implementation, avoid particle to walk overlapping route, and then avoid local deadlock, improve treatment effeciency.
3, a kind of method and device of images match is embodiments provided, identification image is carried out by giving the gray-scale value of picture portion and each subregion, by arranging particle motion in certain rule, make particle close to target location, after in arrival threshold range, determine the image to be matched matched with template image according to the position of particle, by the method and device, improve accuracy and the matching speed of images match.
It should be noted that, in this article, the relational terms of such as first and second and so on is only used for an entity or operation to separate with another entity or operational zone, and not necessarily requires or imply the relation that there is any this reality between these entities or operation or sequentially.And, term " comprises ", " comprising " or its any other variant are intended to contain comprising of nonexcludability, thus make to comprise the process of a series of key element, method, article or equipment and not only comprise those key elements, but also comprise other key elements clearly do not listed, or also comprise by the intrinsic key element of this process, method, article or equipment.When not more restrictions, the key element " being comprised a 〃 〃 〃 〃 〃 〃 " limited by statement, and be not precluded within process, method, article or the equipment comprising described key element and also there is other same factor.
One of ordinary skill in the art will appreciate that: all or part of step realizing said method embodiment can have been come by the hardware that programmed instruction is relevant, aforesaid program can be stored in the storage medium of embodied on computer readable, this program, when performing, performs the step comprising said method embodiment; And aforesaid storage medium comprises: ROM, RAM, magnetic disc or CD etc. various can be program code stored medium in.
Finally it should be noted that: the foregoing is only preferred embodiment of the present invention, only for illustration of technical scheme of the present invention, be not intended to limit protection scope of the present invention.All any amendments done within the spirit and principles in the present invention, equivalent replacement, improvement etc., be all included in protection scope of the present invention.

Claims (10)

1. the method for an images match, it is characterized in that, comprise: S0: according to the gray value information of template image, determine the target location of described template image, according to described target location, pre-set threshold range, according to the gray value information of each image to be matched, determine the position of each image to be matched, also comprise:
S1: the first preset value particle is set, the present speed of each particle of initialization and current location;
S2: for each particle, according to the optimum position in the current location of the optimum position in all historical position of the current location of the present speed of the default weight of current particle, current particle, current particle, current particle and all particles, determine the transition speed of current particle;
S3: for each particle, according to described transition speed and current location, determines the crossover position of current particle;
S4: according to the crossover position of each particle, judges whether the quantity of the particle in described threshold range is greater than the second preset value, if so, performs step S6, otherwise, perform step S5;
S5: for each particle, using the present speed of the transition speed of current particle as current particle, using the current location of the crossover position of current particle as current particle, returns step S2;
S6: according to the crossover position of the particle in described threshold range and the position of each image to be matched, determine the image to be matched matched with described template image.
2. method according to claim 1, is characterized in that, described S2, comprising:
For each particle, determine the transition speed of current particle according to formula one, described formula one is:
v g[i]=w i×v[i]+c 1×random1×(pbest[i]-present[i])
+c 2×random2×(gbest-present[i])+random(v)
Wherein, v g[i] is the transition speed of i-th particle, w iit is the default weight of i-th particle, v [i] is the present speed of i-th particle, present [i] is the current location of i-th particle, pbest [i] is the optimum position in all historical position of i-th particle, gbest is the optimum position in the current location of all particles, and random1 is the first random number, and random2 is the second random number, the speed that random (v) is random particles, c 1for the primary importance parameter preset, c 2for the second place parameter preset.
3. method according to claim 1, is characterized in that, before described S5, also comprises:
A1: for each particle, judges that whether the crossover position of current particle is the optimum position in all historical position of current particle, if so, performs step S5, otherwise, perform steps A 2;
A2: for each particle, judges whether the number that the crossover position of current particle is identical with all historical position of current particle is greater than the 3rd preset value, if so, performs steps A 3, otherwise, perform step S5;
A3: for each particle, using the crossover position of the current location of the particle adjacent with current particle as current particle, performs step S5.
4. method according to claim 1, is characterized in that,
Described S0, comprise: on described template image, with any point on described template image for template true origin, extract the first predetermined number template subregion from described template image, determine the coordinate figure of each template pixel in each template subregion;
For each image to be matched, using point corresponding with described template true origin on current image to be matched as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of matched pixel point corresponding with each template pixel on current image to be matched, determine according to the coordinate figure of all matched pixel points the coupling subregion that each template subregion is corresponding;
Calculate the gray-scale value of each template subregion of described template image, determine the target location of described template image according to the gray-scale value of each template subregion of described template image;
For each image to be matched, calculate the gray-scale value of each coupling subregion of current image to be matched, determine the position of current image to be matched according to the gray-scale value of each coupling subregion of current image to be matched;
And/or,
Described S3, comprising:
For each particle, determine the crossover position of current particle according to formula two, described formula two is:
present g[i]=present[i]+v g[i]
Wherein, present g[i] is the crossover position of i-th particle, and present [i] is the current location of i-th particle, v g[i] is the transition speed of i-th particle.
5. method according to claim 1, is characterized in that, before described S0, also comprises:
On described template image, with any point on described template image for template true origin, extract the second predetermined number template subregion from described template image, determine the coordinate figure of each template pixel in each template subregion;
For each source images, using point corresponding with described template true origin on current source picture as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of source image vegetarian refreshments corresponding with each template pixel on current source picture, determine according to the coordinate figure of institute's active pixel point the source subregion that each template subregion is corresponding;
Determine the gray-scale value of each template subregion of described template image, determine the gray-scale value of each source subregion of each source images;
For each source images, judge the source subregion that whether there is a 4th preset value Satisfying Matching Conditions in current source picture, if, then determine that current source picture is described image to be matched, otherwise, determine that current source picture is not described image to be matched, wherein, matching condition comprises: the difference of the gray-scale value of the template subregion that the gray-scale value of the current source subregion of current source picture is corresponding with described template image is in default intensity value ranges.
6. a device for images match, is characterized in that, comprising:
Position determination unit, for the gray value information according to template image, determines the target location of described template image, according to described target location, pre-set threshold range, according to the gray value information of each image to be matched, determine the position of each image to be matched;
Initialization unit, for arranging the first preset value particle, the present speed of each particle of initialization and current location;
Transition speed unit, for for each particle, according to the optimum position in the current location of the optimum position in all historical position of the current location of the present speed of the default weight of current particle, current particle, current particle, current particle and all particles, determine the transition speed of current particle;
First Transition position units, for for each particle, according to described transition speed and current location, determines the crossover position of current particle;
First judging unit, for the crossover position according to each particle, judges whether the quantity of the particle in described threshold range is greater than the second preset value, if so, then and trigger match unit, otherwise, trigger current location element;
Described current location element, for for each particle, using the present speed of the transition speed of current particle as current particle, using the current location of the crossover position of current particle as current particle, triggers described transition speed unit;
Described matching unit, for according to the crossover position of particle in described threshold range and the position of each image to be matched, determines the image to be matched matched with described template image.
7. device according to claim 6, is characterized in that, described transition speed unit, specifically for for each particle, determines the transition speed of current particle according to formula one, and described formula one is:
v g[i]=w i×v[i]+c 1×random1×(pbest[i]-present[i])
+c 2×random2×(gbest-present[i])+random(v)
Wherein, v g[i] is the transition speed of i-th particle, w iit is the default weight of i-th particle, v [i] is the present speed of i-th particle, present [i] is the current location of i-th particle, pbest [i] is the optimum position in all historical position of i-th particle, gbest is the optimum position in the current location of all particles, and random1 is the first random number, and random2 is the second random number, the speed that random (v) is random particles, c 1for the primary importance parameter preset, c 2for the second place parameter preset.
8. device according to claim 6, is characterized in that, also comprises:
Second judging unit, for for each particle, judges that whether the crossover position of current particle is the optimum position in all historical position of current particle, if so, triggers described current location element, otherwise, trigger the 3rd judging unit;
Described 3rd judging unit, for for each particle, judges whether the number that the crossover position of current particle is identical with all historical position of current particle is greater than the 3rd preset value, if, trigger the second crossover position unit, otherwise, trigger described current location element;
Described second crossover position unit, for for each particle, using the crossover position of the current location of the particle adjacent with current particle as current particle, triggers described current location element.
9. device according to claim 6, is characterized in that,
Described position determination unit, comprising:
First zoning unit, for at described template image, with any point on described template image for template true origin, extract the first predetermined number template subregion from described template image, determine the coordinate figure of each template pixel in each template subregion;
Second zoning unit, for for each image to be matched, using point corresponding with described template true origin on current image to be matched as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of matched pixel point corresponding with each template pixel on current image to be matched, determine according to the coordinate figure of all matched pixel points the coupling subregion that each template subregion is corresponding;
Target location determining unit, for calculating the gray-scale value of each template subregion of described template image, determines the target location of described template image according to the gray-scale value of each template subregion of described template image;
Matched position determining unit, for for each image to be matched, calculates the gray-scale value of each coupling subregion of current image to be matched, determines the position of current image to be matched according to the gray-scale value of each coupling subregion of current image to be matched;
And/or,
Described First Transition position units, specifically for for each particle, determines the crossover position of current particle according to formula two, described formula two is:
present g[i]=present[i]+v g[i]
Wherein, present g[i] is the crossover position of i-th particle, and present [i] is the current location of i-th particle, v g[i] is the transition speed of i-th particle.
10. device according to claim 6, is characterized in that, also comprises:
3rd zoning unit, for at described template image, with any point on described template image for template true origin, extract the second predetermined number template subregion from described template image, determine the coordinate figure of each template pixel in each template subregion;
4th zoning unit, for for each source images, using point corresponding with described template true origin on current source picture as true origin, according to the coordinate figure of template pixel each in each template subregion, determine the coordinate figure of source image vegetarian refreshments corresponding with each template pixel on current source picture, determine according to the coordinate figure of institute's active pixel point the source subregion that each template subregion is corresponding;
Gray-scale value determining unit, for determining the gray-scale value of each template subregion of described template image, determines the gray-scale value of each source subregion of each source images;
Screening unit, for for each source images, judge the source subregion that whether there is a 4th preset value Satisfying Matching Conditions in current source picture, if, then determine that current source picture is described image to be matched, otherwise, determine that current source picture is not described image to be matched, wherein, matching condition comprises: the difference of the gray-scale value of the template subregion that the gray-scale value of the current source subregion of current source picture is corresponding with described template image is in default intensity value ranges.
CN201510937636.3A 2015-12-15 2015-12-15 Image matching method and apparatus Pending CN105469120A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510937636.3A CN105469120A (en) 2015-12-15 2015-12-15 Image matching method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510937636.3A CN105469120A (en) 2015-12-15 2015-12-15 Image matching method and apparatus

Publications (1)

Publication Number Publication Date
CN105469120A true CN105469120A (en) 2016-04-06

Family

ID=55606791

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510937636.3A Pending CN105469120A (en) 2015-12-15 2015-12-15 Image matching method and apparatus

Country Status (1)

Country Link
CN (1) CN105469120A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110503094A (en) * 2019-08-14 2019-11-26 中国电子科技集团公司第二十八研究所 Professional certificate photo name board recognition methods, device
CN114879746A (en) * 2022-07-13 2022-08-09 山东中宇航空科技发展有限公司 Flight route optimization method for agricultural plant protection unmanned aerial vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101833670A (en) * 2010-04-30 2010-09-15 北京航空航天大学 Image matching method based on lateral inhibition and chaos quantum particle swarm optimization
CN102969780A (en) * 2012-10-30 2013-03-13 天津大学 Off-grid wind/solar/battery hybrid power generation system capacity optimal configuration method
CN104200226A (en) * 2014-09-01 2014-12-10 西安电子科技大学 Particle filtering target tracking method based on machine learning
CN104361135A (en) * 2014-12-11 2015-02-18 浪潮电子信息产业股份有限公司 Image retrieval method
CN104915969A (en) * 2015-05-21 2015-09-16 云南大学 Template matching tracking method based on particle swarm optimization

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101833670A (en) * 2010-04-30 2010-09-15 北京航空航天大学 Image matching method based on lateral inhibition and chaos quantum particle swarm optimization
CN102969780A (en) * 2012-10-30 2013-03-13 天津大学 Off-grid wind/solar/battery hybrid power generation system capacity optimal configuration method
CN104200226A (en) * 2014-09-01 2014-12-10 西安电子科技大学 Particle filtering target tracking method based on machine learning
CN104361135A (en) * 2014-12-11 2015-02-18 浪潮电子信息产业股份有限公司 Image retrieval method
CN104915969A (en) * 2015-05-21 2015-09-16 云南大学 Template matching tracking method based on particle swarm optimization

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110503094A (en) * 2019-08-14 2019-11-26 中国电子科技集团公司第二十八研究所 Professional certificate photo name board recognition methods, device
CN114879746A (en) * 2022-07-13 2022-08-09 山东中宇航空科技发展有限公司 Flight route optimization method for agricultural plant protection unmanned aerial vehicle
CN114879746B (en) * 2022-07-13 2022-09-20 山东中宇航空科技发展有限公司 Flight route optimization method for agricultural plant protection unmanned aerial vehicle

Similar Documents

Publication Publication Date Title
Zhang et al. A cascaded R-CNN with multiscale attention and imbalanced samples for traffic sign detection
CN112801164B (en) Training method, device, equipment and storage medium of target detection model
Akinlar et al. EDCircles: A real-time circle detector with a false detection control
Soetedjo et al. Fast and robust traffic sign detection
CN104081435A (en) Image matching method based on cascading binary encoding
Li et al. A complex junction recognition method based on GoogLeNet model
CN111640089A (en) Defect detection method and device based on feature map center point
CN110502977B (en) Building change classification detection method, system, device and storage medium
CN109583493A (en) A kind of credit card detection and digit recognition method based on deep learning
Liu et al. LB-LSD: A length-based line segment detector for real-time applications
CN109255792A (en) A kind of dividing method of video image, device, terminal device and storage medium
CN102855635A (en) Method and device for determining human body action cycles and recognizing human body actions
CN105469120A (en) Image matching method and apparatus
Mi et al. Research on a Fast Human‐Detection Algorithm for Unmanned Surveillance Area in Bulk Ports
EP3410389A1 (en) Image processing method and device
CN109815763A (en) Detection method, device and the storage medium of two dimensional code
Lu et al. Particle filter vehicle tracking based on surf feature matching
CN110135224B (en) Method and system for extracting foreground target of surveillance video, storage medium and terminal
Soundararajan et al. Analysis of mincut, average cut, and normalized cut measures
Hagelskjær et al. Bridging the reality gap for pose estimation networks using sensor-based domain randomization
Lai et al. Efficient guided hypothesis generation for multi-structure epipolar geometry estimation
Hui et al. A multilevel single stage network for face detection
Lee et al. Learning to assemble geometric shapes
Suarez-Mash et al. Using deep neural networks to classify symbolic road markings for autonomous vehicles
Flannery et al. RANSAC: identification of higher-order geometric features and applications in humanoid robot soccer

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20160406