CN103049735A - Method for detecting particular object in image and equipment for detecting particular object in image - Google Patents

Method for detecting particular object in image and equipment for detecting particular object in image Download PDF

Info

Publication number
CN103049735A
CN103049735A CN2011103107651A CN201110310765A CN103049735A CN 103049735 A CN103049735 A CN 103049735A CN 2011103107651 A CN2011103107651 A CN 2011103107651A CN 201110310765 A CN201110310765 A CN 201110310765A CN 103049735 A CN103049735 A CN 103049735A
Authority
CN
China
Prior art keywords
certain objects
energy
outer boundary
gradient
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2011103107651A
Other languages
Chinese (zh)
Other versions
CN103049735B (en
Inventor
刘殿超
师忠超
钟诚
刘童
王刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Priority to CN201110310765.1A priority Critical patent/CN103049735B/en
Publication of CN103049735A publication Critical patent/CN103049735A/en
Application granted granted Critical
Publication of CN103049735B publication Critical patent/CN103049735B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a method for detecting a particular object in an image. The method comprises the following steps of a region-of-interest estimation step: estimating a region containing a particular object as a region-of-interest in an input image to be processed; a character determining step: determining characteristic parameters of an object in the region-of-interest; an object energy determining step: determining the energy of the object according to the characteristic parameter of the object; and a particular object identifying step: comparing the energy of the determined object with a predetermined threshold value, if the energy of the object is equal to or greater than the predetermined threshold value, judging the object as the particular object. The invention correspondingly provides equipment for detecting a particular object in an image.

Description

The equipment of certain objects in the method for certain objects and the detected image in the detected image
Technical field
The present invention relates to the equipment of certain objects in the method for certain objects in a kind of detected image and the detected image.
Background technology
Along with the development of computer technology, at the area of pattern recognition of image and video significant progress is arranged also, also increasing to the demand of the technology of certain objects in the detected image, and obtained certain achievement.
At area of pattern recognition, the effect of object detection depends on choosing of key feature or key feature combination.In recent years, a large amount of practical feature is applied in the object detection field.Based on the method for identifying and classifying of single features generally can obtain ratio of precision lower, but the higher recognition result of a lot of flase drop efficient arranged.Take cloud detection as example, can recognize most cloud based on the recognition methods of solid color feature, but have simultaneously the object that much has with the cloud Similar color may be by flase drop Cheng Yun.
(US 7,480,052B1) proposed the method that detects cloud in satellite cloud picture based on the electromagnetic field spectrum information for patent document 1.The ratio value between them by the reflected value in the bandwidth range of more at least three discrete electromagnetic field frequency spectrums, is then compared in certain zone in satellite cloud picture, thereby obtains the result of determination of cloud detection.Yet patent document 1 is only specially for satellite cloud picture, determines the cloud detection result by adopting electromagnetic field frequency spectrum reflected value, because of rather than a kind of cloud detection method of optic of broad scope, be of limited application.
Non-patent document 1 (Classification of satellite cloud imagery based on multi-feature texture analysis and neural networks, Christodoulou, C.I.; Michaelides, S.C.; Pattichis, C.S.; Kyriakou, K.; Dept.of Comput.Sci., Univ.of Cyprus, Image Processing, 2001, Proceedings, 2001 International Conference, vol.1,497-500 is based on the cloud classification method in the satellite cloud picture of the analysis of many feature textures and neural network, Cyprus university) the different cloud classification methods of a kind of differentiation of proposition.9 kinds of different texture characteristic sets (comprising altogether 55 features) are extracted, and train effective cloud Classifier by neural network.Textural characteristics in the non-patent document 1 has comprised border, texture etc. feature, but these features input to be used for neural metwork training separately, independently.Feature differentiation is limited, and processes comparatively complicated.
Summary of the invention
Make the present invention in view of the above-mentioned problems in the prior art, the present invention proposes the equipment of certain objects in a kind of method based on certain objects in the detected image of energy model and the detected image.
According to an aspect of the embodiment of the invention, the method for certain objects in a kind of detected image has been proposed, comprising: the interesting region estimating step, in the pending image of input, estimate to comprise the zone of described certain objects, as area-of-interest; The feature determining step is determined the characteristic parameter of object in the described area-of-interest; Object energy determining step is determined the energy of object according to the characteristic parameter of object; The certain objects discriminating step is compared the energy of determined object with predetermined threshold, if the energy of this object more than or equal to this predetermined threshold, is described certain objects with this object discrimination then.。
According to another aspect of the embodiment of the invention, the equipment of certain objects in a kind of detected image has been proposed, comprising: the interesting region estimating device, in the pending image of input, estimate to comprise the zone of described certain objects, as area-of-interest; Feature is determined device, determines the characteristic parameter of object in the described area-of-interest; The object energy is determined device, determines the energy of object according to the characteristic parameter of object; The certain objects discriminating gear is compared the energy of determined object with predetermined threshold, if the energy of this object more than or equal to this predetermined threshold, is described certain objects with this object discrimination then.
By reading the detailed description of following the preferred embodiments of the present invention of considering by reference to the accompanying drawings, will understand better above and other target of the present invention, feature, advantage and technology and industrial significance.
Description of drawings
Fig. 1 illustrates the overview flow chart according to the method for certain objects in the detected image of the embodiment of the invention.
Fig. 2 illustrates an example of pending image.
Fig. 3 is schematically illustrated to carry out the rear estimated area-of-interest about the certain objects cloud of interesting region estimating processing to pending image shown in Figure 2.
Fig. 4 illustrates another example of pending image.
Fig. 5 is schematically illustrated to carry out the rear estimated area-of-interest about the certain objects cloud of interesting region estimating processing to pending image shown in Figure 4.
Fig. 6 illustrates the synoptic diagram to the region of interesting extraction object outer boundary characteristic parameter among Fig. 2.
Fig. 7 illustrates the synoptic diagram that image shown in Figure 4 is extracted object inner boundary characteristic parameter.
Fig. 8 illustrates the general frame according to the equipment of certain objects in the detected image of the embodiment of the invention.
Fig. 9 is the general frame that illustrates according to the system of certain objects in the detected image of the embodiment of the invention.
Embodiment
Below in conjunction with accompanying drawing the embodiment of the invention is described.
Fig. 1 illustrates the overview flow chart according to the method for certain objects in the detected image of the embodiment of the invention.As shown in Figure 1, the method for certain objects can comprise in the detected image: interesting region estimating step S100, can in the pending image of input, estimate to comprise the zone of described certain objects, as area-of-interest; Feature determining step S200 can determine the characteristic parameter of object in the described area-of-interest; Object energy determining step S300 can determine according to the characteristic parameter of object the energy of object; And certain objects discriminating step S400, the energy of determined object can be compared with predetermined threshold, if the energy of this object more than or equal to this predetermined threshold, is described certain objects with this object discrimination then.
In this interesting region estimating step S100, described pending image can be divided into a plurality of zones, obtain the color character of regional, utilize linear classifier to judge respectively whether regional meets the color character of described certain objects, the zone combination that will meet the color character of described certain objects obtains described area-of-interest.
For input picture to be detected, perhaps be called pending image, can in interesting region estimating step S100, carry out Preliminary detection to it, get rid of the obvious specific target object that will detect that do not have, be the image of certain objects, to reduce the burden of this last handling process.
Preliminary detection can be based on any feature in shape, color, size etc. or manifold combination, usually, Preliminary detection based on a kind of single features has very high processing speed, can significantly reduce the quantity of image to be detected, but Preliminary detection has relatively low verification and measurement ratio, in fact have a certain amount of is not to comprise the image of certain objects by this Preliminary detection, enters processing after this.
Suppose that in described certain objects be in the situation of cloud, in this interesting region estimating step S100, the pending image of input can be carried out the Preliminary detection based on the cloud of solid color feature.Cloud can be divided into substantially white clouds, black clouds, reach rosy clouds of dawn sunset clouds etc. classification, can collect a large amount of positive sample image about all kinds of clouds (cloud atlas picture) and negative sample image (non-cloud atlas picture), comes training classifier.The detection of the cloud of color-based feature can utilize linear classifier, this linear classifier can be by extracting abundant positive sample image and the RGB color characteristic of negative sample image, and then use support vector machine (Support Vector Machine, SVM) training and obtain.
Interesting region estimating step S100 can carry out for each pixel one by one to the preliminary judgement of pending image, yet, in order to reduce the complexity of processing, also image can be divided into several rectangle frames of equal sizes, for example, the image of 1024*768 is divided into 32*27 rectangle frame, and the image pixel number that the obvious embodiment of the invention can be processed and the ranks number of dividing rectangle frame are not limited to this.Through the processing take pixel or rectangle frame as unit to pending image, at the Hypothetical classification device certain or some above-mentioned units are judged as in the situation of certain objects, suppose that certain objects is cloud, detect target and comprise white clouds, black clouds, rosy clouds of dawn sunset clouds, then contain the zone of cloud in the pending image and in fact do not contain cloud but the zone contained with the object of cloud Similar color can be detected.
Fig. 2 illustrates an example of pending image.By the processing of above-mentioned interesting region estimating step S100 to image shown in Figure 2, can obtain it about the area-of-interest of certain objects.Fig. 3 schematically illustrated to pending image shown in Figure 2 carries out interesting region estimating process after the estimated area-of-interest about the certain objects cloud, wherein, pending image is divided into the rectangle frame of several rows column number, wherein mark the rectangle frame of cross spider Estimate to consist of area-of-interest, the rectangle frame that does not wherein contain cross spider determines not relate to this certain objects.
Fig. 4 illustrates another example of pending image.By the processing of above-mentioned interesting region estimating step S100 to image shown in Figure 4, can obtain it about the area-of-interest of certain objects.Fig. 5 schematically illustrated to pending image shown in Figure 4 carries out interesting region estimating process after the estimated area-of-interest about the certain objects cloud, wherein, pending image is divided into the rectangle frame of several rows column number, wherein mark the rectangle frame of cross spider
Figure BDA0000098629020000042
Estimate to consist of area-of-interest, the rectangle frame that does not wherein contain cross spider determines not relate to this certain objects.
Object among Fig. 2 is the single object cloud, is in the situation of certain objects at cloud, and the area-of-interest that interesting region estimating step S100 estimates is all really for comprising the object cloud; Object among Fig. 4 has multiple, such as building, automobile, cloud etc., although cloud still is certain objects, yet the area-of-interest that interesting region estimating step S100 estimates may also be estimated as area-of-interest about cloud to other object, although see the existence erroneous judgement from last result, all remain in this step.
The detected zone that may comprise certain objects of interesting region estimating step S100 can be called area-of-interest (ROI), interesting region estimating step S100 can record to the position of the estimated area-of-interest that goes out of pending image, additional data as the view data of this pending image, after this processing can be carried out for detected area-of-interest, thereby alleviates the pressure of subsequent treatment.
For the estimated area-of-interest of interesting region estimating step S100, can enter feature determining step S200 processes, yet, alternatively, also can get rid of again the area-of-interest that some unlikely are the certain objects of target by getting rid of step, further reduce the burden of this aftertreatment.
Getting rid of step for example can be, according to the position of area-of-interest in described pending image, gets rid of the area-of-interest of the position feature that does not meet described certain objects.The eliminating foundation that the eliminating step can adopt is not limited to the position of area-of-interest, but also can adopt neighbour's feature or further feature to carry out the feature verification, thereby can further get rid of a lot of flase drop results.
Suppose that the certain objects as the target that finally will detect is cloud, such as white clouds, black clouds, rosy clouds of dawn sunset clouds, get rid of step and can remove by the verification of some important supplemental characteristics the flase drop result.For example, position feature can be a simply and effectively feature, in the situation of getting rid of step employing position feature, if area-of-interest is arranged in the first half of image, object so wherein just might be cloud, if there is in the latter half, object so wherein will unlikely be cloud.Again for example, in the situation of getting rid of step employing neighbour feature, if there is object to be surrounded by blue, gray area in the area-of-interest, this blue region may be blue sky, this gray area may be the cloudy day, and this object might be cloud so, and such area-of-interest can keep; And if the object in the area-of-interest is not all surrounded by blue, gray area, then this object is unlikely is cloud, might for example be plate, car or skin etc., and such area-of-interest can be got rid of.
The area-of-interest that interesting region estimating step S100 is estimated or the area-of-interest that is not excluded through getting rid of step enter feature determining step S200 and subsequent processing.
In described feature determining step S200, determine the characteristic parameter of object in the described area-of-interest, described characteristic parameter can comprise object outer boundary characteristic parameter and interior of articles boundary characteristic parameter.
About object outer boundary characteristic parameter, can be by determining object boundary in the area-of-interest, extract object outer boundary wherein, determine that according to this object area occupied outer boundary counts out, equidistant on the object outer boundary outer boundary point that outer boundary is counted out is set, determine size value and the direction metric of outer boundary point gradient, outer boundary is put the size value of gradient and direction metric and outer boundary count out as object outer boundary characteristic parameter.
Feature determining step S200 will process to obtain that wherein the image of object outer boundary characteristic parameter can be still for example as shown in Figure 2, yet, should be appreciated that no matter whether pass through the processing of above-mentioned eliminating step at this, known the information of position of the area-of-interest among Fig. 2 etc. this moment.
Fig. 6 illustrates the synoptic diagram to the region of interesting extraction object outer boundary characteristic parameter among Fig. 2.
At first, can pass through boundary detection method, for example Sobel or Canny Boundary Detection, the border of extracting each object in the area-of-interest, choosing wherein, the outermost border is outer boundary.In Fig. 6, show the outer boundary that extracts of each object, schematically marked wherein outer boundary Le1, Le2, the Le3 of three objects, in order to the enforcement of feature determining step S200 that the embodiment of the invention is described.The outer boundary that it will be understood by those skilled in the art that other object among Fig. 2 also can and extract.For simplicity, in Fig. 6, the outer boundary of above-mentioned other object is not given label, yet it will be understood by those skilled in the art that can be to process other outer boundary with the processing of following processing same way as for outer boundary Le1.Below, illustrate that as an example of outer boundary Le1 example it is enclosed the extraction of the outer boundary feature of object.
The shared area area of the object that outer boundary Le1 encloses can obtain by ripe means, can determine the outer boundary that adapts with this area Ns that counts out according to the empirical function Ns=f (area) that a large amount of positive sample image analyses are obtained.Namely, the value of Ns can decide according to the size of certain objects to be detected, and wherein, can be with any regular, for example the highest, minimum, the most left or the rightest, come the starting point to determine the position of first outer boundary point, all the other points or clockwise or also can arrange successively widdershins.Suitable Ns value makes it possible to extract as much as possible the key feature of object outside.
Then, externally arrange equidistantly successively this Ns outer boundary point on the Le1 of border, in Fig. 6, show the outer boundary point on the outer boundary Le1, wherein, for illustrative purposes, 4 outer boundary point Pe1, Pe2, Pe3, Pe4 have schematically been marked, for simplicity, other outer boundary point of this object is not given label, yet it will be understood by those skilled in the art that can be to process other outer boundary point with the processing of following processing same way as for outer boundary point Pe1, Pe2, Pe3, Pe4.
Then, can calculate by ripe means the gradient of each outer boundary point, gradient is vector, and Grad comprises gradient size values and gradient direction value.The arrow at Fig. 6 peripheral frontier point Pe1, Pe2, Pe3, Pe4 place represents the direction of this some place gradient.
Calculate the mean value of the size of outer boundary point gradient, as the size value of described outer boundary point gradient.Namely, calculate the mean value msm of the gradient magnitude of all Ns outer boundary point, as the size value of described outer boundary point gradient, the size value msm of this gradient can be used for weighing the gradual change degree of object outer boundary.
Can be before or after the mean value of the size of calculating outer boundary point gradient, or meanwhile, the distribution of the consecutive point differential seat angle of calculating outer boundary point gradient is as the direction metric of described outer boundary point gradient.Wherein, the angular ranges of 360 degree can be divided into the angular interval of predetermined number, the consecutive point differential seat angle is distributed to described angular interval, the number of the described angular interval that is distributed with described consecutive point differential seat angle as the direction metric.
For example, it is poor that the gradient direction angle value of all adjacent two outer boundary points of object is done, take Fig. 6 as example, suppose that in the counterclockwise direction (obviously also can in a clockwise direction) be to outside frontier point Pe1, Pe2, Pe3, it is poor that the gradient direction angle value at Pe4 place takes turns doing, it is the angle of the angle-Pe3 of the Pe4 gradient direction of the ordering gradient direction of ordering, the angle of the gradient direction that the angle-Pe2 of the gradient direction that Pe3 is ordered is ordered, the angle of the gradient direction that the angle-Pe1 of the gradient direction that Pe2 is ordered is ordered, calculate so successively, until finish the calculating of consecutive point differential seat angle of whole Ns outer boundary point of object that Le1 encloses, circulate a week, get back to the Pe4 point.
Then, difference is grouped in the default angle difference segment.Each segment covers 10 degree, has 36 segments, adds up the distribution situation of these angle differences, and counts the segment number ds that these angle differences distribute.For example, supposing has 25 outer boundary points (Ns=25), and 25 angle differences are then arranged, and supposes that these 25 angle differences are distributed in 19 angular interval sections, then the direction metric ds=19 of described outer boundary point gradient.
Statistics object outer boundary is put the outside scrambling that adjacent gradient direction difference can be weighed object.Can comprise Ns, msm and ds by the resulting object outer boundary of above-mentioned processing characteristic parameter, these parameters will enter the processing procedure that is used for setting up energy model after this.
Can before or after the process of above-mentioned extract external body boundary characteristic parameter, perhaps with it simultaneously, extract object inner boundary characteristic parameter.
About interior of articles boundary characteristic parameter, can be by determining object boundary in the area-of-interest, extract interior of articles border wherein, determine that according to interior of articles border total length inner boundary counts out, equidistant on the interior of articles border inner boundary point that inner boundary is counted out is set, determine size value and the direction metric of inner boundary point gradient, inner boundary is put the size value of gradient and direction metric and inner boundary count out as interior of articles boundary characteristic parameter.
Feature determining step S200 will process to obtain that wherein the image of interior of articles boundary characteristic parameter can be still for example as shown in Figure 4, yet, should be appreciated that no matter whether pass through the processing of above-mentioned eliminating step at this, known the information of position of the area-of-interest among Fig. 4 etc. this moment.
Fig. 7 illustrates the synoptic diagram that image shown in Figure 4 is extracted object inner boundary characteristic parameter.For the interior of articles border that clearly represents to extract, Fig. 7 adopts the form of binary picture.Those skilled in the art are appreciated that by following explanation the feature determining step S200 of the embodiment of the invention also can extract its inner boundary for image shown in Figure 2, also can extract its outer boundary for image shown in Figure 4.Adopt different picture specification feature determining step S200 to embody the general applicability of the embodiment of the invention.
Extract the inner vein feature of each object in the area-of-interest, for example, can pass through boundary detection method, for example Sobel or Canny Boundary Detection, the border of extracting each object in the area-of-interest, and the extraction inner boundary of each object wherein thus.In fact, about extract external body boundary characteristic parameter and extraction object inner boundary characteristic parameter, can utilize the result of a Boundary Detection, each object exterior-most limits is outer boundary, and within the outer boundary is inner boundary.Then, for each object, all borders of this interior of articles are imagined as an integral body, with the border of an interior of articles for example according to from top to bottom, all coupled together by left-to-right order, by extracting the feature on this whole interior border of drawing up, weigh the characteristic of this interior of articles texture.In Fig. 7, show the border of extracting of each object, comprise outer boundary and inner boundary, schematically marked the wherein whole interior border Li that is formed by connecting of certain object, in order to the enforcement of feature determining step S200 that the embodiment of the invention is described.
The inner boundary that it will be understood by those skilled in the art that other object among Fig. 4 also can and extract.For simplicity, in Fig. 7, the inner boundary of above-mentioned other object is not given label, yet it will be understood by those skilled in the art that can be to process other inner boundary with the processing of following processing same way as for inner boundary Li.Below, the extraction of its inner vein feature is described as an example of inner boundary Li example.
The length l ength of inner boundary Li can obtain by ripe means, can determine the inner boundary that adapts with this length Nb that counts out according to the empirical function Nb=f (length) that a large amount of positive sample image analyses are obtained.Namely, the value of Nb can decide according to the inner boundary length of certain objects to be detected, and wherein, can be with any regular, for example the highest, minimum, the most left or the rightest, come the starting point to determine the position of first outer boundary point, all the other points or clockwise or also can arrange successively widdershins.Suitable Nb value makes it possible to extract as much as possible the key feature of interior of articles.
Then, on inner boundary Li, arrange equidistantly successively this Nb inner boundary point, in Fig. 7, show the inner boundary point on the inner boundary Li, wherein, for illustrative purposes, 4 inner boundary point Pi1, Pi2, Pi3, Pi4 have schematically been marked, for simplicity, other inner boundary point of this object is not given label, yet it will be understood by those skilled in the art that can be to process other inner boundary point with the processing of following processing same way as for inner boundary point Pi1, Pi2, Pi3, Pi4.
Then, can calculate by ripe means the gradient of each inner boundary point, gradient is vector, and Grad comprises gradient size values and gradient direction value.The arrow at inner boundary point Pi1, Pi2, Pi3, Pi4 place represents the direction of this some place gradient among Fig. 7.
Calculate the mean value of the size of inner boundary point gradient, as the size value of described inner boundary point gradient.Namely, calculate the mean value mbm of the gradient magnitude of all Nb inner boundary point, as the size value of described inner boundary point gradient, the size value mbm of this gradient can be used for weighing the gradual change degree on interior of articles border.
Can be before or after the mean value of the size of calculating outer boundary point gradient, or meanwhile, the distribution of the consecutive point differential seat angle of calculating inner boundary point gradient is as the direction metric of described inner boundary point gradient.Wherein, the angular ranges of 360 degree can be divided into the angular interval of predetermined number, the consecutive point differential seat angle is distributed to described angular interval, the number of the described angular interval that is distributed with described consecutive point differential seat angle as the direction metric.
For example, it is poor that the gradient direction angle value of all adjacent two inner boundary points of object is done, take Fig. 7 as example, suppose by from left to bottom right order (obviously also can along other direction) to inner frontier point Pi1, Pi2, Pi3, it is poor that the gradient direction angle value at Pi4 place takes turns doing, it is the angle of the angle-Pi2 of the Pi1 gradient direction of the ordering gradient direction of ordering, the angle of the gradient direction that the angle-Pi3 of the gradient direction that Pi2 is ordered is ordered, the angle of the gradient direction that the angle-Pi4 of the gradient direction that Pi3 is ordered is ordered, calculate so successively, until finish the calculating of the consecutive point differential seat angle of whole Nb inner boundary points on the Li, the angle of the gradient direction of the angle-starting point of the gradient direction of rearmost point, thus circulate a week.
Then, difference is grouped in the default angle difference segment.Each segment covers 10 degree, has 36 segments, adds up the distribution situation of these angle differences, and counts the segment number db that these angle differences distribute.For example, supposing has 25 inner boundary points (Nb=25), and 25 angle differences are then arranged, and supposes that these 25 angle differences are distributed in 4 angular interval sections, then the direction metric db=4 of described inner boundary point gradient.
The adjacent gradient direction difference of statistics interior of articles frontier point can be weighed the scrambling of the inner vein of object.Can comprise Nb, mbm and db by the resulting interior of articles boundary characteristic of above-mentioned processing parameter, these parameters will enter the processing procedure that is used for setting up energy model after this.
Then, at object energy determining step S300, the key feature of determined certain object in the processing procedure before utilizing, namely the parameter of the parameter of its outer boundary Gradient Features and inner boundary textural characteristics thereof is set up energy model.It will be appreciated by those skilled in the art that, although in the above description, introduce respectively the parameter of outer boundary Gradient Features of how to confirm object and the parameter of inner boundary textural characteristics for different images, yet, in setting up the process of energy model, the inevitable energy model of setting up this object according to outer boundary characteristic parameter and the inner boundary characteristic parameter of same object.
Particularly, in described object energy determining step S300, can determine object outer boundary energy based on object outer boundary characteristic parameter, determine interior of articles border energy based on interior of articles boundary characteristic parameter, object outer boundary energy and interior of articles border energy are pressed the predefined weight addition, obtain the energy of this object.
Particularly, for some objects, can set up its energy model by following formula (1):
E object=E surface+k·E body (1)
E wherein ObjectThe gross energy of this object, E SurfaceThe outer boundary energy of object, E BodyIt is the inner boundary energy of object; K is the weight parameter between outside and the internal energy, can be the optimal value that obtains by a large amount of sample training of the method utilization of machine learning.
Be in the situation of cloud in the certain objects as the target that will detect, E ObjectCan be expressed as E CloudAnd in the case, k is by training the result who gets about the sample image of cloud in a large number.Certain objects in the target that conduct will detect is in the situation of other certain objects, then obtains the k value by a large amount of the training about the sample image of this other certain objects.
Particularly, the ratio that can count out according to direction metric and the outer boundary of described outer boundary point gradient and the size value of described outer boundary point gradient are determined described object outer boundary energy.For example, object outer boundary ENERGY E SurfaceCan calculate by following formula (2)
E surface=a ds/Ns+a -msm (2)
Wherein, Ns is this object outer boundary point number, msm is the size value of this Ns outer boundary point gradient, ds is the direction metric of this Ns outer boundary point gradient, a can be for greater than 1 arbitrary value, for example, a can be math constant e, also can be other constant value of 1.5,2,100 etc.
Particularly, the ratio that can count out according to direction metric and the inner boundary of described inner boundary point gradient and the size value of described inner boundary point gradient are determined described interior of articles border energy.
For example, object outer boundary ENERGY E BodyCan calculate by following formula (3)
E body=a db/Nb+a -mbm (3)
Wherein, Nb is this interior of articles frontier point number, and mbm is the size value of this Nb inner boundary point gradient, and db is the direction metric of this Nb inner boundary point gradient, and is identical in its value of the implication of a and the above-mentioned formula (2).
Thereby, can calculate by following formula (4) the gross energy E of this object Object
E object=a ds/Ns+a -msm+k·(a db/Nb+a -mbm) (4)
It is identical with the description in the preamble wherein respectively to measure implication.
Can find by formula (4), when this detected object has fuzzy, slow gradual change and irregular outer boundary, when having simultaneously fuzzy smooth and irregular inner vein or only a small amount of inner vein being arranged, it is large that the total energy value of this object to be detected can trend towards becoming.And in some other situation, for example work as object to be detected and have clear or regular outer boundary, when having simultaneously clear or regular inner vein, the total energy value of this object to be detected can trend towards diminishing.Take cloud as example, the outside of cloud is fuzzy irregular, and inner vein is that fuzzy irregular or inner vein is a small amount of, therefore, when object to be detected is cloud, can generate a higher total energy value.Therefore, the energy model that formula (4) is set up can be weighed the object surface, for example gradual change degree and the irregularity boundary of outer boundary, and weigh simultaneously the interior of articles feature, for example sharpness of texture and distribution situation.
Whether after object energy determining step S300 calculates the gross energy of object to be detected, differentiating this object by certain objects discriminating step S400 is certain objects as target.Wherein, can be according to the method for machine learning, by training a large amount of samples about this certain objects, and according to have the above-mentioned formula that identical parameters arranges with above describing, for example a value identical with above-mentioned testing process, k value etc., set up its energy model, obtain corresponding optimal value according to the energy value of sample, as the predetermined threshold that is used for judging this certain objects.If the energy of object that should be detected is described certain objects with this object discrimination then more than or equal to this predetermined threshold; Obviously also can be if the energy of object that should be detected greater than this predetermined threshold, is described certain objects with this object discrimination then; Otherwise, with this object discrimination for not being described certain objects.Detect in whole area-of-interests after the object, obtain final process result.
So far, can be take this area arbitrarily ripe means identify in image that to differentiate be the object of target certain objects, and with arbitrarily ripe means output of this area.
The present invention can also be embodied as the equipment of certain objects in a kind of detected image, can be used for implementing the method for certain objects in the aforementioned detected image.Fig. 8 illustrates the general frame according to the equipment of certain objects in the detected image of the embodiment of the invention.As shown in Figure 8, the equipment of certain objects comprises in this detected image: interesting region estimating device 100 can be used for implementing aforementioned interesting region estimating step S100, with in the pending image of input, estimate to comprise the zone of described certain objects, as area-of-interest; Feature is determined device 200, can be used for implementing aforementioned feature determining step S200, to determine the characteristic parameter of object in the described area-of-interest; The object energy is determined device 300, can be used for implementing aforementioned object energy determining step S300, to determine the energy of object according to the characteristic parameter of object; Certain objects discriminating gear 400, can be used for implementing aforementioned certain objects discriminating step S400, so that the energy of determined object is compared with predetermined threshold, if the energy of this object more than or equal to this predetermined threshold, is described certain objects with this object discrimination then.
Wherein, described feature determines that device 200 determined described characteristic parameters can comprise object outer boundary characteristic parameter and interior of articles boundary characteristic parameter.
Wherein, described object energy determines that device 300 can determine object outer boundary energy based on object outer boundary characteristic parameter, determine interior of articles border energy based on interior of articles boundary characteristic parameter, object outer boundary energy and interior of articles border energy are pressed the predefined weight addition, obtain the energy of this object.
Wherein, described interesting region estimating device 100 can be divided into a plurality of zones with described pending image, obtain the color character of regional, utilize linear classifier to judge respectively whether regional meets the color character of described certain objects, the zone combination that will meet the color character of described certain objects obtains described area-of-interest.
Equipment according to certain objects in the detected image of the embodiment of the invention can also comprise remover, can be used for implementing aforementioned eliminating step, with according to the position of area-of-interest in described pending image, get rid of the area-of-interest of the position feature that does not meet described certain objects.
In according to the detected image of the embodiment of the invention in the equipment of certain objects, can be by determining object boundary in the area-of-interest, extract object outer boundary wherein, determine that according to this object area occupied outer boundary counts out, equidistant on the object outer boundary outer boundary point that outer boundary is counted out is set, determine size value and the direction metric of outer boundary point gradient, outer boundary is put the size value of gradient and direction metric and outer boundary count out as object outer boundary characteristic parameter.
In according to the detected image of the embodiment of the invention in the equipment of certain objects, can be by determining object boundary in the area-of-interest, extract interior of articles border wherein, determine that according to interior of articles border total length inner boundary counts out, equidistant on the interior of articles border inner boundary point that inner boundary is counted out is set, determine size value and the direction metric of inner boundary point gradient, inner boundary is put the size value of gradient and direction metric and inner boundary count out as interior of articles boundary characteristic parameter.
In according to the detected image of the embodiment of the invention, in the equipment of certain objects, can calculate the mean value of the size of outer boundary point gradient, as the size value of described outer boundary point gradient; Calculate the distribution of the consecutive point differential seat angle of outer boundary point gradient, as the direction metric of described outer boundary point gradient; The ratio of counting out according to direction metric and the outer boundary of described outer boundary point gradient and the size value of described outer boundary point gradient are determined described object outer boundary energy.
In according to the detected image of the embodiment of the invention, in the equipment of certain objects, can calculate the mean value of the size of inner boundary point gradient, as the size value of described inner boundary point gradient; Calculate the distribution of the consecutive point differential seat angle of inner boundary point gradient, as the direction metric of described inner boundary point gradient; The ratio of counting out according to direction metric and the inner boundary of described inner boundary point gradient and the size value of described inner boundary point gradient are determined described interior of articles border energy.
In according to the detected image of the embodiment of the invention in the equipment of certain objects, the angular range of 360 degree can be divided into the angular interval of predetermined number, the consecutive point differential seat angle is distributed to described angular interval, the number of the described angular interval that is distributed with described consecutive point differential seat angle as the direction metric.
The present invention can also implement by the system of certain objects in a kind of detected image.Fig. 9 is the general frame that illustrates according to the system 1000 of certain objects in the detected image of the embodiment of the invention, as shown in Figure 9, the system 1000 of certain objects can comprise in the detected image: input equipment 1100, be used for from the outside input will Check processing image, the remote input equipment that for example can comprise keyboard, Genius mouse, scanner and communication network and connect; Treatment facility 1200, be used for implementing the method for above-mentioned detected image certain objects according to the embodiment of the invention, perhaps be embodied as the above-mentioned equipment according to certain objects in the detected image of the embodiment of the invention, for example can comprise central processing unit or other the chip with processing power such as DSP etc. of computing machine; Output device 1300 is used for implementing to outside output the result of above-mentioned certain objects testing process gained, for example can comprise display, printer and communication network and the long-range output device that connects etc.; And memory device 1400, for the result who stores the related image of above-mentioned certain objects Check processing process, gained in volatile or non-volatile mode, order, intermediate data etc., for example can comprise the various volatile or nonvolatile memory of random access memory (RAM), ROM (read-only memory) (ROM), hard disk or semiconductor memory etc.
The outer boundary Gradient Features of the object that this instructions is above mentioned and inner vein feature (for example inner boundary Gradient Features) just are applied to two kinds of key characters in the various features in object detection field, enforcement of the present invention is not limited to this, and many other features also can be used for setting up energy model.Inside and outside energy in the energy model is two sub-energy, yet, also can set up the energy model of the object detection with three or more key features, this energy model expanded to contain three or more sub-energy, in the case, in formula (1), add corresponding energy element.
The equipment of certain objects can be take cloud as certain objects in the method for certain objects and the detected image in the detected image of the embodiment of the invention, the cloud detection that can be correlated with to white clouds, black clouds, rosy clouds of dawn sunset clouds etc. obviously can also be applied to detect the type objects with fuzzy and irregular contour and fuzzy inner vein feature.For example, in the detected image of the embodiment of the invention in the method for certain objects and the detected image equipment of certain objects can be applied to detection such as some down toys with the identical configuration of cloud detection.
One skilled in the art will recognize that for certain objects arbitrarily, can use the equipment of certain objects in the method for certain objects in the detected image of the embodiment of the invention and the detected image.Sample image based on this certain objects is trained, and by setting up with the corresponding energy model of this certain objects and determining corresponding predetermined threshold, can realize the detection of this certain objects in the image.Therefore, the embodiment of the invention can be applied to general object detection.
By the equipment based on certain objects in the method for certain objects in the detected image of energy model and the detected image according to the embodiment of the invention, can realize reducing false drop rate, remove and utilize other object that the prior art means can flase drop.The energy model that the embodiment of the invention adopts is by combining a plurality of key features, they minute are equipped with different weights, be established in the energy model by certain mathematical operation, thus the distinctive numerical value that can utilize the energy value that generates to characterize to have certain objects.By such energy model, the means of identification that the embodiment of the invention realizes can be effectively in conjunction with key feature and obtain better recognition accuracy.
Compare with aforementioned patent document 1, in the detected image of the embodiment of the invention in the method for certain objects and the detected image equipment of certain objects by extracting the image unique characteristics, for example outer boundary Gradient Features and inner vein feature, set up energy model, thereby the cloud in effective detected image, have more widely range of application, and be not only satellite cloud picture.
Compare with aforementioned non-patent document 1, in the detected image of the embodiment of the invention in the method for certain objects and the detected image equipment of certain objects except Preliminary detection and feature verification, two or more key features (for example outer boundary gradient and inner vein feature) can be established in the energy model, being equipped with weight realizes organically combining, thereby obtain better feature differentiation, and reduce processing load with the algorithm of simplifying.
The embodiment of the invention based on the equipment of certain objects in the method for certain objects in the detected image of energy model and the detected image by proposing energy model, the key feature of two or more objects of optimal combination, set up inside and outside energy for feature respectively, then by two kinds of features of weight parameter k optimal combination, to reach the optimal classification effect.
The equipment based on certain objects in the method for certain objects in the detected image of energy model and the detected image of the embodiment of the invention proposes the optimum detection structure: at first carry out the Preliminary detection based on single features, then important supplemental characteristic is carried out the feature verification, then set up energy model and carry out the final detection of specific objective object, thereby realize optimum detection efficient and performance.
Specific objective object in detection is in the situation of cloud, the equipment based on certain objects in the method for certain objects in the detected image of energy model and the detected image of the embodiment of the invention adopts the outer boundary gradient characteristic of cloud and inner vein characteristic as key character, carries out the detection of cloud.
In addition, the equipment based on certain objects in the method for certain objects in the detected image of energy model and the detected image of the embodiment of the invention proposes neighbour's direction difference statistical method, to weigh the scrambling of border or texture.
Obviously, more than illustrative concrete formula, parameter, hardware, numerical value be example, those skilled in the art can be in the scope of spirit of the present invention, obtains other place of equation, parameter, hardware, numerical value according to the instruction of this instructions and realizes the present invention.More than describe as an example the detail of embodiments of the invention in detail by the cloud detection model in the image recognition model, but, it will be appreciated by those skilled in the art that, the applicable model of cognition of the present invention is not limited to this, but can be applied to detection and the identification of other models beyond the cloud detection model.
The sequence of operations that illustrates in instructions can be carried out by the combination of hardware, software or hardware and software.When carrying out this sequence of operations by software, can be installed to computer program wherein in the storer in the computing machine that is built in specialized hardware, so that computing machine is carried out this computer program.Perhaps, can be installed to computer program in the multi-purpose computer that can carry out various types of processing, so that computing machine is carried out this computer program.
For example, can be pre-stored in hard disk or ROM (ROM (read-only memory)) as recording medium computer program.Perhaps, can be temporarily or for good and all storage (record) computer program in removable recording medium, such as floppy disk, CD-ROM (compact disc read-only memory), MO (magneto-optic) dish, DVD (digital versatile disc), disk or semiconductor memory.Can so removable recording medium be provided as canned software.
The present invention has been described in detail with reference to specific embodiment.Yet clearly, in the situation that does not deviate from spirit of the present invention, those skilled in the art can carry out change and replacement to embodiment.In other words, the present invention is open with the form of explanation, rather than explains with being limited.Judge main idea of the present invention, should consider appended claim.

Claims (10)

1. the method for certain objects in the detected image comprises:
The interesting region estimating step in the pending image of input, estimates to comprise the zone of described certain objects, as area-of-interest;
The feature determining step is determined the characteristic parameter of object in the described area-of-interest;
Object energy determining step is determined the energy of object according to the characteristic parameter of object;
The certain objects discriminating step is compared the energy of determined object with predetermined threshold, if the energy of this object more than or equal to this predetermined threshold, is described certain objects with this object discrimination then.
2. according to the method for certain objects in the detected image claimed in claim 1, wherein,
In described feature determining step, described characteristic parameter comprises object outer boundary characteristic parameter and interior of articles boundary characteristic parameter; And
In described object energy determining step, determine object outer boundary energy based on object outer boundary characteristic parameter, determine interior of articles border energy based on interior of articles boundary characteristic parameter, object outer boundary energy and interior of articles border energy are pressed the predefined weight addition, obtain the energy of this object.
3. according to the method for certain objects in the detected image claimed in claim 1, wherein,
In described interesting region estimating step, described pending image is divided into a plurality of zones, obtain the color character of regional, utilize linear classifier to judge respectively whether regional meets the color character of described certain objects, the zone combination that will meet the color character of described certain objects obtains described area-of-interest.
4. according to the method for certain objects in the detected image claimed in claim 1, also comprise:
Get rid of step, according to the position of area-of-interest in described pending image, get rid of the area-of-interest of the position feature that does not meet described certain objects.
5. according to the method for certain objects in the detected image claimed in claim 2, wherein, by determining object boundary in the area-of-interest, extract object outer boundary wherein, determine that according to this object area occupied outer boundary counts out, equidistant on the object outer boundary outer boundary point that outer boundary is counted out is set, determine size value and the direction metric of outer boundary point gradient, outer boundary is put the size value of gradient and direction metric and outer boundary count out as object outer boundary characteristic parameter.
6. according to the method for certain objects in the detected image claimed in claim 2, wherein, by determining object boundary in the area-of-interest, extract interior of articles border wherein, determine that according to interior of articles border total length inner boundary counts out, equidistant on the interior of articles border inner boundary point that inner boundary is counted out is set, determine size value and the direction metric of inner boundary point gradient, inner boundary is put the size value of gradient and direction metric and inner boundary count out as interior of articles boundary characteristic parameter.
7. according to the method for certain objects in the detected image claimed in claim 5, wherein,
Calculate the mean value of the size of outer boundary point gradient, as the size value of described outer boundary point gradient; Calculate the distribution of the consecutive point differential seat angle of outer boundary point gradient, as the direction metric of described outer boundary point gradient; The ratio of counting out according to direction metric and the outer boundary of described outer boundary point gradient and the size value of described outer boundary point gradient are determined described object outer boundary energy.
8. according to the method for certain objects in the detected image claimed in claim 6, wherein,
Calculate the mean value of the size of inner boundary point gradient, as the size value of described inner boundary point gradient; Calculate the distribution of the consecutive point differential seat angle of inner boundary point gradient, as the direction metric of described inner boundary point gradient; The ratio of counting out according to direction metric and the inner boundary of described inner boundary point gradient and the size value of described inner boundary point gradient are determined described interior of articles border energy.
9. according to the method for certain objects in claim 7 or the 8 described detected image, wherein,
The angular ranges of 360 degree are divided into the angular interval of predetermined number, the consecutive point differential seat angle are distributed to described angular interval, the number of the described angular interval that is distributed with described consecutive point differential seat angle as the direction metric.
10. the equipment of certain objects in the detected image comprises:
The interesting region estimating device in the pending image of input, estimates to comprise the zone of described certain objects, as area-of-interest;
Feature is determined device, determines the characteristic parameter of object in the described area-of-interest;
The object energy is determined device, determines the energy of object according to the characteristic parameter of object;
The certain objects discriminating gear is compared the energy of determined object with predetermined threshold, if the energy of this object more than or equal to this predetermined threshold, is described certain objects with this object discrimination then.
CN201110310765.1A 2011-10-14 2011-10-14 The equipment of certain objects in the method for certain objects and detected image in detected image Expired - Fee Related CN103049735B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110310765.1A CN103049735B (en) 2011-10-14 2011-10-14 The equipment of certain objects in the method for certain objects and detected image in detected image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110310765.1A CN103049735B (en) 2011-10-14 2011-10-14 The equipment of certain objects in the method for certain objects and detected image in detected image

Publications (2)

Publication Number Publication Date
CN103049735A true CN103049735A (en) 2013-04-17
CN103049735B CN103049735B (en) 2016-02-03

Family

ID=48062368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110310765.1A Expired - Fee Related CN103049735B (en) 2011-10-14 2011-10-14 The equipment of certain objects in the method for certain objects and detected image in detected image

Country Status (1)

Country Link
CN (1) CN103049735B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105989348A (en) * 2014-12-09 2016-10-05 由田新技股份有限公司 Detection method and system for using handheld device by person
CN110955243A (en) * 2019-11-28 2020-04-03 新石器慧通(北京)科技有限公司 Travel control method, travel control device, travel control apparatus, readable storage medium, and mobile device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6504951B1 (en) * 1999-11-29 2003-01-07 Eastman Kodak Company Method for detecting sky in images
US20050180635A1 (en) * 2004-02-17 2005-08-18 Trifonov Mikhail I. Method and apparatus for selecting an object in an image
CN101833750A (en) * 2010-04-15 2010-09-15 清华大学 Active contour method based on shape constraint and direction field, and system thereof
CN102122343A (en) * 2010-01-07 2011-07-13 索尼公司 Method and device for determining angle of inclination of body and estimating gesture

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6504951B1 (en) * 1999-11-29 2003-01-07 Eastman Kodak Company Method for detecting sky in images
US20050180635A1 (en) * 2004-02-17 2005-08-18 Trifonov Mikhail I. Method and apparatus for selecting an object in an image
CN102122343A (en) * 2010-01-07 2011-07-13 索尼公司 Method and device for determining angle of inclination of body and estimating gesture
CN101833750A (en) * 2010-04-15 2010-09-15 清华大学 Active contour method based on shape constraint and direction field, and system thereof

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105989348A (en) * 2014-12-09 2016-10-05 由田新技股份有限公司 Detection method and system for using handheld device by person
CN110955243A (en) * 2019-11-28 2020-04-03 新石器慧通(北京)科技有限公司 Travel control method, travel control device, travel control apparatus, readable storage medium, and mobile device
CN110955243B (en) * 2019-11-28 2023-10-20 新石器慧通(北京)科技有限公司 Travel control method, apparatus, device, readable storage medium, and mobile apparatus

Also Published As

Publication number Publication date
CN103049735B (en) 2016-02-03

Similar Documents

Publication Publication Date Title
JP7104691B2 (en) Bioparticle classification system and method
Malon et al. Classification of mitotic figures with convolutional neural networks and seeded blob features
JP5464244B2 (en) Image processing apparatus, program, and image processing system
US8379961B2 (en) Mitotic figure detector and counter system and method for detecting and counting mitotic figures
JP5333570B2 (en) Image processing apparatus, program, and image processing system
CN103049733B (en) Method for detecting human face and human-face detection equipment
JP6197659B2 (en) Detection control device, program, and detection system
CN102663723B (en) Image segmentation method based on color sample and electric field model
US9443129B2 (en) Methods and apparatus for image analysis using threshold compactness features
CN116205919A (en) Hardware part production quality detection method and system based on artificial intelligence
CN115131760B (en) Lightweight vehicle tracking method based on improved feature matching strategy
CN108073940B (en) Method for detecting 3D target example object in unstructured environment
Ates et al. An image-processing based automated bacteria colony counter
CN107392105B (en) Expression recognition method based on reverse collaborative salient region features
KR102624956B1 (en) Method for detecting cells with at least one malformation in a cell sample
CN111368865B (en) Remote sensing image oil storage tank detection method and device, readable storage medium and equipment
CN103049735B (en) The equipment of certain objects in the method for certain objects and detected image in detected image
CN109191467B (en) Method and device for predicting autophagy phenotype of cell
CN102184524B (en) Neighbourhood learning culture gene image segmentation method based on standard cut
CN115862112A (en) Target detection model for facial image acne curative effect evaluation
JP5907125B2 (en) Image processing apparatus, program, and image processing system
EP2776974B1 (en) Methods and apparatus for image analysis using threshold compactness features
Jin et al. Joint Feature Learning for Cell Segmentation Based on Multi-scale Convolutional U-Net
CN112287948B (en) Sketch mark-based middle-level feature extraction rapid edge detection method
CN111209900B (en) Image processing method for positioning centromere according to feature extraction algorithm

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160203

CF01 Termination of patent right due to non-payment of annual fee