CN1514398A - Human detecting method, apparatus, system and storage medium - Google Patents
Human detecting method, apparatus, system and storage medium Download PDFInfo
- Publication number
- CN1514398A CN1514398A CNA02160407XA CN02160407A CN1514398A CN 1514398 A CN1514398 A CN 1514398A CN A02160407X A CNA02160407X A CN A02160407XA CN 02160407 A CN02160407 A CN 02160407A CN 1514398 A CN1514398 A CN 1514398A
- Authority
- CN
- China
- Prior art keywords
- candidate
- district
- eyes
- eye detection
- threshold value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The present invention uses the following steps: to survey candiate eye zone in given image: reading in image, analysing the image to obtain a candidate eye listing as selecting a candidate eye zone to be central candidate zone from the candidate listing, confirming adjacent candidate zone of the central condidate zone based on preset standard obtaining dark field by processing minimum area containing all adjacent candidate zones, confirming and deleting false candidate eye zone based on consistency between dark field and abovesaid candidate eye zone, repeating steps followed behind selecting step for each candidate eye zone and to output listing for post-processing.
Description
Technical field
The present invention relates to a kind of image processing method, especially a kind of eye detection method that is used for surveying the human eye of image.The invention still further relates to a kind of eye detection equipment, a kind of eye detection system and wherein store the storage medium of eye detection program code.
Background technology
Nowadays, image recognition technique is applied to many technical fields, such as satellite imagery analysis, robotization, motion video compression and surveillance etc.Up to now, existing many technology that are used for the object of recognition image for example have template matching method, statistical pattern recognition method, tactic pattern method of identification and neural net method etc.
A kind of to discern to as if human body itself, especially people's face.Draw article " FaceDetectionandRotationsEstimationUsingColorInformation " (the5 at this paper for the HaiyuanWu of reference
ThIEEEInternationalWorkshoponRobotandHumanCommunication, 1996, pp341-346) in, disclose a kind of template matching method that is used to survey people's face.The effect of this method depends on the quality of the image that is detected too much, especially depends on the complexity of illumination condition and background.Face's difference of different ethnic groups also influences Effect on Detecting.
In some other method, can come from image, to survey people's face by at first surveying people's feature (such as eyes, mouth and nose etc.) on the face.Article " AFastApproachforDetectingHumanFacesinaComplexBackground " (Proceedingsofthe1999IEEEInternationalSymposiumonCircuits andSystem at this paper Kin-ManLam incorporated by reference, 1998, ISCAS ' 98Vol.4, among the pp85-88, disclose a kind of method of surveying eyes, wherein, suppose that at first some zones are possible eyes, according to some conditions these zones are checked then, to examine real eye areas.The efficient of this method is lower, because in an images, has too many possible eye areas (candidate's eyes).
In order to improve described prior art, the applicant has developed a kind of image processing method and equipment, image processing system and storage medium, and (be disclosed among the pending trial Chinese patent application No.00127067.2 of application on September 15th, 2000, publication number is CN1343479A.This application this draw be with reference to).Utilize this method, can obtain the tabulation in the candidate's eyes district in the image.Then, by with the pairing of candidate's eyes, can obtain a candidate face district tabulation.
Yet, in described candidate's eyes district, many artificial eyes district is arranged.As a result, in described candidate face district, many dummy's faces district is arranged.Described artificial eye district or dummy's face district should be excluded.
For this reason, in another the pending trial Chinese patent application 01132807.x that is entitled as " image processing method and equipment, image processing system and storage medium " that submits to September 6 calendar year 2001, it is a kind of by analyzing an annular region in each candidate face district that the applicant provides, and comes to get rid of from the candidate face district method in non-face district.
Summary of the invention
Equally for described purpose, the application seeks to provide a kind of eye detection method, be used for surveying the eyes district, in particular in given image, filtering out wrong candidate's eyes district, thereby obtain candidate's eyes district and candidate face district more accurately at given image.
Another object of the present invention provides a kind of eye detection equipment, system and storage medium, is used for surveying the eyes district at given image, in particular for filter out wrong candidate's eyes district in given image.
According to an aspect of the present invention, described first purpose realizes that by a kind of eye detection method this method comprises the following steps:
A) read in an images;
B) analyze described image, obtain candidate's eyes district tabulation;
C) (abbreviate later on that " candidate list ") selects undressed candidate's eyes district as candidate's eyes district, center (abbreviating " center candidate regions " later on as) as from described candidate's eyes district tabulation;
D), determine the neighbor candidate eyes district s (after this abbreviating " neighbor candidate district " as) of described center candidate regions based on predetermined standard;
E) handle the Minimum Area (after this being called " neighborhood district ") that comprises whole described neighbor candidate district and obtain the dark space;
F), determine candidate's eyes district with deletion error based on the consistance between described dark space and the described candidate's eyes;
G) repeating step c) to f), up to no longer including undressed candidate's eyes district; And
H) export described candidate list, be used for subsequent treatment described image.
According to a further aspect in the invention, provide a kind of eye detection equipment, it comprises: reader unit is used to read in image; Candidate's eyes district sniffer is used to analyze the described image that reads in and obtains the tabulation of candidate's eyes district; And output unit, be used to export described candidate list and use for the subsequent treatment of described image; It is characterized in that described equipment also comprises: selecting arrangement is used for selecting undressed candidate's eyes district as the center candidate regions from described candidate list; Validation apparatus is used for the deletion artificial eye district, described neighbor candidate district from described center candidate regions; And control device, be used to control described selecting arrangement, make that all the candidate's eyes districts in the described candidate list all obtain handling; Described validation apparatus also comprises:
The neighbor candidate district determines device, and being used for determining should be as the neighbor candidate district of described center candidate regions in which candidate's eyes district of described candidate list;
Device is determined in the dark space, is used to handle the Minimum Area (neighborhood district) that comprises described neighbor candidate district, thereby obtains a series of dark spaces; And
Artificial eye district filtrator is determined and candidate's eyes district of deletion error based on the consistance between described dark space and the described candidate's eyes district.
According to another aspect of the invention, provide a kind of eye detection system, it comprises: an eikongen, an aforesaid eye detection equipment and a subsequent processing device.
According to another aspect of the invention, provide a kind of storage medium, wherein store the program code of realizing described method of the present invention.
By means of the present invention, can examine described candidate's eyes district rapidly and exactly and get rid of the artificial eye district, thereby more accurately obtain candidate's eyes district and candidate face district.
Description of drawings
Read hereinafter detailed description of preferred embodiments, can other purposes of the present invention more than you know, feature and advantage.The accompanying drawing and the instructions one that constitute the part of this instructions are used from the explanation embodiments of the invention, explain principle of the present invention.In the accompanying drawing:
Fig. 1 is the process flow diagram of first embodiment of eye detection method of the present invention;
Fig. 2 is that to illustrate of as shown in Figure 1 neighbor candidate district determining step 5000 preferred
The process flow diagram of embodiment;
Fig. 3 is the process flow diagram that illustrates an embodiment of dark space determining step 5500 as shown in Figure 1;
Fig. 4 is the process flow diagram that illustrates an embodiment of artificial eye district filtration step 6000 as shown in Figure 1;
Fig. 5 is that wherein, described method has proceeded to markers step 5508 as shown in Figure 3 by an example of the image of eye detection method processing of the present invention;
Fig. 6 is an image shown in Figure 5, and wherein said method has proceeded to correspondence district determining step 6006 as shown in Figure 4;
Fig. 7 is an image shown in Figure 5, and wherein artificial eye district filtration step 6000 has as shown in Figure 1 been finished;
Fig. 8 is the process flow diagram of second embodiment of eye detection method of the present invention;
Fig. 9 is the schematic block diagram of first embodiment of eye detection of the present invention system;
Figure 10 is the schematic block diagram of the validation apparatus of eye detection system as shown in Figure 8;
Figure 11 is an example by the image of eye detection method processing of the present invention, is used to illustrate determining of described neighbor candidate district.
Figure 12 is a block scheme, illustrates an example that can be used for realizing the computer system of method and apparatus of the present invention.
Embodiment
Below in conjunction with described accompanying drawing the preferred embodiments of the present invention are described.
Computer system for example
Method of the present invention can realize in any messaging device.Described messaging device for example is personal computer (PC), and notebook computer embeds the single-chip microcomputer in camera, video camera, scanner, the gate control system etc., or the like.For those of ordinary skills, be easy to realize method of the present invention by software, hardware and/or firmware.Especially it should be noted that, it will be obvious to those skilled in the art that, may need to use input-output device, memory device and microprocessor such as CPU etc. for some step of carrying out this method or the combination of step.May not be certain to mention these equipment in the explanation to method of the present invention below, but in fact used these equipment.
As above-mentioned messaging device, Fig. 7 shows giving an example of a computer system, can realize method and apparatus of the present invention therein.It should be noted that the computer system that is shown in Fig. 7 just is used for explanation, does not really want to limit the scope of the invention.
From the angle of hardware, computing machine 1 comprises a CPU6,5, RAM7 of a hard disk (HD), ROM8 and input-output device 12.Input-output device can comprise input media such as keyboard, Trackpad, tracking ball and mouse etc., and output unit is such as printer and monitor, and input-output unit is such as floppy disk, CD drive and communication port.
From the angle of software, described computing machine mainly comprises operating system (OS) 9, input/output driver 11 and various application program 10.As operating system, can use any operating system that to buy on the market, such as Windows series (Windows is the trade mark of Microsoft) and based on the operating system of Linux.Input/output driver is respectively applied for and drives described input-output device.Described application program can be an Any Application, such as word processor, image processing program etc., comprising can existing program used in this invention and aim at the present invention's application program establishment, that can call described existing program.
Like this, in the present invention, can in the hardware of described computing machine, carry out method of the present invention by operating system, application program and input/output driver.
In addition, computing machine 1 can also be connected to a digital device 3 and an application apparatus 2.Described digital device can be camera, video camera, scanner or the digitizer that is used for simulated image is converted to digital image as eikongen 502 as mentioned below.The result who utilizes the present invention to obtain is output to application apparatus 2, and this application apparatus is carried out suitable operation according to described result.Described application apparatus also can be the camera (or similar device) that is used as described digital device simultaneously, can be any automatic control system perhaps, such as gate control system.The Another application program of described image and the combination of hardware be realized, be used for further handling to described application apparatus also can in described computing machine 1 inside.
The eye detection method
The present invention system is based on the following fact.Those of ordinary skills know, when from an images, extracting feature, and resulting result, noise profile especially wherein can change along with the disposal route that is adopted usually.Yet, usually can be consistent with each other on sizable degree with the Different Results that distinct methods obtained, noise profile is very different.Therefore, can handle described image with serial of methods, then, based on comparison to the different disposal result, the real feature in the recognition image accurately.Be exactly detailed description of the present invention below.
(first embodiment)
Referring to Fig. 1, wherein show first embodiment of eye detection method of the present invention.This method starts from read step 1000, in this step, reads in the pending numeral or the image of analog format from an eikongen.Described eikongen can be an any kind, such as memory device among the PC and camera etc.
In analytical procedure 2000, analyze described image then, generate candidate's eyes district tabulation by means of described Chinese patent application 00127067.2 disclosed method.If reading visually in described read step 1000 is simulated image, before analyzing, should carry out digitizing to described image.Described analytical procedure 2000 also can be carried out with other known methods, such as region-growing method, Region Segmentation method and hybrid method.
In selecting step 4000, select undressed candidate's eyes district as the center candidate regions randomly or sequentially from described candidate list." undressed candidate's eyes district " is meant and is not chosen as the center candidate regions, also be not chosen as candidate's eyes district in the neighbor candidate district as described below of center candidate regions.For convenience described selection operation can make described candidate's eyes district's ascending order or descending sort according to the size in candidate's eyes district, then can the described candidate's eyes of select progressively district as described center candidate regions.
Described method advances to neighbor candidate district determining step 5000 then, in this step, is that described center candidate regions is determined a series of neighbor candidate district based on predetermined standard.From described candidate list, choose specifically and check each candidate's eyes district,, then it is added the neighbor candidate district tabulation of described center candidate regions, otherwise skip over if it meets described standard.The tabulation of resulting neighbor candidate district is output, and is used for dark space determining step 5500 as shown in Figure 1.To those skilled in the art, there are various standards to be fit to select the neighbor candidate district.Yet the applicant provides some preferred standards below.
Described standard can be the difference in size between described each candidate's eyes district and the described center candidate regions.If described difference in size is not more than predetermined first threshold, corresponding candidate's eyes district is just as the neighbor candidate district.Here, said " size " can be understood as area, or width or height, or width and height.With the area value is example, and described first threshold is 0 area to described image, and preferably 0.4S is to 10S, and wherein S is that described center candidate regions is the size of unit with the pixel.Best, described first threshold is 2S+15 (pixel).
In a modification, described standard can be the luminance difference between described each candidate's eyes district and the described center candidate regions.If described luminance difference is not more than the second predetermined threshold value, corresponding candidate's eyes district is just as the neighbor candidate district.Here, described brightness can be according to 255 gray level system measurement, and described second threshold value is 0 to 255, are more preferably 20 to 80, and best, described second threshold value is 50.
In another modification, described standard is the distance between each candidate's eyes district and the described center candidate regions.If described distance is not more than the 3rd predetermined threshold value, corresponding candidate eyes district is just as the neighbor candidate district.Described distance can be Euler's distance or other distances, such as Diff E or Diff N or their combination.When described distance adopt Euler apart from the time, described the 3rd threshold value be 0 in the described image ultimate range between any two pixels.If described distance is the combination of Diff E and Diff N, then described the 3rd threshold value is made of Diff E threshold value and Diff N threshold value.Only when Diff E and Diff N were no more than described Diff E threshold value and described Diff N threshold value respectively, corresponding candidate's eyes district just was confirmed as the neighbor candidate district of described center candidate regions.Described Diff E threshold value and Diff N threshold value are respectively 0 width and 0 height to described image to described image.Be more preferably, described Diff E threshold value is 0.2 * AW to 5 * AW, and described Diff N threshold value is 0.2 * AH to 5 * AH, and wherein AW is the width of described center candidate regions, and AH is the height of described center candidate regions.Best, described Diff E threshold value is AW, and described Diff N threshold value is AH.
Be more preferably, described three standards can be used in combination.Best, described all three standards all are used for determining the neighbor candidate district.Yet, as known to persons of ordinary skill in the art, also can use other standards.
As an example, Figure 11 shows 6 candidate's eyes district a, b, and c, d, e and f, wherein, candidate's eyes district c is the center candidate regions.Their parameter is shown in the following table 1:
Table 1
Candidate's eyes district S | ????A | ????b | C (center candidate regions) | ????d | ????e | ????f | |
Area (pixel) | ????10 | ????20 | ????30 | ????28 | ????60 | ????61 | |
Area difference | ????20 | ????10 | ????2 | ????30 | ????31 | ||
Brightness (gray level) | ????30 | ????50 | ????71 | ????120 | ????44 | ????34 | |
Luminance difference | ????41 | ????21 | ????49 | ????27 | ????37 | ||
Position (centre coordinate) | ????(168, ????371) | ????(179, ????364) | ????(180, ????375) | ????(190, ????373) | ????(172, ????380) | ????(191, ????385) | |
Distance to described center candidate regions | Euler's distance | ????12.6 | ????11.0 | ????10.2 | ????9.4 | ????14.9 | |
Coordinate is poor | ????(12,4) | ????(1,11) | ????(10,2) | ????(8,5) | ????(11,10) |
Further, following table 2 shows resulting definite result when adopting different standards:
Table 2
Standard | Area difference | Luminance difference | Euler's distance | Coordinate is poor | Area difference, luminance difference and Euler's distance |
The threshold value of supposing | ??25 | ????30 | ????13 | ????(10,7) | Be respectively 25,30 and 13 |
Neighbor candidate district S | ??a, ??b,d | ????b,e | ????a,b,d,e | ????d,e | ????b |
As can be seen from Table 2, accepted standard is many more, and determined neighbor candidate district is few more, thereby has improved the result of method gained of the present invention.
Figure 2 illustrates a preferred embodiment of described neighbor candidate district determining step 5000.Be used for after described candidate list is selected the selection step 5002 in candidate's eyes district, checking described selected candidate's eyes district successively in difference in size comparison step 5004, luminance difference comparison step 5006 with in apart from comparison step 5008.If this candidate's eyes district is stayed, then adding step 5010 with its adding neighbor candidate district's tabulation.In repeating step 5011, each undressed candidate's eyes district in the described candidate list is repeated above-mentioned steps.Should be noted that if selected candidate's eyes district is rejected then there is no need to continue to carry out all the other steps in described three comparison step, process will directly be returned described selection step 5002 in any one step of described three comparison step.It is also understood that described three comparison step are not necessarily carried out according to the described order of Fig. 2, and can carry out according to random order.
Get back to Fig. 1 now.The next procedure of described method is a treatment step 5500, determines a Minimum Area (neighborhood district) (field district determining step 5502 as shown in Figure 3) that comprises all neighbor candidate districts in this step, handles this field district then, generates a series of dark spaces.Here, described neighborhood district can be an Any shape, such as rectangle, circle, ellipse and polygon etc., and can adopt the whole bag of tricks to handle described neighborhood district to obtain the dark space.For example, the described dark space identical method of method therefor when determining described candidate list is determined.The dark space also can be determined with diverse ways.As previously mentioned, can use and ask 00127067.2 disclosed method in the described Chinese patent, and other known methods, such as region-growing method, Region Segmentation method and hybrid method.
For obtaining described dark space, preferably described neighborhood district is carried out binary conversion treatment.The binary conversion treatment in described neighborhood district is shown in detail in Fig. 3.After above-mentioned field district determining step 5502, a binary-state threshold determining step 5504 is arranged.Described binary-state threshold can calculate with following formula:
Threshold value=AvrBrightness * a+AvrBackground * b
Wherein, coefficient a and b's and be 1, AvrBrightness and AvrBackground are respectively the mean flow rate and the average background in all neighbor candidate districts.Described coefficient a can be 0-1, is more preferably 0.4-0.9, best 0.7; Described coefficient b can be 0-1, is more preferably 0.1-0.6, preferably 0.3.
In binaryzation step 5506, in described neighborhood district, carry out binaryzation then, thereby make described neighborhood district convert black white image to.In markers step 5508, the dark pixel that is communicated with is labeled as the dark space.
So far, treatment step 5500 shown in Figure 1 has just been finished.The next step of this method is a filtration step 6000, judges that based on the consistance between described dark space and the described candidate's eyes district which candidate's eyes district (comprising described center candidate regions) is false, and deletes these false candidate's eyes districts.Specifically, by the relative position and the size in more described dark space and described candidate's eyes district, will have the overlapping with it candidate's eyes district that gets good dark space and be considered as " very " eyes district, thereby be retained in the described candidate list.For those of ordinary skills, such comparison and deletion action can realize in various modes.For example, the realization of described filtration step can utilize the distance between the center in the center of described dark space and described candidate's eyes district, and the size in described dark space and described candidate's eyes district.But preferably follow the method shown in Fig. 4-7.
Referring to Fig. 4, first substep of described filtration step 6000 as shown in Figure 1 is corresponding district determining steps 6006, wherein, for each neighbor candidate district, determines a corresponding dark space.This deterministic process is as follows.At first, determine overlapping area between dark space and the candidate's eyes district.For each candidate's eyes district, have the corresponding dark space of the dark space of maximum overlapping area as this candidate's eyes district.If candidate's eyes district corresponding two or a more a plurality of dark space with same maximum overlapping area, then this candidate's eyes district is considered as not having corresponding dark space.Above stated specification is an example of described corresponding district determining step 6006; Obviously, described deterministic process can be implemented with other suitable manner.
Further describe described corresponding district determining step 6006 below in conjunction with accompanying drawing 5 and 6.Figure 5 illustrates the view for example of the piece image of handling through eye detection method of the present invention, wherein said method has proceeded to markers step 5508 as shown in Figure 3.In the figure, shadow region B, D, F, I, J and L represent candidate's eyes district, clear area A, C, E, G, K and M represent aforementioned dark space.In Fig. 6, non-corresponding dark space is deleted.Specifically, dark space A is not because deleted with any candidate's eyes area overlapping; Dark space C and E are because with same candidate's eyes area overlapping and have same overlapping area and deleted; Dark space H and M are because overlapping area is deleted less than the overlapping area of dark space G and K respectively.Last result is dark space G, and K remains as corresponding dark space.Specifically, described dark space G is corresponding to two candidate's eyes district F and I, and dark space K is corresponding to two candidate's eyes district J and L.
Get back to Fig. 4.Ensuing three sub-steps 6008,6010,6012 are used for deleting false candidate's eyes district.Describe these steps in detail below in conjunction with Fig. 6 and 7.If the candidate's eyes district with less overlapping area corresponding to a plurality of candidate's eyes district, is then deleted in a corresponding dark space, candidate's eyes block reservation that will have maximum overlapping area is in described candidate list.As shown in Figure 6, candidate's eyes district J should be deleted, because another candidate's eyes district L is overlapping with same corresponding dark space K and have a bigger overlapping area.If a plurality of candidate's eyes district is overlapping with same corresponding dark space and have identical overlapping area, then with these candidate's eyes district Delete Alls.As shown in Figure 6, described candidate's eyes district F and I are overlapping with corresponding dark space G, and have identical overlapping area, thereby should be deleted.At last, there is not candidate's eyes district of corresponding dark space, such as the candidate's eyes district B and the D that are shown among Fig. 6, deleted.In Fig. 7, aforementioned substep is finished, and only has candidate's eyes district L to stay.Obviously, described three sub-steps 6008,6010 and 6012 are not necessarily according to illustrated order, and can be any orders.
So just finished filtration step shown in Figure 1 6000.Be repeating step 7000 afterwards, repeat step from described selection step 4000 to described filtration step 6000 at all undressed candidate's eyes districts in the described candidate list.In described candidate list all candidate's eyes district all after treatment, described method advances to output step 8000, exports described candidate list and is used for follow-up processing.
As mentioned above, for those of ordinary skills, there are various standards to be adapted at determining in the neighbor candidate district determining step 5000 described neighbor candidate district.Equally, there are various methods to be used in and handle described neighborhood district in the described treatment step 5500 to obtain the dark space.Equally, also can accomplished in various ways in the described filtration step 6000 based on conforming operation.Therefore, the specific substep shown in Fig. 2-7 of described three steps just for example.The present invention should not be limited to so specific substep.That is to say that method of the present invention comprises that the such fact of described three steps has just constituted key of the present invention, described specific substep is only to further improvement of the present invention.
(second embodiment)
In a second embodiment, method of the present invention also comprises step as described below.As shown in Figure 8, after described output step 8000 as shown in Figure 1, described method advances to people's face determining step 404, determines the candidate face district based on the residue candidate's eyes district that obtains in described detection steps 402.There are many methods can be used for determining the candidate face district from candidate's eyes district.For example, can determine a candidate face district from candidate's eyes district based on eyes and human intrinsic relative position.Again for example, symmetry based on described pair of eyes, and/or distance between the pair of eyes and/or the general relative position of eyes in image, it is right described candidate's eyes district can be made into, and can determine the candidate face district based on pair of eyes and human intrinsic relative position then.
Next be people's face deletion step 406, be used for deleting dummy's face district: the method for for example utilizing described pending trial Chinese patent application 01132807.x to be proposed.Certainly, also can use other method to delete dummy's face district, for example, the relative position based between people's face and other position of human body perhaps based on the relative position of people's face in image, can use the tactic pattern recognition methods.In output step 408, the district as a result of exports with described residue people face, is used for follow-up processing at last.
Eye detection equipment and system
The present invention also provides a kind of eye detection equipment and system.To describe in detail it below.Similar with preceding method, the any parts that constitute eye detection equipment of the present invention and system can be the combinations of the parts or the parts of aforementioned any messaging device, perhaps install or be combined in the software in aforementioned any messaging device and/or the combination of hardware and/or firmware.For those of ordinary skills, be easy to realize the described parts of equipment of the present invention.Equally, the operation that it will be obvious to those skilled in the art that each described parts all relates to the use such as CPU etc. of input-output device, memory device, microprocessor.Hereinafter these equipment are not necessarily mentioned in the explanation of equipment of the present invention and system, but in fact used these equipment.As an instantiation of aforementioned information treatment facility, preamble had been described a computer system, and its description does not repeat them here.
As shown in Figure 9, comprise an eikongen 502, an eye detection system 500 of the present invention and a subsequent processing device 516 according to eye detection of the present invention system.Described eikongen 502 can be any storage medium, such as the memory device of PC, perhaps can be that image pick up equipment is such as camera or scanner or the like.Described subsequent processing device can be that output device is such as monitor or printer, perhaps can be to be used for determining equipment at people's face in the definite people's face district of image according to the output of described eye detection equipment, perhaps can be automatic control system such as gate control system, or the like.
When described subsequent processing device 516 is people's faces when determining equipment, it also comprises following apparatus: candidate face is determined device, is used for determining candidate face based on residue candidate's eyes district of described eye detection equipment output; Dummy's face delete device is used to delete dummy's face district; And output unit, be used for output residue people face district and be used for subsequent treatment.
As previously mentioned, determine device, have many methods to be used for determining the candidate face district from candidate's eyes district for described candidate face.Equally, as previously mentioned,, there are many methods to be used to get rid of false candidate face for described dummy's face delete device.
Get back to Fig. 9 now, described eye detection equipment 500 of the present invention comprises reader unit 504, candidate's eyes district sniffer 506, selecting arrangement 507, validation apparatus 508, control device 510 and output unit 512.
Described reader unit 504 receives input from eikongen 502.The image that reads in is handled by candidate's eyes district sniffer 506, generates the tabulation of candidate's eyes district.Described selecting arrangement is responsible for selecting a described validation apparatus 508 of undressed candidate's eyes district cause to handle from described candidate list.Described control device 510 obtains output from described validation apparatus 508, and controls described selecting arrangement 507 and select next undressed candidate's eyes district, if any.When not having undressed candidate's eyes district in the described candidate list, described control device 510 notifies described output unit 512 that final tabulation is outputed to described subsequent processing device 516.
The structure of described validation apparatus 508 is shown in Figure 10, and wherein, dotted arrow and square frame are used for illustrating the annexation between other parts of the parts of described validation apparatus and described eye detection equipment 500.
As shown in figure 10, described validation apparatus 508 comprises that the neighbor candidate district determines device 9020, and device and artificial eye district filtrator 9060 are determined in the dark space.Described neighbor candidate district determines that device 9020 receives candidate's eyes district as the center candidate regions from described selecting arrangement 507, around this center candidate regions, determines the neighbor candidate district of qualified candidate's eyes district as this center candidate regions.Then, described dark space determines that device 9040 processing comprise the neighborhood district in all described neighbor candidate districts to obtain the dark space.Information about described center candidate regions, neighbor candidate district and dark space is sent in the described artificial eye district filtrator 9060, this filtrator determines that based on the consistance between described dark space and the described candidate's eyes district which candidate's eyes district (comprising described center candidate regions) is false candidate's eyes district, and with they deletions.The result who obtains outputs to described control device 616, this control device and then control described selecting arrangement 507 as mentioned above or described result is outputed to described subsequent processing device 516.
As previously mentioned,, there are various standards to be fit to select the neighbor candidate district, can be used in described neighbor candidate district and determine in the device 9020 for those of ordinary skills.Equally, having various methods can be applied in described dark space determines to obtain the dark space to handle described neighborhood district in the device 9040.Equally, in described artificial eye district filtrator 9060, described can accomplished in various ways based on conforming operation.Therefore, described neighbor candidate district determine device 9020, described dark space determine device 9040 and described artificial eye district filtrator 9060 as Fig. 10 described ad hoc structure just for example.The present invention should not be limited to this ad hoc structure.That is to say that described validation apparatus 508 comprises that described neighbor candidate district determines that device 9020, described dark space determine that such true of device 9040 and described artificial eye district filtrator 9060 itself constitutes key of the present invention.The above-mentioned concrete structure of described three devices is a further improvement of the present invention.
Be the detailed description of the above-mentioned ad hoc structure of described three devices below.
As shown in figure 10, as preferred embodiment, described neighbor candidate district determines that device 9020 comprises and is used for difference in size comparison means 9022 that the difference in size between each undressed candidate's eyes district and the described center candidate regions and a first threshold are compared, be used for luminance difference comparison means 9024 that the luminance difference between each undressed candidate's eyes district and the described center candidate regions and one second threshold value are compared, and be used for distance between each undressed candidate's eyes district and the described center candidate regions and one the 3rd threshold value compare apart from comparison means 9022.Based on described comparative result, qualified candidate's eyes district is added the tabulation of neighbor candidate district.As previously mentioned, described neighbor candidate district determines that device can only comprise or their combination in any in described three comparison means.And described neighbor candidate district determines that device can comprise other device, perhaps is made of other device.
Described dark space determines that device 9040 can be identical with described candidate's eyes district sniffer 506, or described candidate's eyes district sniffer 506 is own.Perhaps, described dark space determines that device can comprise a binaryzation device 9046 as shown in figure 10.In Figure 10, described dark space determines that device 9040 also comprises: Minimum Area is determined device 9042, is used to determine to comprise the Minimum Area in all neighbor candidate districts; Binary-state threshold is determined device, is used to calculate described binaryzation device 9046 used binary-state threshold when binaryzation is carried out in described zone; And labelling apparatus 9048, be used for the dark pixel of the connection that obtains after the binaryzation is labeled as the dark space.
An example of described artificial eye district filtrator 9060 comprises that corresponding dark space determines device 9062 and screening plant 9064.The former is used for determining the corresponding dark space in each neighbor candidate district, and the latter is used to delete underproof candidate's eyes district, as previously mentioned.
Storage medium
Described purpose of the present invention can also be by realizing with program of operation on described eikongen 502 and any messaging device that subsequent processing device 516 is communicated by letter or batch processing aforesaid.Described messaging device, eikongen and subsequent processing device are known common apparatus.Therefore, described purpose of the present invention also can be only by providing the program code of realizing described eye detection method to realize.That is to say that the storage medium that stores the program code of realizing described eye detection method constitutes the present invention.
To those skilled in the art, can realize described eye detection method with any program language programming easily.Therefore, omitted detailed description at this to described program code.
Obviously, described storage medium can be well known by persons skilled in the art, and perhaps therefore the storage medium of any kind that is developed in the future also there is no need at this various storage mediums to be enumerated one by one.
Although to above-mentioned explanation of the present invention is to carry out in conjunction with concrete step and structure, the present invention is not limited to disclosed details here.On the contrary, under the prerequisite that does not break away from spirit of the present invention and scope, the application should be considered as covering all modifications or modification.
Claims (39)
1. an eye detection method comprises the following steps:
A) read in image;
B) analyze described image, obtain the tabulation of candidate's eyes district;
C) from described candidate list, select undressed candidate's eyes district as the center candidate regions;
D), determine the neighbor candidate district of described center candidate regions based on predetermined standard;
E) processing comprises the Minimum Area in all described neighbor candidate districts to obtain the dark space;
F) determine based on the consistance between described dark space and the described candidate's eyes district and delete false candidate's eyes district;
G) repeating step c) to f), up to there not being undressed candidate's eyes district; With
H) the described candidate list of output is used for the subsequent treatment of described image.
2. eye detection method as claimed in claim 1 is characterized in that described method also comprises the following steps:
Based on described step h) residue candidate's eyes district of obtaining determines the candidate face district;
Deletion dummy face district;
Output residue people face district is used for subsequent treatment.
3. eye detection method as claimed in claim 1 or 2, it is characterized in that, described step c) comprises: according to size with ascending order or descending to the ordering of described candidate's eyes district, then from described candidate list select progressively candidate eyes district as described center candidate regions.
4. eye detection method as claimed in claim 1 or 2, it is characterized in that, described step d) comprises: a difference in size and a first threshold between each undressed candidate's eyes district and the described center candidate regions are compared, wherein, if described difference in size is not more than described first threshold, then will this undressed candidate's eyes district be defined as the neighbor candidate district of described center candidate regions.
5. eye detection method as claimed in claim 4 is characterized in that described size is an area value, and described first threshold is 0 area to described image.
6. eye detection method as claimed in claim 5 is characterized in that, described first threshold is 0.4 * S to 10 * S, and wherein S is the size of described center candidate regions.
7. eye detection method as claimed in claim 6 is characterized in that, described first threshold is 2 * S+15 (pixel).
8. eye detection method as claimed in claim 1 or 2, it is characterized in that, described step d) comprises: luminance difference between each undressed candidate's eyes district and the described center candidate regions and one second threshold value are compared, wherein, if described luminance difference is not more than described second threshold value, then will this undressed candidate's eyes district be defined as the neighbor candidate district of described center candidate regions.
9. eye detection method as claimed in claim 8 is characterized in that, described brightness is measured according to 255 gray level systems, and described second threshold value is 0 to 255.
10. eye detection method as claimed in claim 9 is characterized in that, described second threshold value is 20 to 80.
11. eye detection method as claimed in claim 10 is characterized in that, described second threshold value is 50.
12. eye detection method as claimed in claim 1 or 2, it is characterized in that, described step d) comprises, distance between each undressed candidate's eyes district and the described center candidate regions and one the 3rd threshold value are compared, wherein, if described distance is not more than described the 3rd threshold value, then will this undressed candidate's eyes district be defined as the neighbor candidate district of described center candidate regions.
13. eye detection method as claimed in claim 12 is characterized in that, described distance is Euler's distance, described the 3rd threshold value be 0 in the described image longest distance between any two pixels.
14. eye detection method as claimed in claim 12, it is characterized in that, described distance is that coordinate is poor, comprise Diff E and Diff N, described the 3rd threshold value comprises Diff E threshold value and Diff N threshold value, be respectively 0 width and 0 height to described image to described image, if described Diff E and described Diff N are not more than described Diff E threshold value and described Diff N threshold value respectively, then corresponding candidate's eyes district is judged as the neighbor candidate district of described center candidate regions.
15. eye detection method as claimed in claim 14, it is characterized in that described Diff E threshold value is 0.2 * AW to 5 * AW, described Diff N threshold value is 0.2 * AH to 5 * AH, wherein AW is the width of described center candidate regions, and AH is the height of described center candidate regions.
16. eye detection method as claimed in claim 15 is characterized in that, described Diff E threshold value is AW, and described Diff N threshold value is AH.
17. eye detection method as claimed in claim 1 or 2, it is characterized in that, described step d) comprises: the difference in size between each undressed candidate's eyes district and the described center candidate regions, luminance difference and distance are compared with first, second and the 3rd threshold value respectively, wherein, if described difference in size, luminance difference and distance are not more than described three threshold values respectively, then this undressed candidate's eyes district is confirmed as the neighbor candidate district of described center candidate regions.
18. method as claimed in claim 1 or 2 is characterized in that, in step e), determines that the method for described dark space is identical with the step b) method therefor.
19. method as claimed in claim 1 or 2 is characterized in that described step e) comprises a binaryzation step.
20. eye detection method as claimed in claim 19 is characterized in that the used binary-state threshold of described binaryzation step calculates according to following formula:
Threshold value=AvrBrightness * a+AvrBackground * b
Wherein AvrBrightness and AvrBackground are respectively the mean flow rate and the average background in all neighbor candidate districts, and coefficient a and b are 0-1, and a+b=1.
21. eye detection method as claimed in claim 20 is characterized in that, described coefficient a is 0.4-0.9, and described coefficient b is 0.1-0.6.
22. eye detection equipment as claimed in claim 21 is characterized in that, described coefficient a is 0.7, and described coefficient b is 0.3.
23. eye detection method as claimed in claim 1 or 2 is characterized in that described step f) comprises the following steps:
Determine the corresponding dark space in each neighbor candidate district;
Delete candidate's eyes district according to following standard:
I) if the less candidate's eyes district of overlapping area corresponding to a plurality of candidate's eyes district, is deleted in a corresponding dark space;
If ii) a plurality of candidate's eyes district is overlapping with same corresponding dark space and have identical overlapping area, then with they Delete Alls;
Iii) deletion does not have candidate's eyes district of corresponding dark space.
24. an eye detection equipment (500) comprising: reader unit is used to read in image; Candidate's eyes district sniffer is used to analyze the image that reads in, and obtains candidate's eyes district tabulation; And output unit, be used to export this candidate's eyes district tabulation to be used for this visual subsequent treatment; It is characterized in that described equipment also comprises: selecting arrangement (507) is used for choosing undressed candidate's eyes district as the center candidate regions from described candidate list; Validation apparatus is used for the deletion artificial eye district, neighbor candidate district from described center candidate regions; And control device, be used to control described selecting arrangement, make that all candidate's eyes districts all obtain handling in the described candidate list; Described validation apparatus also comprises:
The neighbor candidate district determines device, is used for determining which candidate's eyes district of described candidate list should be as the neighbor candidate district of described center candidate regions;
Device is determined in the dark space, is used to handle the Minimum Area that comprises described neighbor candidate district, thereby obtains a series of dark spaces; And
Artificial eye district filtrator is used for determining and deleting false candidate's eyes district based on the consistance between described dark space and the described candidate's eyes district.
25. eye detection equipment as claimed in claim 24 (500) is characterized in that it also comprises a collator, is used for described candidate's eyes district by size with ascending order or descending sort, to help the selection operation of described selecting arrangement (507).
26. as claim 24 or 25 described eye detection equipment (500), it is characterized in that, described neighbor candidate district determines that device (9020) comprises difference in size comparison means (9022), and the latter is by comparing the neighbor candidate district that determines described center candidate regions with a difference in size between each undressed candidate's eyes district and the described center candidate regions and a first threshold.
27. as claim 24 or 25 described eye detection equipment (500), it is characterized in that, described neighbor candidate district determines that device (9020) comprises luminance difference comparison means (9024), and the latter is by comparing the neighbor candidate district that determines described center candidate regions with luminance difference between each undressed candidate's eyes district and the described center candidate regions and one second threshold value.
28. as claim 24 or 25 described eye detection equipment (500), it is characterized in that, described neighbor candidate district determines that device (9020) comprises apart from comparison means (9026), and the latter is by comparing the neighbor candidate district that determines described center candidate regions with distance between each undressed candidate's eyes district and the described center candidate regions and one the 3rd threshold value.
29. as claim 24 or 25 described eye detection equipment (500), it is characterized in that, described neighbor candidate district determines that device (9020) comprises difference in size comparison means (9022), luminance difference comparison means (9024) and apart from comparison means (9026), be respectively applied for the difference in size between each undressed candidate's eyes district and the described center candidate regions, luminance difference and distance are with first, the second and the 3rd threshold value compares, described difference in size, luminance difference and distance all are not more than the neighbor candidate district of candidate's eyes district of described three threshold values as described center candidate regions.
30., it is characterized in that described dark space determines that device (9040) is identical with described candidate's eyes district's sniffers (506) as claim 24 or 25 described eye detection equipment (500).
31., it is characterized in that described dark space determines that device (9040) comprises binaryzation device (9046) as claim 24 or 25 described eye detection equipment (500).
32. eye detection equipment as claimed in claim 31 (500), it is characterized in that described dark space determines that device (9040) comprises that also binary-state threshold determines device, be used for calculating the described binary-state threshold that described binaryzation device (9046) uses according to following formula:
Threshold value=AvrBrightness * a+AvrBackground * b
Wherein AvrBrightness and AvrBackground are respectively the mean flow rate and the average background in all neighbor candidate districts, and coefficient a and b are 0-1, and a+b=1.
33. eye detection equipment as claimed in claim 32 is characterized in that, described coefficient a is 0.4-0.9, and described coefficient b is 0.1-0.6.
34. eye detection equipment as claimed in claim 33 is characterized in that, described coefficient a is 0.7, and described coefficient b is 0.3.
35., it is characterized in that described artificial eye district filtrator comprises as claim 24 or 25 described eye detection equipment:
Device (9060) is determined in corresponding dark space, is used to each neighbor candidate district to determine a corresponding dark space; With
Screening plant (9064) is used for according to following standard deletion candidate eyes district:
I) if the less candidate's eyes district of overlapping area corresponding to a plurality of candidate's eyes district, is deleted in a corresponding dark space;
If ii) a plurality of candidate's eyes district is overlapping with same corresponding dark space and have identical overlapping area, then with they Delete Alls;
Iii) deletion does not have candidate's eyes district of corresponding dark space.
36. an eye detection system comprises: eikongen (502), eye detection equipment as claimed in claim 12 (500) and subsequent processing device (516).
37. eye detection as claimed in claim 36 system is characterized in that described subsequent processing device (516) determines equipment for people's face, this people's face determines that equipment comprises:
Candidate face is determined device, is used for determining the candidate face district based on residue candidate's eyes district of described eye detection equipment output;
Dummy's face delete device is used to delete dummy's face district; With
Output unit is used to export remaining people's face district and is used for subsequent treatment.
38., it is characterized in that described eye detection equipment (500) is one of claim 24-35 described eye detection equipment as claim 36 or 37 described eye detection systems.
39. a storage medium stores the program code that is used to realize as the described eye detection method of one of claim 1-23.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNB02160407XA CN100465986C (en) | 2002-12-31 | 2002-12-31 | Human detecting method, apparatus, system and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNB02160407XA CN100465986C (en) | 2002-12-31 | 2002-12-31 | Human detecting method, apparatus, system and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN1514398A true CN1514398A (en) | 2004-07-21 |
CN100465986C CN100465986C (en) | 2009-03-04 |
Family
ID=34237874
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CNB02160407XA Expired - Fee Related CN100465986C (en) | 2002-12-31 | 2002-12-31 | Human detecting method, apparatus, system and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN100465986C (en) |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001216515A (en) * | 2000-02-01 | 2001-08-10 | Matsushita Electric Ind Co Ltd | Method and device for detecting face of person |
-
2002
- 2002-12-31 CN CNB02160407XA patent/CN100465986C/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
CN100465986C (en) | 2009-03-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN1514397A (en) | Human ege detecting method, apparatus, system and storage medium | |
CN110941594B (en) | Splitting method and device of video file, electronic equipment and storage medium | |
CN1282942C (en) | Image processing method for appearance inspection | |
CN1223969C (en) | Feature matching and target extracting method and device based on sectional image regions | |
Moaveni et al. | Evaluation of aggregate size and shape by means of segmentation techniques and aggregate image processing algorithms | |
Li et al. | Novel approach to pavement image segmentation based on neighboring difference histogram method | |
CN103955660B (en) | Method for recognizing batch two-dimension code images | |
CN1523533A (en) | Human detection through face detection and motion detection | |
US20160314368A1 (en) | System and a method for the detection of multiple number-plates of moving cars in a series of 2-d images | |
CN1258894A (en) | Apparatus and method for identifying character | |
CN1867928A (en) | Method and image processing device for analyzing an object contour image, method and image processing device for detecting an object, industrial vision apparatus, smart camera, image display, security | |
CN1977286A (en) | Object recognition method and apparatus therefor | |
CN103870824B (en) | A kind of face method for catching and device during Face datection tracking | |
CN1947151A (en) | A system and method for toboggan based object segmentation using divergent gradient field response in images | |
CN113469921B (en) | Image defect repairing method, system, computer device and storage medium | |
CN1215438C (en) | Picture contrast equipment, picture contrast method and picture contrast program | |
CN108961262B (en) | Bar code positioning method in complex scene | |
CN112132151A (en) | Image character recognition system and method based on recurrent neural network recognition algorithm | |
CN112509026A (en) | Insulator crack length identification method | |
CN111507119B (en) | Identification code recognition method, identification code recognition device, electronic equipment and computer readable storage medium | |
CN1207673C (en) | Half-tone dot eliminating method and its system | |
CN100354876C (en) | Method and equipment for intensifying character line image and storage medium | |
CN1514398A (en) | Human detecting method, apparatus, system and storage medium | |
CN114384073B (en) | Subway tunnel crack detection method and system | |
CN1920853A (en) | System and method for content recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20090304 Termination date: 20171231 |
|
CF01 | Termination of patent right due to non-payment of annual fee |