GB2427777A - Object detection using pairs of point features - Google Patents

Object detection using pairs of point features Download PDF

Info

Publication number
GB2427777A
GB2427777A GB0506766A GB0506766A GB2427777A GB 2427777 A GB2427777 A GB 2427777A GB 0506766 A GB0506766 A GB 0506766A GB 0506766 A GB0506766 A GB 0506766A GB 2427777 A GB2427777 A GB 2427777A
Authority
GB
United Kingdom
Prior art keywords
image
point
value
pairs
image value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0506766A
Other versions
GB0506766D0 (en
Inventor
Richard Ian Taylor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PEPPERDOG Ltd
Original Assignee
PEPPERDOG Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PEPPERDOG Ltd filed Critical PEPPERDOG Ltd
Priority to GB0506766A priority Critical patent/GB2427777A/en
Publication of GB0506766D0 publication Critical patent/GB0506766D0/en
Publication of GB2427777A publication Critical patent/GB2427777A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06K9/00228
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/165Detection; Localisation; Normalisation using facial parts and geometric relationships
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems

Abstract

In an image processing apparatus, image data defining input images of objects is processed to indicate the position, rotation and scale of detected objects within the images. Processing is carried out to apply a series of test stages to every possible location in the image, mediated by a supplied process mask. Each stage itself consists of a number of feature tests which are based on pairs of point features projected onto the image.

Description

* 2427777 Image Processing Apparatus This iiiveiition rehtt('S to the
automatic (l(tCctli)ll of objects in images.
lucre arc inaiiy apl)lications where a still image, or a video sticain, must be scarche(l for a particular type of object. For example, a roadside camera may be required to detect passing vehicles so that their number-1)lates can be read, or a CCTV camera may imeed to detect hiunian faces so that a PTZ (:afliera can capture a high resolution l)icture of the individual.
Sonic systems exist, winch attempt to I)erf( am this function, most notably that descril)ed by P. Viola and M. Jones in "Fast and Rnbust Classification using Asymnietric Adal3oost and a Detector Cascade" , Neural Information Processing Systems 2002. However, existing systems are either too slow, too inacurate (fail to detect enough valid objects or detect too many invalid objects) or too inflexible (cami only detect objects with restricted position or size or rotation).
The present invention has beemi made with the above problems in mind.
According to the present invention there is provided a method or apparatus for automatically testing for an object at a imumber of possible positions, scales and rotations within an individual image or Vi(leo frame.
The present invention may also utilise an externally supplied imiask defining a set of image pixels which niay be considered as possible object candidates and a set of image pixels which should not be considered as object candidates.
The present invention further provides instructions for configuring a i. ro- granmniable processilig apparatus to perform such a method or to become configured as such an apparatus.
Embodiments of the invention will now be (lescrii)ed, by way of exaiiiple only, with reference to the accompanying drawmgs in which: Figure 1 is a block diagram showing an example of notional functional coin- poileilts within a processing apparatus of an cmnhodmient of the invention; Figure 2 shows the processing operations pert ormed by the apparatus shiowii in Figure 1; Figure 3 shows an example of an input image to he j)rocesse(l by the appa- ratus shown in Figure 1; Figure 4 shows an exaniple of a feature pair used duriiig the processing of an image by the apparatus shown in Figure 1; The main function of the invention is to detect objects. Instead of using a small number of complex tests, as is Often done, this invention uses a large number of simple tests. In fact time tests used are as simple as possible. Each test consists of a coniparisoli between time image values at only two points.
Two p ints being a nlinimunm because using a single point would require the absolute image values to be consistent, which is almost, never the case.
Coinparimig a pair of points to test if one value is greater than the other is an extremely robust mneasuren1(ilt Any mnonotoiijc transformation of the imilage will leave the n1easuremn(Imt, unchanged. If there are many points where the values are similar, such that a sinai! amount, of noise may change the value )f the test, then a threshold on the (lifference may he applied for additional rohustmjess.
The particular selectiomi of tests to be used and the choice of any required threshoi(ls is problem dependent. Tests and thresholds are therefore deter- mmniied by a suital)le training scheme using either examples of the objects to he (letected or some model of the objects. Such training and selection is riot Part of this invention it is assumfle(l that all tests amid thresholds are coded uito the apparatus or are k)aded from an external source.
Rehrring to Figure 1, an embodiment of the invention Comprises a processing aPparatus 10, such as a personal conlptmter, user input devices 11, such as a keyboard, mouse etc., and output devices 12, such as a monitor, printer or other mietWorke(i (levice.
Time processing apparatus 10 is prograimmied to operate in accordance with l)rogramming instructions input for example as data stored on some medium such as a CD 13, and/or as a signal 14 from for examimple a remote database over a link such as the internet, and/or entered by the user via user input devices ii.
The l)rograrnrning illstructjofls Comprise rnstruCfjI)ns to cause the processing apparatus 10 to become (:oIl!igure(l to process image (lata 15 and mask data 16 and user inpnt (lata via iflJ)1]t (levices 11. The processing apparatfls 10 is alo Collliguretl to allow detected objects and other data to be recovered in response to user-d(1iiIe(l r)redethied search criteria and output to one or Inure output devices 12.
When programmed by tile progranlllijmig instructions processing apparatus effectively becomes configI1r((J int(.) a nunibcr of functional units ftr per- h)ruling ProCessing Operatjoiis. Examples of such functional Units all(i their intercon11e(tioIIs are shown in Figure 1. The ihlustrate(J Units and intercon- Ilecti()Iis in Figijr 1 are, however, Ilotional and are showij for illustrative Purposes only to assist, undersfall(hing. they do not necessarily represent the exact, Uiiits and Connections into which the processor, memory etc. of the I)locessing apparatus become Configured Referring to the functional units shown in Figure 1, central controller 17 process(s inputs from the user input devices 11 and also provides control and processing for a nuniber of other fuiictioiia,l units.
Data store 18 stores image data and other data input to the processing apparatus 10. It may also store ir1tern1e(1iLte results and the results of l)revious pr( >cessing.
()bjcct detector 19 perftrms l)rocessing of image data to test for the I) resence of an object within that image data.
Output Controller 20 gcILerat(s data for output to another Processing appa- ratus or outj)ut device winch conveys the results of Processing to the user and/or external device.
Figure 2 Shlow the processing operations performed iii this emnbodimezmt by the Processing ap)aratuIs 10.
Referring to Figure 2, at step SI 0 the apparatus 10 collects a set of input data comprising image data 15 and iiiak data 16 and user input via user iIil)uIt deVices ii.
Figure 3 shows au, exaniple of image data 15 to be input at step SlO. This image contains a single person against a backgroumn(].
The mask data 16 to be input at step Sb may be fixed for l images or may lie caldulate(l from the ililage, (Jr a sequence of images, by some other apparatus.
Exaiiiples of the user data input at step Sb would be the ckks of the oh jects of interest, viewpoint, data, .r some restrjctjojis OIl the possible size and (.)ri.entatioii of ohject5.
Refirring to Figure 2, at step 520 the apparatis 10 checks whether there are any more cotnbjnatj(,Ils of posjfjoii orientatjoii anti scale that have yet to be tested, si.ibject to the supplied iiia,sk. If there are theti the next colubination is selected and processing Proceeds to step S30 otherwise I)rocessing l)r)ceeds to Step 590.
Referring to Figure 2, at step S90 the central Controller 17 instructs the output Controller to oiitpu t the set of (letectc(1 objects from data store 18 to the outI)ut devices 12.
Referring to Figure 2, at step S80 the apparatus 10 checks whether any more ililages remain to be l)Tocessed. If there are then l)rocessing Proceeds to step S 10 otherwise the apparatus stops.
Referring to Figure 2, at step S30 the apj)arat,ms 10 checks whether any further test stages renlaill for this location, orientatiii and scale. If there are then Processing Proceeds to step S4() otherwise processing Proceeds to StC1) S 100.
Refcrriiig to Figure 2, at step S100 the object detector 19 records the current location, orientatioti afl(i scale in data store 18 as a (1etecte object.
Referring to Figure 2, at step S40 the apparatus 10 sets a score Couliter to zero and selects the set of feature tests, weights and the threshold fir the next test stage. At step S50 the apparatus tests whether the value of the 50)re (;oniiter is greater than the selected threslio](-I. If it is then Processing Proceeds to step S30 otherwise Processing Procecols to step S60.
Referring to Figure 2, at step S60 the apparatus 10 tests whether there are any remaining feature tests to be carried out in the current test stage. If there are then l)rOcessing Proceeds t( step 570 otherwise I) TOcessing Proceeds to step S20.
Referring to Fire 2, at step S70 the apparatus 10 selects the next feature test aIl(I al)l>lieS it to the image If the test is passed then the weight asoci- ated with the test is add( d to the current score counter Ot1I( rwise the counter is unchanged.
flfirring to Figure 4, each feature test is defined within a window 4() by a pair of points 41 and 42. Each point lies within the circle 44 that just fits inside the window, so that rotation about the centre 43 elisures that all points still lie within Window 40 The window is transfornle(1 into an image region 49 by applying transforiiiatj115 to the points 41,42 such a a traiLsiation 46, rotation 47 amid a scale 48 and rounding to the nearest pixel.
Given two traI1sfoI'1u(I points 50, 5 l the fiaturc test is passed if the iliLage brightness at point 50 is greater than time nnage brightness at point 51.
A nutfll)er of mo(jifjcatjol!s are Possible to the dnhl)odjnieiit described above.
For exanhI)le, in the eml)o(iimn(nt, above at step S70 the apparatums 10 compares the brightiiess of the two points. However, other image channels may be used for comparison. Also, the value of (me point in one channel may be Colnpare(1 to the value of the other point in a diffèreiit channel: at this case it is also Possible for time tWo 1)Oiflts to be identical.
Iii time eIiLi)O(hiiI1o-ILt above, at Ste1.) S70 the test is a simple "greater thati" OI)eration. However, the test may be more complex: for example, we may require one value to exceed the.ther by more titan some threshold, or WC may require one value to exceed a threshold umol the other to ftll below a threshold.
Iii time eniboalimimemit above, at step S70 mneasurmilejits are made by rounding poiiits to the nearest l)ixei. However, image values may instead he interpo- lated and/or, in the case of large scales, images may first be subsampied to an al)Propriate size.
Iii the enihodjuiejit above, processing is performed by a computer Usilig pro- cessiiig routines defined by prograimunimig instructions. However, sonic or all of the l)tocessimlg could be 1)erformned using electronic i r mechanical hard- ware.
Other changes and modificat,jomis cami be made without departing from the spirit and scope of the invention

Claims (9)

  1. Claims 1 Apparatijs for l)rocessing image data to (letect one ot more
    objects con- taine(l within the image using feature tests based on pairs of point-fi- atijres.
  2. 2 Apparatus according to claim 1, where the poiiit features are transformed by a function or a look- up table from a reference window.
  3. 3. Apparau according to any I)receding claim, where a test Comprises an image value at one point being greater tliaii an image value at the other point by sonic threshold value
  4. 4. Apparatus according to claini 3, where either image value is one of brighit- ness, saturation, line, red, green, blue or some calci dated combination thereof.
  5. 5. A method of processilig image data to detect one or more objects contained within the image using feature tests based on pairs of pointfeatures.
  6. 6 A liicthIO(i according to claini 5, where the point features are transformed by a function or a look-up table from a reference window.
  7. 7. A nietliod according to claim 5 or 6, where a test comprises an image value at one Point being greater than an image value at the other point by sonic threshold value.
  8. 8. A method according to claini 7, where either image value is one of bright- ness, Satiiratioii, hue, red, green, blue or some calculate(l combination thereof
  9. 9. A storage device storing computer-us)le instructions fbr causing a pro- giammable I)rocessing apparatus to become operable to perform a method accor(!ing to any of claims 5 to 8.
    8 A signal conveying computer-usah;1( instructions for causing a programniable processing apparatus to bec anc operable to) perform a method according to any of clainis 5 to 8.
GB0506766A 2005-04-02 2005-04-02 Object detection using pairs of point features Withdrawn GB2427777A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0506766A GB2427777A (en) 2005-04-02 2005-04-02 Object detection using pairs of point features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0506766A GB2427777A (en) 2005-04-02 2005-04-02 Object detection using pairs of point features

Publications (2)

Publication Number Publication Date
GB0506766D0 GB0506766D0 (en) 2005-05-11
GB2427777A true GB2427777A (en) 2007-01-03

Family

ID=34586635

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0506766A Withdrawn GB2427777A (en) 2005-04-02 2005-04-02 Object detection using pairs of point features

Country Status (1)

Country Link
GB (1) GB2427777A (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0800145A2 (en) * 1996-04-01 1997-10-08 Siemens Aktiengesellschaft Method for recognition by computer of at least one finger-shaped object in a hand-shaped first object

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0800145A2 (en) * 1996-04-01 1997-10-08 Siemens Aktiengesellschaft Method for recognition by computer of at least one finger-shaped object in a hand-shaped first object

Also Published As

Publication number Publication date
GB0506766D0 (en) 2005-05-11

Similar Documents

Publication Publication Date Title
US11222239B2 (en) Information processing apparatus, information processing method, and non-transitory computer-readable storage medium
CN104919794B (en) For extracting the method and system of metadata from master-slave mode camera tracking system
US9571797B2 (en) Network equipment, network system and surveillance camera system
US7860162B2 (en) Object tracking method and object tracking apparatus
US20190180583A1 (en) Information processing system, method and computer readable medium for determining whether moving bodies appearing in first and second videos are the same or not
JP2008250908A (en) Picture discriminating method and device
CN110866512B (en) Monitoring camera shielding detection method based on video classification
KR20060119968A (en) Apparatus and method for feature recognition
US20150071529A1 (en) Learning image collection apparatus, learning apparatus, and target object detection apparatus
WO2002032129A1 (en) A method of searching recorded digital video for areas of activity
US20120020514A1 (en) Object detection apparatus and object detection method
US10455144B2 (en) Information processing apparatus, information processing method, system, and non-transitory computer-readable storage medium
JPH08251521A (en) Image input device
JP2004524901A (en) Method and apparatus for reducing correlation noise in image data
WO2022009944A1 (en) Video analysis device, wide-area monitoring system, and method for selecting camera
JPH10263126A (en) Form analyzing system for exercise using personal computer
KR101124560B1 (en) Automatic object processing method in movie and authoring apparatus for object service
GB2427777A (en) Object detection using pairs of point features
CN113781384A (en) Video quality evaluation method and device
US20230098829A1 (en) Image Processing System for Extending a Range for Image Analytics
JP5098160B2 (en) Security device, security system and security program
US11182619B2 (en) Point-of-interest determination and display
CN117121051A (en) Privacy filter based on real-time machine learning for removing reflective features from images and video
EP2267673A2 (en) Digital video recording system and self-test method thereof
US11023769B2 (en) Modifying an image based on identifying a feature

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)