GB2417801A - Image processing apparatus - Google Patents

Image processing apparatus Download PDF

Info

Publication number
GB2417801A
GB2417801A GB0419886A GB0419886A GB2417801A GB 2417801 A GB2417801 A GB 2417801A GB 0419886 A GB0419886 A GB 0419886A GB 0419886 A GB0419886 A GB 0419886A GB 2417801 A GB2417801 A GB 2417801A
Authority
GB
United Kingdom
Prior art keywords
images
processing
data
metrics
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0419886A
Other versions
GB0419886D0 (en
Inventor
Richard Ian Taylor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PEPPERDOG Ltd
Original Assignee
PEPPERDOG Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PEPPERDOG Ltd filed Critical PEPPERDOG Ltd
Priority to GB0419886A priority Critical patent/GB2417801A/en
Publication of GB0419886D0 publication Critical patent/GB0419886D0/en
Publication of GB2417801A publication Critical patent/GB2417801A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/162Detection; Localisation; Normalisation using pixel segmentation or colour matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/169Holistic features and representations, i.e. based on the facial image taken as a whole
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/155Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands use of biometric patterns for forensic purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

In an image processing apparatus, image data defining input images of people is processed to generate a selection of other images of people from a library of images. Processing is carried out to associate some specific signature data with each individual image within the library, and signature data is used to automatically select a subset of the images in the library which best meet sonic operator-specified criteria. The signature data includes colour histograms.

Description

Image Processing Apparatus This invention relates to the automatic
selection of a number of images of people from libraries of images.
There are many applications where an individual or an organization has ac c ess to a large library of images and wishes to select some subset of those images which best satisfies some criteria. For example, the police may have a particular suspect and require images of people who resemble the suspect for the purposes of constructing a witness book or t'or creating a line-up; or a modeling agency may have chosen one or more models and require images of other models who either resemble or complement those models for the purposes of constructing a portfolio or for organising a shoot.
Such selections can be done manually, by looking through the images one at a time or by viewing pages of thumbnail images or by selecting images randomly. The process can also be partially automated by associating text annotations with the images and then performing a search in the normal way, such as happens at irnages.google.com for example. Existing systems which use these approaches are either very time consuming for the operator or they rely very heavily on the accuracy and relevance of the text annotations associated with the images.
The present invention has been made with the above problems in mind.
According to the present invention there is provided a method or apparatus for automatically associating some specific signature data with each individ- ual image within a library.
The present invention also provides a method or apparatus for utilising sigh nature data to automatically select a, subset of the images in a library which best meet some operator-specified criteria. The criteria may take the form of similarity to one or more target images and/or dissirnila,rity from one or more target images.
The present invention may also utilise manually selected or entered search terms (such as age, height, weight, sex, ethnicity) which can relate to the signature data or some other data which are associated with the library images by some other means.
The present invention further provides instructions for configuring a programmable processing apparatus to perform such a method or to become configured as such an apparatus.
Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which: Figure 1 is a block diagram showing and example of notional functional corn- ponents within a processing apparatus of an enbodirnent of the invention; Figure 2 shows the processing operations performe<l try the apparatus shown in Figure 1; Figure 3 shows an example of an input image to be processed by the appa- ratus shown in Figure 1; Figure 4 shows an example of a window function calculated during the pros cessing of an image by the apparatus shown in Figure 1; Deferring to Figure 1, an embodiment of the invention comprises a processing apparatus 10, such as a personal computer, user input devices 11, such as a keyboard, mouse etc., and output devices 12, such as a monitor, printer or other networked device.
The processing apparatus 10 is progTarnmed to operate in accordance with progTarmming instru< tions input for example as data stored on some medium such as a CD 13, and/or as a signal 14 from for example a remote database over a link such as the internet, and/or entered by the user via user input devices 11.
The programming instructions conrprise instructions to cause the processing apparatus 10 to become configured to process image data 15 and other data 16 and user input data via input devices l1. The processing apparatus 1() is also configured to Clew image data and other data to be recovered in response to user-defined or predefined search criteria and output to one or more output devices 12.
When programmed by the programming instructions, processing apparatus l effectively becomes configured into a number of functional units for performing processing operations. Examples of such functional units and their interconnections are shown in Figure 1. The illustrated units and intercon- nectious in Figure 1 are, however, notional and are shown for illustrative purposes only to assist understanding: they do not necessarily represent the exact units and connections into which the processor, memory etc. of the processing apparatus become configured.
Referring to the functional units shown ire Figure 1, central controller 17 processes inputs f'rorm the user input devices 11 and also provides control and processing for a number of other functional units.
Data store 18 stores image data, and other data input to the processing ap- paratus to. It Dray also store intermediate results and the results of previous processing.
Signature generator 19 performs processing of insane data to generate signa- ture data to be associated with that image data,.
Matcher 20 performs processing on signature data and other data to compare input data with stored data and select the records which best satisfy the query < rite,ria.
()utput controller 21 generates data for output to another processing appa- ratus or output device which conveys the results of processing to the user and/or external device.
Figure 2 shows the processing operations perforrme.d in this embodiment by the processing apparatus 10.
Referrirrg to Figure 2, at step S10 the apparatus 10 collects a set of input data comprising image data 15 and other data, 16 and user input via user input devices 11.
Figure 3 shows an example of image data 15 to be input at step S10. Thi image contains a single person against a plain blue background.
An example of other data 16 to be input at step S10 would be the date and time at which the image data was c aptured.
Examples of the user data input at step S10 would be whether this is a new image for the library or a library query, the name of the person in the image, the sex of the person, the age of the person, and if this is a query, details of the query c riteria.
Referring to Figure 2, at step S20 the apparatus 10 processes the input ctata to determine the action required I,y the user and to extract information from the image, data. All or some of the data may then be stored within the apparatus 10 in data store 18 or on some media or network accessible by the apparatus 10.
The information extracted by signature generator 19 from the image data at step S2O in this embodiment is as follows.
Firstly the image ctata is processed to find any faces that are present: this is done using a standard technique such as that described in "Fast and Robust Classification using Asyrurmetric AdaBoost and a Detector Cascade" by P. Viola anct M. Jones, Neural Information Processing Systems 2()()2.
Referring to Figure 4, the distance between the eyes 40 is calculated and the mid-point of the fat e 41 is defined to be the mid-point of the eyes. A window on the image is then defined to be all image pixels which lie within a radius 42 of the mid-point 41. In this embodiment the radius 42 is defined to be 1.5 times the eye separation 40.
A signature is then calculated from the pixel vahes which lie within the window. In this embodiment the signature is a sequence of 420 numbers derived from 3 colour histograms of a) the whole window, b) the top half of the window and c) the bottom half of the window.
Colour histograms are calculated by counting the number of pixels within the window or sub-winclow which have the same RGB colour value. In this embodiment only the 3 most significant bits in each of R.,G,B is used and only pixels with colours in which R. is the largest c omponent are counted.
Each histogram is normalised by dividing each of its 14() counts by the total number of pixels counted. The 3 norrnalised histograms are then eoncate- nated to form the 420 valued signature.
Referring to Figure 2, at step S30 the apparatus 1() determines whether a query is being made. If it is then step S40 is carried out, otherwise step S1() is carried out.
Referring to Figure 2, at step S40 the apparatus 1() e.xaznines the query criteria that have been specified and matcher 20 calculates the best matrices with the records that it has available.
In this embodiment an example query is "16 people most similar to the given image, whose sex is male and whose age is 2(30". For this example the apparatus 10 would search the avai]al>le records for instances where the sex was male and the age was in the range 23(). Frown those results the apparatus 10 would select the 18 instances with the largest similarity score between the signature of that instance and the signature of the input image data 15.
The similarity score between two signatures is defined as the negative sum of the squared differences between corresponding elements of the signatures.
A score of zero means the signatures are identical and a large negative score means the signatures are very different.
Referring to Figure 2, at step S50 the apparatus 10 uses output controller 21 to output the results it has calculated to one or more of the output devices 12.
A number of modifications are possible to the embodiment described above.
For example, in the embodiment above at step S10 the apparatus 10 collects all of the input data simultaneously. However, the data collection process may be interactive, with the apparatus 10 prompting the user for a sequence of inputs which depends on the previous inputs and the image data and other data available. Also, step S10 could be the continuation of a previous query, in which case the user may be prompted to add to, change or delete previously entered inputs.
In the embodiment above, Figure 3 shows a single image of a single person against a plain background. However, more complex images may be used in which more than one person is present and the background is c [uttered: also, several images may be input at the same time as part of the same query or library record.
In the embodiment above, other data 16 input at step S10 is the date and time at which the image data was captured. However, this data nary not be available or other data, such as the location or camera type, may be available.
Also, the data normally entered by the user at S10 may instead be available as other data 16 if it hats been entered previously by this or another user or captures! by some other means.
In the embodiment above, user data captured at step S10 is used to determine the query criteria for step S40. However, the query may be fixed in advance or may be defined by processing carried out by the apparatus 10.
In the embodiment above, at step S20 the signature generator 19 detects faces in the image data using a standard technique. However, other techniques may be used to find faces, or the fate location may be identified by the user, or the face Cation may be contained in other data 16. Also, an alternative signature generator could be used which does not need to know the face location.
In the embodiment above, Figure 4 shows a window defined by a circle centred on the mid-point of the face. However, other window shapes may be used: for example a square, a rectangle or an ellipse; arid other window sizes and other definitions of the mid-point.
In the ernbodinent above, at step S20 the signature generator 19 calculates a signature with 42() numbers using 3 colour histogranrs. However, other signatures could be used, with more or fewer numbers, by employing other or additional types of histogram and additional or alternative rneasurenents c alculable from the image data.
In the crnbo<tinrcut above, at step S40 the matcher 20 compares records using boolean, true or false, tests and a signature score based on squared differences. However, singe or all of the boolean tests could be made into numerical tests, for example instead of specifying an age range 20-30 the query could specify an age of 28 and then score records based on the actual age difference. Also, the signature score c ould be c alculated by some other means, for example using the scalar product of the signature vectors.
In the embodiment above, processing ix performed by a computer using processing routines defined by programming instructions. However, some or all of the processing could be performed using electronic or mechanical hardware.
Other changes arid modifications can he made without departing from the spirit and sc ope of the invention.

Claims (8)

  1. Claims 1. Apparatus for processing image data to generate one or more
    metrics relating, to one or more people contained within the image; further using these metrics to select a plurality of images from a, database of images containing people.
  2. 2. Apparatus according to claim 1, where the processing detects faces in the images and calculates the metrics based on the locations of detected faces.
  3. 3. Apparatus according to any Receding claim, where the, processing calcula,tes the metrics using cokur histograms.
  4. 4. A method of processing image data to generate one or more metrics relating to one or more people contained within the image; further using these metrics to select a, plurality of images from a database of images containing people.
  5. 5. A method according to claim 4, where the processing detects faces in the images and calculates the metrics based on the locations of detected faces.
  6. 6. A method according to claim 4 or 5, where the processing c alculates the metrics using colour histogTarms.
  7. 7. A storage device storing computer-usable instructions for c arising a pro- grammable processing apparatus to become operable to perform a method according to any of claims 4 to 6.
  8. 8. A signal conveying computer-usable instructions for causing a programmable processing apparatus to become operable to perform a method according to any of claims 4 to 6.
GB0419886A 2004-09-07 2004-09-07 Image processing apparatus Withdrawn GB2417801A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0419886A GB2417801A (en) 2004-09-07 2004-09-07 Image processing apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0419886A GB2417801A (en) 2004-09-07 2004-09-07 Image processing apparatus

Publications (2)

Publication Number Publication Date
GB0419886D0 GB0419886D0 (en) 2004-10-13
GB2417801A true GB2417801A (en) 2006-03-08

Family

ID=33186632

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0419886A Withdrawn GB2417801A (en) 2004-09-07 2004-09-07 Image processing apparatus

Country Status (1)

Country Link
GB (1) GB2417801A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998052119A1 (en) * 1997-05-16 1998-11-19 The Trustees Of Columbia University In The City Of New York Method and system for image retrieval
US6163622A (en) * 1997-12-18 2000-12-19 U.S. Philips Corporation Image retrieval system
EP1164506A2 (en) * 2000-05-22 2001-12-19 Eastman Kodak Company Determining sets of materials interesting for a user by analyzing images
US20020081026A1 (en) * 2000-11-07 2002-06-27 Rieko Izume Image retrieving apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998052119A1 (en) * 1997-05-16 1998-11-19 The Trustees Of Columbia University In The City Of New York Method and system for image retrieval
US6163622A (en) * 1997-12-18 2000-12-19 U.S. Philips Corporation Image retrieval system
EP1164506A2 (en) * 2000-05-22 2001-12-19 Eastman Kodak Company Determining sets of materials interesting for a user by analyzing images
US20020081026A1 (en) * 2000-11-07 2002-06-27 Rieko Izume Image retrieving apparatus

Also Published As

Publication number Publication date
GB0419886D0 (en) 2004-10-13

Similar Documents

Publication Publication Date Title
US9720936B2 (en) Biometric matching engine
Bala et al. Using learning to facilitate the evolution of features for recognizing visual concepts
US5644765A (en) Image retrieving method and apparatus that calculates characteristic amounts of data correlated with and identifying an image
JP2008262581A (en) System and method for determining image similarity
GB2402535A (en) Face recognition
RU2345414C1 (en) Method of creation of system of indexing for search of objects on digital images
US6950554B2 (en) Learning type image classification apparatus, method thereof and processing recording medium on which processing program is recorded
Lepsøy et al. Statistical modelling of outliers for fast visual search
WO2007129474A1 (en) Object recognition device, object recognition program, and image search service providing method
US11403875B2 (en) Processing method of learning face recognition by artificial intelligence module
CN108140107A (en) Quickly, high-precision large-scale fingerprint verification system
EP3958171A1 (en) Automatic method to determine the authenticity of a product
Ładniak et al. Search of visually similar microscopic rock images
KR20100108778A (en) Image information classification method and apparatus
JP2021174438A (en) Individual identification system, individual identification program, and recording medium
JP2004192555A (en) Information management method, device and program
GB2417801A (en) Image processing apparatus
JP2000082075A (en) Device and method for retrieving image by straight line and program recording medium thereof
JP2015187770A (en) Image recognition device, image recognition method, and program
JP6904619B1 (en) retrieval method
Ballary et al. Deep learning based facial attendance system using convolutional neural network
Anandababu et al. Structural similarity measurement with metaheuristic algorithm for content based image retrieval
JPH10171831A (en) Device for retrieving image file
WO2021220828A1 (en) Individual object identification system
Sharm et al. Image matching algorithm based on human perception

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)