APPARATUS AND METHOD FOR EVALUATING AN OBJECT
[0001 ] Evaluating the human perception of objects is a complex and labor intensive process. For example, in order to evaluate the human perception of a particular consumer product typically requires either customer surveys, questionnaires, customer interviews, or the like.
 Perception can be inferred, in many cases, such as for consumer products, by studying sales data to determine how well a product is selling and thereby infer customer perception of the product. Perception of advertisements, for example, may also be inferred by studying changes in product sales data during an advertising campaign. However, measuring the effectiveness of a product or an advertisement in this way may take many weeks or months to determine. Consequently, this delay significantly reduces opportunities for the campaign or product to be improved, modified, extended, re-enforced, or withdrawn, etc. based on the determined effectiveness.
 Examples and embodiments of the invention will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:
 Figure 1 is a simplified block diagram illustrating an object evaluation system according to an example of the present invention;
 Figure 2 is a simplified flow diagram outlining example processing steps taken by an objection evaluation system according to an example of the present invention;
 Figure 3 is a simplified block diagram illustrating an analysis module according to an example of the present invention;
 Figure 4 is a simplified flow diagram outlining an example method of operating an analysis module according to an example of the present invention;
 Figure 5 is a simplified flow diagram outlining an example method of operating an analysis module according to an example of the present invention; and
 Figure 8 is a simplified block diagram illustrating an example processing system on which examples of the present invention may be implemented,
 Referring now to Figure 1 there shown an illustration of an object evaluation system 100 according to an example of the present invention. The system 100 comprises a stream 108 of video images generated, for example by a video camera 108 placed in proximity to an object 102 to be evaluated. The video stream 108 is analyzed by an analysis module 1 10 which determines an evaluation for the object 102 based on the contents of the video stream 108.
[0001 1 ] In the present example the object 102 may be an advertisement or a product, although it will appreciated the object 102 is not limited thereto. In the case where the object 102 is an advertisement, it may be, for example, a static advertisement printed on a suitable support, a static advertisement displayed on an electronic display, such as on a television screen or computer monitor, a video advertisement, an audio advertisement, or the like. In the case where the object 102 is a product it may be, for example, a single product, a group of products, or the like.
 The video camera 108 is positioned such that it captures video images of people 104a to 104n in proximity to the object 102 and generates a video stream 108. Sn the illustration shown in Figure 1 the video camera 106 is placed behind the object 102 to capture video images of people moving in close proximity to the object 102. In other examples, however, the video camera 106 may be positioned in other ways, for example, to capture video images of people approaching or moving away from the object 102.
 The video camera 108 may be a single video camera, or may be a video camera of a multiple-camera video surveillance system.
 An outline method of operating the video stream analyzer 1 10 is now described with further reference to Figure 2.
 The video stream analyzer 1 10 receives (202) the video stream 108 comprising video images and processes video images in the video stream 108 to identify (204) human individuals from the video stream. Each individual may be identified, for example, through analysis of their facial or other anatomical features. For each individual identified the video stream analyzer 1 10 detects (206), through appropriate processing of the video stream 108, an emotion and emotion intensity level. In at least one example, the video stream analyzer analyzes video images snowing faces of individuals, and determines an emotion and emotion level through analysis of facial features. The emotions detected may include, for example, happiness, sadness, surprise, fear, anger, and no emotion. An emotion level may, for instance, be attributed on a 1 to 10 scale, where 1 indicates a low level of emotion and 10 indicates a high level of emotion. A detected emotion may, for example, include a hybrid emotion. For example, an individual may be determined to be expressing a low level of surprise and a medium level of happiness. Any suitable emotion detection techniques may be used.
 The result of the emotion detection for each individual are used to calculate an object interest level. In one example, a sum of the emotion levels for each detected emotion over a predetermined evaluation period is made and a global interest level is determined therefrom. For example, if 10 individuals are identified in the video stream their respective emotions and emotion levels may be recorded in a suitable memory, data store, log, data structure, etc, as shown in Table 1 below.
INDIVIDUAL IDENTIFIER EMOTION EMOTION L EVEL
1 Happiness 7
2 Surprise 5
3 Happiness 6
4 Anger 2
TABLE 1 - Example results of emotion detection
 Since the video stream 108 captures images of people in proximity to the object under evaluation 102 it can be inferred that the emotions detected from the video stream are the emotions expressed as a result of having seen or experienced the object 102.
 An overall object interest level may be calculated by, for example, averaging the emotion levels for each detected emotion and determining which emotion or emotions are the most prominent. In other examples, other suitable methods may be used.
 The video stream 108 is analyzed by the video stream analyzer for a predetermined period. In one example 24 hours of video stream may be analyzed, whereas in other examples a shorter or greater period of video stream may be analyzed.
 Figure 3 shows a video stream analyzer 1 10 in greater detail, according to an example of the present invention. An example method of operating the video stream analyzer 1 10 is further described below with reference to Figures 4 and 5.
[00021 ] In the present example, the video stream analyzer 1 10 comprises a video buffer 302, an identity analyzer 304, an emotion analyzer 308, a temporary emotion data store 308, an emotion data store 310, and an interest determination module 312.
 The video stream analyzer 1 10 receives (402) the video stream 108 in the video buffer 302. The video buffer 302 enables the video stream to be analyzed in other than real-time. In other examples, however, the video stream may be analyzed in real-time in which case a smaller video buffer (or even no
video buffer at all) may be required depending on the processing capabilities of the video stream analyzer 1 10.
 The identity analyzer 304 analyzes (404) the video frames from the video stream stored in the video buffer 302 to identify individual people vvithin the video stream. It will be appreciated that numerous facial recognition technologies and systems exist and may be used. For example, the identity analyzer 304 may use known image processing techniques to locate a face in an image and may use various facial feature detection techniques to locate a person's eyes, nose, mouth, ears, etc.
 The identity analyzer 304 computes a unique or substantially unique identifier for each person in the video stream. The unique identifier may be calculated based, for example, on the characteristics of each facial feature and the spatial relationship between them. In some examples a hash function may be used in the calculation of the unique identifier. It should be noted, however, that in the present example the purpose of the identity analyzer 304 is to uniquely or substantially uniquely differentiate between different people captured in the video stream 108, and not to attribute a real-life identity to each person detected.
 When an individual has been identified by the identity analyzer 304 it passes the determined identifier to an emotion analyzer 306. In addition to the determined identifier in the present example the identity analyzer additionally identifies one or more video frames in which the identified individual has been identified. In the present example the reason for identifying video frames is to enable the emotion and emotion level of an individual to be determined during each video sequence in which the individual is identifiable. For example, an individual may be captured in the video stream when first coming into proximity to the object under evaluation. Initially, the individual may not notice the object and hence may only show a neutral emotion. The individual may then turn away from the object under evaluation and later turn back towards to the object, this time noticing the object and showing a high level of happiness. Each of these sequences may be captured in different sequences of video frames. In at least one example the video frames are consecutive video frames.
 The identity anaiyzer 304 additionaliy determines and sends a set of image coordinates defining the area of the video image from where the individual was identified.
 The emotion anaiyzer 304 analyzes (406) the defined area of each of the identified video frames to determine a perceived emotion and emotion level for the individual in each of the identified video frames. The results for each video frame are recorded in a temporary emotion data store 308, along with the determined individual identifier, a video time stamp, a detected emotion, and determined emotion level. In at least one example, the recorded video timestamp represents the actual time the video frame was taken.
 Since during a sequence of video images the facial expression of an individual may change the emotions and emotion levels for a sequence of video images are recorded in the temporary emotion data store 308. In Table 2 below is shown example emotion and emotion levels of an individual who was detected in 5 sequential video frames.
TABLE 2 - Example determined emotions and emotion levels
 A representative emotion and emotion level is then determined to characterize the different emotions and emotion levels expressed by an individual during a video sequence. In the present example the representative emotion is determined as being the highest emotion level for each detected emotion determined during a video sequence. Hence, in the current example for the
identified video sequence the determined emotion is happiness and the determined emotion level is 8. In other examples the determined representative emotion level may be based on an average emotion level, a median emotion level, or any other suitable calculated. In other examples, if more than one emotion is determined for an individual during a video sequence one or more representative emotions may be determined,
 The determined representative emotion and emotion level, along with the individual identifier and timestamp are recorded or stored (408) in the main emotion data store 310. An example extract of the data stored in the main emotion data store 310 is shown below in Table 3.
TABLE 3 - Example extract from the Emotion Data Store
[00031 ] In the case where multiple emotions are detected by the emotion analyzer 306 during a video sequence each detected emotion and corresponding determined emotion level are recorded (408) in the main emotion data store 310. For example, where the highest emotion levels are recorded in the emotion data store 310 only the highest determined emotion level for each different determined emotion is recorded.
 Once a suitable set of emotion data is stored in the emotion data store 310 the interest determination module 312 can process the data stored therein to calculate an object interest level, as described below with further reference to Figure 5.
 At 502 an evaluation period is obtained is defined. The evaluation period defines the period over which the object interest level is to be determined. For example, the evaluation period may be a period of 24 hours, may be a period corresponding to the opening hours of a shop from which the video stream 108 is received, or any other suitable period.
 At 504 the interest determination module 312 processes the data in the emotion data store 310 corresponding to the desired evaluation period to determine (508) a suitable object interest level. Once the object interest level has been determined it may, for example, be stored in a data store or log, published, or displayed on a suitable graphical user interface.
 In one example, the interest determination module 312 determines the most frequently determined emotion as being the most prominent emotion expressed during the evaluation period and performs an average of the different emotion levels corresponding to the determined most frequent emotion.
 In another example, the interest determination module 312 calculates the object interest levei based on an average of emotion levels for each different determined emotion.
 In a further example, the identifiers of one or more predetermined individual identifiers may be either not stored, or if stored removed, from the emotion data store 310 or may be not taken into account by the interest determination module. This may be useful, for example, to exclude personnei who work in proximity to the object under evaluation from being included in the object interest level calculation. For example, such personnel are likely to appear in the video stream multiple times during the evaluation period and could cause inaccuracies in the object interest level from being introduced.
 In a further example the identity analyzer 304 may detect additional characteristics of each individual, such as their sex, approximate age, whether an adult or child, etc. This information may be additional stored in the emotion data store 310. The interest determination module 312 may, thereafter, calculate an object interest levei based on only a subset of the data stored in the emotion data store 310. For example, the interest determination module 312 may calculate the
object interest level only from individuals being identified as being female and adult.
 In a still further example, only a single emotion data store may be provided.
 In a yet further example, the video stream analyzer may process more than one video stream. For example, in a shop environment one video stream may be captured facing away from the object under evaluation, and another video stream may be capture facing the object under evaluation. The video streams may, for example, be generated by custom video cameras, or may, if suitable, be video streams taken from an existing closed-circuit television (CCTV) system.
[00041 ] Turning now to Figure 6 there is shown a block diagram of computer system 600 on which the analysis module 1 10 may be implemented in one example. For example, the analysis module 1 10 may be implemented by way of programming instructions that define object evaluation software as described above being stored on a non-transitory computer readable storage medium 604 or 606. The memory 604 and storage 606 are coupled to a processor 602, such as a microprocessor, through a communication bus 610. The instructions, when executed by the processor 602 provide the functionality of an object evaluation system as described above by executing the above- described method steps.
 It will be appreciated that embodiments of the present invention can be realized in the form of hardware, software or a combination of hardware and software. Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a ROM, whether erasable or rewritable or not, or in the form of memory such as, for example, RAM, memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a CD, DVD, magnetic disk or magnetic tape. It will be appreciated that the storage devices and storage media are example of machine-readable storage that are suitable for storing a program or programs that, when executed, implement examples of the present invention. Accordingly, example provide a program comprising code for implementing a system or
meihod as claimed in any preceding claim and a machine readable storage storing such a program. Still further, examples of the present invention may be conveyed electronically via any medium such as a communication signal carried over a wired or wireless connection and embodiments suitably encompass the same.
 Ail of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or ail of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
 Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.