GB2349761A - Apparatus for archiving still image data - Google Patents

Apparatus for archiving still image data Download PDF

Info

Publication number
GB2349761A
GB2349761A GB9905158A GB9905158A GB2349761A GB 2349761 A GB2349761 A GB 2349761A GB 9905158 A GB9905158 A GB 9905158A GB 9905158 A GB9905158 A GB 9905158A GB 2349761 A GB2349761 A GB 2349761A
Authority
GB
United Kingdom
Prior art keywords
archive data
generating
image
data
archive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9905158A
Other versions
GB2349761B (en
GB9905158D0 (en
Inventor
Simon Micheal Rowe
Michael James Taylor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to GB9905158A priority Critical patent/GB2349761B/en
Publication of GB9905158D0 publication Critical patent/GB9905158D0/en
Priority to US09/519,178 priority patent/US7139767B1/en
Publication of GB2349761A publication Critical patent/GB2349761A/en
Application granted granted Critical
Publication of GB2349761B publication Critical patent/GB2349761B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures

Abstract

In an image processing apparatus 2, input image data defining still images, such as photographs or discreet frames of video data, is stored in an archive database 50 as JPEG image data 502. Text data 520 is stored for each image comprising text to be associated with the image. Viewing information 540 is also stored for each image defining, for each person or animal in the image, the person, animal or object at which they are looking. Time information is also stored for each image defining the time at which the image was recorded. The storage of the text data 520 and the viewing information 540 facilitates the improved searching and retrieval of images from the archive database 50.

Description

IMAGE PROCESSING APPARATS The present invention relates to the field of information archiving, and, more particularly, to the storage of image data for photographs or other still images.
Many databases exist for the storage of data such as image data. However, existing databases suffer from the problem that the ways in which the database can be interrogated to retrieve information are limited.
It is an object of the present invention to provide a database for the archiving of image data which facilitates improved information retrieval.
According to the present invention there is provided an apparatus or method in which image data is archived together with gaze information to facilitate information retrieval.
The present invention also provides an apparatus or method in which image data is stored together with information defining one or more subjects in the image and at what the subjects are looking.
The present invention further provides an apparatus or method for archiving image data, in which archive data is stored in association with the image data, the archive data defining one or more people or animals in the image and the person, animal or object at which they are looking.
Such a system facilitates searching of the stored information to identify an image using a query such as "find each image in which Simon is looking at Mike"or such as"find each image in which I am looking at mountains".
The present invention further provides an apparatus or method for generating archive data in such a system, and in addition, an apparatus or method for searching stored data in such a system.
The present invention further provides instructions, both in signal and recorded form, for configuring a programmable processing apparatus to become arranged as an apparatus, or to become operable to perform a method, in such a system.
Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which: Figure 1 is a block diagram showing an example of notional functional components within a processing apparatus of an embodiment of the invention; Figure 2 shows the processing operations performed to store information in the archive database; Figure 3 shows the information displayed to a user at step S20 in Figure 2; Figure 4 shows the processing operations performed at step S30 in Figure 2 to store the image data and information in the archive database; Figure 5 schematically illustrates the storage of image data and information in the meeting archive database; Figure 6 shows the processing operations performed to retrieve information from the archive database; Figure 7A shows the information displayed to a user at step S200 in Figure 6; Figure 7B shows an example of information displayed to a user at step S220 in Figure 6; and Figure 8 shows a second embodiment of the invention in which the functions of the first embodiment are performed by separate, interconnected apparatus.
Referring to Figure 1, an embodiment of the invention comprises a processing apparatus 2, such as a personal computer, user input devices 4, such as a keyboard, mouse etc., a display device 6, such as a conventional personal computer monitor, and a scanner 8 for scanning photographs to produce digital image data therefrom.
The processing apparatus 2 is programmed to operate in accordance with programming instructions input, for example, as data stored on a data storage medium, such as disk 10, and/or as a signal 12 input to the processing apparatus 2, for example from a remote database, over a datalink (not shown) such as the Internet, and/or entered by a user via a user input device 4.
The programming instructions comprise instructions to cause the processing apparatus 2 to become configured to store image data defining input images together with associated information provided by the user in a database, and to search the database to retrieve images in dependence upon search parameters input by the user.
When programmed by the programming instructions, processing apparatus 2 effectively becomes configured into a number of functional units for performing processing operations. Examples of such functional units and their interconnections are shown in Figure 1. The illustrated units and interconnections in Figure 1 are, however, notional and are shown for illustration purposes only to assist understanding; they do not necessarily represent the exact units and connections into which the processor, memory etc. of the processing apparatus becomes configured.
Referring to the functional units shown in Figure 1, central controller 20 processes inputs from the user input devices 4, and also provides control and processing for a number of the other functional units. Memory 22 is provided for use by central controller and other functional units.
Image data store 30 stores the image data representing the images input to the processing apparatus 2. This input image data is generated by scanning a photograph using scanner 8, or by downloading digital image data directly from a digital camera, from a database, or from a separate processing apparatus, etc. Such digital image data may be a photograph recorded with a digital camera or a frame of image data from a video camera.
Archive processor 40 in conjunction with central controller 20 stores image data from the image data store 30 in the archive database 50 together with information related to the image which is input by a user.
Text searcher 60, in conjunction with central controller 20 is used to search the archive database 50 to retrieve one or more images which meet search criteria specified by a user.
Image display processor 70 displays images from image data store 30 or archive database 50 on display device 6.
Output processor 80 outputs data from archive database 50, either on a storage device such as disk 90, or as a signal 92.
Figure 2 shows the processing operations performed to store image data and associated information in archive database 50.
Referring to Figure 2, at step S10, central controller 20 and image display processor 70 read image data stored in image data store 30 and display the next input image to the user on display device 6 (this being the first input image the first time step S10 is performed).
At step S20, central controller 20 causes a message to be displayed on display device 6 requesting the user to enter information about the displayed image. More particularly, in this embodiment, central controller 20 causes the screen shown in Figure 3 to be displayed to the user.
Referring to Figure 3, the user is requested to enter information 300 defining the approximate date that the image was recorded, and also information 310 comprising text to be stored in the archive database 50 in association with the image. The text may be, for example, a caption for the image, and/or factual information about when, where, how and why the image was recorded, and/or further information such as details about the subject matter of the image, etc.
In addition, the user is requested to enter information 320 defining each respective person or animal in the image and information 330 defining the person, animal or object at which each person or animal identified in information 320 is looking. This information will be used to facilitate better searching and retrieval of information from archive database 50, as will be explained below.
In response to the request for information at step S20, the user may type in the information using a user input device 4 such as a keyboard, or may enter the information using a conventional voice recognition processor, such as"Dragon Dictate"or IBM"ViaVoice".
Referring again to Figure 2, at step S30, central controller 20 and archive processor 40 perform processing to store the image data for the input image and the information entered by the user at step S20 in the meeting archive database 50.
Figure 4 shows the processing operations performed by central controller 20 and archive processor 40 at step S30.
Referring to Figure 4, at step S100, the image data for the input image is stored in archive database 50.
Figure 5 schematically illustrates the storage of data in archive database 50. The storage structure shown in Figure 5 is notional and is provided for illustration purposes only to assist understanding; it does not necessarily represent the exact way in which data is stored in archive database 50.
Referring to Figure 5, archive database 50 stores time information represented by the horizontal axis 500, on which each unit represents a predetermined amount of time, for example one month. The image data stored at step S100 in Figure 4 is stored in archive database 50 in compressed form, this being as JPEG data 502 in this embodiment, together with timing information based on the information 300 entered by the user at step S20 (this timing information being represented in Figure 5 by the position of the image data along the horizontal axis 500).
In the example shown in Figure 5, image data for six images is stored, as indicated at 504,506,508,510,512 and 514.
Referring again to Figure 4, at step S110, the text information 310 entered by the user at step S20 is stored in the archive database 50 (indicated at 520 in Figure 5). More particularly, the text data is stored with a link to the corresponding image data, this link being represented in Figure 5 by the text data being in the same vertical column as the JPEG image data, that is, for example, text data 522 is linked to JPEG image data 504.
At step S120, each person or animal identified in the information 320 input by the user at step S20 is read, together with the person, animal or object at which they are looking defined in the corresponding information 330.
At step S130, a check is carried out to determine whether a unique reference number has already been stored in archive database 50 for each person, animal and object read at step S120. More particularly, referring to Figure 5, archive database 50 stores an identification table 530 for storing information defining people, animals and objects defined in the information 320 and 330, together with a respective unique identification number. Accordingly, at step S130, a search of table 530 is carried out to determine whether an entry already exists for each person, animal or object read at step S120.
If it is determined at step S130 that one or more of the people, animals or objects read at step S120 does not have a unique identification number, then, at step S140, a new entry is created in table 530 to list the person, animal or object and to assign a unique reference number thereto. Thus, a new entry is created for each person, animal or object for which no entry already exists in table 530. On the other hand, if it is determined at step S130 that an entry already exists in table 530 for each person, animal and object read at step S120, then step S140 is omitted.
At step S150, the viewing information entered by the user as information 320 and 330 at step S20 is stored in archive database 50 (indicated at 540 in Figure 5), together with a link to the associated text data 520 and JPEG image data 502 (this link being schematically represented in Figure 5 by the viewing information being in the same vertical column as the associated text data and JPEG image data-thus for example, viewing information 542 is associated with text data 522 and JPEG image data 504).
Referring again to Figure 2, at step S40, central controller 20 determines whether image data for another input image is stored in image data store 30. Steps S10 to S40 are repeated until each input image has been archived in archive database 50 as described above.
In the example shown in Figure 5, archive data for six images is stored. The data 550,552 and 554 for three of the images is stored with time information indicating that these images were recorded between 1 November 1998 and 1 December 1998, while the data 556 for a fourth image is stored with time information indicating that it was recorded between 1 December 1998 and 1 January 1999, and the data 558 and 560 for fifth and sixth images is stored with time information indicating that these images were recorded between 1 January 1999 and 1 February 1999.
The viewing information 540 for images 550 and 552 defines, in conjunction with the data stored in table 530, that Simon is looking at Mike in these images.
Similarly, the viewing information 540 for image 554 defines that Alex is looking at a cat in the image. The viewing information 540 for image 556 indicates that Allan is looking at the Tower of London and also that Alex is looking at the Tower of London. The viewing information 540 for image 558 defines that Simon is looking at Allan and also that Allan is looking at the Tower of London. Similarly, the viewing information 540 for image 560 defines that Mike is looking at a newspaper in the image.
Figure 6 shows the processing operations performed by central controller 20, text searcher 60 and image display processor 70 to search the archive database 50 to identify each photograph which meets the search specification entered by a user and to display the identified photographs to the user on display device 6.
Referring to Figure 6, at step S200, central controller 20 causes a message to be displayed on display device 6 requesting the user to enter information defining the search of archive database 50 that is required. More particularly, in this embodiment, central controller 20 causes the display shown in Figure 7A to appear on display device 6.
Referring to Figure 7A, the user is requested to enter information defining the image or images that he wishes to find in the archive database 50. More particularly, in this embodiment, the user is requested to enter information 700 defining a person or animal present in the image (s) to be found, information 710 defining the person, animal or object at which the person or animal identified in information 700 is looking in the image, and information 720 defining one or more key words which were present in the text information 310 entered by the user at step S20 (Figure 2). In addition, the user is able to enter time information defining a portion or portions of the database for which the search is to be carried out. More particularly, the user can enter information 730 defining a date beyond which the search should be discontinued (that is, the period before the specified date will be searched), information 740 defining a date after which the search should be carried out, and information 750 and 760 defining a start date and end date respectively between which the search is to be carried out.
In this embodiment, the user is not required to enter all of the information 700,710 and 720 for one search, and instead may omit one or two pieces of this information.
If the user enters all of the information 700,710 and 720, then the search will be carried out to identify each image in the meeting archive database 50 in which the person or animal identified in information 700 is looking at the person, animal or object identified in information 710 and the key words defined in information 720 are associated with the image in text data 520. On the other hand, if information 720 is omitted, then a search is carried out to identify each image in which the person or animal identified in information 700 is looking at the person, animal or object identified in information 710, irrespective of the text which is associated with the image. If the information 710 is omitted, then a search is carried to identify each image in which the person or animal identified in information 700 is present and the key words defined in information 720 are present in the text data 520 associated with the image. If the information 700 is omitted, then a search is carried out to identify each image in which the person, animal or object identified in information 710 is present and the key words defined in information 720 are present in the text data 520 associated with the image. If information 710 and 720 is omitted, then a search is carried out to identify any image in which the person or animal identified in information 700 is present, irrespective of the person, animal or object at which they are looking and irrespective of the text data 520 associated with the image. If the information 700 and 720 is omitted, then a search is carried out to identify each image in which the person, animal or object defined in information 710 is present. Similarly, if information 700 and 710 is omitted, then a search is carried out to identify each image for which the associated text data 520 contains the key words defined in information 720.
In addition, the user may enter all of the time information 730,740,750 and 760 or may omit one or more pieces of this information.
Once the user has entered all of the required information to define the search, he begins the search by clicking on area 770 using a user input device 4, such as a mouse.
Referring again to Figure 6, at step S210, the search information entered by the user is read by central controller 20 and the instructed search is carried out. More particularly, in this embodiment, central controller 100 converts any person, animal or object identified in information 700 or 710 to a corresponding unique reference number using table 530, searches the viewing information 540 to identify each image satisfying the requirements specified in information 700 and 710, and searches the text data 520 to identify which image or images of those identified on the basis of the viewing information 540 have the key words defined in information 720 associated therewith. If any time information has been entered by the user, then these searches are restricted to the dates defined by those time limits.
At step S220, central controller 100 displays a list of relevant images identified during the search to the user on display device 6. More particularly, central controller 100 displays information such as that shown in Figure 7B to the user.
Referring to Figure 7B, a list is displayed of each image which satisfies the search parameters, and information is displayed defining the time at which the image was recorded (this being the information 300 input by the user at step S20). The user is then able to select one of the images to be displayed on display device 6 by clicking on the required image in the list using a user input device 4, such as a mouse. In the example shown in Figure 7B, two images are listed as having being found in the search. By way of example, if the user had entered"Allan"as information 700 and"Tower of London" as information 710 to define the search, but had not entered information 720, then the two images 556 and 558 would be identified as meeting these search criteria.
At step S230, central controller 20 reads the selection made by the user at step S220, and image display processor 70 displays the selected image on the user display device 6.
Various modifications and changes can be made to the above embodiment.
For example, in the embodiment above, the text information 310 is manually entered by a user (that is by typing or speaking through a voice recognition processor). However, the text information may be input by using an optical character-recognition processor to convert written text to digital data. In particular, this method may be used to archive material containing both photographs and words, such as magazines, brochures, catalogues etc. such that the input image data is generated using scanner 8 and the input text data is generated using an optical character recognition processor.
In the embodiment above, at step S20 (Figure 2), the information 300 defining the time at which the input image was recorded in entered manually. However, some cameras print the recording date on the photograph and, in such cases, processing may be performed by processing apparatus 2 to read this date from the input image data.
In addition the recording date may be incorporated in other ways as part of the input image data and read by processing apparatus 2.
In the embodiment above, at step S20 (Figure 2), the information 320 defining a subject person or animal is entered manually. However, instead, image processing may be carried out by processing apparatus 2 to perform image identification to identify each person or animal type in the image automatically. If one or more animals cannot be identified using such techniques, then the user may be requested to input information 320 manually.
In the embodiment above, processing apparatus 2 inclues functional components for receiving and generating data to be archived (for example, central controller 20, image data store 30 and archive processor 40), functional components for storing the archive data (for example archive database 50), and also functional components for searching the database and retrieving information therefrom (for example central controller 20 and text searcher 60). However, these functional components may be provided in separate apparatus. For example, one or more apparatus for generating data to be archived, and one or more apparatus for database searching may be connected to one or more databases via a network, such as the Internet. Figure 8 illustrates an example configuration of such separate apparatus comprising data input apparatus 800, a database 810, and database searching apparatus 820.
Other modifications and changes are, of course, possible without departing from the spirit and scope of the invention.
The contents of the applicant's co-pending applications filed concurrently herewith (attorney reference numbers:
2635601, 2643401, 2643601~, 2643901,2644001,2644101,
2744201 and 2644601) are hereby incorporated by reference.

Claims (67)

  1. CLAIMS 1. Apparatus for archiving still image data, comprising: means for receiving image data defining an input image; means for generating first archive data defining a person or animal in the input image; means for generating second archive data defining a person, animal or object at which the person or animal defined in the first archive data is looking; and a database for storing the first and second archive data such that it is associated with image data defining the input image.
  2. 2. Apparatus according to claim 1, further comprising means for generating third archive data defining text, and wherein the database is arranged to store the third archive data such that it is associated with the first and second archive data and the stored image data defining the input image.
  3. 3. Apparatus according to claim 2, wherein the means for generating third archive data comprises means for performing optical character recognition.
  4. 4. Apparatus according to claim 2, wherein the means for generating third archive data comprises means operable to allow a user to input the text information manually.
  5. 5. Apparatus according to any preceding claim, further comprising means for generating fourth archive data defining information about when the input image was recorded, and wherein the database is arranged to store the fourth archive data such that it is associated with the stored image data defining the input image and related archive data.
  6. 6. Apparatus according to claim 5, wherein the means for generating fourth archive data comprises means for performing processing to read a recording date from the received image data.
  7. 7. Apparatus according to claim 5, wherein the means for generating fourth archive data comprises means operable to allow a user to input manually the information about when the input image was recorded.
  8. 8. Apparatus according to any preceding claim, wherein the means for generating first archive data comprises means for performing image recognition processing.
  9. 9. Apparatus according to any of claims 1 to 7, wherein the means for generating first archive data comprises means operable to allow a user to input the first archive data manually.
  10. 10. Apparatus according to any preceding claim, wherein the means for generating second archive data comprises means operable to allow a user to input the second archive data manually.
  11. 11. Apparatus according to any preceding claim, further comprising search means for searching data stored in the database in accordance with search instructions to identify each stored image which meets criteria defined in the search instructions.
  12. 12. Apparatus according to claim 11, wherein the search means is operable to search data stored in the database in accordance with search criteria relating to the second archive data.
  13. 13. Apparatus according to claim 12, wherein the search means is operable to search data stored in the database in accordance with search criteria relating to the first and second archive data.
  14. 14. Apparatus according to any of claims 11 to 13, wherein the search means is operable to search data stored in the database in accordance with search criteria relating to any one or more of the first, second, third or fourth archive data.
  15. 15. Apparatus according to any preceding claim, further comprising display means for displaying an image to a user.
  16. 16. Apparatus for generating data for archiving still image data in a database, comprising: means for receiving image data defining an input image; means for generating first archive data defining a person or animal in the input image; and means for generating second archive data defining a person, animal or object at which the person or animal defined in the first archive data is looking.
  17. 17. Apparatus according to claim 16, further comprising means for generating third archive data defining text.
  18. 18. Apparatus according to claim 17, wherein the means for generating third archive data comprises means for performing optical character recognition.
  19. 19. Apparatus according to claim 17, wherein the means for generating third archive data comprises means operable to allow a user to input the text information manually.
  20. 20. Apparatus according to any of claims 16 to 19, further comprising means for generating fourth archive data defining information about when the input image was recorded.
  21. 21. Apparatus according to claim 20, wherein the means for generating fourth archive data comprises means for performing processing to read a recording date from the received image data.
  22. 22. Apparatus according to claim 20, wherein the means for generating fourth archive data comprises means operable to allow a user to input manually the information about when the input image was recorded.
  23. 23. Apparatus according to any of claims 16 to 22, wherein the means for generating first archive data comprises means for performing image recognition processing.
  24. 24. Apparatus according to any of claims 16 to 22, wherein the means for generating first archive data comprises means operable to allow a user to input the first archive data manually.
  25. 25. Apparatus according to any of claims 16 to 24, wherein the means for generating second archive data comprises means operable to allow a user to input the second archive data manually.
  26. 26. Apparatus according to any of claims 16 to 25, further comprising display means for displaying an image to a user.
  27. 27. Apparatus for searching data stored in a database comprising image data defining a plurality of still images and, for each of the still images, first archive data defining a person or animal in the input image and second archive data defining a person, animal or object at which the person or animal defined in the first archive data is looking, the apparatus comprising search means for searching the data in the database in accordance with search criteria relating to the second archive data to identify each stored image which meets the criteria.
  28. 28. Apparatus according to claim 27, wherein the search means is operable to search the data in the database in accordance with search criteria relating to the first and second archive data.
  29. 29. Apparatus according to claim 27 or claim 28, wherein the search means is further operable to search data stored in the database in accordance with search criteria relating to third archive data, the third archive data defining text.
  30. 30. Apparatus according to any of claims 27 to 29, wherein the search means is further operable to search data stored in the database in accordance with search criteria relating to fourth archive data, the fourth archive data defining information about when an image was recorded.
  31. 31. A method of archiving still image data in a computer database, comprising the steps of: receiving image data defining an input image; generating first archive data defining a person or animal in the input image; generating second archive data defining a person, animal or object at which the person or animal defined in the first archive data is looking ; and storing the first and second archive data in the database such that it is associated with image data defining the input image.
  32. 32. A method according to claim 31, further comprising the steps of generating third archive data defining text, and storing the third archive data in the database such that it is associated with the first and second archive data and the stored image data defining the input image.
  33. 33. A method according to claim 32, wherein the step of generating third archive data comprises performing optical character recognition.
  34. 34. A method according to claim 32, wherein the step of generating third archive data comprises storing text information manually input by a user.
  35. 35. A method according to any of claims 31 to 34, further comprising the step of generating fourth archive data defining information about when the input image was recorded, and storing the fourth archive data in the database such that it is associated with image data defining the input image and related archive data.
  36. 36. A method according to claim 35, wherein the step of generating fourth archive data comprises performing processing to read a recording date from the received image data.
  37. 37. A method according to claim 35, wherein the step of generating fourth archive data comprises storing information manually input by a user about when the input image was recorded.
  38. 38. A method according to any of claims 31 to 37, wherein the step of generating first archive data comprises performing image recognition processing.
  39. 39. A method according to any of claims 31 to 37, wherein the step of generating first archive data comprises storing first archive data manually input by a user.
  40. 40. A method according to any of claims 31 to 39, wherein the step of generating second archive data comprises storing second archive data manually input by a user.
  41. 41. A method according to any of claims 31 to 40, further comprising a search step of searching data stored in the database in accordance with search instructions to identify each stored image which meets criteria defined in the search instructions.
  42. 42. A method according to claim 41, wherein, in the search step, data stored in the database is searched in accordance with search criteria relating to the second archive data.
  43. 43. A method according to claim 42, wherein, in the search step, data stored in the database is searched in accordance with search criteria relating to the first and second archive data.
  44. 44. A method according to any of claims 41 to 43, wherein, in the search step, data stored in the database is searched in accordance with search criteria relating to any one or more of the first, second, third or fourth archive data.
  45. 45. A method according to any of claims 31 to 44, further comprising the step of generating a signal conveying the database storing at least one input image and the archive data.
  46. 46. A method according to claim 45, further comprising the step of recording the signal either directly or indirectly to generate a recording thereof.
  47. 47. A method of generating data for archiving still image data in a database, comprising: receiving image data defining an input image; generating first archive data defining a person or animal in the input image; and generating second archive data defining a person, animal or object at which the person or animal defined in the first archive data is looking.
  48. 48. A method according to claim 47, further comprising a step of generating third archive data defining text.
  49. 49. A method according to claim 48, wherein the step of generating third archive data comprises performing optical character recognition.
  50. 50. A method according to claim 48, wherein the step of generating third archive data comprises storing text information manually input by a user.
  51. 51. A method according to any of claims 47 to 50, further comprising a step of generating fourth archive data defining information about when the input image was recorded.
  52. 52. A method according to claim 51, wherein the step of generating fourth archive data comprises performing processing to read a recording date from the received image data.
  53. 53. A method according to claim 51, wherein the step of generating fourth archive data comprises storing information manually input by a user about when the input image was recorded.
  54. 54. A method according to any of claims 47 to 53, wherein the step of generating first archive data comprises performing image recognition processing.
  55. 55. A method according to any of claims 47 to 53, wherein the step of generating first archive data comprises storing first archive data manually input by a user.
  56. 56. A method according to any of claims 47 to 55, wherein the step of generating second archive data comprises storing second archive data manually input by a user.
  57. 57. A method according to any of claims 47 to 56, further comprising a step of generating a signal conveying the first and second archive data.
  58. 58. A method according to claim 57, further comprising a step of recording the signal either directly or indirectly to generate a recording thereof.
  59. 59. A method of searching data stored in a database comprising image data defining a plurality of still images and, for each of the still images, first archive data defining a person or animal in the input image and second archive data defining a person, animal or object at which the person or animal defined in the first archive data is looking, the method comprising searching the data in the database in accordance with search criteria relating to the second archive data to identify each stored image which meets the criteria.
  60. 60. A method according to claim 59, wherein the data in the database is searched in accordance with search criteria relating to the first and second archive data.
  61. 61. A method according to claim 59 or claim 60, wherein the data stored in the database is searched in accordance with search criteria relating to third archive data, the third archive data defining text.
  62. 62. A method according to any of claims 59 to 61, wherein the data stored in the database is searched in accordance with search criteria relating to fourth archive data, the fourth archive data defining information about when an image was recorded.
  63. 63. A method according to any of claims 59 to 62, further comprising the step of displaying an image identified in the search.
  64. 64. A storage device storing computer-useable instructions for causing a programmable processing apparatus to become configured as an apparatus as set out in any of claims 1 to 30.
  65. 65. A storage device storing computer-useable instructions for causing a programmable processing apparatus to become operable to perform a method as set out in any of claims 31 to 63.
  66. 66. A signal conveying computer-useable instructions for causing a programmable processing apparatus to become configured as an apparatus as set out in any of claims 1 to 30.
  67. 67. A signal conveying computer-useable instructions for causing a programmable processing apparatus to become operable to perform a method as set out in any of claims 31 to 63.
GB9905158A 1999-03-05 1999-03-05 Image processing appatratus Expired - Fee Related GB2349761B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB9905158A GB2349761B (en) 1999-03-05 1999-03-05 Image processing appatratus
US09/519,178 US7139767B1 (en) 1999-03-05 2000-03-06 Image processing apparatus and database

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB9905158A GB2349761B (en) 1999-03-05 1999-03-05 Image processing appatratus

Publications (3)

Publication Number Publication Date
GB9905158D0 GB9905158D0 (en) 1999-04-28
GB2349761A true GB2349761A (en) 2000-11-08
GB2349761B GB2349761B (en) 2003-06-11

Family

ID=10849086

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9905158A Expired - Fee Related GB2349761B (en) 1999-03-05 1999-03-05 Image processing appatratus

Country Status (1)

Country Link
GB (1) GB2349761B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2376145A (en) * 2001-04-13 2002-12-04 Hewlett Packard Co Viewing and retrieving images over a network
GB2403305A (en) * 2003-06-25 2004-12-29 Canon Kk Image and date information processing
GB2403304A (en) * 2003-06-25 2004-12-29 Canon Kk Image and date information processing

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5499146A (en) * 1994-05-24 1996-03-12 Texas Instruments Incorporated Method and apparatus for recording images for a virtual reality system
US5819286A (en) * 1995-12-11 1998-10-06 Industrial Technology Research Institute Video database indexing and query method and system
WO1999065223A2 (en) * 1998-06-12 1999-12-16 Anivision, Inc. Method and apparatus for generating virtual views of sporting events

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5499146A (en) * 1994-05-24 1996-03-12 Texas Instruments Incorporated Method and apparatus for recording images for a virtual reality system
US5819286A (en) * 1995-12-11 1998-10-06 Industrial Technology Research Institute Video database indexing and query method and system
WO1999065223A2 (en) * 1998-06-12 1999-12-16 Anivision, Inc. Method and apparatus for generating virtual views of sporting events

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
http://garuda.imag.fr/MPEG4/syssite/syspub/version1/index *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2376145A (en) * 2001-04-13 2002-12-04 Hewlett Packard Co Viewing and retrieving images over a network
GB2403305A (en) * 2003-06-25 2004-12-29 Canon Kk Image and date information processing
GB2403304A (en) * 2003-06-25 2004-12-29 Canon Kk Image and date information processing

Also Published As

Publication number Publication date
GB2349761B (en) 2003-06-11
GB9905158D0 (en) 1999-04-28

Similar Documents

Publication Publication Date Title
EP0555027B1 (en) Information processing apparatus and method utilising useful additional information packet
US6499016B1 (en) Automatically storing and presenting digital images using a speech-based command language
US7533129B2 (en) Method software program for creating an image product having predefined criteria
DE60112212T2 (en) AGENT FOR INTEGRATED COMMENTING AND RECALLING PICTURES
US5625771A (en) Method for making cursor form
CN101546588B (en) Image processing apparatus and method
JPH0695629A (en) Automated system and method for acquisition, control and playback for presentation
JPH11250071A (en) Image database constructing method, image database device and image information storage medium
JPH05128166A (en) Electronic image filing device
JP2000276484A (en) Device and method for image retrieval and image display device
JP2002049907A (en) Device and method for preparing digital album
GB2349761A (en) Apparatus for archiving still image data
US20060100976A1 (en) Method for searching image files
JP5342509B2 (en) CONTENT REPRODUCTION DEVICE, CONTENT REPRODUCTION DEVICE CONTROL METHOD, CONTROL PROGRAM, AND RECORDING MEDIUM
JPS60114971A (en) Document information processing unit with memorandum
JP2004214759A (en) Method and apparatus of classifying image, and program
JP2000067079A (en) Method for retrieving book and device therefor and record medium recorded with book retrieval program
JP3334949B2 (en) Image processing apparatus and method
JPH10334121A (en) Retrieval attribute giving method and retrieving method for image, image processor, and recording medium where retrieval attribute giving program and retrieving program are recorded
JPH0962709A (en) Device and method for image retrieval
JPH09319755A (en) Pet retrieving system
JPS5820992Y2 (en) Kanji document search device
JPH0215374A (en) Image information retrieving device
JPS62248375A (en) Picture processor
JPH0544059B2 (en)

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20150305