AU2012232990A1 - Image selection based on correspondence of multiple photo paths - Google Patents

Image selection based on correspondence of multiple photo paths Download PDF

Info

Publication number
AU2012232990A1
AU2012232990A1 AU2012232990A AU2012232990A AU2012232990A1 AU 2012232990 A1 AU2012232990 A1 AU 2012232990A1 AU 2012232990 A AU2012232990 A AU 2012232990A AU 2012232990 A AU2012232990 A AU 2012232990A AU 2012232990 A1 AU2012232990 A1 AU 2012232990A1
Authority
AU
Australia
Prior art keywords
images
image
path
paths
sets
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2012232990A
Inventor
Julie Ray Kowald
Mark Ronald Tainsh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to AU2012232990A priority Critical patent/AU2012232990A1/en
Publication of AU2012232990A1 publication Critical patent/AU2012232990A1/en
Abandoned legal-status Critical Current

Links

Abstract

Abstract IMAGE SELECTION BASED ON CORRESPONDENCE OF MULTIPLE PHOTO PATHS Disclosed is a method (800) of identifying at least two image sets from a plurality of image sets, each image set having been captured by a different image capture device. The method determines, for each of the plurality of image sets (810), using location information (814) associated with the images of the image set, a geographic path (816) travelled by at least an image capture device used to capture the images of the image set. Path correspondence measures are calculated (820) to define the correspondence between at least two of the determined geographic paths. The method then identifies (830) at least two image sets from the plurality of image sets, the at least two image sets being those associated with geographic paths having the highest of the calculated correspondence measures. Process the 700 (640) \ photos / All photos from event made available for processing Photo information made consistent _ High level filtering of photos Select preferred Rate photos based photos from the on correlation of photo set 4-, photographer paths Combine photo path 740 Create album or based rating with other photo product ratings 738 with preferred photos Final photo selection based on ratings and 799," other factors End Fig. 7 norr 1 P044904 figsjodge

Description

S&F Ref: P044904 AUSTRALIA PATENTS ACT 1990 COMPLETE SPECIFICATION FOR A STANDARD PATENT Name and Address Canon Kabushiki Kaisha, of 30-2, Shimomaruko 3 of Applicant: chome, Ohta-ku, Tokyo, 146, Japan Actual Inventor(s): Julie Ray Kowald Mark Ronald Tainsh Address for Service: Spruson & Ferguson St Martins Tower Level 35 31 Market Street Sydney NSW 2000 (CCN 3710000177) Invention Title: Image selection based on correspondence of multiple photo paths The following statement is a full description of this invention, including the best method of performing it known to me/us: 5845c(6743992_1) IMAGE SELECTION BASED ON CORRESPONDENCE OF MULTIPLE PHOTO PATHS Technical Field [0002] The current invention relates to image selection and, in particular, to the use of information related to the capture of the images to assist in the selection. Background [0003] A motivation to capture photographs at events such as weddings, parties, team sporting events, amongst many others, includes reliving the memories and sharing the experience with other people. [0004] To effectively share the experience with other people requires a user to select the images that best represent the event from a collective point of view. One individual's point of view may not be appreciated by another individual. [0005] People who attend events such as those mentioned above can take photographs using Wi-Fi cameras, smart cameras and smart phones. Such devices enable the user to share the photos with others directly from those devices to repositories operated or accessible to the other people. Cloud repositories and shared network drives enable the photographs to stored and provide access control methods for sharing. Applications such as image browsers, for example as executable on mobile devices, can then be used to provide the connectivity and interfaces to the shared networks and cloud repositories. Some applications provide for automatic upload directly from the capture device to storage repositories with photo stream presentation layers, while other applications require the user to authorise the upload to selected presentation services. A photo stream is set up by single user and typically given a name associated with some form of access control, such as private, public or specific group membership, for the photographic content. Multiple photographers who have access to these services can contribute to shared "photo streams". [0006] Photographs in the photo stream can be shared using a variety of presentation methods including photo albums, slide shows, albums on social networks, galleries, and emails. [0007] Remote or co-located event participants or guest users can view the event photographs using viewer services such as image browsers while the event is still taking - 2 place or later, after the event has completed. The images can be arranged in photo streams, printed or digital photo books, slide shows and single or montage portraits. Viewers can annotate individual photographs in the photo stream or collections of photographs with their own comments or simply mark those using popular tags, such as "favourite", "like", "dislike" or "love". [0008] One problem, where multiple photographers contribute to the set, is that the photograph set often becomes much larger and more unusable than a photograph set captured by a single photographer. Multiple photographers typically take photographs of the same scenes when significant moments or sub-events occur at relatively the same time. Duplication of scenes can occur which often s in a large photograph set that contains large clusters of similar looking photographs. The large cumulative photograph set can be difficult to browse. When looking at images in time order, the user can be slowed down by large clusters of similar looking photographs taken at similar times. The user can become disorientated and not find images they want. Some photographers may also contribute photographs not related or not strongly related to the event, thereby making the set larger and more difficult to browse. Finding the photographs in the set based on sorting can become a time consuming activity for the user. Some tools that help a user find images in the set include using the publishers social network, which can provide context about those social groups to which the image relate, facial recognition, and location tracking methods such as GPS, which provide the viewer geographic context about where the photograph was captured. [0009] Ranking of photographs is used in social networking and photograph hosting services to promote popular photographs, photographers and help sort and filter the large number of photographs. Photograph ranking methods include algorithms that consider user interaction, such as the number of times an image has been viewed, the number of comments the image has, and the number of "likes", "favourites" or "dislikes" users have assigned. This can help the user to find images that are popular to other users. [0010] There are several problems with the above approaches. These include the ranking being an individual point of view, and that ranking algorithms are performed only the images that have been seen and tagged by users. Many images may not have been viewed and tagged by users and thus can become lost in the larger image set. In this case, users are presented again with the problem of searching through large sets, typically sorted by meta-data such as time, photographer, and location, to find images that are of interest.
-3 [0011] Photo presentation software such as photo book creation tools use various algorithms to automatically select images from image collections. A goal of such algorithms is to select the images that best represent the event in the photo book. This is a difficult problem. Some images in the set may be representative of the event while other may be unrelated side events that have less relevance to the collective audience. Summary [0012] It is an object of the present invention to substantially overcome, or at least ameliorate, one or more disadvantages of existing arrangements. [0013] The arrangements presently disclosed operate by deriving popularity of image subjects by using coinciding photo paths, such as photographer paths and subject paths, thereby providing users with a useful image filtering method. [0014] According to a first aspect of the present disclosure, there is provided a method A method of identifying at least two image sets from a plurality of image sets, each said image set being captured by a different image capture device, said method including the steps of: determining, for each of the plurality of image sets, using location information associated with the images of the image set, a geographic path travelled by at least an image capture device used to capture the images of the image set; calculating path correspondence measures defining the correspondence between at least two of the determined geographic paths; and identifying at least two image sets from the plurality of image sets, said at least two image sets being those associated with geographic paths having the highest of the calculated correspondence measures. [0015] Desirably the identifying of the at least two image sets comprises prioritising the images in the at least two images sets over images of any other image sets. Alternatively the identifying of the at least two image sets comprises ordering the images in the selected image sets based on the associated determined geographic paths. Preferably the determination of the paths is further based on time of capture information, and the calculation of the correspondence between the paths is further based on the time of capture of the images. [0016] The method may further comprise determining correspondence between the geographic paths based on at least one subject captured in the images. Desirably this may involve identifying the subjects of the images based on at least one of: (i) shot orientation metadata of the captured images, (ii) imaging metadata of the captured -4 images, and (iii) image content analysis. Preferably the imaging metadata comprises at least one of a frame size, a lens focal length, an aperture setting, and an angle of view of the image capture device by which the corresponding set of images is captured. [0017] Preferably the calculating of the path correspondence measures comprises: (a) calculating a correspondence measure of a first pair of paths comprising a first geographic path relative to a second geographic path, by assessing a match between each image of the first path against each image of the second path and forming a match value (1040) for the first path; (b) repeating step (a) for at least one further pair of paths; and (c) correlating the match values to form a correspondence measure for the pairs of paths. Desirably (i) step (b) comprises repeating step (a) by assessing a match between each image of the second path against each image of the first path and forming a match value (1040) for the second path, and step (c) provides a first correlation value of the first path with respect to the second path and a second correlation value of the second path with respect to the first path; and (ii) the method comprises repeating steps (i) each pair of paths associated with the plurality of sets of images to form a correlation value between each path and each other path. [0018] Advantageously the assessing comprises determining those images of the second path captured with a threshold time period of the image of the first path; determining those images of the second path captured within a threshold distance from the image of the first path; and incrementing the match value based on the those images so determined. Preferably the determining of images of the second path captured with a threshold time period of the image of the first path forms a first subset of images and the determining of those images of the second path captured within a threshold distance from the image of the first path comprises determining of those images of the first subset that were captured within a threshold distance from the image of the first path, and the match value is incremented only for those images satisfying both criteria. [0019] According to another aspect of the present disclosure there is provided a method of selecting representative images from a plurality of image sets, each said image set being captured by a different image capture device, said method comprising identifying those images and the corresponding image sets that are associated with a particular event; performing the method as described above to identify at least two of said plurality of images sets having highest corresponding geographic paths; ordering the images of the plurality of image sets based upon images associated with the identified at least two image sets; and selecting representative images for the event from the ordered images.
- 5 Preferably the identifying of the images comprises identifying metadata associated with images captured by different cameras operated by a single photographer, and modifying the identified metadata so that all such images are treated as a single set of images. [0020] According to another aspect of the present disclosure, there is provided an apparatus for implementing any one of the aforementioned methods. [0021] According to another aspect of the present disclosure there is provided a computer program product including a computer readable medium having recorded thereon a computer program for implementing any one of the methods described above. [0022] Other aspects of the invention are also disclosed. Brief Description of the Drawings [0023] At least one embodiment of the present invention will now be described with reference to the drawings and appendices, in which: [0024] Fig. 1A schematically illustrates the context of the presently disclosed arrangements with images selected from those captured at an event by multiple photographers being filtered for presentation; [0025] Fig. 1B shows a set of images captured by three photographers (P1, P2 and P3) at the event of Fig. 1A; [0026] Fig. 2 depicts the location of the photographs of Fig. 1 B in a spatial layout; [0027] Fig. 3 illustrates the order in which the photographers took the images of Fig. 1B and the scenes associated therewith; [0028] Fig. 4 illustrates the spatial-temporal correlation of two of the photographers of the event of Fig. 1A; [0029] Fig. 5 is an illustration of a page layout application that utilises the spatial temporal correlation of the images to produce usable narrative of the event of Fig. 1A; [0030] Fig. 6 is a flow diagram showing the capturing of photos at an event by multiple photographers; -6 [0031] Fig. 7 is a flow diagram showing the high level steps in processing the captured images of Fig. 6 to create an album or photo product;. [0032] Fig. 8 is a flow diagram showing the steps in using correspondence of the photo paths to rate images. [0033] Fig. 9 is a diagram of paths captured by photographers at an event to demonstrate aspects relevant to determining path distances; [0034] Fig. 10 is a flow diagram showing the steps in determining the correspondence strength between two paths; [0035] Fig. 11 is a flow diagram showing the steps in using correspondence of the subject paths to rate images; [0036] Fig. 12 is a diagram showing a photo path that relates to subject location; [0037] Fig. 13 is a diagram showing photo paths for three photographers that relates to subject location; [0038] Fig. 14 is a diagram showing two potential photographic subjects and the camera settings that relate to subject identification; [0039] Fig. 15 is a diagram showing two potential photographic subjects and the camera settings including an alternate field of view that relate to subject identification; [0040] Fig. 16 is a schematic block diagram representation of a system including the capture devices, photograph shared storage and album presentation repository and display; and [0041] Figs. 17A and 17B form a schematic block diagram of a general purpose computer system upon which arrangements described can be practiced. Detailed Description including Best Mode [0042] Fig. 1A shows an image processing arrangement 100 in which images captured by cameras 120 at an event 110 operated by multiple photographers P1 - P4, are collected at a storage device 130. A processing device 140 operates to filter, from a large - 7 set of candidate images, a smaller selection of images to be used for various purposes. In this example, the smaller selection of images is depicted as displayed in an image browser such as a gallery 150, and also a possibly different selection of images being presented in a photo book 160. There are many ways in which the selected images can be presented, such as slide shows and posters. The processing that is implemented within the processing device 140 supports the selection process when images are captured by multiple photographers 120 at the event 110, and preferably provides for input associated with image ratings. The processing device 140 uses the image ratings as one input when selecting images with higher rated images having precedence over lower rated images. In Fig. 1A the location of the storage device 130 and the processing device 140 can be distributed. A typical configuration would have the storage device 130 and processing device 140 located within the cloud with the photographers 120 uploading the images to the image storage 130. [0043] Notwithstanding the typical implementation mentioned above, Figs. 17A and 17B depict a general-purpose computer system 1700, upon which the various arrangements described can also be practiced. [0044] As seen in Fig. 17A, the computer system 1700 includes: a computer module 1701; input devices such as a keyboard 1702, a mouse pointer device 1703, a scanner 1726, a camera 1727, representative of the one of the cameras 120, and a microphone 1780; and output devices including a printer 1715, a display device 1714 and loudspeakers 1717. An external Modulator-Demodulator (Modem) transceiver device 1716 may be used by the computer module 1701 for communicating to and from a communications network 1720 via a connection 1721. The communications network 1720 may be a wide-area network (WAN), such as the Internet, a cellular telecommunications network, or a private WAN. Where the connection 1721 is a telephone line, the modem 1716 may be a traditional "dial-up" modem. Alternatively, where the connection 1721 is a high capacity (e.g., cable) connection, the modem 1716 may be a broadband modem. A wireless modem may also be used for wireless connection to the communications network 1720. The computer system 1700 may be a stand alone device with which the cameras 120(1727) may couple for example substantially simultaneously in real-time, for example a Wi-Fi connection, or physically via wired connections for post capture downloading of images to the computer module 1701. The computer system 1700 may alternatively be considered as part of the aforementioned "cloud", where the cameras 120(1727) couple via the networks 1720, 1722 via other means, such as wireless (cellular) telephone networks and the like.
-8 [0045] The computer module 1701 typically includes at least one processor unit 1705, and a memory unit 1706. For example, the memory unit 1706 may have semiconductor random access memory (RAM) and semiconductor read only memory (ROM). The computer module 1701 also includes an number of input/output (1/O) interfaces including: an audio-video interface 1707 that couples to the video display 1714, loudspeakers 1717 and microphone 1780; an I/O interface 1713 that couples to the keyboard 1702, mouse 1703, scanner 1726, camera 1727 and optionally a joystick or other human interface device (not illustrated); and an interface 1708 for the external modem 1716 and printer 1715. In some implementations, the modem 1716 may be incorporated within the computer module 1701, for example within the interface 1708. The computer module 1701 also has a local network interface 1711, which permits coupling of the computer system 1700 via a connection 1723 to a local-area communications network 1722, known as a Local Area Network (LAN). As illustrated in Fig. 17A, the local communications network 1722 may also couple to the wide network 1720 via a connection 1724, which would typically include a so-called "firewall" device or device of similar functionality. The local network interface 1711 may comprise an EthernetTM circuit card, a BluetoothTM wireless arrangement or an IEEE 802.11 wireless arrangement; however, numerous other types of interfaces may be practiced for the interface 1711. [0046] The 1/O interfaces 1708 and 1713 may afford either or both of serial and parallel connectivity, the former typically being implemented according to the Universal Serial Bus (USB) standards and having corresponding USB connectors (not illustrated). Storage devices 1709 are provided and typically include a hard disk drive (HDD) 1710. The HDD 1710 and/or the memory 1706 may operate at the store 130 discussed above. Other storage devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also be used. An optical disk drive 1712 is typically provided to act as a non-volatile source of data. Portable memory devices, such optical disks (e.g., CD-ROM, DVD, Blu-ray DiscTM), USB-RAM, portable, external hard drives, and floppy disks, for example, may be used as appropriate sources of data to the system 1700. [0047] The components 1705 to 1713 of the computer module 1701 typically communicate via an interconnected bus 1704 and in a manner that results in a conventional mode of operation of the computer system 1700 known to those in the relevant art. For example, the processor 1705 is coupled to the system bus 1704 using a connection 1718. Likewise, the memory 1706 and optical disk drive 1712 are coupled to the system bus 1704 by connections 1719. Examples of computers on which the described arrangements can be practised include IBM-PC's and compatibles, Sun Sparcstations, Apple MacTM or a like computer systems.
-9 (0048] The methods of image selection processing to be described may be implemented using the computer system 1700 wherein the processes of Figs. 1B - 15, may be implemented as one or more software application programs 1733 executable within the computer system 1700. In particular, the steps of the method of image selection are effected by instructions 1731 (see Fig. 17B) in the software 1733 that are carried out within the computer system 1700. The software instructions 1731 may be formed as one or more code modules, each for performing one or more particular tasks. The software may also be divided into two separate parts, in which a first part and the corresponding code modules performs the image selection methods and a second part and the corresponding code modules manage a user interface between the first part and the user. [0049] The software may be stored in a computer readable medium, including the storage devices described below, for example. The software is loaded into the computer system 1700 from the computer readable medium, and then executed by the computer system 1700. A computer readable medium having such software or computer program recorded on the computer readable medium is a computer program product. The use of the computer program product in the computer system 1700 preferably effects an advantageous apparatus for image selection from multiple sources, being the cameras 120(1727). [0050] The software 1733 is typically stored in the HDD 1710 or the memory 1706. The software is loaded into the computer system 1700 from a computer readable medium, and executed by the computer system 1700. Thus, for example, the software 1733 may be stored on an optically readable disk storage medium (e.g., CD-ROM) 1725 that is read by the optical disk drive 1712. A computer readable medium having such software or computer program recorded on it is a computer program product. The use of the computer program product in the computer system 1700 preferably effects an apparatus for image selection. [0051] In some instances, the application programs 1733 may be supplied to the user encoded on one or more CD-ROMs 1725 and read via the corresponding drive 1712, or alternatively may be read by the user from the networks 1720 or 1722. Still further, the software can also be loaded into the computer system 1700 from other computer readable media. Computer readable storage media refers to any non-transitory tangible storage medium that provides recorded instructions and/or data to the computer system 1700 for execution and/or processing. Examples of such storage media include floppy disks, magnetic tape, CD-ROM, DVD, Blu-ray T M Disc, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, or a computer readable card such as a -10 PCMCIA card and the like, whether or not such devices are internal or external of the computer module 1701. Examples of transitory or non-tangible computer readable transmission media that may also participate in the provision of software, application programs, instructions and/or data to the computer module 1701 include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like. [0052] The second part of the application programs 1733 and the corresponding code modules mentioned above may be executed to implement one or more graphical user interfaces (GUls) to be rendered or otherwise represented upon the display 1714. Through manipulation of typically the keyboard 1702 and the mouse 1703, a user of the computer system 1700 and the application may manipulate the interface in a functionally adaptable manner to provide controlling commands and/or input to the applications associated with the GUI(s). Other forms of functionally adaptable user interfaces may also be implemented, such as an audio interface utilizing speech prompts output via the loudspeakers 1717 and user voice commands input via the microphone 1780. [0053] Fig. 17B is a detailed schematic block diagram of the processor 1705 and a memory" 1734. The memory 1734 represents a logical aggregation of all the memory modules (including the HDD 1709 and semiconductor memory 1706) that can be accessed by the computer module 1701 in Fig. 17A. [0054] When the computer module 1701 is initially powered up, a power-on self-test (POST) program 1750 executes. The POST program 1750 is typically stored in a ROM 1749 of the semiconductor memory 1706 of Fig. 17A. A hardware device such as the ROM 1749 storing software is sometimes referred to as firmware. The POST program 1750 examines hardware within the computer module 1701 to ensure proper functioning and typically checks the processor 1705, the memory 1734 (1709, 1706), and a basic input-output systems software (BIOS) module 1751, also typically stored in the ROM 1749, for correct operation. Once the POST program 1750 has run successfully, the BIOS 1751 activates the hard disk drive 1710 of Fig. 17A. Activation of the hard disk drive 1710 causes a bootstrap loader program 1752 that is resident on the hard disk drive 1710 to execute via the processor 1705. This loads an operating system 1753 into the RAM memory 1706, upon which the operating system 1753 commences operation. The operating system 1753 is a system level application, executable by the processor 1705, to fulfil various high level functions, including processor management, - 11 memory management, device management, storage management, software application interface, and generic user interface. [0055 The operating system 1753 manages the memory 1734 (1709, 1706) to ensure that each process or application running on the computer module 1701 has sufficient memory in which to execute without colliding with memory allocated to another process. Furthermore, the different types of memory available in the system 1700 of Fig. 17A must be used properly so that each process can run effectively. Accordingly, the aggregated memory 1734 is not intended to illustrate how particular segments of memory are allocated (unless otherwise stated), but rather to provide a general view of the memory accessible by the computer system 1700 and how such is used. [0056] As shown in Fig. 17B, the processor 1705 includes a number of functional modules including a control unit 1739, an arithmetic logic unit (ALU) 1740, and a local or internal memory 1748, sometimes called a cache memory. The cache memory 1748 typically include a number of storage registers 1744 - 1746 in a register section. One or more internal busses 1741 functionally interconnect these functional modules. The processor 1705 typically also has one or more interfaces 1742 for communicating with external devices via the system bus 1704, using a connection 1718. The memory 1734 is coupled to the bus 1704 using a connection 1719. [0057] The application program 1733 includes a sequence of instructions 1731 that may include conditional branch and loop instructions. The program 1733 may also include data 1732 which is used in execution of the program 1733. The instructions 1731 and the data 1732 are stored in memory locations 1728, 1729, 1730 and 1735, 1736, 1737, respectively. Depending upon the relative size of the instructions 1731 and the memory locations 1728-1730, a particular instruction may be stored in a single memory location as depicted by the instruction shown in the memory location 1730. Alternately, an instruction may be segmented into a number of parts each of which is stored in a separate memory location, as depicted by the instruction segments shown in the memory locations 1728 and 1729. [0058] In general, the processor 1705 is given a set of instructions which are executed therein. The processor 1705 waits for a subsequent input, to which the processor 1705 reacts to by executing another set of instructions. Each input may be provided from one or more of a number of sources, including data generated by one or more of the input devices 1702, 1703, data received from an external source across one of the networks 1720, 1702, data retrieved from one of the storage devices 1706, 1709 or data - 12 retrieved from a storage medium 1725 inserted into the corresponding reader 1712, all depicted in Fig. 17A. The execution of a set of the instructions may in some cases result in output of data. Execution may also involve storing data or variables to the memory 1734. [0059] The disclosed image selection arrangements use input variables 1754, which are stored in the memory 1734 in corresponding memory locations 1755, 1756, 1757. The image selection arrangements produce output variables 1761, which are stored in the memory 1734 in corresponding memory locations 1762, 1763, 1764. Intermediate variables 1758 may be stored in memory locations 1759, 1760, 1766 and 1767. [0060] Referring to the processor 1705 of Fig. 17B, the registers 1744, 1745, 1746, the arithmetic logic unit (ALU) 1740, and the control unit 1739 work together to perform sequences of micro-operations needed to perform "fetch, decode, and execute" cycles for every instruction in the instruction set making up the program 1733. Each fetch, decode, and execute cycle comprises: (i) a fetch operation, which fetches or reads an instruction 1731 from a memory location 1728, 1729, 1730; (ii) a decode operation in which the control unit 1739 determines which instruction has been fetched; and (iii) an execute operation in which the control unit 1739 and/or the ALU 1740 execute the instruction. [0061] Thereafter, a further fetch, decode, and execute cycle for the next instruction may be executed. Similarly, a store cycle may be performed by which the control unit 1739 stores or writes a value to a memory location 1732. [0062] Each step or sub-process in the processes of Figs. 1 B to 15 may be associated with one or more segments of the program 1733 and is performed by the register section 1744, 1745, 1747, the ALU 1740, and the control unit 1739 in the processor 1705 working together to perform the fetch, decode, and execute cycles for every instruction in the instruction set for the noted segments of the program 1733. [0063] One or more components of the image selection methods may in some implementations be performed in dedicated hardware such as one or more integrated circuits. Such dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories.
- 13 [0064] An event, such as the event 110 is typically defined in terms of the captured images by a boundary formed by the location/s and time/s of image capture, and the people who attend or participate and thus forming the subject of image capture. This boundary may be established or specified by metadata associated with each captured image. [0065] Event participants and attendees may physically or remotely attend the event or view evidence of the event after the event has occurred. [0066] Events purposes include social gatherings, such as weddings and birthdays, special events such as product launches or exhibitions and social photography where photographers get together to take photographs of interesting locations such as city landmarks. [0067] Events hold significance to the attendees and are often recorded in photographs for posterity. The event photographs can be stored unedited using shared repositories and further edited into collections such as albums and photo merchandise. [0068] Many event attendees are likely to take photographs of the primary subjects of the event as these hold the most significance to the purpose of the event. Primary subjects include, for example, the bride and groom arriving at the wedding reception, the bride and groom cutting the cake, the speeches and the bridal waltz. Photographers are likely to have co-incident paths relating to the capture the primary subject(s) of the event. [0069] An event will generally also include "secondary subjects" and "unrelated subjects". Secondary subjects are less frequently photographed compared to primary subjects and may only be photographed by a relative few of the event photographers. Examples of secondary subjects include scenes at a wedding such as the flowers on the altar, church pews or the mother of the bride. Unrelated events are those that occur and are photographed but are not related to the main theme of the event. Examples of unrelated events at a wedding might include opportunistic shots of scenes at the location of the event of subjects such as people, sunsets or landscapes, or erroneous photographs (of the dance floor for example) that do not specifically tell the story of the wedding. [0070] Most attendees of the event who are taking photographs tend to commonly take photos of the primary subjects.
- 14 [0071] An event is typically organised into temporal and spatial sub-events where the primary subjects will potentially move between different locations over time. As the event attendee takes photographs of the subjects a temporal spatial travel path is formed. The event attendees' travel path is called the photographer path. The travel path of the subject is called the subject path. A photo path can be derived from the photographer path or the subject path, or both. [0072] During the course of the event, many photographs will be captured that include a mix of primary, secondary and unrelated subjects and will be available for event users to view. [0073] A photo path will form for every event attendee that shares two or more photographs. The photo path can be used to filter the collection of photographs. The photo path enables a user to find images using the points of view of the photographer. What is often more important than using individual photo paths to filter the photograph set. This is a filtering of the set according to a path that is most significant in conveying the event story to the collective group of photographers. Thus a collective path of the event attendees may be determined for use in filtering the collective set of images of the event. The photo path may be a photographer path or a subject path. [0074] In a preferred implementation, images that are most significant to the purpose of the event are identified by correlating the photographer's paths for strongly parallel capture times and locations. This method is used to rank photographs in the set based on capture popularity. [0075] In another implementation, a subject path is formed using correlated capture paths and camera trajectories to derive subject location and times. A trajectory can be formed from the photographer's camera to the subject. The trajectory can be used to identify the subject location, and with the time of capture being available with the meta data. The subject's travel path can be formed using this combined meta-data. The subject path provides a more accurate representation of the primary subjects' actual geographic location and travel path than the photographer path. [0076] At an event where many photographers are taking photographs in a shared collection, the photograph image set can become large and difficult to use. As the photographs are a record of the event it is useful to know which photographs best tell the story that represents the event. One way of accessing the representative photographs is to identify the behaviour of the photographers during the event and look for common -15 behaviour. Photographer behaviour can be identified from observing the photographer's travel path using capture location and time to identify a common subject interest. A measure of shared interest of a photograph can be identified from points at which many photographers' paths, such as the paths 906 and 909 seen in Fig. 9, correlate by time and location. The photographs that have the highest level of shared interest are considered to best represent the event. Images that best represent the event may be presented in some form such as an album such that may be displayed in many forms including a video, slideshow, digital or printed album. The subject path, such as the path 1200 seen in Fig. 12, tells more about the relevance, or validates and identifies the photographer's path. [0077] To illustrate the image processing and selection, this description makes use of the example of a wedding where guests share the photographs they take during the event. Figs. 1 to 6 show a use case that illustrates the formation of photo paths at a wedding event. Fig. 1B shows a set 1000 of images 1001 captured by three photographers (P1, P2 and P3) at the same wedding event, for example each using a corresponding camera 1727, such as seen in Fig. 17A. The set 1000 consists of 13 images from a wedding reception event and is simplified to explain the use case of the collective set of images captured by wedding guests of the event. At a typical wedding event there may be hundreds or thousands of images in the set. The problem this presents is how to select the best images that will best represent the event in an album summary. In Fig. 1 B, photographs 1.1.1 - 1.1.3 are three photographs that show the arrival of the bride and groom at an entrance of the reception centre captured by two photographers (P1) and (P2). Photographs 1.2.1 - 1.2.2 show two photographs of the speeches made for the bride and groom captured by two photographers (P1) and (P2). Photographs 1.3.1 - 1.3.3 are three photographs of the cutting of the cake ceremony by the bride and groom. Photograph 1.4.1 is one photograph of a group of wedding guests by one photographer (P3). Photographs 1.5.1 - 1.5.3 are three photographs of the bridal waltz captured by three photographers (P1), (P2) and (P3). Lastly, photograph 1.6.1 is one photograph of two bridesmaids captured by one photographer (P3). [0078] Fig. 2 shows a spatial arrangement of locations indicating both (i) where the photographer was when they captured the photograph, and (ii) the location of the subject when they were photographed. The photographs generally include the arrival 201 of the bride and groom at the beginning of the event, and a photograph 206 of two bridesmaids leaving at the end of the event, these being shown at the location of entrance of the reception centre. Photographs 202 captured during the speeches and photographs 204 of wedding guests talking after the cutting of the cake are shown at the location of the -16 reception centre dining room. Photographs of the bride and groom cutting the cake 203 are shown at the location of the reception centre stage. Photographs taken of the bridal waltz 205 are shown at the location of the dance floor. The location of the photographs tells us something about the context of the event, however there are some photographs , such as the photographs 206,204 in the spatial arrangement that are temporally out of context. For example photographs 204 and 206 are photographs of secondary subjects in relation to the photo paths of the wedding event. Photograph 204 was captured of wedding guests socialising and photograph 206 captures two bridesmaids leaving the event at the end of the night. [0079] Fig. 3 shows the time order that the photographs were captured by the three photographers. Photographs are annotated with numbers starting from 0 to show the commencement of the time order. Photographs with the same number were photographed at substantially the same time, for example within a predetermined time window, which may be set to be, say, 5 seconds, 30 seconds, or 2 minutes, for example. For example, with respect to the cutting of the cake at 8:00 pm, being "Scene 3", the three images are not identical nor could all have been captured simultaneously due to the different physical disposition of the bride and groom. Such however may all have been captured within a time window of 2 minutes, which can be considered to be "at the same time" for the purposes of this description as it applies to a wedding scenario. The same may apply to "scene 5", the bridal waltz as seen in Fig. 3. By contrast, for a sporting event, "at the same time" may be predetermined to be within a time window of, say, 5 seconds or shorter. Fig. 4 shows the photographs with time order and spatial data combined. Photographer 1 and Photographer 2 follow photograph capture paths 401 and 402 respectively, which are strongly correlated in time order and spatial location, indicating that the context of their respective captured photographs is the same. Photographer 3 follows photograph capture path 403 is correlated in spatial layout, however the time order of Photographer 3 is not correlated, thereby indicating the photographs of Photographer 3 are not correlated with those of Photographers 1 and 2. [0080] Fig. 5 shows those photographs selected to best represent the event presented in a photo album 500. In this example, the photograph selection was based on the correlated photographer paths from Photographer 1 and Photographer 2. . The photographs of Photographer 3 are filtered out as they are not correlated with those on the photo paths of Photographers 1 and 2. The photographs in the album are presented in time order. Photographs 501 show the arrival of the bride and groom and were captured in the location of the entrance of the reception centre at approximately 6:00 pm, at the start of the event. Photographs 502 show the speeches and toasts and were -17 captured in the reception centre dining room at approximately 7:00 pm during the event. Photographs 504 show the bridal waltz and were captured near the dance floor around 10:00pm. [0081] Fig. 6 is a flowchart of a method 600 that describes the steps that occur in preparation for the processing of images. The method 600 will be described with reference to the arrangements of Figs. 1A and 17A, but also with respect the generic system configuration 1600 of Fig. 16. The method 600 commences with a capture step 610 where multiple photographers capture photos at the event of interest. [0082] Cameras 120, 1601, 1727 and other digital photo capturing devices, such a mobile telephone handsets, tablet devices etc., are capable of storing metadata together with the captured images. The metadata can include time and date of capture, as well location information. The location information will commonly be GPS co-ordinates recorded by the capturing device via a satellite or satellite aware positioning device. In addition altitude information may be recorded. GPS commonly has limitations when capture devices are indoor or in other location where the satellite signals necessary for GPS calculation cannot be obtained. In these indoor situations alternate mechanisms are available for determining location. In particular, Indoor Positioning Systems (IPS) have been developed by companies such as Google Inc., Nokia Corporation and the chip manufacturer, Broadcom Corporation. Such IPS for example can operate using wireless technologies over existing or introduced equipment and methods such as "Time of Arrival" and "Received signal strength indication". These developments mean that individual image capturing devices may be made capable of recording locations in both outdoor and indoor areas. Alternate or additional systems involve the use of image content recognition and device settings to set the location of the photo capture device at the time of capture. For example landmarks and even indoor features may be recognised and this can be augmented with information such as zoom settings to position the capture device to a location at the time of capture. These processes of obtaining the time and location information of image capture are performed during step 620. Additional information may be stored with the metadata that can support processing the image to determine the location of the image subject. This information may include orientation parameters recorded by the imaging device 120, 1601, 1727, as well as additional information such as zoom setting, focus points. [0083] Captured images can be shared directly from the capture devices 120, 1610, 1727. Step 630 represents a process of collecting images for the purpose of creation of a photo product. Particularly, in the example step 630, photos captured by multiple - 18 photographers at the event are made available for processing for selected ones of those photos to make an album or photo product. Cloud services and shared network drives 1602 of Fig. 16 can be used to enable access to the images and content with mobile image capture devices providing the connectivity and interfaces to the shared networks and cloud services. Some services include automatic upload from photo streams, while others require the user to authorise the upload to selected presentation services. Alternately the images captured at the event can be manually transferred to a common device, such as the computer module 1701, using physical cable or by loading memory devices or by a combination of the previously described methods. In this fashion, captured images are made available in a shared computing environment, such as the networks 1720, 1722 including the computer 1701. The images may then be processed in step 640. The processing of step 640 is preferably performed according to the method 700 of Fig. 7. [0084] Fig. 7 depicts a preferred method 700 for processing images captured by multiple photographers. Initially, step 710 provides that all captured photos from the particular event are made available for processing. This may involve the shared service, for example executing a program on the computer 1701, identifying all photos having generally associated time and location information. For example, for the wedding event of Fig. 1A, the photos of Fig. 1B may have been captured on date 2012:08:31 between times 19:00:00 and 23:59:00 and at GPS locations 35.12345xxS and 151.12345yyE, whereas other photos stored on or accessible to the computer 1701 may be for events that occurred perhaps on the same date, 2012:08:31, but between 12:15:00 and 14:30:00 at GPS locations 35.456767xS and 151.23456yE (a lunch-time picnic for example). [0085] Step 720 then operates to ensure that the images associated with the particular event (i.e. the wedding, as opposed to the picnic) have consistent time/date and location information. This can be used compensate for incorrect time settings on the capture devices for example where a time setting one device may have wandered compared to another device. In some implementations, the capture devices can also automatically synchronise their date/time settings, negating the requirement for this step. A method for correcting time information is by synchronising to a shared photographic events using image content processing. A cutting of the cake image at a wedding for example could be recognised and used for time synchronisation. The cutting of the cake will occur within a time period that has some duration, and photographers will capture photographs within this duration range. For example both devices may capture the cutting of the cake such as images 1.3.2(P2) and 1.3.3(P3) which occur over a period of, say, 60 seconds, whereas basic time settings on the cameras may differ by say 5 minutes, thereby -19 providing that images associated with at least one of the cameras could benefit from capture times being adjusted by a period of between 4 and 5 minutes. Step 720 may involve execution of an algorithm that uses the most highly correlated images and an average of their time of capture to determine a time point for synchronisation. [0086] Once the images are made consistent, then an image selection step 730, and later expanded on by steps 732,734,736 and 738, takes place. Step 730 is a process of assigning priorities for the images of the event. The priorities can be used to select the images most suitable for inclusion in the photo product that is to be constructed in step 740. The step 740 of creating the album or image product 1603, 1604, 500 may be automated with the images that make up the product being determined by processing of step 740. Alternately step 740 may provide a set of the most suitable candidate images from which a user can further filter out images leaving those that are to be included in the photo album or image product. The album or image product may include but is not necessarily restricted to printed photo albums, other printed photo merchandise, image galleries uploaded to social network sites, photos posted to social network sites and photographic network sites. [0087] Steps 732-738, also seen in Fig. 7, expand the process of step 730. Step 732 provides for high level filtering of images where desired, for example to remove unsuitable images. Optionally, images of low quality, such as blurry images or out of focus images can be removed at this point by applying a quality rating algorithm that identifies the quality images including those that are not well composed or are out of focus and applies a ranking value potentially restricts a particular image from the set of images for the event. Also, to avoid the inclusion of images that are visually similar, an image similarity algorithm can be optionally used to identify and remove repeated images. The result of step 732 is a possibly reduced set of images meeting basic quality and uniqueness criteria and which are available for selection into photo album or image product. [0088] Step 734, discussed later with reference to Fig. 8, then operates to select the images to be included in the photo album or image product. In step 734, the paths of photographers are determined and the correspondence of the paths is used to obtain a rating factor. That rating factor is then combined with other ratings in step 736 to obtain a final image rating. Methods for combining ratings include selection of the highest rating amongst those being combined and averaging the ratings of those that are being combined. Other sources of image ratings include explicit ratings by either photographers or viewers. For example, a rating may be derived from a viewer's consideration of an - 20 image via a social media database of the images. A final image selection step 738 operates to select the photos for inclusion in the album or image product based on the ratings and on other factors The other factors can include balancing images, thereby ensuring representative coverage of the event, and to have representative images of important event occurrences. Included in this could be, for example, ensuring there is an image of the cake cutting at a wedding. The factors can include a simple numerical maximum or other number to give a desired or targeted photo album size. The rating factor is desirably associated with at least two photo paths and the images associated with those two photo paths can be used to order or rank the universe of all images from all photographers for the particular event to thereby characterise those images that are considered to be more representative of the event based upon photo path. The final selection of representative images for inclusion in the photo album may then be based upon the ordering. [0089] Fig. 8 describes in more detail a preferred method 800 of using photo path correspondence to rate images, being step 734. In an initial step 810, the images captured by each photographer are used to create a photo path for that photographer. Examples of photo paths are shown in Fig. 9. Steps 812, 814 and 816 as seen in Fig. 8 show a preferred expansion of step 810. Step 812 operates to obtain a capture time for each photo in the set of photos pertaining to the particular photographer. Step 814 then obtains a capture location for each of those photos. These details are typically extract from metadata or modified metadata of each photo. Step 816 then operates to correlate the photos of the particular photographer into a "path". Path data can be represented by pairs of time and location data. The form of the path can be simply by a straight line connections between the data pairs, or in some implementations by interpolating a path from the data pairs. With a path so formed, data pairs of associated other images captured by other photographers can be correlated with the particular photographer path. An alternative implementation can also include altitude information in the path, so that events taking place, for example, on multiple floors levels within a building ,can be correctly processed. [0090] After the paths are determined for each of the photographers, the correspondences between the different paths are determined in step 820. These correspondences provide for the identification of at least two images sets, associated with corresponding photo paths that preferably correlate better than image sets of other photo paths. Step 820 is discussed in more detail below, particularly in relation to Fig. 10.
- 21 [0091] Step 830 then operates where two or more of the photo paths correlate highly as determined in step 820, to then rate or prioritize images from those highly correlated photo paths over or higher than photos from other photo paths for the event. Step 830 can also operates to order the images in the selected image sets based on the associated determined geographic paths. The net result of step 830, and hence step 734 is typically a limited or reduced set of photos from the event that are ordered or prioritised based on correlation with best correlated photo paths. The photos in the reduced set remain associated with the corresponding photo path of the capturing photographer, but nevertheless may correlate well (or perhaps poorly) with other photos and/or other photo paths. [0092] Photos that belong to paths that correlate highly are given a high rating. Fig. 9 shows example photo paths for 4 photographers. The path of the photographer 1 is shown as path 906 with shots captured by photographer 1 along the path shown using the "+" symbol 918. The time of capture relative to a starting time (0) is shown for each of the photographers as a number 930. Similarly the path for photographer 2 is shown as the path 909 with the capture position for photographer 2 shown using the symbol "o" 921. A path 912 for photographer 3 has capture positions identified by the symbol "x" 924. A path 915 for photographer 4 has capture positions indicated by a square symbol "o" 927. Arrows 903 show the direction of each path, which can also be determined by the time order of photo capture along the paths. The paths 906,909,912,915 are shown as smooth curves interpolating between the corresponding capture positions. This use of smoothed interpolated curves is for descriptive purposes and is not considered when determining path correspondence according to step 820. A high level correspondence is a measure of the closeness of 2 photographers capture points in both location and time. In Fig. 9, the path 915 is seen to have low correspondence with the other 3 paths because of the capture locations on the path 915 are located at relatively large distances from shots on the other paths. The capture locations on the path 912 are close to those on paths 906 and 909, however the times when the images were captured do not closely match. By noting the times of capture on the path 912 it will be apparent that the corresponding photographer travelled in the opposite direction to the two photographers associated with paths 906 and 909. This means that path 912 has relatively low correspondence to both path 906 and path 909. In contrast the location and times of capture on paths 906 and 909 are relatively close and this indicates a relatively higher correspondence between these two paths. A preferred method for calculating the correspondence between a pair of paths, as used in step 820, is shown as the method 1000 in Fig. 10.
- 22 [0093] The method 1000 operates initially to consider each of the images in a first path, which is selected in sequence according to step 1010. The selection of current image from the first path in step 1010 includes processing according to steps 1020, 1030 and 1040. Step 1020 involves identifying images from the second path of the pair that occur within a threshold time period of the current selected image from the first path. Step 1030 then considers if at least one of those images identified in step 1020 is within a threshold distance from the current selected image from the first path, as selected in step 1010. The threshold distance may be predetermined for the particular event being processed (e.g. for the wedding the threshold distance may be 5 metres). The consideration of step 1030 can be performed using the corresponding capture locations to determine the distances. If there is at least one image from the second path within the distance limit and thus within the time limit, therefore satisfying both criteria, then there is considered to be a match with the current selected image from the first path, and a match count is incremented in step 1040. [0094] According to step 1010, the sequence of steps 1020, 1030 and 1040 is performed for each image in the first path. [0095] When all images in the first path have been processed in step 1010 then, according to step 1050, on a first pass, step 1060 is then processed. In step 1060, the paths are swapped and the previous calculations of step 1010 are again made. On conclusion of step 1050 for the second pass, match count values exist for each image of each path that represents the number of images of the other path that was captured within a threshold distance and threshold time of the particular image. [0096] Step 1070 then operates to determine the correspondence between the pair of paths by calculating a correlation value based on the ratio of the total number of matches over the total number of images considered. Further the locations where the matches occurred are then considered in step 1080 and the diversity of the locations is further used to adjust the correspondence. This can be by reducing the correlation to zero if both image paths are not from a threshold number of different locations. [0097] An alternate implementation may use the location of the subject, captured by a given photographer, and not the position of the photographer, to create the photo path. The created path in such a circumstance may be called a "subject path" to distinguish from the "photographer path" which is specific example described above. One issue with the photographer path is that two co-located photographers can be substantially simultaneously capturing different subjects (e.g. one photographer captures the bride, - 23 whilst the other captures the groom). In such an instance, the subject path can provide a more accurate representation of the primary subject's actual geographic location and travel path than that of the photographer path. To determine the subject path, additional information must be determined and stored by the camera and additional methods are required to determine the location of the subject. This additional information can include focal setting, direction (azimuth), and inclination, that may be made available as image metadata. Techniques for assessment and determination of the subject path are now discussed. [0098] Fig. 11 shows a method 1100 for using the subject location to determine the photographer path. The method 1100 is a complement to the method of Fig. 8 for step 734 of Fig. 7, with the subject location replacing the use of capture location in determining the photographer path. In this regard, steps 1110 - 1130 are the same as those seen in Fig. 8 excepting that in step 1114, as compared to step 814, the method 1100 obtains the subject location for each photo in the set of images. [0099] Figs. 12, 13, 14, 15, 16 and 17 demonstrate characteristics of subject path and mechanisms for determining the subject. The term "subject path" is used herein to reference a photographer path that uses the subject location, and not the capture location. [00100] Fig. 12 shows example photographer paths 1206 and 1209 correlated by capture time and location, and the subject paths 1201 and 1202 based on subject locations, for two photographers. Photographs captured by photographer 1 on photographer path 1206 are illustrated along the first subject path 1201 and the direction is shown by arrows. Photographs captured by photographer 2 on the path 1209 are illustrated along the second subject path 1202. The estimated subject location 1208 is shown along the photographer's camera shooting trajectory 1210 at a distance from the camera location. That distance may be determined by the camera lens and focus setting used at the time of image capture. Mechanisms for determining the subject location are shown in latter Figs. 14 and 15. [00101] The time of image capture relative to a starting time (0) 1203, corresponding to an initial image capture, is shown for each of the photographs on the subject paths 1201 and 1202 as a number 1205. [00102] Fig. 13 shows that by identifying photographer paths by capture time and subject position, it is possible to strengthen existing photographer path correspondence, or to identify photographer paths that are not necessarily correlated by capture location. Three -24 example photographer paths based on capture position are shown in Fig. 13 as photographer paths 1301, 1302, and 1303. When the method 1000 described in Fig. 10 is applied, based on capture position, it will be appreciated that the photographer path 1303 would not correlate to either of the paths 1301 and 1302. However, when the same method 1000 is applied for the photographer paths based on subject position paths 1311, 1312, and 1313 then all three paths correlate. [00103] Note that in some implementations multiple photographers may be capturing only one subject and hence a determination of the correspondence between the geographic paths of the photographers may be based on the one subject in the captured images. [00104] Fig. 14 and Fig. 15 show how the settings of a capture device (camera) can be used to locate a subject when combined with location of the camera. These camera settings can be stored as metadata with the photo for later processing in the same way as photo location and date-time information is stored as photo metadata. Fig. 14 and Fig. 15 show how adjusting cameras settings can result in the selection of a different subject despite the photographer not changing either location or a shooting trajectory 1408 from the photographer path. Fig. 14 shows for a camera 1499 the depth of field 1403, a first object - person 1401 outside of the depth of field (out of focus), another object - person 1402 within the depth of field at the ideal focal point 1410 (in focus). When an object person 1402 is within the depth of field, that object is in focus and has a high probability of being the photographer's subject. The depth of field lies between a near point 1412 and far point 1411. The far point 1411 may extend to infinity. The extent of the depth of field is defined by the frame size 1407 of the camera 1499, the current aperture setting 1406, the lens focal length 1405, and the related diagonal angle of view 1409. The subject distance 1404 is a measure from the aperture 1406 to the ideal focal point 1410. The subject distance 1404 and the angle of the shooting sight trajectory 1408 can be used to identify the position of the subject which is used to create a subject path. [00105] Fig. 15 shows that altering the corresponding settings 1505, 1506, 1507, and 1508 of the camera 1499 can result in a new depth of field position and redefined subject person 1502 from the scene. The ideal focal point 1510 and resulting subject distance 1504 will be shorter if the focal length of a camera 1499 is changed, for example from a 200 mm focal length 1511 to a 100 mm focal length 1509, resulting in a new photographer's subject position and thus affecting the configuration of the subject path. [00106] Fig. 14 and Fig. 15 illustrate the use of camera settings and corresponding metadata to determine the position of the subject of a photo. In addition to the use of -25 metadata to determine the subject, image content processing can determine whether two or more photos refer to the same object. An example of this is the use of face or clothing recognition that could be used to determine that a particular person subject is present in two photos captured by two photographers. If according, to the method 1000 described with reference to Fig. 10, two photographers took a photo of the same person within a threshold time as specified in step 1020, then the two photo subject distance (=0) would fall within the threshold distance considered in step 1030 and a match would be recorded. In this case, the absolute positional location of the subject does not need to be known. [00107] For some events, it is not uncommon for an individual photographer to use two or more cameras to capture images at the event. One example includes sporting events where a photographer may have different cameras configured with different lenses to capture action at significantly different distances. In such instances, the photographer path associated with images from the two cameras is coincident. Therefore it is highly desirable to treat, for the purposes of the processing described herein, each of those cameras as a single camera delivering a single set of images. This can be achieved by additional processing in step 630 where metadata associated with images captured by different cameras of the same photographer is tagged or modified so that all such images are treated as a single set. Industrial Applicability [00108] The arrangements described are applicable to the computer and data processing industries and particularly for the processing of images that had been captured by multiple devices substantially simultaneously. [00109] The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive. [00110] (Australia Only) In the context of this specification, the word "comprising" means "including principally but not necessarily solely" or "having" or "including", and not "consisting only of". Variations of the word "comprising", such as "comprise" and "comprises" have correspondingly varied meanings.

Claims (20)

1. A method of identifying at least two image sets from a plurality of image sets, each said image set being captured by a different image capture device, said method including the steps of: determining, for each of the plurality of image sets, using location information associated with the images of the image set, a geographic path travelled by at least an image capture device used to capture the images of the image set; calculating path correspondence measures defining the correspondence between at least two of the determined geographic paths; and identifying at least two image sets from the plurality of image sets, said at least two image sets being those associated with geographic paths having the highest of the calculated correspondence measures.
2. A method according to claim 1 wherein the identifying of the at least two image sets comprises prioritising the images in the at least two images sets over images of any other image sets.
3. A method according to claim 1, wherein the identifying of the at least two image sets comprises ordering the images in the selected image sets based on the associated determined geographic paths.
4. A method according to claim 1, wherein the determination of the paths is further based on time of capture information, and the calculation of the correspondence between the paths is further based on the time of capture of the images.
5. A method according to claim 1 further comprising determining correspondence between the geographic paths based on at least one subject captured in the images.
6. A method according to claim 5, further comprising identifying the subjects of the images based on at least one of: (i) shot orientation metadata of the captured images, (ii) imaging metadata of the captured images, and (iii) image content analysis. - 27
7. A method according to claim 6 wherein the imaging metadata comprises at least one of a frame size, a lens focal length, an aperture setting, and an angle of view of the image capture device by which the corresponding set of images is captured.
8. A method according to claim 1 wherein the calculating of the path correspondence measures comprises: (a) calculating a correspondence measure of a first pair of paths comprising a first geographic path relative to a second geographic path, by assessing a match between each image of the first path against each image of the second path and forming a match value (1040) for the first path; (b) repeating step (a) for at least one further pair of paths; (c) correlating the match values to form a correspondence measure for the pairs of paths.
9. A method according to claim 8, wherein; (i) step (b) comprises repeating step (a) by assessing a match between each image of the second path against each image of the first path and forming a match value (1040) for the second path, and step (c) provides a first correlation value of the first path with respect to the second path and a second correlation value of the second path with respect to the first path; and (ii) the method comprises repeating steps (i) each pair of paths associated with the plurality of sets of images to form a correlation value between each path and each other path.
10. A method according to claim 8 wherein the assessing comprises: determining those images of the second path captured with a threshold time period of the image of the first path; determining those images of the second path captured within a threshold distance from the image of the first path; and incrementing the match value based on the those images so determined.
11. A method according to claim 10 wherein the determining of images of the second path captured with a threshold time period of the image of the first path forms a first subset of images and the determining of those images of the second path captured within a threshold distance from the image of the first path comprises determining of those images of the first subset that were captured within a threshold distance from the image of the first path, and the match value is incremented only for those images satisfying both criteria. - 28
12. A method of selecting representative images from a plurality of image sets, each said image set being captured by a different image capture device, said method comprising: identifying those images and the corresponding image sets that are associated with a particular event; performing the method of any one of claims 1 to 10 to identify at least two of said plurality of images sets having highest corresponding geographic paths; ordering the images of the plurality of image sets based upon images associated with the identified at least two image sets; and selecting representative images for the event from the ordered images.
13. A method according to claim 12 wherein the identifying of the images comprises identifying metadata associated with images captured by different cameras operated by a single photographer, and modifying the identified metadata so that all such images are treated as a single set of images.
14. A computer readable storage medium having a program recorded thereon, the program being executable by computerised apparatus to identify at least two image sets from a plurality of image sets, each said image set being captured by a different image capture device, said program comprising: code for determining, for each of the plurality of image sets, using location information associated with the images of the image set, a geographic path travelled by at least an image capture device used to capture the images of the image set; code for calculating path correspondence measures defining the correspondence between at least two of the determined geographic paths; and code for identifying at least two image sets from the plurality of image sets, said at least two image sets being those associated with geographic paths having the highest of the calculated correspondence measures.
15. A computer readable storage medium according to claim 14 wherein the code for identifying of the at least two image sets comprises at least one of: code for prioritising the images in the at least two images sets over images of any other image sets; and code for ordering the images in the selected image sets based on the associated determined geographic paths; and the code for the determination of the paths is further based on time of capture information, and -29 the code for the calculation of the correspondence between the paths is further based on the time of capture of the images.
16. A computer readable storage medium according to claim 14 further comprising: code for determining correspondence between the geographic paths based on at least one subject captured in the images; and code for identifying the subjects of the images based on at least one of: (i) shot orientation metadata of the captured images, (ii) imaging metadata of the captured images, and (iii) image content analysis; wherein the imaging metadata comprises at least one of a frame size, a lens focal length, an aperture setting, and an angle of view of the image capture device by which the corresponding set of images is captured.
17. A computer readable storage medium according to claim 14 wherein the calculating of the path correspondence measures comprises code for: (a) calculating a correspondence measure of a first pair of paths comprising a first geographic path relative to a second geographic path, by assessing a match between each image of the first path against each image of the second path and forming a match value (1040) for the first path; (b) repeating step (a) for at least one further pair of paths by assessing a match between each image of the second path against each image of the first path and forming a match value (1040) for the second path; and (c) correlating the match values to form a correspondence measure for the pairs of paths to provide a first correlation value of the first path with respect to the second path and a second correlation value of the second path with respect to the first path; (d) repeating execution of (b) and (c) for each pair of paths associated with the plurality of sets of images to form a correlation value between each path and each other path.
18. A computer readable storage medium having a program recorded thereon, the program being executable by computer apparatus to select representative images from a plurality of image sets, each said image set being captured by a different image capture device, said program comprising: code for identifying those images and the corresponding image sets that are associated with a particular event; code for identifying at least two of said plurality of images sets having highest corresponding geographic paths, comprising: - 30 code for determining, for each of the plurality of image sets, using location information associated with the images of the image set, a geographic path travelled by at least an image capture device used to capture the images of the image set; code for calculating path correspondence measures defining the correspondence between at least two of the determined geographic paths; and code for identifying at least two image sets from the plurality of image sets, said at least two image sets being those associated with geographic paths having the highest of the calculated correspondence measures; code for ordering the images of the plurality of image sets based upon images associated with the identified at least two image sets; and code for selecting representative images for the event from the ordered images.
19. An image selection system, said system comprising: a processor coupled to at least one memory upon which are stored images from a plurality of image sets, each said image set being captured by a different image capture device; a program executable by said processor to select representative images from the plurality of image sets, said program comprising: code for identifying those images and the corresponding image sets that are associated with a particular event; code for identifying at least two of said plurality of images sets having highest corresponding geographic paths, comprising: code for determining, for each of the plurality of image sets, using location information associated with the images of the image set, a geographic path travelled by at least an image capture device used to capture the images of the image set; code for calculating path correspondence measures defining the correspondence between at least two of the determined geographic paths; and code for identifying at least two image sets from the plurality of image sets, said at least two image sets being those associated with geographic paths having the highest of the calculated correspondence measures; code for ordering the images of the plurality of image sets based upon images associated with the identified at least two image sets; and code for selecting the representative images for the event from the ordered images.
20. The invention substantially as described herein with reference to any one of the embodiments as that embodiment is illustrated in the drawings. - 31 Dated this 28th day of September 2012 CANON KABUSHIKI KAISHA Patent Attorneys for the Applicant/Nominated Person SPRUSON & FERGUSON
AU2012232990A 2012-09-28 2012-09-28 Image selection based on correspondence of multiple photo paths Abandoned AU2012232990A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2012232990A AU2012232990A1 (en) 2012-09-28 2012-09-28 Image selection based on correspondence of multiple photo paths

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2012232990A AU2012232990A1 (en) 2012-09-28 2012-09-28 Image selection based on correspondence of multiple photo paths

Publications (1)

Publication Number Publication Date
AU2012232990A1 true AU2012232990A1 (en) 2014-04-17

Family

ID=50479194

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2012232990A Abandoned AU2012232990A1 (en) 2012-09-28 2012-09-28 Image selection based on correspondence of multiple photo paths

Country Status (1)

Country Link
AU (1) AU2012232990A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10409858B2 (en) 2013-08-02 2019-09-10 Shoto, Inc. Discovery and sharing of photos between devices
US11954402B1 (en) * 2022-11-29 2024-04-09 Henk B. Rogers Talk story system and apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10409858B2 (en) 2013-08-02 2019-09-10 Shoto, Inc. Discovery and sharing of photos between devices
US11954402B1 (en) * 2022-11-29 2024-04-09 Henk B. Rogers Talk story system and apparatus

Similar Documents

Publication Publication Date Title
KR101557297B1 (en) 3d content aggregation built into devices
US8896709B2 (en) Method and system for image and metadata management
US8989506B1 (en) Incremental image processing pipeline for matching multiple photos based on image overlap
US20160191434A1 (en) System and method for improved capture, storage, search, selection and delivery of images across a communications network
US20150237268A1 (en) Multiple Camera Imaging
JP2016119508A (en) Method, system and program
WO2021088417A1 (en) Movement state information display method and apparatus, electronic device and storage medium
US8943020B2 (en) Techniques for intelligent media show across multiple devices
US10958837B2 (en) Systems and methods for determining preferences for capture settings of an image capturing device
TWI619037B (en) Method and system for generating content through cooperation among users
CN105005599A (en) Photograph sharing method and mobile terminal
US20150242405A1 (en) Methods, devices and systems for context-sensitive organization of media files
JP2004280254A (en) Contents categorizing method and device
CN105809618A (en) Picture processing method and device
CN111480168B (en) Context-based image selection
JPWO2013132557A1 (en) Content processing apparatus and integrated circuit, method and program thereof
CN107203646A (en) A kind of intelligent social sharing method and device
US10469739B2 (en) Systems and methods for determining preferences for capture settings of an image capturing device
US10778855B2 (en) System and method for creating contents by collaborating between users
AU2012232990A1 (en) Image selection based on correspondence of multiple photo paths
JP2014182650A (en) Image sharing device, method for controlling image sharing device and program
JP2015139001A (en) Information processing device, information processing method and program
JP2010068247A (en) Device, method, program and system for outputting content
CN104981753B (en) Method and apparatus for content manipulation
WO2016079609A1 (en) Generation apparatus and method for evaluation information, electronic device and server

Legal Events

Date Code Title Description
MK4 Application lapsed section 142(2)(d) - no continuation fee paid for the application