AU2009243525A1 - Network-based collaborative image filtering and backup - Google Patents

Network-based collaborative image filtering and backup Download PDF

Info

Publication number
AU2009243525A1
AU2009243525A1 AU2009243525A AU2009243525A AU2009243525A1 AU 2009243525 A1 AU2009243525 A1 AU 2009243525A1 AU 2009243525 A AU2009243525 A AU 2009243525A AU 2009243525 A AU2009243525 A AU 2009243525A AU 2009243525 A1 AU2009243525 A1 AU 2009243525A1
Authority
AU
Australia
Prior art keywords
images
network
server
image
resolution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2009243525A
Inventor
Dino Talic
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to AU2009243525A priority Critical patent/AU2009243525A1/en
Publication of AU2009243525A1 publication Critical patent/AU2009243525A1/en
Abandoned legal-status Critical Current

Links

Abstract

Abstract NETWORK-BASED COLLABORATIVE IMAGE FILTERING AND BACKUP A method of transferring one or more high-resolution images, from a set of images captured by an image capture device (202), over a network (120) to a server (101) connected to the network (120). A reduced resolution representation of each of the set of images is received from a first computer readable storage medium (1609) associated with the image capture device (202). The reduced resolution representations are published on the server ) (101). User input associated with the published reduced resolution representations is compared with a predetermined threshold and one or more high-resolution images are selected from the set of images based on the comparison. The selected high resolution images are transferred from the first computer readable storage medium (1609), over the network (120), to a second computer readable storage medium (106) associated with the server (101). 2423602vl 924583_Final -4/17 Receive reduced resolution representations 302 of newly captured images Images are published on 303 the user's gallery Measure and aggregate 305 image ratings Filter images by comparing user image 306 rating results with a predetermined threshold Selecting high resolution images to be uploaded form the host device for 307 each image above threshold Host device uploads requested full resolution 308 image to central server Fig. 3 2423744vl (924583_DrawingsFinal)

Description

S&F Ref: 924583 AUSTRALIA PATENTS ACT 1990 COMPLETE SPECIFICATION FOR A STANDARD PATENT Name and Address Canon Kabushiki Kaisha, of 30-2, Shimomaruko 3 of Applicant : chome, Ohta-ku, Tokyo, 146, Japan Actual Inventor(s): Dino Talic Address for Service: Spruson & Ferguson St Martins Tower Level 35 31 Market Street Sydney NSW 2000 (CCN 3710000177) Invention Title: Network-based collaborative image filtering and backup The following statement is a full description of this invention, including the best method of performing it known to me/us: 5845c(2424485_1) - 1 NETWORK-BASED COLLABORATIVE IMAGE FILTERING AND BACKUP FIELD OF INVENTION The present invention relates to image filtering and, in particular, to a method and apparatus for transferring a plurality of high-resolution images from a set of images captured by an image capture device to a network storage system. The present invention also relates to a computer program product including a computer readable medium having recorded thereon a computer program for transferring a plurality of high-resolution images from a set of images captured by an image capture device to a network storage system. DESCRIPTION OF BACKGROUND ART Modem digital cameras often include compact flash-based digital storage, which allows users to capture a large number of digital images and store the digital images on the camera before eventually saving or printing the images. Such digital images may include, for example, electronic photographs captured with a digital camera or scanned from an original document. The ability to capture and store such a large quantity of images poses difficulties for users in selecting images for printing or further processing. The stored images are often unclassified, meaning that the images have not been sorted according to some criteria. The difficulties arise in the fact that users typically require a long period of time to manually search through the unclassified collection of images and select the images desired for printing or further processing. The problem is further exacerbated by the fact that many of the captured images taken are of a similar setting. Online image backup systems allow a user to upload and store images on an Internet-based storage network. It is often safer to store images on an online image backup system as opposed to a local storage medium as it reduces the likelihood of data loss. The images are 5 also accessible from any terminal with an Internet connection and it allows for easy republication of images to other Internet sites, such as blogs and image sharing websites. However, online image backup systems generally pose restrictions as to the amount of data that can be stored on the backup system by each individual user. Further, bandwidth available for image upload and download from the online image backup system is often limited. 30 The restrictions on data and bandwidth associated with an online image backup system forces the user to select a subset of images for upload to the online backup system. The process of sorting through images and finding the best images to upload to the online image 2423602vl 924583_Final -2 backup system is a time consuming and tedious process. As a result, many consumers never upload their images to an online image backup system. One known method allows images to be automatically uploaded to an online image backup system over a wireless network, thus alleviating the user from the need to manually upload images. However, such a method does not solve the issue of constrained storage on the online image backup system or the bandwidth constraints of most users, particularly where wireless technologies are concerned. The user still needs to filter the images. SUMMARY OF THE INVENTION It is an object of the present invention to substantially overcome, or at least ameliorate, one or more disadvantages of existing arrangements. According to one aspect of the present invention there is provided a method of transferring one or more high-resolution images, from a set of images captured by an image capture device, over a network to a server connected to the network, the method comprising the steps of: receiving a reduced resolution representation of each of the set of images from a first computer readable storage medium associated with the image capture device; publishing the reduced resolution representations on the server; comparing user input associated with the published reduced resolution representations with a predetermined threshold; 0 selecting one or more high-resolution images from the set of images based on the comparison; and transferring the selected high resolution images from the first computer readable storage medium, over the network, to a second computer readable storage medium associated with the server. 5 According to another aspect of the present invention there is provided a computer system for transferring one or more high-resolution images, from a set of images captured by an image capture device, over a network to the computer system connected to the network, the system comprising: 2423602vl 924583_Final -3 a memory for storing data and a computer program; a processor coupled to said memory for executing the computer program, the computer program comprising instructions for: receiving a reduced resolution representation of each of the set of images from a first computer readable storage medium associated with the image capture device; publishing the reduced resolution representations on the server ; comparing user input associated with the published reduced resolution representations with a predetermined threshold; selecting one or more high-resolution images from the set of images based on the comparison; and transferring the selected high resolution images from the first computer readable storage medium, over the network, to a second computer readable storage medium associated with the server. According to still another aspect of the present invention there is provided an apparatus for transferring one or more high-resolution images, from a set of images captured by an image capture device, over a network to a server connected to the network, the apparatus comprising: means for receiving a reduced resolution representation of each of the set of images from a first computer readable storage medium associated with the image capture device; means for publishing the reduced resolution representations on the server; 0 means for comparing user input associated with the published reduced resolution representations with a predetermined threshold; means for selecting one or more high-resolution images from the set of images based on the comparison; and means for transferring the selected high resolution images from the first computer readable Z5 storage medium, over the network, to a second computer readable storage medium associated with the server. 2423602vl 924583_Final -4 According to still another aspect of the present invention there is provided a computer readable medium including a computer program recorded thereon for transferring one or more high-resolution images, from a set of images captured by an image capture device, over a network to a server connected to the network, the computer program comprising: code for receiving a reduced resolution representation of each of the set of images from a first computer readable storage medium associated with the image capture device; code for publishing the reduced resolution representations on the server; code for comparing user input associated with the published reduced resolution representations with a predetermined threshold; code for selecting one or more high-resolution images from the set of images based on the comparison; and code for transferring the selected high resolution images from the first computer readable storage medium, over the network, to a second computer readable storage medium associated with the server. Other aspects of the invention are also disclosed. BRIEF DESCRIPTION OF THE DRAWINGS One or more embodiments of the invention will now be described with reference to the following drawings, in which: 0 Figs. IA and I B form a schematic block diagram of a general purpose computer module upon which a server of Fig. 2 may be practiced; Fig. 2 shows a network-based storage and collaborative image rating system; Fig. 3 is a schematic flow diagram showing a method of transferring a plurality of high resolution images, from a set of images captured by an image device, to the server of Figs. 1A 5 and B; Fig. 4 shows an example collection of images reviewed in accordance with the method of Fig. 12; Fig. 5 is a schematic flow diagram showing a method of uploading a reduced resolution (or thumbnail) representation of each of a set of images; 2423602v l 924583_Final -5 Fig. 6 is a schematic flow diagram showing a method of publishing the uploaded reduced resolution images on the server of Figs. 1A and IB, as executed in the method of Fig. 3; Fig. 7 is a table showing a typical metadata table containing image filename and timestamp data; Fig. 8 shows a predetermined number of images on a hyper-text mark-up language page; Fig. 9 is a schematic flow diagram showing a method of measuring and aggregating the image rating results, as executed in the method 305; Fig. 10 is a schematic flow diagram showing a method of comparing user input, associated with published reduced resolution representations, with a predetermined threshold, as executed in the method of Fig. 3; Fig. 11 is a schematic flow diagram showing a method of uploading one or more high resolution images from a set of images; Fig. 12 is a schematic flow diagram showing a method of reviewing a large unsorted collection of images; Fig. 13 shows an example collection of images reviewed in accordance with the method of Fig. 12; Fig. 14 shows an alternative network-based storage and collaborative image rating system; and Fig. 15 is a schematic flow diagram showing a method of transferring (or transmitting) a ) plurality of high-resolution images to a networked enabled device. Fig. 16 is a schematic block diagram of the digital camera 202 upon which some of the methods to be described may be implemented. DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION !5 Where reference is made in any one or more of the accompanying drawings to steps and/or features, which have the same reference numerals, those steps and/or features have for the purposes of this description the same function(s) or operation(s), unless the contrary intention appears. Methods of transferring a plurality of high-resolution images from a set of images captured 30 by an image capture device to a network storage system will be described below. The described methods allow appropriate images to be selected from a plurality of unclassified images. The methods described aid the process of selection of high interest images from a 2423602vl 924583_Final -6 large collection of images by using aspects of social collaboration to reduce the workload on a single user and speed up the process of image selection. The term "social media" refers to online media which is created through collaboration between different users and is facilitated by highly accessible and scalable publishing techniques. There is currently a significant trend towards social media in online media forms. Examples of popular social media sites include Facebook T m and MySpace
TM
. The tendency for users to use social media to collaborate and communicate poses a significant potential resource for image filtering. Users regularly browse, comment and rate personal images on social media websites. The comments and ratings may be used to collaboratively filter the images. The typical consumer has a "bursty" image capture pattern in that there are periods, such as holidays or significant family events, when a relatively large volume of images is captured over a short period of time. The described methods use collaborative filtering based on social networking principles. The described methods allow the filtering of images of high interest to a user to be farmed out to multiple users. Such methods are effective because a single user typically finds a large volume of images, taken over a relatively short time, time-consuming to filter. Thus, although the overall volume of images taken amongst a user group is the same, collaborative image filtering may help users deal with the bursty nature of most image capturing. > The described methods use network-based collaborative filtering to filter an unclassified image collection. In accordance with the described methods, reduced resolution images are uploaded to a central server 101 (see Figs. IA, 1B and 2), and based on collaborative image filtering, one or more full resolution images are uploaded to the server 101. The reduced resolution images comprising an image collection are uploaded over a communications !5 network 120 (see Figs. 1A, lB and 2) to the central server 101. The uploaded reduced resolution images are then published on the server 101 allowing other users to view and interact with the uploaded images. Data records of the interactions of the users with the reduced resolution images are stored and based on this interaction data, the methods determine which images are likely to be of high interest to the owner. The interaction data for 30 each image is compared to a predetermined threshold and if the interaction data exceeds the predetermined threshold the image is deemed likely to be of high interest to the user. Finally, for each image which is determined likely to be of high interest to the user, the original or full resolution image is requested to be uploaded to the server 101 from a host device (e.g., 202 as 2423602vl 924583_Final -7 seen in Fig. 2). In this instance, the host device 202 then uploads the requested full resolution images to the server 100 for permanent storage on a storage medium connected to the network 120. Fig. 2 shows a system 200 on which the described methods may be implemented. The system 200 may be referred to as a network-based storage and collaborative image rating system. The system 200 comprises image capture devices 201, such as a digital camera 202 or a camera enabled mobile phone 203, which may be used to capture a set of images. The image capture devices 201 may be enabled for wireless communication. The image capture device 201 sends reduced resolution versions of the captured images over a wireless network (such as WiFi or 3G) 204 to a central server 101. The wireless network 204 connects to a communications network 120 via a gateway 205 and data communication are carried out over the network 120 to which the server 101 is also connected. The central server 207 aggregates and publishes the images to a user group 210 associated with a submitting user. The images from the central server 101 are presented to the user group 210 via a presentation layer 209. The user group 210 communicates with the central server 101 using the communications network 120 through a gateway 208. Figs. IA and lB collectively form a schematic block diagram of a general purpose computer module upon which the central server 101 can be practiced. As seen in Fig. I A, the server 101 may be connected to input devices such as a keyboard 102, a mouse pointer device 103, a scanner 126, a camera 127, and a microphone 180, and output devices including a printer 115, a display device 114 and loudspeakers 117. An external Modulator-Demodulator (Modem) transceiver device 116 may be used by the server 101 for communicating to and from the communications network 120 via a connection 121. The network 120 may be a wide-area network (WAN), such as the 5 Internet or a private WAN. Where the connection 121 is a telephone line, the modem 116 may be a traditional "dial-up" modem. Alternatively, where the connection 121 is a high capacity (eg: cable) connection, the modem 116 may be a broadband modem. A wireless modem may also be used for wireless connection to the network 120. The server 101 typically includes at least one processor unit 105, and a memory unit 106 30 for example formed from semiconductor random access memory (RAM) and semiconductor read only memory (ROM). The server 101 also includes an number of input/output (1/0) interfaces including an audio-video interface 107 that couples to the video display 114, loudspeakers 117 and microphone 180, an 1/0 interface 113 for the keyboard 102, mouse 103, 2423602vl 924583_Final -8 scanner 126, camera 127 and optionally a joystick (not illustrated), and an interface 108 for the external modem 116 and printer 115. In some implementations, the modem 116 may be incorporated within the server 101, for example within the interface 108. The server 101 also has a local network interface 111 which, via a connection 123, permits coupling of the server 101 to a local computer network 122, known as a Local Area Network (LAN). As also illustrated, the local network 122 may also couple to the wide network 120 via a connection 124, which would typically include a so-called "firewall" device or device of similar functionality. The interface 111 may be formed by an EthernetTM circuit card, a BluetoothTm wireless arrangement or an IEEE 802.11 wireless arrangement. The interfaces 108 and 113 may afford either or both of serial and parallel connectivity, the former typically being implemented according to the Universal Serial Bus (USB) standards and having corresponding USB connectors (not illustrated). Storage devices 109 are provided and typically include a hard disk drive (HDD) 110. Other storage devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also be used. An optical disk drive 112 is typically provided to act as a non-volatile source of data. Portable memory devices, such optical disks (eg: CD-ROM, DVD), USB-RAM, and floppy disks for example may then be used as appropriate sources of data to the server 101. The components 105 to 113 of the server 101 typically communicate via an interconnected bus 104 and in a manner which results in a conventional mode of operation of the server 101 known to those in the relevant art. Examples of computers on which the described arrangements can be practised include IBM-PC's and compatibles, Sun Sparcstations, Apple MacTM or alike computer systems evolved therefrom. Some of the described methods may be implemented using the server 101 wherein some of the processes to be described may be implemented as one or more software application 5 programs 133 executable within the server 101. In particular, the steps of some of the described methods are effected by instructions 131 in the software that are carried out within the server 101. The software instructions 131 may be formed as one or more code modules, each for performing one or more particular tasks. The software may also be divided into two separate parts, in which a first part and the corresponding code modules performs the 0 described methods and a second part and the corresponding code modules manage a user interface between the first part and the user. The software may be stored in a computer readable medium, including the storage devices described below, for example. The software 133 is typically stored in the HDD 110 or the 2423602vl 924583_Final -9 memory 106. The software is loaded into the server 101 from the computer readable medium, and is then executed by the server 101. Thus for example the software may be stored on an optically readable CD-ROM medium 125 that is read by the optical disk drive 112. A computer readable medium having such software or computer program recorded on it is a computer program product. The use of the computer program product in the server 100 preferably effects an advantageous apparatus for implementing one or more of the described methods. In some instances, the application programs 133 may be supplied to the user encoded on one or more CD-ROM 125 and read via the corresponding drive 112, or alternatively may be read by the user from the networks 120 or 122. Still further, the software can also be loaded into the server 101 from other computer readable media. Computer readable storage media refers to any storage medium that participates in providing instructions and/or data to the server 101 for execution and/or processing. Examples of such storage media include floppy disks, magnetic tape, CD-ROM, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the server 101. Examples of computer readable transmission media that may also participate in the provision of software, application programs, instructions and/or data to the server 101 include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like. The second part of the application programs 133 and the corresponding code modules mentioned above may be executed to implement one or more graphical user interfaces (GUIs) to be rendered or otherwise represented upon the display 114. Through manipulation of 5 typically the keyboard 102 and the mouse 103, a user of the server 100 and the application may manipulate the interface in a functionally adaptable manner to provide controlling commands and/or input to the applications associated with the GUI(s). Other forms of functionally adaptable user interfaces may also be implemented, such as an audio interface utilizing speech prompts output via the loudspeakers 117 and user voice commands input via 0 the microphone 180. Fig. I B is a detailed schematic block diagram of the processor 105 and a "memory" 134. The memory 134 represents a logical aggregation of all the memory modules (including the HDD 109 and semiconductor memory 106) that can be accessed by the server 101 in Fig. IA. 2423602vl 924583_Final -10 When the server 101 is initially powered up, a power-on self-test (POST) program 150 executes. The POST program 150 is typically stored in a ROM 149 of the semiconductor memory 106. A hardware device such as the ROM 149 is sometimes referred to as firmware. The POST program 150 examines hardware within the server 101 to ensure proper functioning, and typically checks the processor 105, the memory (109, 106), and a basic input output systems software (BIOS) module 151, also typically stored in the ROM 149, for correct operation. Once the POST program 150 has run successfully, the BIOS 151 activates the hard disk drive I10. Activation of the hard disk drive 110 causes a bootstrap loader program 152 that is resident on the hard disk drive 110 to execute via the processor 105. This loads an operating system 153 into the RAM memory 106 upon which the operating system 153 commences operation. The operating system 153 is a system level application, executable by the processor 105, to fulfil various high level functions, including processor management, memory management, device management, storage management, software application interface, and generic user interface. The operating system 153 manages the memory (109, 106) in order to ensure that each process or application running on the server 101 has sufficient memory in which to execute without colliding with memory allocated to another process. Furthermore, the different types of memory available in the system 100 must be used properly so that each process can run effectively. Accordingly, the aggregated memory 134 is not intended to illustrate how particular segments of memory are allocated (unless otherwise stated), but rather to provide a general view of the memory accessible by the computer system 100 and how such is used. The processor 105 includes a number of functional modules including a control unit 139, an arithmetic logic unit (ALU) 140, and a local or internal memory 148, sometimes called a cache memory. The cache memory 148 typically includes a number of storage registers 144 5 146 in a register section. One or more internal busses 141 functionally interconnect these functional modules. The processor 105 typically also has one or more interfaces 142 for communicating with external devices via the system bus 104, using a connection 118. The application program 133 includes a sequence of instructions 131 that may include conditional branch and loop instructions. The program 133 may also include data 132 which 0 is used in execution of the program 133. The instructions 131 and the data 132 are stored in memory locations 128-1030 and 135-1037 respectively. Depending upon the relative size of the instructions 131 and the memory locations 128-1030, a particular instruction may be stored in a single memory location as depicted by the instruction shown in the memory 2423602vl 924583_Final - 11 location 130. Alternately, an instruction may be segmented into a number of parts each of which is stored in a separate memory location, as depicted by the instruction segments shown in the memory locations 128-129. In general, the processor 105 is given a set of instructions which are executed therein. The processor 1105 then waits for a subsequent input, to which it reacts to by executing another set of instructions. Each input may be provided from one or more of a number of sources, including data generated by one or more of the input devices 102, 103, data received from an external source across one of the networks 120, 102, data retrieved from one of the storage devices 106, 109 or data retrieved from a storage medium 125 inserted into the corresponding reader 112. The execution of a set of the instructions may in some cases result in output of data. Execution may also involve storing data or variables to the memory 134. One or more of the described methods may use input variables 154 that are stored in the memory 134 in corresponding memory locations 155-158. Such described methods produce output variables 161 that are stored in the memory 134 in corresponding memory locations 162-165. Intermediate variables may be stored in memory locations 159, 160, 166 and 167. The register section 144-146, the arithmetic logic unit (ALU) 140, and the control unit 139 of the processor 105 work together to perform sequences of micro-operations needed to perform "fetch, decode, and execute" cycles for every instruction in the instruction set making I up the program 133. Each fetch, decode, and execute cycle comprises: (a) a fetch operation, which fetches or reads an instruction 131 from a memory location 128; (b) a decode operation in which the control unit 139 determines which instruction has been fetched; and 5 (c) an execute operation in which the control unit 139 and/or the ALU 140 execute the instruction. Thereafter, a further fetch, decode, and execute cycle for the next instruction may be executed. Similarly, a store cycle may be performed by which the control unit 139 stores or writes a value to a memory location 132. 0 Each step or sub-process in the processes of Figs. 3 to 15 is associated with one or more segments of the program 133, and is performed by the register section 144-1047, the ALU 140, and the control unit 139 in the processor 105 working together to perform the fetch, 2423602vl 924583_Final - 12 decode, and execute cycles for every instruction in the instruction set for the noted segments of the program 133. One or more of the described methods may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of described methods. Such dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories. Fig. 16 is a schematic block diagram of the digital camera 202 upon which some of the methods to be described may be implemented. The camera 202 comprises embedded components. Alternatively, the methods implemented on the camera 202 may be implemented on the camera enabled mobile phone 203, a portable media player, a personal data assistant or the like, in which processing resources are limited. Some of the methods implemented on the camera 202 may also be performed on higher-level devices such as desktop computers, server computers, and other such devices with significantly larger processing resources. As seen in Fig. 16, the camera 202 comprises an embedded controller 1602. Accordingly, the camera 202 may be referred to as an "embedded device." In the present example, the controller 1602 comprises a processing unit (or processor) 1605 which is bi-directionally coupled to an internal storage module 1609. The storage module 1609 may be formed from non-volatile semiconductor read only memory (ROM) 1660 and semiconductor random ) access memory (RAM) 1670. The RAM 1670 may be volatile, non-volatile or a combination of volatile and non-volatile memory. The camera 202 comprises a display controller 1607, which is connected to a video display 1614, such as a liquid crystal display (LCD) panel or the like. The display controller 1607 is configured for displaying graphical images on the video display 1614 in accordance with !5 instructions received from the processor 1605. The camera 202 also comprises user input devices 1613 which are typically formed by keys, a keypad or like controls. In some implementations, the user input devices 1613 may include a touch sensitive panel physically associated with the display 1614 to collectively form a touch-screen. Such a touch-screen may thus operate as one form of graphical user 30 interface (GUI) as opposed to a prompt or menu driven GUI typically used with keypad display combinations. Other forms of user input devices may also be used, such as a microphone (not illustrated) for voice commands or a joystick/thumb wheel (not illustrated) for ease of navigation about menus. 2423602vl 924583_Final - 13 As seen in Fig. 16, the camera 202 also comprises a portable memory interface 1606, which is coupled to the processor 1605 via a connection 1619. The portable memory interface 1606 allows a complementary portable memory device 1625 to be coupled to the camera 202 to act as a source or destination of data or to supplement the internal storage module 1609. Examples of such interfaces permit coupling with portable memory devices such as Universal Serial Bus (USB) memory devices, Secure Digital (SD) cards, Personal Computer Memory Card International Association (PCMIA) cards, optical disks and magnetic disks. The camera 202 also comprises a communications interface 1608 to permit coupling of the camera 202 to the computer or wireless communications network 204 via a connection 1621. The connection 1621 may be wired or wireless. For example, the connection 1621 may be radio frequency or optical. An example of a wired connection includes Ethernet. Further, an example of wireless connection includes BluetoothTM type local interconnection, Wi-Fi (including protocols based on the standards of the IEEE 802.11 family), Infrared Data Association (IrDa) and the like. The embedded controller 1602, possibly in conjunction with further special function components 1610, is provided to perform that special function. The components 1610 represent a lens, focus control and image sensor of the camera 202. Some of the methods described below may be implemented using the embedded controller 1602 wherein the some of the described processes, may be implemented as one or more software application programs 1633 executable within the embedded controller 1602. The camera 202 is an effective and advantageous apparatus for implementing some of the described methods. In particular, the steps of some of the described methods are effected by instructions in the software 1633 that are carried out within the controller 1602. The software instructions may be formed as one or more code modules, each for performing one or more 5 particular tasks. The software may also be divided into two separate parts, in which a first part and the corresponding code modules performs the described methods and a second part and the corresponding code modules manage a user interface between the first part and the user. The software 1633 is generally loaded into the controller 1602 from a computer readable 0 medium, and is then typically stored in the ROM 1660 of the internal storage module 1609, as illustrated in Fig. 16, after which the software 1633 can be executed by the processor 1605. In some instances, the processor 1605 may execute software instructions that are located in RAM 1670. Software instructions may be located in RAM 1670 by the processor 1605 initiating a 2423602vl 924583_Final -14 copy of one or more code modules from ROM 1660 into RAM 1670. Alternatively, the software instructions of one or more code modules may be pre-installed in a non-volatile region of RAM 1670 by a manufacturer. After one or more code modules have been located in RAM 1670, the processor 1605 may execute software instructions of the one or more code modules. As described herein, the application program 1633 is typically pre-installed and stored in the ROM 1660 by a manufacturer, prior to distribution of the camera 202. However, in some instances, the application programs 1633 may be supplied to the user encoded on one or more CD-ROM (not shown) and read via the portable memory interface 1606 prior to storage in the internal storage module 1609 or in the portable memory 1625. In another alternative, the software application program 1633 may be read by the processor 1605 from the network 1620 or loaded into the controller 1602 or the portable storage medium 1625 from other computer readable media as described above. The second part of the application programs 1633 and the corresponding code modules mentioned above may be executed to implement one or more graphical user interfaces (GUls) to be rendered or otherwise represented upon the display 1614. Through manipulation of the user input device 1613 (e.g., the keypad), a user of the camera 202 and the application programs 1633 may manipulate the interface in a functionally adaptable manner to provide controlling commands and/or input to the applications associated with the GUI(s). Other ) forms of functionally adaptable user interfaces may also be implemented, such as an audio interface utilizing speech prompts output via loudspeakers (not illustrated) and user voice commands input via the microphone (not illustrated). The camera enabled mobile phone 203 may have a similar configuration to the camera 202, albeit with different functional components. !5 A method 300 of transferring a plurality of high-resolution images, from a set of images captured by an image device (i.e., in the form of the camera 202), over the network 120 to the server 101 connected to the network 120. The method 300 uses collaborative digital image filtering and image backup performed using the system 200. However, the method 300 is not limited to the system 200 described in Fig. 2 and may be implemented on any suitable 30 hardware platform or a variation thereof. In accordance with the method 300, reduced resolution representations of captured images are uploaded to the central server 101. The reduced resolution representations of the captured images are uploaded to the server 101 instead of the full resolution images for a number of 2423602vl 924583_Final - 15 reasons. Firstly, there are significant bandwidth limitations when sending data over a communications network such as the Internet, particularly using a wireless network. The limitations in bandwidth mean that the cost and time required to transmit a large or full resolution image are high. Furthermore, the server 101 requires a higher storage capacity in order to store full resolution images of the entire image collection. Finally, the user is not likely to be interested in the entire image collection, but rather only the images which are of high interest to them. Thus, it is inefficient to upload the entire image collection at the full resolution. On the central server 101 the uploaded images are published in a user gallery associated where members of a user group may view and rate those images. A user group is defined as the users to which an owner (i.e., a first user) of the image allows access to the image gallery. The user image ratings are measured and aggregated, where based on those image ratings the images deemed to be popular or of high interest to the user are separated into a gallery of highly interesting images, where the gallery is associated with the first user. For each image which is deemed to be of high interest to the user the full resolution images are requested for upload from the host device 202 to the server 101. The method 300 will be described with reference to images captured with the camera 202. However, the method 300 may be implemented to process images captured with any other suitable device such as the mobile phone 203 seen in Fig. 3 or even a network camera (e.g., a web-cam) connected to a portable ) computer or the like. The method 300 may be implemented as one or more code modules of the software application program 133 resident on the hard disk drive 110 of the server 101 and being controlled in its execution by the processor 105 of the server 101. The method 300 begins at step 302, where the processor 105 performs the step of receiving a reduced resolution (or 25 thumbnail) representation of each of the set of images from a first computer readable storage medium associated with the image capture device in the form of the camera 202. The first computer readable storage medium is accessible by the image capture device in the form of the camera 202. Prior to the execution of the method 300, the set of images are captured by the camera 202 30 and the reduced resolution representation of each of the set of images are uploaded (or transferred) to the server 101. The reduced resolution representation images are transferred via the network 120. The reduced resolution images may be transferred via a low bandwidth 2423602v 924583_Final - 16 network connection. A method 500 of uploading the reduced resolution representation of each of the set of images will be described in detail below. At the next step 303, the processor 105 performs the step of publishing the reduced resolution representations on the server 101. As described in detail below, the images are published for a user in a user gallery associated with the first user. The reduced resolution representations may be published to an account on the server 101 so that access to the reduced resolution representations are limited to selected users in a plurality of users in a user group. The reduced resolution representations may be published on third party website such as social network sites to allow users to rate the images in a native environment. A method 600 of publishing the reduced resolution representations, as executed at step 303, will be described in detail below with reference to Fig. 6. The method 300 continues at the next step 305, where the processor 105 performs the step of measuring and aggregating the image rating results, based on users within the user group viewing and rating the images. A method 900 of measuring and aggregating the image rating results, as executed at step 305, will be described in detail below reference to Fig. 9. At the next step 306, the processor 105 performs the step of comparing user input, associated with the published reduced resolution representations, with a predetermined threshold. Based on the comparison, images which have an aggregate user rating above the predetermined threshold are stored in a separate gallery of images of high interest. A method ) 1000 of comparing user input, associated with the published reduced resolution representations, with the predetermined threshold, will be described in detail below with reference to Fig. 10. The method 300 continues at the next step 307, where the processor 105 performs the step of selecting one or more high-resolution images from the set of images based on the 25 comparison performed at step 306. Images above the predetermined threshold are selected. The processor 105 requests an original resolution image for each of the images rated above the predetermined threshold (i.e., the selected images) to be the uploaded from the camera or host device 202. At the next step 308, the processor 105 performs the step of transferring the selected high resolution images from the first computer readable storage medium associated 30 with the camera 202, over the network 120, to a second computer readable storage medium 106 associated with the server 101. The selected high-resolution images may be transferred via a high bandwidth network connection. 2423602vl 924583_Final - 17 A method 1100 of uploading one or more high-resolution images from the set of images will be described in detail below with reference to Fig. 11. In the method 1100, the selected high resolution images are also transferred from the first computer readable storage medium, over the network 120, to the second computer readable storage medium 106 associated with the server 101. The method 500 of uploading the reduced resolution (or thumbnail) representation of each of the set of images will now be described with reference to Fig. 5. The method 500 may be implemented as one or more software code modules of the software application program 1633 resident within the storage camera 202 and being controlled in its execution by the processor 1605 of the camera 202. The method 500 begins at 502, where in response to a signal from the user input devices 1613, the processor 1605 captures one or more digital images. Once the digital images have been captured, at the next step 503, the captured images are stored by the processor 1605 locally on the camera 202 (i.e., image capture device) within the internal storage device 1609. At the next step 504, the processor 1605 then polls whether the camera 202 is in wireless network range. If the processor 1605 is not in wireless network range, the processor 1605 keeps polling at regular intervals until the processor 1605 finds a wireless network 204 which the processor 1605 can connect to. The wireless network 204 provides, either directly or indirectly, a gateway 205 to the communications network 120 in order to allow the camera 202 communication with the central server 101. Once a suitable wireless network has been detected, at the next step 505, the processor 1605 of the camera 202 (i.e., the image capture device) establishes a connection with the central server 101. The method in which the connection between the camera 202 and the server 101 is established may take one of many forms. For example, the connection between 5 the camera 202 and the server 101 may be established using token-based authentication based on Secure Sockets Layer (SSL). Once a user registers the camera 202 on the central server 101 an account is created for the user and an authentication token is issued associating that particular camera 202 with the user account. The authentication token is stored within the storage device 1609 of the camera 202. 0 Thus, each time the camera 202 attempts to establish a connection with the central server 101, the processor 1605 uses the issued authentication token to identify the camera 202 and connect to the correct user account. The handshaking or connection process may be carried out using SSL in order to prevent eavesdropping of exchanged data by third parties. 2423602vl 924583_Final - 18 After the successful connection and authentication of the camera 202 with the central server 101, at the next step 506, a first image to be uploaded to the server 101 is downsampled by the processor 1605. Images are uploaded to the central server 101 in chronological order, that is, in the order that the images were captured. If a downsampled version of the image is already available on the camera 202, then this already downsampled version may be used for the upload without any additional processing needing to be carried out on the camera 202. The image downsampling is carried out in order to produce a digital image which is of a lower (or reduced) resolution than the original image. The particular method employed to downsample the image is implementation specific and may take any suitable form. One example of a suitable digital image downsampling method for use at step 506 is the bilinear method, which may be used to achieve image downsampling. Following successful downsampling of the first image in the collection of images on the camera 202 to be uploaded to the server 101, at the next step 507, the reduced resolution image is then uploaded to the central server 101 by the processor 1605. The upload of the reduced resolution image is carried out using the connection established with the server 101 previously. The upload of the image involves transferring the data across the communication network 120 to the central server 101. At step 508, if the processor 1605 determines that there is an error in the upload of the reduced resolution image, then the method 500 returns to step 507, the processor 1605 ) reattempts to upload the image. Once the reduced resolution image has been successfully uploaded, at the next step 509, the image is marked within the internal storage module 1609 as having been uploaded. The marking may be done by setting a field in metadata of the image or by keeping a separate list within the storage module 1609 of images which have been uploaded. The reason for marking the image as having been uploaded is to prevent the image .5 from being uploaded unnecessarily a second time. At step 510, if the processor 1605 determines that there are further images in the image collection which have not been uploaded to the server 101, then the method 500 proceeds to step 506 for those images in order to upload the images to the central server 101. Once all images from the image collection have been uploaded to the server 101 in a reduced 30 resolution format, at the next step 511, the processor 1605 signals to the server 101 that there are no further images to be uploaded and the method 500 reverts to step 502, where the processor 1605 waits for further image capturing to take place. 2423602v 924583_Final -19 The method 600 of publishing the reduced resolution representations, as executed at step 303, will now be described with reference to Fig. 6. The method 600 may be implemented as one or more software code modules of the software application program 133 resident within the server 101 and being controlled in its execution by the processor 105 of the server 101. In accordance with the method 600, the central server 101 publishes the uploaded reduced resolution images in the user gallery. The method 600 begins at step 601, where the processor 105 compiles the uploaded reduced resolution images into a collection or gallery. Then at the next step 602, the processor 105 sequences the images in chronological order, in order to maintain the sequence in which the images were taken. However, the images may not necessarily be ordered in chronological order. For example, in other implementations, the reduced resolution images may be sequenced or grouped based on geotag information (or metadata) associated with each image, facial recognition or other appropriate criteria, in order to group the images according to alternate criteria. The chronological ordering of images may be carried out according to any suitable method. In one implementation, the images may be ordered, for example, in accordance with metadata 701 associated with each reduced resolution image as seen in Fig. 7. The types of metadata fields that may be used to order the images in chronological order include, for example, filename 703 and timestamp 704 metadata fields. Once the images have been sorted based on chronological order the images are published in D the image gallery associated with the first user that captured and uploaded the images. The images are published in the gallery associated with the first user in order to make the images available for viewing to one or more other users in the user group. As an example, Fig. 8 shows a predetermined number of images (e.g., 802) on a HTML page 801 accessible using an Internet Browser software application. Depending on the number of images in the collection 25 the images may be split up over a number of pages 804 (i.e., Page 1 ... n). Navigational tools, such as buttons 805, may be provided to the user in order to allow for the navigation between the pages in a library. In the example of Fig. 8, the images 802 are ordered in their chronological order 803 corresponding to the sorting at step 602 based on the metadata 701. The method 600 concludes at the next step 604, where the processor 105 notifies users in 30 the user group that new images have been published in the image gallery associated with the first user. The notification may take one of many forms, such as an email notification or a notification on the central server website. 2423602vl 924583_Final -20 The method 900 of measuring and aggregating the image rating results, as executed at step 305, will be described in detail below reference to Fig. 9. The method 900 may be implemented as one or more software code modules of the software application program 133 resident within the server 101 and being controlled in its execution by the processor 105 of the server 101. In accordance with the method 900, a second one of the users within the user group view and rate images and the processor 105 measures and aggregates the image rating results. The method 900 begins upon a user 901 from the user group commencing an image review session. The processor 105 of the server 101 may either facilitate the image review session through an authenticated user session or a guest user session. At the first step 902, if the processor 105 determines that the second user is not registered, then the method 900 proceeds to step 905, where the processor 105 creates a new session where the second user may still participate in the image review process through a temporary session. The new temporary session includes a session in which the activity of the second user is captured by the processor 105, but does not impact any future user sessions. If, however, the second user is a registered user, then the method 900 proceeds to step 903. At step 903, the processor 105 executes an authentication process in order to identify the user. The authentication process performed at step 903 may be carried out in any suitable manner. In one implementation, the processor 105 associates a username and password with D each user which may then be used in order to securely identify the second user. The processor 105 may perform the step of authenticating a user based on a token or a password. The processor 105 may be authenticated based on device-based authentication. For example, the processor 105 may be configured to authenticate a unique number, such as camera serial number used to identify the camera 202 on the server 101, after the camera 202 has initially 25 been registered to a user account. In this instance, no authentication may be required for the user in the future. Once the user has been identified, at the next step 904, the processor 105 checks whether session data from previous sessions for the second user is available. If no session data from previous user sessions is available, then the method 900 proceeds to step 905 where the 30 processor 105 creates a new session commenced for the second user. If however, previous session information is available for the identified second user, then at the next step 906, the processor 105 retrieves the previous session information and the information is used in the current session. The previous session information may contain, for example, a variety of types 2423602vl 924583_Final -21 of data, such as previously reviewed images and/or the ratings given to images. The previous session information is stored in the memory 106 and/or the hard disk drive 110. The next step 908 in the method 900 involves the processor 105 detecting the second user interacting with digital images presented in the user gallery. At step 908, these user interactions are recorded by the processor 105 within the memory 106 and/or hard disk drive 110 in order to assess the popularity of the images reviewed by the second user. The types of user interaction that the processor 105 monitors for in order to gauge user preference of images may vary with implementation. In one implementation, one or more of the following metrics are used for monitoring user preferences for images: 9 Number of image views (click to enlarge image) * Number of times indicated as favourite image * Rating (up/down rating, scale-based rating, etc.) * Number of times commented e Uses in image content (slideshows, merchandise, etc.) e Number of downloads The method 900 continues at the next step 909, where the processor 105 weights the user interaction data stored in the memory 106 and/or hard disk drive 110, if necessary. The weighting of interaction data may be necessary in order to compensate for different reviewing habits of different users. There are numerous methods by which user interaction data can be 0 weighted. For example, users who consistently rate images highly may have a lower image weighting applied to their image ratings in order to normalise the overall results for the image collection. In another case, if the same person views a particular image multiple times, a lower weighting may be applied to this interaction, whereas if a number of different people view the image, a higher weighting may be applied. Some of the possible factors which may be used in 25 order to weight the images include: * Number of image views * Reviewer rating " The overall activity of the reviewer * Number of ratings 30 e Characteristics of the viewer (eg identity, location, etc) In the final step 910 of the method 900, the processor 105 aggregates the user data for the image collection under review. In particular, at step 910, the processor 105 performs the step 2423602vl 924583_Final - 22 of combining the weighted interaction data from all reviewing users in order to determine the images which are deemed popular. Also at step 910, the results of the aggregation step 910 for each image in the collection are stored by the processor 105 in a permanent data store 911, such as a database, within the hard disk drive 110. The aggregation of the usage data may be carried out each time new usage data for a particular image becomes available, that is, immediately after a user has interacted with the image. The method 1000 of comparing user input, associated with the published reduced resolution representations, with the predetermined threshold, will be described in detail below with reference to Fig. 10. The method 1000 filters images which have an aggregate user I rating above the predetermined threshold into a separate gallery of images of high interest to the first user. The method 1000 may be implemented as one or more software code modules of the software application program 133 resident within the server 101 and being controlled in its execution by the processor 105 of the server 101. In the first step 1001 of the method 1000, the processor 105 performs the step of retrieving the image rating for each image in the collection under review from the data store 911 within the hard disk drive 110. At the next step 1003, the processor 105 performs the step of comparing the rating for each image to the predetermined threshold 1002. The threshold 1002 is a value which determines whether the aggregate rating of a particular image group satisfies the requirements to be 0 classified as an image of high interest to the user. At the next step 1004, if the processor 105 determines that the aggregate image rating of a particular image is above the predetermined threshold, then the processor 105 classifies the particular image as being of high interest to the user and the method 1000 proceeds to step 1005. At step 1005, the image is filtered by the processor 105 into a separate gallery or a 25 subset of images 1006 of the current image collection representing images of high interest to the user. The subset of images 1006 may be stored within the hard disk drive 110. The method 1100 of uploading one or more high-resolution images from the set of images will be described in detail below with reference to Fig. 11. In the method 1100, an original resolution image for each image rated above the predetermined threshold 307 is requested and 30 the respective images are uploaded from the camera 202 (or host device or host devices). Steps 1101, 1102, 1103, 1104, 1108 and 1110 are implemented as one or more software code modules of the software application program 133 resident within the server 101 and being controlled in its execution by the processor 105 of the server 101. Steps 1106, 1107, 1111 and 2423602v1 924583_Final -23 1112 are implemented as one or more software code modules of the software application program 1633 resident within the storage camera 202 and being controlled in its execution by the processor 1605 of the camera 202. The method 1100 begins at a first step 1101, where the processor 105 performs the step of determining the host device for each image which has received a user rating above the predetermined threshold. A host device in this instance refers to the camera 202 from which the image was originally uploaded. In alternative embodiments, however, the host device may refer to any networked device which stores digital images, such as a general purpose computer module connected to the network 120. At the next step 1102, the images are compiled (or sorted) based on the host device from which the images originated. The compiled images of step 1102 are stored in a non-volatile data store 1103, such as a database, configured within the hard disk drive 110. Then at step 1104, the processor 105 generates and transmits a request 1105 for a full resolution image upload from the camera 202 (or host device). The request 1105 includes a list of the images 5 to be uploaded from the camera 202 (or host device). At step 1106, the processor 1605 of the camera 202 receives the request 1105, upon which the upload of the requested images is initiated by the camera 202 (or host device). Accordingly, at next step 1107, the processor 1105 performs the step of transferring (or uploading) high resolution versions 1109 of the images listed in the request 1105 from the 0 first computer readable storage medium in the form of the internal storage module 1609 to the second computer readable storage medium in the form of the hard disk drive 110 associated with the server 101. The high resolution versions of the images 1109 are transferred over the network 120. The connection between the server 101 and the camera 202 (or host device) may be 25 established the next time the camera 202 comes into wireless range to a gateway after the server 101 request is issued. The images 1109 are uploaded individually or in batch to the central server 101. At step 1108, the processor 105 of the server 101 performs the step of receiving the images 1109 and storing the images with the hard disk drive 1109. If the processor 1605 of the camera 202 determines, at step 1111, that the image upload is 30 successful then the method 1100 proceeds to step 1112. At step 1112, the uploaded images are marked within the internal storage module 1609 as having been uploaded. If the upload has failed, then the method 1100 returns to step 1107 and the processor 105 reattempts the upload. 2423602vl 924583_Final - 24 In the final step 1110, the central server 101 replaces the reduced resolution images in the subset 1006 stored within the hard disk drive 110 with the high (or full) resolution uploaded images 1109. Fig. 14 shows an alternative system 1400 on which one or more of the described methods may be implemented. The system 1400 is an alternative to the network-based storage and collaborative image rating system 200. In the system 200, the image capture devices 201, in the form of the camera 202 or mobile phone 203, connect directly to a gateway 205 using a wireless network 204. Such a configuration is not possible in all circumstances, either because the image capture device 201 is not enabled for wireless communication or a wireless network I is not available in a particular location. The system 1400 eliminates the need for a wireless network 204. In the system 1400, the camera 202 or mobile phone 203 (or image capture device) connect through a direct link 1403 to a network enabled device 1404, typically a personal computer. The direct link connection 1403 refers to a point to point wired connection, such as a 5 Universal Serial Bus (USB) or a FireWire connection. The captured images are then transmitted from the camera 202 or mobile phone 203 (or image capturing device) to the network enabled device 1404. A method 1500 of transferring a plurality of high-resolution images, from a set of images captured by an image device (e.g., in the form of the camera 202) to the network enabled 0 device 1404. The method 1500 will be described with reference to images captured with the camera 202. However, the method 1500 may be implemented to process images captured with any other suitable device such as the mobile phone 203 seen in Fig. 14 or even a network camera (e.g., a web-cam) connected to a portable computer or the like. In order to transmit the image collection contained on the camera 202 (or mage capture 25 device) to the network enabled device 1404, at the first step 1502, the processor 1605 establishes a connection. Step 1502 may be performed as a handshaking process which allows for data transfer to take place and depends on the type of technology used, for example USB and FireWire employ different protocols. After the connection has been established, at the next step 1503, the processor 1605 of the camera 202 (or image capture device) performs the 30 step of transferring (or transmitting) the high resolution images in the collection across the link 1403 to the network enabled device 1404. Since point to point wired links are typically high speed links the entire image collection may be transmitted to the network enabled device 1404 from the camera 202 (or image capture device). The images may be stored locally on the 2423602v1 924583_Final - 25 network enabled device 1404 in a local data store such as a hard drive 1505. The collection of captured full resolution images is then available on the network enabled device 1404. The images may be transferred to the server 101 and manipulated as described above. In particular, the device 1404 may perform the step of down-sampling the images on the device 1404 in the form of a personal computer. The network enabled device 1404 becomes the image storage device and handles all further requests. When displaying a large unsorted collection of images to a user, such as the second user, for review, the second user will have a distinct bias towards images presented initially in the image collection. Furthermore, if the image collection is large, the user is unlikely to maintain interest in the image collection and review the entire set, especially if the set is of low overall quality. All images in the image collection, however, should preferably be reviewed by at least one user. A method 1200 of reviewing a large unsorted collection of images will now be described with reference to Figs. 4 and 12. The method 1200 may be implemented as one or more i software code modules of the software application program 133 resident within the server 101 and being controlled in its execution by the processor 105 of the server 101. In the method 1200, an automatic filter 402 and view counter 403 are applied to an unsorted image collection 400, as seen in Fig. 4, in order to construct 404 a more ideal image subset 405 for display to a user 407. The subset 405 of the large image collection 401 is 0 displayed to the user, because for large image sets the entire image collection cannot usually be displayed at once using the HTML page 801. The method 1200 increases the likelihood of each image being viewed using the view counter 403. The method 1200 also increases the likelihood that the interest of the reviewer will be maintained for longer by using automatic filtering 403 and feedback-based selection 406. 25 The method 1200 describes the typical workflow for a user 1201 reviewing a large unsorted image collection 401. At the first step 1202, the processor 105 receives a request, upon input by the viewer 1201, to view images for the purposes of reviewing. The processor 105 then verifies, at step 1203, whether the image collection 401 is available for review, in other words, whether the image collection 401 has been uploaded to the server 101. If there 30 are no images in the image collection 401 available for review, at the next step 1204, the processor 105 notifies the user 1201 that there are no images available for display. The notification may be generated at step 1201 using a pop-up box or the like, on the page 801. If 2423602vl 924583_Final -26 images are available for display, the method 1200 proceeds to step 1205. Otherwise, the method 1200 proceeds to step 1208. At step 1205, if the processor 105 determines that the image collection 401 contains a higher number of images than the maximum that can be displayed on the page 801 at a single time, then the method 1200 proceeds to 1206. Otherwise, if the image collection 401 is not greater than what can be displayed on the page 801 the images are immediately displayed within the page 801 at the next step 1208. At the next step 1206, the processor 105 applies automatic filtering to image collection 401 to produce a filtered image collection using a number of actions to reduce the size of the image collection 401 including, for example: " Elimination of highly similar images " Elimination of low quality images (i.e. out of focus, overexposed, underexposed images, etc.) " Event classification * Context analysis (i.e. grouping profile, landscape, etc images) Accordingly, at step 1206 filtering of the reduced resolution representations of the images in the image collection 401 may be performed by eliminating one or more images of low quality. Once the automatic filtering process has been completed, at the next step 1207, if the ) processor 105 determines that the filtered image collection is still greater than what can be displayed on the page 801, then the method 1200 proceeds to step 1211. Otherwise, the method 1200 proceeds to step 1208. At step 1208, since after the automatic filtering process at step 1206 the filtered image collection can be displayed on the page 801, the filtered image collection is displayed to the Z5 user 1201. However, at step 1211, since the filtered image collection is still too large view counter weighting is applied to each image of the filtered collection by the processor 1205 . In particular, each image of the filtered image collection has an associated view counter 1305, as seen for example in Fig. 13, in order to track the number of times that the particular image has been displayed to a user. Accordingly, the processor 105 may perform the step of updating a 30 counter associated with each reduced resolution representation when the respective reduced representation is displayed to a user. The processor 105 may also perform the step of prioritising display of the reduced representations associated with low counts. Images with a 2423602vl 924583_Final - 27 higher view count get a lower priority for display to a user in order to maximise the probability of each image being viewed and rated by a user. Once the automatic filtering at step 1206 and the view counter weighting at step 1208 have been applied, at step 1209, the processor 105 performs the step of selecting the images for display giving preference to images based on previous user feedback. User preferences from previous rating sessions, if available in the hard disk drive 110, are taken into account in order to prioritise images which may be of higher interest to the user and thus keep the user engaged for longer. Finally, at step 1210, the processor 1205 updates the view counters for each image in the subset of images being displayed in the page 801. The method 1200 will be further described by way of example with reference to Fig. 13. In the example of Fig. 13, an original image collection 1301 to be displayed contains a total of sixteen (16) images (e.g., image 1303). However, a corresponding HTML page (or user interface) 1302 is only able to display six (6) images at a time. As a result, a subset of six (6) images is selected from the image collection 1301 for display in the page 1302. Within this collection images reference as photo 2, 4, 5, 12 and 16 represent images of poor quality. For example, the images referenced as photo 2, 4, 5, 12 and 16 may be out of focus or overexposed images. Thus during the automatic filtering performed by the processor at step 1206, the images the images referenced as photo 2, 4, 5, 12 and 16 are filtered out to obtain image collection 1304 as seen in Fig. 13. Since the image collection in 1304 contains eleven ) (11) images, and the page 1302 can only display six (6) images, the image collection 1304 is further narrowed as at step 1211 of the method 1200. The processor 105 applies the view counter-based weighting to the images of image collection 1304. In particular, each image (e.g., 1303) in the image collection 1304 is prioritised based on the number of times each image has already been displayed. As seen in Fig. 13, the images of the image collection 1304 5 are sorted based on their respective view counters 1305 to produce image collection 1306. Finally, six (6) images are selected for display to the user 1302 from the image collection 1306 which contains eleven (11) images. As seen in Fig. 13, the first five (5) images of the image collection 1306 (i.e., Photo 13, Photo 14, Photo 15, Photo 6 and Photo 7) have a view count 1305 of two (2) or less and thus will be selected for display in user interface 1302. 30 However, Photo 8, Photo 9, Photo 10 and Photo 11 all have a view counter of three (3). In order to select the final image from the remaining images of the image collection 1306 previous user feedback 1307, as at step 1209, is used in order to determine which images to display. In the present example, the user feedback 1307, is based on the reviewing user's 2423602vl 924583_Final - 28 preference for faces in the images. In particular, the user rated highly images containing person A 40% of the time and person D only 5% of the time. Thus, the processor 105 may determine that the user has a highest preference for images containing person A. As a result Photo 11 is selected for inclusion in the final display set as Photo 11 is the only image which has a view counter 1305 of three (3) and contains person A. The final set of images is the displayed to the user in the user interface 1302. Accordingly, the processor 105 may perform the steps of determining an interaction history associated with a particular user and prioritising the display of reduced resolution representations of images to the particular user in accordance with the determined history. As described above, at step 307, the processor 105 performs the step of selecting one or more high-resolution images from the set of images based on the comparison performed at step 306. In one implementation, a locking mechanism may be associated with certain images restricting which of the images on the image capture device (or host device), such as the camera 202, may be uploaded. For example, the processor 1605 may perform the step of locking one or more of the images on the internal storage module 1609 (or first computer readable storage medium) until the selected high resolution images are transferred from the internal storage module 1609 to the second computer readable storage medium 106 associated with the server 101. Further, such a locking mechanism may restrict one or more images from being deleted from the image capture device (or host device) until the one or more ) images have been released by the server 101. Industrial Applicability The arrangements described are applicable to the computer and data processing industries and particularly for backing-up stored images. The foregoing describes only some embodiments of the present invention, and Z5 modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive. In the context of this specification, the word "comprising" means "including principally but not necessarily solely" or "having" or "including", and not "consisting only of'. Variations of the word "comprising", such as "comprise" and "comprises" have correspondingly varied 30 meanings. 2423602vl 924583_Final

Claims (20)

1. A method of transferring one or more high-resolution images, from a set of images captured by an image capture device, over a network to a server connected to the network, the method comprising the steps of: receiving a reduced resolution representation of each of the set of images from a first computer readable storage medium associated with the image capture device; publishing the reduced resolution representations on the server; comparing user input associated with the published reduced resolution representations with a predetermined threshold; selecting one or more high-resolution images from the set of images based on the comparison; and transferring the selected high resolution images from the first computer readable storage 5 medium, over the network, to a second computer readable storage medium associated with the server.
2. The method according to claim 1, further comprising the step of filtering the reduced resolution representations by eliminating images of low quality. 20
3. The method according to claim 1, further comprising the steps of: updating a counter associated with each reduced resolution representation when the respective reduced representation is displayed to a user; and prioritising display of the reduced representations associated with low counts. 25 2423602v 924583_Final - 30
4. The method according to claim 1, further comprising the steps of: determining an interaction history associated with a particular user; and prioritising the display of reduced resolution representations to said particular user in accordance with the determined history.
5. The method according to claim 1, wherein the first computer readable storage medium is accessible by the image capture device.
6. The method according to claim 1, wherein the reduced resolution images are D transferred via a low bandwidth network connection.
7. The method according to claim 1, wherein the selected high-resolution images are transferred across a high bandwidth network connection. 5
8. The method according to claim 1, wherein the reduced resolution representations are published to an account on the server so that access to said reduced resolution representations is limited to selected users in a plurality of users.
9. The method according to claim 1, wherein the reduced resolution representations are 20 published on third party websites.
10. The method according to claim 1, further comprising the step of authenticating a user based on a token. 2423602vl 924583_Final -31
11. The method according to claim 1, further comprising the step of authenticating a user based on a password.
12. The method according to claim 1, further comprising the step of authenticating a user based device-based authentication.
13. The method according to claim 1, wherein the reduced resolution images are grouped based on a geotag metadata associated with each image. 3
14. The method according to claim 1, wherein the high-resolution images are selected based on facial recognition.
15. The method according to claim 1, further comprising the step of down-sampling the images on a personal computer. 5
16. The method according to claim 1, further comprising the step of locking one or more of the images on the first computer readable storage medium until the selected high resolution images are transferred from the first computer readable storage medium to the second computer readable storage. 20
17. A computer system for transferring one or more high-resolution images, from a set of images captured by an image capture device, over a network to the computer system connected to the network, the system comprising: a memory for storing data and a computer program; 2423602vl 924583_Final - 32 a processor coupled to said memory for executing the computer program, the computer program comprising instructions for: receiving a reduced resolution representation of each of the set of images from a first computer readable storage medium associated with the image capture device; publishing the reduced resolution representations on the server; comparing user input associated with the published reduced resolution representations with a predetermined threshold; selecting one or more high-resolution images from the set of images based on the comparison; and ) transferring the selected high resolution images from the first computer readable storage medium, over the network, to a second computer readable storage medium associated with the server.
18. An apparatus for transferring one or more high-resolution images, from a set of images 5 captured by an image capture device, over a network to a server connected to the network, the apparatus comprising: means for receiving a reduced resolution representation of each of the set of images from a first computer readable storage medium associated with the image capture device; means for publishing the reduced resolution representations on the server; 20 means for comparing user input associated with the published reduced resolution representations with a predetermined threshold; means for selecting one or more high-resolution images from the set of images based on the comparison; and 2423602vl 924583_Final - 33 means for transferring the selected high resolution images from the first computer readable storage medium, over the network, to a second computer readable storage medium associated with the server.
19. A computer readable medium including a computer program recorded thereon for transferring one or more high-resolution images, from a set of images captured by an image capture device, over a network to a server connected to the network, the computer program comprising: code for receiving a reduced resolution representation of each of the set of images from a ) first computer readable storage medium associated with the image capture device; code for publishing the reduced resolution representations on the server; code for comparing user input associated with the published reduced resolution representations with a predetermined threshold; code for selecting one or more high-resolution images from the set of images based on the 5 comparison; and code for transferring the selected high resolution images from the first computer readable storage medium, over the network, to a second computer readable storage medium associated with the server.
20 20. A method of transferring one or more high-resolution images, from a set of images captured by an image capture device, over a network to a server connected to the network, the method being substantially as herein before described with reference to any one of the embodiments as that embodiment is shown in the accompanying drawings. 25 2423602vl 924583_Final -34 DATED this 3rd Day of December 2009 CANON KABUSHIKI KAISHA Patent Attorneys for the Applicant SPRUSON&FERGUSON 2423602vl 924583_Final
AU2009243525A 2009-12-04 2009-12-04 Network-based collaborative image filtering and backup Abandoned AU2009243525A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2009243525A AU2009243525A1 (en) 2009-12-04 2009-12-04 Network-based collaborative image filtering and backup

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2009243525A AU2009243525A1 (en) 2009-12-04 2009-12-04 Network-based collaborative image filtering and backup

Publications (1)

Publication Number Publication Date
AU2009243525A1 true AU2009243525A1 (en) 2011-06-23

Family

ID=45398450

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2009243525A Abandoned AU2009243525A1 (en) 2009-12-04 2009-12-04 Network-based collaborative image filtering and backup

Country Status (1)

Country Link
AU (1) AU2009243525A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104025071A (en) * 2011-12-27 2014-09-03 高通股份有限公司 Crowd Determined File Size Uploading Methods, Devices And Systems
US20170228292A1 (en) * 2016-02-10 2017-08-10 International Business Machines Corporation Privacy Protection of Media Files For Automatic Cloud Backup Systems
US10133639B2 (en) 2016-02-10 2018-11-20 International Business Machines Corporation Privacy protection of media files for automatic cloud backup systems

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104025071A (en) * 2011-12-27 2014-09-03 高通股份有限公司 Crowd Determined File Size Uploading Methods, Devices And Systems
EP2798511A4 (en) * 2011-12-27 2015-08-26 Qualcomm Inc Crowd determined file size uploading methods, devices and systems
US20170228292A1 (en) * 2016-02-10 2017-08-10 International Business Machines Corporation Privacy Protection of Media Files For Automatic Cloud Backup Systems
US10133639B2 (en) 2016-02-10 2018-11-20 International Business Machines Corporation Privacy protection of media files for automatic cloud backup systems

Similar Documents

Publication Publication Date Title
AU2015336948B2 (en) Camera application
US8473454B2 (en) System and method of on-demand document processing
US9569658B2 (en) Image sharing with facial recognition models
US8521857B2 (en) Systems and methods for widget rendering and sharing on a personal electronic device
US10453181B2 (en) Systems and methods for transforming an image
US9467518B2 (en) System, a method and a computer program product for automated remote control
US20080147684A1 (en) Enhancing User Experiences Using Aggregated Device Usage Data
US20140006977A1 (en) Integrated social network internet operating system and management interface
JP2016520887A (en) Content, service aggregation, management and presentation system
US20120158846A1 (en) Digital content management
KR102108849B1 (en) Systems and methods for multiple photo feed stories
US20110148857A1 (en) Finding and sharing of digital images based on shared face models
US20090213228A1 (en) Method For Specifying Image Handling For Images On A Portable Device
US20130104205A1 (en) Account creating and authenticating method
US10430125B1 (en) System, network architecture and method for accessing and controlling an electronic device
US9516003B2 (en) Unified cloud computing network interface
US10891576B2 (en) System and method for recommending a transaction to replace a device based upon total cost of ownership
AU2009243525A1 (en) Network-based collaborative image filtering and backup
US20120096369A1 (en) Automatically displaying photos uploaded remotely to a digital picture frame
US10762058B2 (en) System and method for providing user-centric content to an electronic device
CN104756087B (en) Information terminal device and storage service application method
US20140351330A1 (en) Service profile maintenance
TWI459300B (en) An application execution method
JP7410373B2 (en) Program, information processing method, information processing device
US20160034174A1 (en) System and method for single-touch engagement with social media and other sites

Legal Events

Date Code Title Description
MK4 Application lapsed section 142(2)(d) - no continuation fee paid for the application